The UK has said that students in England and Wales will no longer receive exam results based on a controversial algorithm after accusations that the system was biased against students from poorer backgrounds, Reuters and BBC News report. The announcement followed a weekend of demonstrations at which protesters chanted “fuck the algorithm” outside the country’s Department for Education.
Instead, students will receive grades based on their teachers’ estimates after formal exams were canceled due to the pandemic. The announcement follows a similar U-turn in Scotland, which had previously seen 125,000 results downgraded.
chants of “fuck the algorithm” as a speaker talks of losing her place at medical school because she was downgraded. pic.twitter.com/P15jpuBscB
— huck (@HUCKmagazine) August 16, 2020
In the UK, A-levels are the set of exams taken by students around the age of 18. They’re the final exams taken before university, and they have a huge impact on which institution students attend. Universities make offers based on students’ predicted A-level grades, and usually, a student will have to achieve certain grades to secure their place.
In other words: it’s a stressful time of year for students, even before the country’s exam regulator used a controversial algorithm to estimate their grades.
As the BBC explains, the UK’s Office of Qualifications and Examinations Regulation (Ofqual) relied primarily on two pieces of information to calculate grades: the ranking of students within a school and their school’s historical performance. The system was designed to generate what are, on a national level, broadly similar results to previous years. Overall, that’s what the algorithm accomplished, with The Guardian reporting that overall results are up compared to previous years, but only slightly. (The percentage of students achieving an A* to C based on the algorithm’s grading rose by 2.4 percent compared to last year.)
Results from independent schools rose by more than state schools
But it’s also led to thousands of grades being lowered from teachers’ estimations: 35.6 percent of grades were adjusted down by a single grade, while 3.3 percent went down by two grades, and 0.2 went down by three. That means a total of almost 40 percent of results were downgraded. That’s life-changing news for anyone who needed to achieve their predicted grades to secure their place at the university of choice.
Worse still, data suggests that fee-paying private schools (also known as “independent schools”) disproportionately benefited from the algorithm used. These schools saw the amount of grades A and above increase by 4.7 percent compared to last year, Sky News reports. Meanwhile, state-funded “comprehensive” schools saw an increase of less than half that: 2 percent.
There is a variety of factors that seem to have biased the algorithm. One theory put forward by FFT Education Datalab is that Ofqual’s approach varied depending on how many students took a given subject, and this decision seems to have led to fewer grades getting degraded at independent schools, which tend to enter fewer students per subject. The Guardian also points out that what it calls a “shockingly unfair” system was happy to boost the number of “U” grades (aka, fails) and round down the amount of A* grades, while one university lecturer has pointed out other failings in the regulator’s approach.
Fundamentally, however, because the algorithm placed so much importance on a school’s historical performance, it was always going to cause more problems for high-performing students at underperforming schools, where the individual’s work would be lost in the statistics. Average students at better schools, meanwhile, seem to have been treated with more leniency. Part of the reason the results have caused so much anger is that this outcome reflects what many see as the wider biases of the UK’s education system.
The government’s decision to ignore the algorithmically determined grades will be welcome news to many, but even using teachers’ predictions comes with its own problems. As Wired notes, some studies have suggested such predictions can suffer from racial biases of their own. One study from 2009 found that Pakistani pupils were predicted a lower score (62.9 percent) more than their white counterparts in one set of English exams and that results for boys from Black and Caribbean backgrounds can spike when they’re assessed anonymously starting from age 16.