Students uncover method to manipulate AI grading algorithm

Spamming relevant keywords, many of which could be found online, saw students' grades rise to potentially perfect scores

Adam Smith
Thursday 03 September 2020 13:17 EDT
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Students in the United States have discovered a way to cheat an algorithm marking their work by simply typing out a list of words relevant to the topic.

Edgenuity, an online platform for virtual learning, was found to be using an artificial intelligence that was scanning students’ answers for keywords it expected to see in the answers.

The discovery was made after one child received 50 per cent grade on an assignment – marked within seconds by the algorithm – and testing with other responses found the methodology Edgenuity was seemingly using to determine grades.

The flaw was highlighted by a University of California associate professor of history, Dana Simmons.

“My kid started Jr. High last week. He couldn't stop talking about how much he loved his History teacher. This afternoon we found him in tears, overcome by stress and self-doubt. His grade for his first short answer homework: 50/100”, Simmons tweeted.

Through trial and error, Simmons realised what was necessary to beat the algorithm and informed her child.

“Two full sentences, followed by a word salad of all possibly applicable keywords. 100% on every assignment. Students on @EdgenuityInc, there's your ticket. [My child] went from an F to an A+ without learning a thing,” she said.

Simmons criticised educators using algorithms to mark students, but said that there were benefits in using this technology to gauge their progress.

One student said that they submitted keywords when they were “completely clueless”, a system which apparently worked “more often than not”, they told The Verge.

Another student said that they keywords could often be found online, “nine times out of ten”, although they did not use the keywords in every answer in order to hide their actions.

“If activities include keywords that are used for determining a system-assigned score, the student will earn a zero per cent if none of the keywords are included in the response, and will earn 100 per cent if at least one keyword is included in the response” reads Edgenuity’s website.

Teachers do have the ability to review grades given by the algorithm. But students complained that they did not appear to be doing so.

“If the teachers were looking at the responses, they didn’t care,” one student commented.

The Independent has reached out to Edgenuity for comment.

The use of an algorithm to mark students is infamous in the United Kingdom, as many A-level students received significantly lower marks than expected last month.

Moreover, research showed that A-level grades at sixth form colleges were around 20 per cent more likely to be downgraded than those at independent schools, disproportionately affecting poorer students.

However the algorithm was also the cause of “rampant” grades inflation for niche subjects such as Latin and classics.

Many students approached law firms to help navigate the “narrow” appeals process for GCSE and A-levels.

It was later confirmed that a review into Ofqual’s statistical model will be conducted and it has been suggested that online exams could be taken next year, rather than relying on an algorithm.

Rather than analysing keywords as Edgenuity did, the algorithm mediated grades based on the historical success of the school.

The FFT Education Datalab, a non-profit organisation which publishes research on education policy, showed how the number of students who could achieve the highest grades was lowered, while one student would be awarded an “unmarked”.

“This seems rather harsh given that the model prediction is for fewer than one pupil (2.30 per cent, when each pupil counts as 3.70 per cent) to achieve this grade”, the non-profit wrote.

As has been shown in other fields, such as algorithms used for policing or private social credit systems in the UK, these algorithms can suffer from the same economic and racial biases in society.

“Children from a certain background may find their assessment is downgraded,” says Stephen Curran, a teacher and education expert, told Wired.

In Scotland, children from poorer environments were twice as likely to have their results downgraded than those from richer ones.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in