Please don’t let an algorithm have the final say on Leaving Cert grades

18 May 2020

Image: © arrowsmith2/Stock.adobe.com

Algorithms built on biased data can never be unbiased, and the recent decision to adjust grading for the Leaving Cert based on previous school performance will weigh heavily on DEIS students, writes Elaine Burke.

It’s stressful to be a Leaving Cert student in any year, but 2020 has taken it to a new level. From the closure of schools months before the end of term to the dragged-out will-they-won’t-they cancellation decision from the Department of Education, the mental state of this year’s students must be run ragged.

Now, at least, we have some certainty. The usual final-year exams for secondary school students in 2020 have been cancelled. Students can accept a grade set by their teachers or opt to sit the exams at an undetermined time in the future, when it is safe to do so. Sorted.

Except, Irish Independent columnist Colette Browne spotted a fatal flaw in this simple plan. In the Government’s guide for calculating the grades for students, the answer to a question on the processing of school data advises that: “The teachers’ estimated marks from each school will be adjusted to bring them into line with the expected distribution for the school.” This “expected distribution” will be derived from statistical analysis of data sets held by the State Examination Commission (SEC).

I don’t have to tell you that, statistically, schools from disadvantaged areas – known in Ireland as DEIS schools – don’t do as well in those statistical rankings. As with all inequalities, those with the money to invest do better than those without.

Virgin Media News’ political correspondent Gavan Reilly put it to Minister for Education Joe McHugh, TD, that this approach would discriminate against DEIS schools. “He seemed satisfied that if a teacher submitted a high-and-legit grade for a pupil, it would still survive the algorithm and emerge as a high grade at the other end,” Reilly tweeted.

And that’s when my stomach hit the floor.

The Department of Education is willing to put the fate of this year’s Leaving Cert students in the hands of an algorithm that is somehow, miraculously, meant to be unbiased. That’s a lot of faith being put into a concept as yet unproven in science and technology.

Algorithms, by their very nature of sticking to what is known and analysing and making decisions based on what has gone before, are as inherently biased as the society that makes them. There is a vast body of research that would call into question whether an algorithm can be unbiased, and most respected experts will avow the need for human intervention in machine-led decision-making.

“If machines are to be trusted, we need to watch over them,” warned Will Douglas Heaven, MIT Technology Review’s senior editor covering AI. In his recent piece exploring how our present unprecedented behaviours are turning automation on its head, he discussed how human intervention is required in even the most mundane algorithms. Because even though its application potential can sound like something straight out of sci-fi, it must be remembered that machine learning is still a blunt instrument.

Edge cases are not the strength of a system that’s trained to look for common denominators and act according to precedent. But human life is far more complex than binary data. If you could represent all of human life in ones and zeroes, what a bland mess it would be. You’d lose all the edges, all the features, all the differences that make that life what it is.

‘Edge cases are not the strength of a system that’s trained to look for common denominators and act according to precedent’

Minister McHugh says that adjusting grades to fall in line with schools’ past performance is an effort to standardise the grading that will be done by teachers across the country. He also stands firm in his belief that this untested system will be able to account for changes not represented in the historic data, such as the recent introduction of a higher-level class that was previously unavailable.

However, Labour education spokesperson Aodhán Ó Ríordáin, TD, has challenged the Department of Education’s confidence in the machine and his party is urging the Irish Human Rights and Equality Commission to investigate this use of “school profiling”.

When I was in fourth year in my DEIS secondary school in Coolock, the whole school was buzzed with the news that one of the previous year’s Leaving Cert students had done the seemingly impossible: she had been awarded 600 points.

For anyone reading from outside of Ireland and wondering of this figure’s significance, this is the maximum grade you can achieve in the Leaving Certificate examinations. A perfect score. It was a figure that until that moment I never thought was possible, and I’m sure many of my peers felt the same. It spurred me on, knowing that I could and should aim as high as the top. If one girl from Mercy College Coolock could do it, why not more of us?

For our teachers, encouraging us to even aim for a third-level education was a goal in and of itself. To have a model of inspiration showing us that not only could we make it to college but saunter into higher education with our pick of the courses on offer was an incredible jolt.

I didn’t score 600 points in my Leaving, but I often think of that girl who did. Every time someone writes off DEIS schools or tries to judge a student’s further education prospects based on where they come from, I have the exceptional example. I know the anomaly. I met the edge case. I saw something unprecedented that historical statistics could not. I understand what a computer could not.

Want stories like this and more direct to your inbox? Sign up for Tech Trends, Silicon Republic’s weekly digest of need-to-know tech news.

Elaine Burke is the host of For Tech’s Sake, a co-production from Silicon Republic and The HeadStuff Podcast Network. She was previously the editor of Silicon Republic.

editorial@siliconrepublic.com