● Prejudice of the Algorithm of "Low Scores in the Past, Low Scores This Year" When the

A-level score, which is the result of the UK's high school achievement evaluation, was reported on August 13th, 300,000 college applicants from England, Wales and Northern Ireland began to protest. This is because the score calculated and notified by a computer algorithm, commonly referred to as artificial intelligence (AI), came out much lower than the score evaluated by the school itself.

The UK, one of the countries that suffered the most severe damage from the global Corona 19 pandemic, was unable to conduct the A-level test this summer, so it used computer algorithms to calculate its grades. The Computer Algorithm Direct Center Performance Model selected by the UK education authorities is designed to derive credits based on a variety of data.

Students and experts said that this algorithm calculated credits against students who were relatively inferior, and urged a thorough investigation into the credit calculation algorithm.

In the case of 18-year-old Philip, a college aspiring student, it was expected that teachers at West London School would receive A for 2 subjects and B for 1 subject to be sufficiently admitted to Exeter University, but the algorithm calculated and informed One score was B for 1 subject and C for 2 subjects. They were unable to enter the University of Exeter.

Forty percent of English students, like Philip, earned less than the teacher predicted. In particular, students at state schools supported by the government received much lower scores than students at private schools, and in the end, they were in a situation where they could not get admission to the desired university.

Strong backlash continued, and teenage students protested in front of the British Ministry of Education. The students exclaimed, "Fuck the algorithm like *!" As the excommunication continued, British Education Minister Gavin Williamson announced that it would accept the credits predicted by teachers, not the credits calculated by the algorithm.

The algorithm used by the UK education authorities was adopted to increase fairness by adjusting the distribution of the highest and lowest grades so that 2020 inquirers receive similar grades as students in the previous school year. It was a method of calculating grades by reflecting the students' scores and rankings predicted by teachers. The problem is that the algorithm was made to reflect the school's past entrance exam scores, which led to favorable scores for students in affluent environments.

Private schools in the UK, where parents pay tuition, usually have a small number of students and have a score distribution that is difficult to standardize with a specific model. As a result, we changed the algorithm to put more weight on the grades predicted by teachers for private school students who are rich and white.

"The problem is that there can be many different algorithms to solve the fairness problem," said Helena Webb, chief researcher at Oxford University's Department of Computer Science. It might be said that the method is fair at a national level, but it can produce a completely different result than a fair score for an individual. It doesn't reflect that, so state school students get a lower grade than renowned private school students who have earned higher grades in the past.”

According to the school's self-assessment, 18-year-old Josh Wicks of Chippenham School in West England, who was predicted to be two A plus and one A credit, but all three A credits, said, "I am angry with my attitude toward state schools. Algorithm says that the school didn't get a high score last year, and I think students won't get a good score this year."


● Algorithms stick to the past… Increasing inequality

Algorithms such as social media, visa application, face recognition, and test score calculation are widely used in most of our society. New technology can be a way out for governments with tight finances or companies seeking innovation. However, experts have long warned about algorithmic biases, and as automation spreads, so does algorithmic concern.

"The A-level test problem is just the tip of the iceberg," said Cory Crider, co-founder of Foxglove, which deals with the abuse of digital technology. The algorithm reflects the biases found in the raw data used. I said. Cryder pointed out that one should be wary of blaming technology alone.

"Whoever says it's a technology problem is lying. What's wrong with the UK test is that they have made political choices to minimize credit inflation. It's not a technology problem, it's a political choice."

Parksgrove and the Immigrants Welfare Council recently raised a problem with the UK Home Office's algorithm for dealing with visas. They pointed out that the algorithm is biased towards visa applications from certain countries and forces them to refuse visas. Paxglobe argues that the UK's visa screening system injects past bias and discrimination into computer programs to further strengthen bias and discrimination.

"We're looking at how the visa processing system works, and we are redesigning the process to make it safer and more refined," the British Home Office said. "However, the immigrant welfare council's claim is unacceptable." . Cryder says that the fact that historical data lead to algorithmic bias can be clearly confirmed in other fields, as is the US police's predictive crackdown system.

In June, the city of Santa Cruz, California, banned the use of the system, saying the software used by police officers could discriminate against people of color. Mayor Justin Cummings said, "We have technology in our society that can target people of color. This is a technology we don't need." "The problem is the data being input. The old data is fed into the algorithm, and the biases that exist in that data are reflected in the algorithm," Cryder says.

The same problem is pointed out by Helena Webb, chief researcher at Oxford University's Department of Computer Science.

“What's in question is the historical data that the algorithms learn. Several facial recognition techniques have emerged, the problem is that these systems learn with the faces of white men, especially white men. So the systems applied in the field recognize the faces of white men. "I'm good at face recognition for women, especially women of color. There's a problem with the data input into the algorithm and the data input method."

The web says that these problems require diversifying the input data set and reflecting the opinions of more diverse people in the algorithm.


●Untact technology that spreads... Can't we be free from AI bias?

Experts hope that the controversy over the UK's academic assessment algorithm will lead to stronger supervision over technology. It is pointed out that the institutional supervision of the new AI (artificial intelligence) system is insufficient.

Instagram CEO Adam Mosseri said, "Some technologies risk repeating the prejudices that exist in our society. Together with our efforts to keep our products and services free of bias, "We need to look closely and remove the bias." Facebook, the parent company of Instagram, recently created a team in charge of checking system bias.

Automated systems are shaping our lives in many different ways these days. "Are there any areas where algorithms aren't used in everyday life," Cryder said, saying, "I hope there will be a movement against the use of algorithms."

The US CNN broadcast reported that algorithms based on past data intensify inequality by reinforcing the bias existing in the past data. It is that the wave of algorithmic education evaluation in the UK shows the side effects of such an algorithm.

The controversy over the'algorithm credits' caused by Corona 19 has ended by requiring the British government to use either the score calculated by the algorithm or the score calculated by the school itself for college entrance examinations. However, with the increase in non-face-to-face activities due to Corona 19, automated systems based on computer algorithms are expanding into our daily lives more broadly. I don't know if artificial intelligence, prejudiced by prejudice, is evaluating me and adjusting my daily life.   

(Photo = Getty Image Korea)