Twitter seems to be more likely to spot white people in photos than black people, user tests showed this week.

Media wrote about "racist algorithms", but can we call it that?

And how does discrimination arise in computer systems?

It all started last week with a conversation between the white student Colin Madland and a black faculty member via the video calling service Zoom.

On Zoom it is possible to set a virtual background, so that the other cannot see in which room you are actually sitting.

But the software failed to correct the crop, causing the black man's head to fade into the background.

As a result, only a shirt with no head above it was visible.

The student posted a screenshot on Twitter to address the issue when he encountered another error.

Twitter shows previews of images that are expandable, but in this case only cut out the white man in the mobile app.

The tweet gained attention and not long after, many other users were experimenting with pictures of white and black people.

In many cases the white person was found to be cut out.

any guesses?


Avatar Author colinmadlandMoment of places00: 18 - 19 September 2020

Everything starts with the training data

Media spoke of "racist algorithms", but we have to be careful with that label, says neuroinformatician Sennay Ghebreab of the University of Amsterdam.

"Effects can be discriminatory. The problem with referencing algorithms that are racist is shifting the problem from society to the algorithm."

Algorithms partly determine what people see on the internet, based on a mathematical formula.

Often they learn to make their own decisions based on the information they are fed with.

"Algorithms do not come out of the blue," says Marlies van Eck, researcher in technology and (tax) law at Radboud University.

"In the case of facial recognition, images of faces are put in a pool of training data. If those are mainly faces of white men, the system only knows those examples."

'Diversity in developer teams is very important'

Both Ghebreab and Van Eck say it starts with teams working on these types of algorithms.

"They must consist of people from different backgrounds," says Ghebreab.

"If you look at the diversity in those teams at large tech companies, it is completely out of proportion to reality."

It is one of the priorities in the search for a solution.

"You can never keep everything out of the systems, but you can try to minimize it," says Ghebreab.

Van Eck agrees with this, who adds that it is never possible to predict exactly what will happen with new innovations.

"You can make a hammer and meet all the conditions, and then you will see that someone else takes it off in a different way."

Van Eck emphasizes the importance of diversity with an example from the real world: "There are stairs with a transparent underlay. That is very annoying for a woman in a skirt. Then I think: yes, here is another man at work. been. "

It often goes wrong with algorithms

In practice, algorithms fed (unconsciously or otherwise) with biased data can cause problems.

There was, for example, the now discontinued SyRI system, with which governments and organizations could detect fraud by collecting a lot of data from citizens and applying algorithms to it.

Based on this, a risk analysis was made and it was possible to determine which people may have committed fraud.

A judge ruled that the system was in violation of human rights.

And so

The Washington Post


last year about an algorithm in computers in American hospitals.

One study revealed that the program caused white patients to receive specialist care more often than patients with a different skin color.

“We live in an increasingly data-driven world,” says Ghebreab.

"Wherever algorithms are used, there is a chance that discrimination will be magnified. It can show up in Zoom, but also with insurers who have to make decisions for citizens. Or with models used in hospitals. There are many examples."

'Let yourself be taught'

Ghebreab says some catching up is needed.

"Many of the algorithms in use today have foundations that were laid 15 years ago. They have a history. It takes time to get the errors out."

According to him, there is not only a role for the makers of these systems.

"I would say to citizens: let yourself be educated about algorithms. Only when you have a basic idea of ​​how it works, you know how your data is used and how it affects you. This development also starts with people. People make it. applies it and can improve it. It is not the computer, but society's turn. "