Technology used in self-driving cars has a racial bias that makes autonomous vehicles more likely to drive into darker skin tone people, a new study claims.
Researchers at the Georgia Institute of Technology found that detection systems, such as the sensors and cameras used in self-driving cars, are better at detecting people with lighter skin tones.
That makes them less likely to spot black people and to stop before crashing into them. Tests on eight image-recognition systems found this bias held true, with their accuracy proving five per cent less accurate on average for people with darker skin.
To prove the hypothesis, the scientists divided a large pool of pedestrian images into two groups of lighter and darker skin using the Fitzpatrick scale – a scientific way of classifying skin colour.
Even when changing the time of day or obstructing the image-detection systems view, the average accuracy remained the same.
It is not the first time that machine learning and vision systems have been shown to have an in-built bias.
Researchers at the Massachusetts Institute of Technology (MIT) found that Amazon’s facial recognition software Recognition had a harder time identifying a person’s gender if they were female or darker-skinned.