Advertisement
Advertisement

Self-driving cars more likely to hit black people than whites – study sparks racism fears

March 07, 2019 at 08:16 am | Tech & Innovation

Mildred Europa Taylor

Mildred Europa Taylor | Staff Writer

March 07, 2019 at 08:16 am | Tech & Innovation

Automated cars on the road. Pic credit: Phys.org

Experts say people are a few years away from the reality of climbing into a self-driving car of their own and getting to their destination without touching a steering a wheel.

But, already, there have been concerns about how safe these automated cars are, and how they might worsen traffic, among others.

The latest worrying development about such vehicles is that they are more likely to drive into black people than whites, a new study claims.

According to researchers at the Georgia Institute of Technology, state-of-the-art detection systems, including the sensors and cameras used in self-driving cars, are better at detecting people with lighter skin tones.

This makes automated cars less likely to recognize people with dark skin and to stop before hitting them, the study titled “Predictive Inequity in Object Detection” noted.

Vox reports that the researchers began with a simple question: How accurately do state-of-the-art object-detection models, like those used by self-driving cars, detect people from different demographic groups?

To find out, the authors of the study looked at a large dataset of images that contain pedestrians. They divided up the people using the Fitzpatrick scale, a scientific system for classifying human skin tones from light to dark.

The researchers then analysed how often the models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.

Detection was five percentage points less accurate, on average, for the dark-skinned group, according to the results. Even when researchers controlled for variables like changing the time of day or obstructing the image-detection systems view, the average accuracy remained the same.

“We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models,” the authors of the story said.

As it stands, the study has not yet been peer-reviewed, meaning that technically it has not been proven accurate by the science community.

Nonetheless, many people in the AI (Artificial Intelligence) community are pushing for the results to be taken into consideration, considering most automotive companies do not publicly release their studies of the models, the Blavity reports.

Nearly 10 million cars with self-driving features are expected to be on the road by 2020, according to BI Intelligence, a leading market research firm.

A few years after 2020, the world will see fully autonomous vehicles that can drive on roads and handle different scenarios with little or no interaction from the driver, the firm predicted.

Self-driving cars can sense their environment and navigate around obstacles, says AARP Life Insurance Programme, adding that these cars “obey traffic laws and reach a preselected destination – even rerouting due to traffic, accidents or construction – by way of built-in cameras, radar, sonar, GPS and infrared sensors.”

Experts are hoping that concerns raised about automated cars would not be overlooked as most people wait for such cars to be their everyday rides.

This is not the first time people have spotted challenges of bias in machine learning and vision systems, according to the Independent.

In January, researchers at the Massachusetts Institute of Technology (MIT) found that Amazon’s facial recognition software Rekognition had a tougher time identifying a person’s gender if they were female or darker-skinned.

Most viewed

Conversations

Must Read