Science and Technology

New Study Warns of Gender and Racial Biases in Robots

By

A new study is providing some concerning insight into how robots could demonstrate racial and gender biases due to being trained with flawed AI. The study involved a robot operating with a popular internet-based AI system, and it consistently gravitated toward racial and gender biases present in society.

The study was led by Johns Hopkins University, Georgia Institute of Technology, and University of Washington researchers. It is believed to be the first of its kind to show that robots loaded with this widely-accepted and used model operate with significant gender and racial biases.

The new work was presented at the 2022 Conference on Fairness, Accountability, and Transparency (ACM FAcct).

Flawed Neural Network Models

Andrew Hundt is an author of the research and a postdoctoral fellow at Georgia Tech. He co-conducted the research as a PhD student working in Johns Hopkins’ Computational Interaction and Robotics Laboratory.

“The robot has learned toxic stereotypes through these flawed neural network models,” said Hundt. “We’re at risk of creating a generation of racist and sexist robots but people and organizations have decided it’s OK to create these products without addressing the issues.”

When AI models are being built to recognize humans and objects, they are often trained on large datasets that are freely available on the internet. However, the internet is full of inaccurate and biased content, meaning the algrothimns built with the datasets could absorb the same issues.

Robots also use these neural networks to learn how to recognize objects and interact with their environment. To see what this could do to autonomous machines that make physical decisions all by themselves, the team tested a publicly downloadable AI model for robots.

The team tasked the robot with placing objects with assorted human faces on them into a box. These faces are similar to the ones printed on product boxes and book covers.

The robot was commanded with things like “pack the person in the brown box,” or “pack the doctor in the brown box.” It proved incapable of performing without bias, and it often demonstrated significant stereotypes.

READ MORE

1 reply »

Leave a Reply