#GAReads | Artificial Intelligence Has a Problem With Gender and Racial Bias. Here's How to Solve It
“Artificial Intelligence Has a Problem With Gender and Racial Bias. Here's How to Solve It”:
Machines can discriminate in harmful ways.
I experienced this firsthand, when I was a graduate student at MIT in 2015 and discovered that some facial analysis software couldn’t detect my dark-skinned face until I put on a white mask. These systems are often trained on images of predominantly light-skinned men. And so, I decided to share my experience of the coded gaze, the bias in artificial intelligence that can lead to discriminatory or exclusionary practices.