/cdn.vox-cdn.com/uploads/chorus_image/image/52804099/53351439.0.jpg)
Facial recognition technology is known to struggle to recognize black faces. The underlying reason for this shortcoming runs deeper than you might expect, according to researchers at MIT.
Speaking during a panel discussion on artificial intelligence at the World Economic Forum Annual Meeting this week, MIT Media Lab director Joichi Ito said it likely stems from the fact that most engineers are white.
"The way you get into computers is because your friends are into computers, which is generally white men. So, when you look at the demographic across Silicon Valley you see a lot of white men,” Ito said.
Ito relayed an anecdote about how a graduate researcher in his lab had found that commonly used libraries for facial recognition have trouble reading dark faces.
“These libraries are used in many of the products that you have, and if you’re an African-American person you get in front of it, it won’t recognize your face,” he said.
Libraries are collections of pre-written code developers can share and reuse to save time instead of writing everything from scratch.
Joy Buolamwini, the graduate researcher on the project, told Recode in an email that software she used did not consistently detect her face, and that more analysis is needed to make broader claims about facial recognition technology.
“Given the wide range of skin-tone and facial features that can be considered African-American, more precise terminology and analysis is needed to determine the performance of existing facial detection systems,” she said.
“One of the risks that we have of the lack of diversity in engineers is that it’s not intuitive which questions you should be asking,” Ito said. “And even if you have a design guidelines, some of this stuff is kind of feel decision."
“Calls for tech inclusion often miss the bias that is embedded in written code,” Buolamwini wrote in a May post on Medium.
Reused code, while convenient, is limited by the training data it uses to learn, she said. In the case of code for facial recognition, the code is limited by the faces included in the training data.
“A lack of diversity in the training set leads to an inability to easily characterize faces that do not fit the normal face derived from the training set,” wrote Buolamwini.
She wrote that to cope with limitations in one project involving facial recognition technology, she had to wear a white mask so that her face could “be detected in a variety of lighting conditions,” she said.
“While this is a temporary solution, we can do better than asking people to change themselves to fit our code. Our task is to create code that can work for people of all types.”
This article originally appeared on Recode.net.