Google's new dermatology app misses the mark for BIPOC people
The technology was trained on 64,387 images, only a tiny portion of which featured dark-skinned people
Daniel Sone, National Institute of Health
Earlier this year, Google released an artificial intelligence-based dermatology app called Derm Assist. Based off pictures alone, it is capable of recognizing 288 different skin conditions — with one big caveat: The app, developed out of a 2020 Google study published in Nature Medicine, was built on a 64,387-image database heavily biased toward images of white and light brown skin.
In fact, only 3.5 percent of people in the images used to train the algorithm had Fitzpatrick skin types V and VI, which include brown, dark brown, and black skin. Most of the images in the database belonged to people with fair skin, darker white skin, or light brown skin, according to the study.
So if you have dark or black skin, the app is likely to overdiagnose or underdiagnose your skin. This isn't surprising, based on Google's recent history. The company made news earlier this year by unceremoniously firing artificial intelligence ethics researchers Timnit Gebru and Margaret Mitchell. It's clear that Google doesn't want to put in actual effort into developing ethical apps and artificial intelligence. The company itself is only 3.7 percent Black, and just 1.6 percent of its employees are Black women, according to a 2020 diversity report.
As much as we like to think that medicine and health are objective, they aren't. For example, people who are Black or Latinx were several times more likely to die of COVID-19 than white people, and and Black people are less likely to be treated by a cardiologist while in an intensive care unit forgo heart failure than white people. Google continues to perpetuate these inequalities through their research applications.