Thursday 12 May 2022

Google Introduces Monk Skin Tone Scale to Improve AI Color Equity

When it comes to trailblazers in the field of color equity, Google doesn’t grace the top of many lists. But there’s a contingent within the company trying to change that. At its I/O 2022 conference, Google introduced a tool it intends to use to improve color equity through representation. It’s a set of ten color swatches that correspond to human skin tones, running the whole gamut from very light to very dark. And Google open sourced it on the spot.

Fairness is a major problem in machine learning. It’s already difficult enough to reduce human values to an algorithm. But there are different kinds of fairness — twenty-one or more, according to one researcher. Statistical fairness is not the same as procedural fairness, which is not the same as allocational fairness. What do we do when different definitions of fairness are mutually exclusive? Instead of trying to write one formula to rule them all, Google has taken a different approach: “Start where you are.”

Where we are is in a state of desperately unequal digital representation. Google is the largest search purveyor on the planet, by a long shot. Run an incognito search on Google Images for “CEO,” and what you get is a sea of white male faces, two of whom are Elon Musk. Search for “woman,” and it’s absolutely true that the results skew young, slender, white, able-bodied. But one of the faces the search returned was a deepfake of a pale young woman, generated by NVidia’s StyleGAN. I’ve written about this specific deepfake before in a different article, so it surprised me to see her face again. I had to double check that I was in incognito mode — but I was.

There are seven billion humans on this planet, and most of them are people of color. There’s a kind of poetry in the idea that Google’s search algorithm, instead of showing a brown or black person, would prefer to return a woman that doesn’t actually exist.

Introducing the Monk Skin Tone Scale

The ten-shade scale was developed by Harvard sociologist and ethicist Dr. Ellis Monk, in collaboration with Google. “In our research, we found that a lot of the time people feel they’re lumped into racial categories, but there’s all this heterogeneity with ethnic and racial categories,” Dr. Monk said in a statement. “And many methods of categorization, including past skin tone scales, don’t pay attention to this diversity. That’s where a lack of representation can happen…we need to fine-tune the way we measure things, so people feel represented.”

The Monk Skin Tone Scale. Image: Google/Dr. Ellis Monk

Google announced that it’ll use the Monk Skin Tone Scale (MST) to improve racial and color representation in search results. There, the scale will make it much easier to access, for example, information on Black hair colors and textures. And the colors aren’t named — not a single café au lait or chocolate comparison in sight. (Are you listening, Pantone?) But the company is also building it into Google Photos, where it will become part of “a new set of Real Tone filters that are designed to work well across skin tones and evaluated using the MST Scale.”

In addition to using the MST scale to improve color equity, the search titan outlined plans for a “standardized way to label web content. Creators, brands and publishers will be able to use this new inclusive schema to label their content with attributes like skin tone, hair color and hair texture.”

Google intends to roll out MST/Real Tone features across Android, iOS, and Web services over the next several months.

Now Read:


No comments:

Post a Comment