Testing face recognition on people of color isn't a bad thing unless you do it the wrong way

It's no secret that using any sort of software that does more than a cursory "look" at people has issues when used by a person of color. Microsoft had issues with the Kinect and even a simple heart-rate sensor on your smartwatch can struggle here. Google is trying to avoid these kinds of issues by training its smart face unlocking using people of color in the hopes that more data fed into the system can help overcome the problem.

You can't use any sort of face recognition software unless it works for everyone.

I think most of us can agree that this is a good thing. We're all people and though skin color is simply a cosmetic difference, it is a valid concern in some cases. Working to make it less of one is a great idea. But it's still going to always be something that's almost uncomfortable to talk about because it focuses on the idea that skin color implies a difference beyond the visual. It's human nature to try to be unoffending and politely shy away from anything outside our comfort zone.

More: I'm big and black and heart rate monitors are terrible for people like me

The issue isn't really anything we should feel uncomfortable about, that's just how we're wired inside. But taking to task something that's already controversial and bungling it beyond belief is something Google should know better than to do and having third-party contractors deceive people and target vulnerable folks like the homeless to gather this data is downright stupid.

Gestures

In case you're not aware, that's exactly what Google is accused of doing according to the New York Daily News. Interviews with current and former employees claim that they were directed to target the homeless because they would be less likely to talk to the media. Or that they were simply playing a "selfie game" or that the $5 gift card they would receive could be exchanged for cash in certain states.

As mentioned, gathering this data is for a good cause. We saw Apple do the same thing prior to the launch of Face ID and for the same reason — you only fix the issue of face recognition working so poorly with darker skin by collecting more data. Google needs to do this for the launch of the Pixel 4. It's not what Google was doing here or why it was doing it — it's how.

The problem isn't what Google is doing, it's how.

Using trickery to exploit anyone is never a good look. When you're doing it and targeting a group of people for a specific reason, you need to be up-front with them and let them know why you need their help. I can't speak for anyone besides myself, but I don't think someone with darker skin would think what Google is trying to do is a bad thing. The trope that "I have friends who are black" is tired and old, but I do, and none of them think collecting more data to make face recognition better for people of color is something Google needs to hide. This small sample isn't conclusive of anything in itself and once again we have to realize that the whole subject of skin color can be touchy, but there is only one way to go about this sort of testing — honestly.

This whole mess makes a bad situation worse, and now it seems like Google may feel that most homeless people aren't white or that people of color are easier to fool and that makes me more than a little upset. I'm sure I'm not alone here and there are plenty of people who may think twice the next time Google wants an opinion or some personal data.

Google, be better.

Jerry Hildenbrand
Senior Editor — Google Ecosystem

Jerry is an amateur woodworker and struggling shade tree mechanic. There's nothing he can't take apart, but many things he can't reassemble. You'll find him writing and speaking his loud opinion on Android Central and occasionally on Twitter.

19 Comments
  • This isn't a mess, there was nothing to make worse, and this websites constant race baiting over it is annoying. I also have black friends, and also family, as I am black. This kind of White Knight ******** over this is annoying. Leave color out of it, and you have no article and the NYT has no article, as this is not a controversy. You and this website want it to be, for clicks. Hey white dude, stop telling us black people when we are being treated unfairly, WE DON'T NEED YOU TO TELL US. Whitesplaining is the worse, most annoying thing to come out of this SJW trash.
  • Dude, you are not black, I've read other post from you. plus no black person in their right mind would use your user name... Just stop. While I agree, as an actual black person that this is not that big a deal, the article makes some great points and I'm glad there are ppl like Jerry out here who are cognitive and clear thinking on issues like these.
  • I have been black for 45 years, I don't need to explain it to you. Let's see: My handle "slave". My grandfather was grandson to a slave, some would say that black people are still slaves. NFL players live under a slave mentality. I am slave to my debt, as my student loans determine everything I do. It's a metaphor you fool. What's not a metaphor is the stupidity of this article and the attached outrage. Stop using black people as a tool to express your outrage. We don't need the White Knights riding up to save us.
  • Any "outrage" is about Google doing this so stupidly and trying to deceive people. Face scanning needs more data from people with dark skin. So scan people with dark skin and fix the issue, not trick homeless people or lie to people about what you're doing. I knew that there would be people gining me gruff for writing this but I just don't care. It needs said — stop doing stupid things, Google.
  • Assuming that Google actually did what they are accused of doing. We really don't have a lot of information at the moment. And even accepting the reporting as accurate, it's also not clear if Google realized what was happening. That's not great, but not the same as actively agreeing to fool people.
  • I'm old fashioned and liked the FPS where is was. Plus I was very excited about the base model pixal4. Until I read about the 90hz display with a 2800mamp battery, huh? This will be an 800+ phone & I can't understand why Google would due this thing!
  • Jerry, sorry. Google, you, and all the people that are writing about this, and many other things, need to learn something. There is no "people of color" or "colored people", there's only people. Is there a colorless or transparent person? No. Is a specific skin tone the standard for the human race? No. We should talk about how companies, like Apple, and Google, keep segmenting the human race by releasing "color correct" features. Hey, smart people, release these when they work for EVERYONE! Let's be clear: I'm not calling any of you, bloggers, tech gurus, and even the companies, who are a reflection of the people in it, as racists. I'm just saying that we should get over the color blindness and be fair to ALL HUMAN BEINGS.
  • I'm just going to jump in here before anyone comes in and says you're racist for not recognizing others skin tones. This whole thing of people with different skin tones needing to be treated differently is ridiculous and is racist. So thank you for you're comment. Common sense has become far too scarce as of late.
  • So, everyone is thre same color? Computers and cameras set everyone as pink? Or green? No, the human skin tone rawness quite widely, and as such computers will need to be trained on a wide variety of skin colors in order to do any processing based on appearance. It's not even just skin color, as some bone structure features, especially in the face, are linked to ethnicity. Thre problem comes when we as a society have layered ourselves. There are many more European descendants in tech than African descendants, and Asian descendants are rare in the decision making areas in tech. Thus means in order to train software tech companies need to go out of their way to train with non-white peoples. Edit: i do want to point out that i think we add a society need to work at increasing access to tech sector jobs (amongst other sectors) for all races and ethnicities, but while we'r do that it's not unreasonable to find ways to ensure all people are represented in the software algorithms.
  • Yes, we should. But the only way to do that, IN THIS CONTEXT is to recognize that some skin tones and facial features seems to trip facial recognition at a rate that is unacceptably high. Pretending that this is not happening is the exact opposite of being fair to all people. On the other hand, being fair to all people requires that people be treated equitably and honestly, with no reference to skin tone, no matter what you noble goals are. And, that's the problem here - Google is accused of trying to do something by being dishonest, and it disproportionately affected people with a particular range of skin tone *because of their skin tone.* Trying to pretend that "color doesn't exist" is, at best stupid. Because while it SHOULD NOT matter - the reality is that it does. Not because there is actually something inherent to those colors, but that doesn't change the fact that it matters.
  • You guys are funny. Jerry well said. I think the formula Google used overall sucked. $5 gift card for my face data yikes that's low. So they target a group that would be more willing and less likely to spread the word.. awkward but okay. But then misleading them is wrong no matter the color. And of course the fact that they are people of color (yes in this scenario that is important to note) it looks even worse. But you won't see black people up in arms over it as a racial thing. At least I'm not. But the bottom line like Jerry said Google, be better.
  • My question is this: Why wasn't this part of the equation from the inception of this technology? Someone earlier made the exact same point, there is no such thing as as a non-colored person, except Larry Bird. "Larry is not White, Larry is clear". The fact that this oversight happened and just donned on folks now is where the damage is
  • Probably because of the demographics of the people designing the product and doing the testing. The same reason psychological studies are a reliable indicator of what college students think, but not necessarily anyone else. The same reason that it takes decades and big data to realize that some groups don't respond to medications the way young white men do, or don't present the same symptoms for heart attacks. Our so-called representative samples represent the people around the developers, because that is where the guinea pig population comes from first.
  • The only real issue here is that the temps didn't tell, or were told not to tell, the testers that they would be keeping scans of their faces. Otherwise this is pretty much product testing 101. There's no issue of race here at all. They were targeting various skin tones to make the product better.
  • Well you see 10+ years ago race wouldn't have been brought up but you know people have to grasp at straws when they can use the race card.
  • 10+ years ago race wouldn't have been brought up because no one would have tried to single out a race to build a better algorithm. The designers would have just left it as it was and hoped for the best.
  • I don't think race has anything to do with this. Google was just not honest and transparent about the testing. We should not make everything about race. The overwhelming majority of people are not racist and make an effort to view people as individuals.
  • I'm not surprised at the "brilliance" Google has displayed here. In the 1990s there was a medical study concerning cancers of the breast and uterus. All the people used in the study were MEN ! 🧐🤔
  • So, you don't want companies to target specific demographics when they need to (women for breast and uterus research, minorities for algorithms)?