Google under attack for ‘illegally’ targeting homeless people of color in facial recognition data collection

Mohammed Awal October 14, 2019
The facial recognition device is pictured here on a Randstad employee's desk. Pic credit: Daily News

Tech giant Google is facing serious scrutiny following reports that it funded a facial recognition project, targeting people of color with dubious tactics.

The company embarked on the massive facial recognition project to build a diverse database for its upcoming Pixel 4 smartphone aimed at avoiding accusations of racial bias.

Google’s facial recognition technology in the past had tougher time identifying people with darker skin and to avoid a recurrence, it paid hired temps to go out to collect face scans from a variety of people on the street using $5 gift cards as an incentive, reports New York Daily News.

In a detailed report, The News cited several people who worked on the project disclosing that Google’s desperation for the data led to “questionable and misleading methods.”

The workers are known as Google TVCs, an acronym for temps, vendors or contractors. These temps were deployed to target homeless people in Atlanta, unsuspecting students on college campuses around the U.S. among others.

Randstad, the third party employment firm, that engaged the temps instructed them to go after people of color and also hide the fact that people’s faces were being recorded and even lie to maximize their data collections.

Some were told to gather the face data by characterizing the scan as a “selfie game” similar to Snapchat, The News reported.

One said workers were told to say things like, “Just play with the phone for a couple of minutes and get a gift card,” and, “We have a new app, try it and get $5.”

“We were told not to tell (people) that it was video, even though it would say on the screen that a video was taken,” a source said, adding that video of each user was stored under each TVC’s profile and periodically reviewed in performance meetings.

“If the person were to look at that screen after the task had been completed, and say, ‘Oh, was it taking a video?’… we were instructed to say, ‘Oh it’s not really,’” they said.

Acknowledging the data collection, A Google spokesperson said: “We regularly conduct volunteer research studies. For recent studies involving the collection of face samples for machine learning training, there are two goals. First, we want to build fairness into Pixel 4’s face unlock feature. It’s critical we have a diverse sample, which is an important part of building an inclusive product,” 

“And second, security. Pixel 4’s face unlock will be a powerful new security measure, and we want to make sure it protects as wide a range of people as possible. “

Last Edited by:Kent Mensah Updated: October 14, 2019

Conversations

Must Read

Connect with us

Join our Mailing List to Receive Updates