© 2021 MICHIGAN RADIO
91.7 Ann Arbor/Detroit 91.3 Port Huron 104.1 Grand Rapids 91.1 Flint
Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
News

Young Black girl says facial recognition software mistake barred her from entering skating rink

Surveillance_Camera_over_the_Platform_of_Train.jpg
John Seung-Hwan Shin
/
Wikimedia Commons Creative Commons license https://creativecommons.org/licenses/by/3.0/deed.en

A Black teenage girl said she was turned away from a Livonia skating rink after being pinged on its facial recognition software.
 
But the girl, 14-year-old Lamya Robinson, said she has never been to the arena before.

Robinson and her parents told a local television station she was mistaken for another girl, who was banned from the rink for allegedly fighting.

"That is not me. Who is that?" she recalled her confused reaction to Fox 2 Detroit. She says she was kicked out, standing outside alone at night.

Her mother Juliea Robinson called it racial profiling.

Critics of facial recognition technology have said the algorithm is inaccurate when it comes to processing darker skin. Research has found the "poorest accuracy consistently found in subjects who are female, Black, and 18-30 years old." The study "Gender Shades" from The Proceedings of Machine Learning Research saw disparities in accuracy between light-skinned men and darker-skinned women.

Lamya's father, Derrick Robinson, said to Michigan Radio that they are holding further comments as they consider their legal options.

The skating rink, Riverside Arena, did not respond to requests for comment from Michigan Radio.

The rink issued a statement to the television station, saying this was their "usual process" and it was hard to look into it when the line was long.

"The software had her daughter at a 97 percent match. This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that," the statement read.

The story broke out across Michigan and beyond. Representative Rashida Tlaib (D-MI) shared the story on Twitter, adding, "(f)acial recognition technology is racist."

Tawana Petty is the national organizing director with Data 4 Black Lives, an organization that aims to use ethical data science to make change in the lives of Black people.

GreenLightsign.jpg
Credit Lester Graham / Michigan Radio
Project Greenlight uses facial recognition technology in Detroit.

Petty said she considers facial recognition a "data weapon" that can lead to the criminalization of Black and brown people. Petty said the Robinsons' case  frustrates her, because it is something she and many others predicted. 

"It's very sad that it takes for a 14-year-old child to be extracted from a roller skating rink for a lot of people to say, 'Wow, that's now, that's a bit too far.' She should never have had to have that experience."
 

Petty, based in Detroit, said due to Project Greenlight — cameras in the city connected to the police headquarters — there could be many other stories' like the Robinsons' that weren't brought to the public eye. 
 
"We're just lucky in this case that law enforcement wasn't called on her."

Police use of facial recognition technology in Michigan

Robert Williams was wrongfully arrested by the Detroit police.
Credit ACLU of Michigan
Robert Williams was wrongfully arrested by the Detroit police.

A Black Farmington Hills resident, Robert Williams, was wrongfully arrested in his driveway, in front of his wife and children, by Detroit police in 2020.

Williams' arrest was based on a faulty facial recognition match, and no other corroborating evidence, according to his lawsuit against the City of Detroit, its police chief and detective. 

He said in testimony to Congress that the incident has tramautized his family. 

"How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway — even if that meant arresting me for a crime I didn’t commit?" he wrote in an American Civil Liberties Union essay. "And as any other Black man would be, I had to consider what could happen if I asked too many questions or displayed my anger openly, even though I knew I had done nothing wrong."

"I get angry when I hear companies, politicians, and police talk about how this technology isn’t dangerous or flawed or say that they only use it as an investigative tool. If any of that was true, I wouldn’t have been arrested."

Retailers 

Data 4 Black Lives is partnered with Ban Facial Recognition, which tracks corporations that use the surveillance technology. This includes Macy's. It also lists stores that are considering the software, like 7-11, McDonald's, or Walgreens.

Petty said her group hopes to bring awareness to the public, so they can make critical decisions about their information being scraped.  

"It's just a very pervasive targeting type of system that has really drastic implications when leveraged by law enforcement. But it also is a tremendous invasion of privacy on every level. And not to mention the fact that folks are not giving consent for this," she said.

Related Content