A Black teenager in the US was barred from entering a roller rink after a facial-recognition system wrongly identified her as a person who had been previously banned for starting a fight there.
Lamya Robinson, 14, had been dropped off by her parents at Riverside Arena, an indoor rollerskating space in Livonia, Michigan, at the weekend to spend time with her pals. Facial-recognition cameras installed inside the premises matched her face to a photo of somebody else apparently barred following a skirmish with other skaters.
Robinson was thus told to leave the premises by staff. She said the person in the image couldn’t possibly be her because she had never been to the skating rink before. Her parents, Juliea and Derrick, are now mulling whether it’s worth suing Riverside Arena or not.
“To me, it’s basically racial profiling,” Lamya’s mother told Fox 2 Detroit. “You’re just saying every young Black, brown girl with glasses fits the profile and that’s not right.”
One of the arena’s managers later called Lamya’s mother to discuss the issue. And in a statement, the biz said: “The software had her daughter at a 97 percent match. This is what we looked at … if there was a mistake, we apologize for that.”
Lots of mistakes to be found
Facial-recognition technology is controversial. Experts in the AI research community, lawyers, and even law enforcement have called Congress to place a moratorium on using the software in the real world. Several projects have shown that the algorithms involved generally struggle with accurately identifying women and people of color, such as Lamya.
The House judiciary committee held a hearing about the effects of facial recognition used in law enforcement just this week. Robert Williams, a man from Detroit, who was wrongly arrested and detained for 30 hours, testified.
“I grew up in Detroit, and I know from that experience that the fact of the matter is that people that look like me have long been more subject to surveillance, heavy policing, and mass incarceration than some other populations,” he said. “I worry that facial recognition technology, even if it works better than it did in my case, will make these problems worse.”
There is no federal-level regulation of the technology in America, however, and Congress seems unlikely to act on the issue. Instead, individual states and cities have their own rules that vary in terms of how and where the facial recognition cameras can be used or not.
In Maine, for example, state officials cannot use the technology and cannot contract third parties to do so. The software cannot be used except in cases involving serious crimes or to search for registered vehicles. Elsewhere, in Portland, Oregon, facial-recognition cameras are not allowed to be used inside any public or private places, from grocery stores to train stations.
Many states, however, are pretty lax about it. Banks in Florida and North Carolina use systems to monitor customers and, in some cases, shoo away homeless people loitering outside. ®