January 27, 2024 at 1:49 pm

Rite Aid’s Facial Recognition Profiled Shoppers Profiled By As Potential Thieves, So FTC Shuts It Down

by Laura Lynott

Imagine going shopping and you don’t know why but you’re instantly followed round a store or stopped from buying something.

It seems hard to believe that such a thing could happen to most of us. But this actually did happen to thousands of people across several states.

A facial recognition system at drugstore chain, Rite Aid, was found to have misidentified Black, Latino, Asian people and women, as ‘likely’ shoplifters.

shutterstock 2057011499 1 Rite Aids Facial Recognition Profiled Shoppers Profiled By As Potential Thieves, So FTC Shuts It Down

The tech should have been used to reduce shoplifting, but instead it wound up with thousands of innocent shoppers being identified as potential thieves.

This led to employees placing certain people under increased surveillance, stopping them from buying items, or even accusing them in front of their friends, family, or other customers, of previously committing crimes.

It seems hard to even imagine but this facial recognition was utilized in hundreds of stores in New York City, LA, San Francisco, Philadelphia, Baltimore, Detroit, Atlantic City, Seattle, Portland, Oregan, Wilmington, Delaware, Sacramento and California.

The system was in operation from October 2012 until July 2020. But last month, the Federal Trade Commission (FTC) reprimanded Rite Aid for its use of the system and banned the retailer from using it for five years, as well as other penalties.

shutterstock 1626183544 Rite Aids Facial Recognition Profiled Shoppers Profiled By As Potential Thieves, So FTC Shuts It Down

FTC Bureau of Consumer Protection director Samuel Levine released a statement: “Rite Aid’s reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk.

“Today’s groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices.”

According to the FTC legal complaint, people were unfairly targeted by the tech, used in mostly non-white neighborhoods.

Innocent children were also selected, including an 11-year-old girl.

“The girl’s mother told Rite Aid that she had missed work because her daughter was so distraught by the incident,” reads the FTC complaint.

Rite Aid issued a statement that said the company will comply with the FTC ruling, while adding the tech was part of a pilot program deployed “in a limited number of stores.”

The fact the system had been in operation for so long should be concerning to anyone concerned for civil liberties.

It’s a good thing this was stopped and recognized by the authorities and the company, as something that had to change.

If you enjoyed that story, check out what happened when a guy gave ChatGPT $100 to make as money as possible, and it turned out exactly how you would expect.