According to the FTC, Rite Aid has been making “reckless” use of AI facial recognition software to track and profile its customers. It started in 2012, when the company rolled out software that collected images of shoppers and built a database of tens of thousands of people, including names, birthdates, and suspected crimes – a serious data privacy concern. And all of this data was collected without customer knowledge or consent, meaning Rite Aid customers could find their face in a database just by stepping into a store.
The system was intended to spot shoplifters, but it was deeply flawed from the start. It used low-quality surveillance images to make matches and generated thousands of false positives. When the software saw someone enter a Rite Aid store that matched an image in its database, it would alert employees, who would then follow customers around the store, search them, accuse them of theft, or call the police.
Some of the matches the software made were beyond belief, such as generating 900 alerts for a single person across 130 stores in multiple states over a five-day period. In another case, the software matched a Black woman entering a store with a White woman in its database – the police were called, and the woman was kicked out of the store before Rite Aid employees realized it was a false positive.
And that’s where the FTC’s accusation of recklessness comes in. Rite Aid rolled out its facial recognition system without keeping safety in mind. The company never assessed the accuracy of its facial recognition technology, nor did it monitor accuracy afterward. While the system regularly generated false positives, Rite Aid didn’t train employees on how to deal with them. The result was predictable: many shoppers found themselves harassed or kicked out of stores simply because they walked inside and had their faces scanned.
Rite Aid stopped using facial recognition technology in 2020, and now the FTC is banning the company from using similar systems for the next five years and requiring it to implement better data security programs. Rite Aid is also required to delete – and have its third party partners delete – images collected for this database, as well as informing customers if they’re collecting biometric data in the future.
Unfortunately, Rite Aid is only the tip of the iceberg when it comes to the use of facial recognition software. Over the past few years, many retailers have experimented with using facial recognition in stores to keep an eye on customers – for good and for ill. Stores like to tout the ways in which such software would be good for shoppers: it might recognize you’re in the store and then send personalized push notifications to your phone pointing out deals in the aisle you’re on.
But such systems can be misused, and Rite Aid isn’t the only example. In 2022 a woman was kicked out of Radio City Music Hall after facial recognition software identified her as an employee of a law firm with litigation against the venue’s owner. While Rite Aid’s facial recognition system showed us what a poorly implemented system can do, the Radio City Music Hall incident shows how powerful such systems can be when they work well, correctly linking a woman to the place she worked based on security footage.
There’s very little regulation on companies collecting facial recognition data, and for the most part, businesses don’t even have to notify you that they’re collecting data the instant you walk into their stores. New York City requires businesses to conspicuously notify people if their biometric data is being collected, but that’s an outlier: in most places, companies can collect this data freely.
It’s hard to get straight answers from retailers on whether they’re collecting a database of personal information on you that includes your face: when the ACLU attempted to find out how many top retailers were using facial recognition technology in 2018, only 10% of companies surveyed were willing to answer the question. Advocacy group Fight for the Future tracks retailers using facial recognition, and while a handful of stores have stated they aren’t using the technology, most are leaving the door open on the matter or won’t comment.
So, what does this mean for you? For now, facial recognition is a technology to be aware of. Whenever your information is being collected and stored, it creates a privacy risk. You don’t know who has access to the information or whether it’s being stored securely – and there’s always the risk hackers could steal it and sell it. And there still isn’t any comprehensive regulation on how retailers notify you they’re collecting this data or how they keep it safe. It’s possible that the FTC’s Rite Aid ruling will encourage other retailers to follow suit on notifying shoppers, but for now, they usually aren’t required to, so you need to pay close attention to retailer policies.
[Image credit: woman with facial recognition via BigStockStock, retail backdrop via Adobe]