Connect with us

Hi, what are you looking for?

Money And WelfareMoney And Welfare

Business News

Rite Aid wrongfully used facial recognition to accuse customers of shoplifting, FTC says

The Federal Trade Commission said Wednesday that Rite Aid secretly used facial recognition on customers for nearly a decade and wrongfully accused many of shoplifting.

The FTC said Black and Asian customers were more likely than white customers to be misidentified as people who had shoplifted or tried to. Women were more likely to be misidentified than men.

According to the complaint, Rite Aid contracted with two companies to help create a database of images of people — considered to be “persons of interest” because Rite Aid believed they engaged in or tried to engage in criminal activity at one of its retail locations — along with their names and other information, such as any criminal background data.

In a statement, Rite Aid said the program was used at only a few stores and discontinued in 2020.

Under the settlement, the agency barred Rite Aid from using facial recognition technology in stores or online for five years. The FTC confirmed to NBC News that it’s the first time it has ordered any entity to stop using facial recognition.

“Rite Aid’s reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk,” Samuel Levine, the director of the FTC’s Bureau of Consumer Protection, said in a news release.

A bankruptcy court must approve the order, because Rite Aid filed for bankruptcy protection in October.

In its statement, the FTC said Rite Aid started using ‘artificial intelligence-based facial recognition technology’ in 2012 to identify customers who may have been shoplifting or engaging in other undesirable behavior. It contracted with two companies to develop a database of people who had shoplifted or been accused of shoplifting in the past.

People walk past a Rite Aid in Queens, N.Y., on Oct. 16.Anthony Behar / Sipa USA via AP file

The database was often filled with low-quality images from store security cameras, employees’ phones or news stories, along with names and other information, including any criminal background data.

Then the system sometimes identified shoppers as people who had been accused of shoplifting at other stores thousands of miles away or flagged one person at dozens of stores around the country.

The FTC said the system generated thousands of false positives, more commonly in stores in neighborhoods with large Black and Asian populations than with white ones.

‘Employees, acting on false positive alerts, followed consumers around its stores, searched them, ordered them to leave, called the police to confront or remove consumers, and publicly accused them, sometimes in front of friends or family, of shoplifting or other wrongdoing,’ the FTC said.

Customers weren’t informed that Rite Aid was using the technology, and employees were ‘discouraged’ from revealing it, the agency said.

It also said Rite Aid didn’t take steps to mitigate the harm that would have been caused by falsely accusing customers of shoplifting and didn’t regularly monitor or test its system, nor did it tell employees that the system could make mistakes. When the company switched to a technology that enabled employees to report bad matches and required them to use it, it didn’t follow through to make sure employees followed that policy.

Notably, the FTC didn’t fine Rite Aid. Justin Brookman, the director of technology policy for Consumer Reports, said the agency seemed to have prioritized establishing a precedent regarding companies’ responsibilities when they use facial recognition technology in stores.

“I get the impression that it’s growing in practice, so the FTC wanted to get ahead of this to some extent,’ Brookman said.

The FTC said in May that companies that use biometric technologies such as facial recognition have to implement them fairly and mitigate the harm they could cause. It also said companies that offer biometric technology can’t make false promises about its accuracy or effectiveness.

In 2010, just two years before Rite Aid started using the facial recognition technology, it settled an FTC charge saying it failed to protect the medical and financial records of customers and employees. That resulted in a $1 million penalty, and the FTC said that in implementing the facial recognition system and failing to oversee its vendors, Rite Aid violated that settlement.

The agency said Rite Aid will be required to implement ‘a robust information security program, which must be overseen by the company’s top executives.’

This post appeared first on NBC NEWS

You May Also Like

Stock News

In this episode of StockCharts TV‘s The Final Bar, Tony Dwyer of Canaccord Genuity talks Fed policy, corporate bond spreads, and why the level of interest...

Investing News

SAGA Metals Corp. (TSXV: “SAGA”) (FSE: “20H”) (“SAGA” or the “Company”), a North American exploration company focused on critical mineral discovery in Canada, announces...

Stock News

SPX Monitoring Purposes: Long SPX 8/9/23 at 4467.71. Long SPX on 2/6/23 at 4110.98: Sold 6/16/23 at 4409.59 = gain of 7.26%. Gain since...

World News

An hour before the public release of an indictment that alleges the former president of the United States led a criminal conspiracy to overturn...