News

AI governance and biometric privacy takeaways from the FTC’S Rite Aid settlement

Image
Image

The Federal Trade Commission (“FTC”) has banned Rite Aid from using facial recognition technologies for surveillance for five years, demonstrating the FTC’s expectations regarding deployments of biometric and artificial intelligence (“AI”) technologies for surveillance.

The FTC settled an enforcement action against Rite Aid for allegedly engaging in unfair facial recognition surveillance practices and failing to implement a comprehensive security program that protects the security, confidentiality, and integrity of customers’ personal information. The proposed FTC order serves as an example for businesses using or contemplating the use of AI-enabled technologies for consumer surveillance or other initiatives that may cause similar consumer harms. More broadly, the settlement signals the FTC’s expectations for AI governance and what the FTC considers as a baseline for algorithmically fair biometric surveillance technology deployments.

Background

The Rite Aid enforcement action demonstrates the FTC’s continued focus on issues involving biometrics and algorithmic fairness. In May 2023, the FTC issued a policy statement highlighting the proliferation of biometric information technologies and warning of potentially significant risks to consumers if those technologies are mismanaged. The FTC cautioned businesses to take care that their collection and use of biometric information was fair and did not cause substantial consumer injury. In June 2022, the FTC released a similar warning about potential consumer harms from certain uses of AI and expressed the agency’s “significant concerns” that AI tools can be inaccurate, biased, and discriminatory by design and incentivize relying on increasingly invasive forms of commercial surveillance. The agency has expressed these concerns in Congressional reports, blog posts, and joint statements with other federal agencies.

Key areas of FTC concern

According to the FTC complaint associated with this enforcement action, Rite Aid deployed AI-based facial recognition technology in hundreds of retail pharmacy locations to identify customers who may have posed a threat for shoplifting or other criminal behavior. Based on automated biometric surveillance systems, Rite Aid employees took actions deemed harmful to consumers, including banning them from the premises, publicly accusing them of criminal behavior, and reporting individuals to the police. The FTC alleged that many of these actions were based on false positive reports, disproportionately impacting women and certain racial or ethnic groups.

The FTC complaint alleged that Rite Aid’s use of facial recognition technology constituted unfair practices under Section 5 as a result of its failure to take reasonable measures to prevent foreseeable harms to consumers. Per the FTC, despite foreseeable risks, Rite Aid implemented the biometric surveillance technology without adequately considering consumer safety.

Specifically, the FTC alleged that Rite Aid failed to:

  • Address potential risks of misidentification, including heightened risks for certain consumers because of their race or gender.

  • Perform due diligence in testing or verifying the accuracy of the technology prior to its implementation (including a lack of diligence into technology vendors).

  • Enforce image quality controls necessary for the technology to function accurately.

  • Adequately train employees tasked with operating the technology.

  • Consistently monitor or test the technology’s accuracy, once deployed.

  • Implement and maintain a comprehensive information security program, as required under a 2010 settlement with the FTC resolving separate allegations regarding the protection of sensitive personal information.

Stipulated order terms

The FTC’s proposed order includes the following provisions:

  • Use of Facial Recognition or Analysis Systems Prohibited. Rite Aid is prohibited from using automated facial recognition technology for surveillance purposes for five years. 

  • Deletion of Images and Analyses. Rite Aid must delete or destroy all photos, videos, and derived data collected with its facial recognition technology and provide written confirmation to the FTC. Rite Aid must also identify, inform, and instruct all third parties who received any of the above data to do the same.

  • Monitoring Program Implementation. Once five years have passed, Rite Aid may only employ an automated facial recognition technology surveillance system if it has first developed and consistently upholds a documented monitoring program that meets specific criteria, including:

    • appointing a qualified employee to supervise the program,

    • conducting risk assessments before deployment, and

    • on an annual basis, implementing safeguards to control identified risks, continuously evaluating and adjusting the program to reflect any changes significantly impacting its effectiveness, and presenting reports on the program and its evaluations to its board and senior executives.

  • Notice and Complaint Procedures. Rite Aid is required to issue written notifications to consumers that include specific types of biometric information collected, when their biometric data is registered in a database associated with a biometric security or surveillance system, and when Rite Aid initiates any action against them that could lead to specific harms based on a result produced by such a system. Rite Aid also must provide conspicuous and clear notices to consumers about the deployment of facial recognition or other biometric surveillance technologies in its stores. Additionally, Rite Aid must promptly investigate and respond in writing to consumer grievances regarding actions taken against them.

  • Required Retention Limits for Biometric Information. Rite Aid must develop a written retention schedule detailing purposes for collecting biometric information and setting out a timeframe for deletion of such data within five years.

  • Prohibition Against Misrepresentations. Rite Aid is prohibited from misrepresenting its data security and privacy practices.

  • Information Security Program. Rite Aid must implement a comprehensive data security program and obtain third-party assessments of the program.

Takeaways for businesses

Although the actions and prohibitions detailed in the FTC’s proposed order are meant to address specific conduct alleged by Rite Aid, it also provides valuable insight for businesses using or contemplating using similar technologies. Indeed, as Commissioner Bedoya noted in a statement regarding the Rite Aid order, the case can and should be viewed as part of the FTC’s broader trend towards promoting algorithmic fairness. The order suggests that businesses contemplating the use of AI-supported biometric technologies for surveillance purposes should consider the following:

  • Informing consumers when they are enrolled in a biometric or security surveillance system, how to contest their entry into that system, when action is taken against them based on enrolment, and how they can contest those actions;

  • Developing a consumer complaint system that allows the business to effectively and promptly address consumer concerns regarding the outputs produced by algorithmic determinations, especially where biometric information technologies are involved;

  • Conducting robust testing, including testing for bias on the basis of protected characteristics;

  • Implementing regular detailed assessments of how inaccuracies may arise from training data, hardware and software issues, and differences between training and deployment environments;

  • Performing ongoing testing under conditions that materially replicate conditions in which AI or biometric technology systems are deployed on at least an annual basis;

  • Modifying or terminating systems if the business is unable to address the risks identified through its assessment and testing;

  • Making AI and biometric governance a top priority at the board level, dedicating the appropriate resources and programs necessary to facilitate compliance with future potential government regulations.

 

 

Authored by W. James Denvil, Donald DePass, Alyssa Golay, and Pat Bruny.

Search

Register now to receive personalized content and more!