AI Insurance Company Faces Class Action Lawsuit Over Use of Biometrics | Carlton Fields

After a tweet accident, Lemonade Inc., an AI-powered insurance company, faces a class action lawsuit for allegedly violating New York laws against the use of biometric data without consent using technology. facial recognition to analyze videos submitted as part of the complaints process.

Artificial intelligence and big data are key elements of Lemonade’s appeal to consumers and investors, but those same tools sparked concern on social media when Lemonade mentioned its use of facial recognition to analyze videos. . In a series of now-deleted tweets, Lemonade said it was gathering more than 1,600 “data points” on its customers, “100 times more data than traditional insurance companies,” to be analyzed by a “charming artificial intelligence robot” which then insurance craftsmanship and quotation. Data points include videos made and submitted by customers. Lemonade’s AI bot analyzes videos for fraud and can supposedly detect “non-verbal clues” that traditional insurers cannot. According to the class action lawsuit, Lemonade also tweeted that this process “ultimately helps…reduce [its] loss ratios” and its “overall operating costs”.

These tweets have raised concerns among Twitter users about the collection of facial biometric data, including the possibility of discrimination based on race and other traits. In response, Lemonade tweeted that it does not use, and “does not attempt to build, an AI that uses physical or personal characteristics to deny claims.” Instead, Lemonade explained that she requests a video during the claims process because “it’s better for [its] customers” and that “the term non-verbal cues was a poor choice of words to describe facial recognition technology [it] we[es] to report complaints submitted by the same person under different identities.

In August 2021, plaintiff Mark Pruden filed a putative class action lawsuit in the Southern District of New York alleging that Lemonade violated New York City statute and common law by “collecting, storing, analyzing, or otherwise using the biometric data of thousands of its customers without their authorization or consent”, and contrary to its privacy policy. Claims include violation of the New York Uniform Deceptive Trade Practices Act, breach of contract, breach of implied contract, and unjust enrichment.

As of December 2021, the case has been on hold while the parties explore settlement negotiations.

Biometric data continues to be a hot topic among consumers, regulators and plaintiffs’ attorneys, especially amid growing consumer concerns about how and why their biometric data is collected. Businesses should be careful to clarify what they are doing, obtain unambiguous consent from their customers, and exercise caution when posting on social media.