Grammarly sued by former ‘expert’ over alleged identity‑stealing AI feature
Photo by Alexandre Debiève on Unsplash
Grammarly touts its AI “Expert Review” as a productivity boost, yet The Verge reports journalist Julia Angwin is suing, claiming the feature hijacked her identity without consent.
Key Facts
- •Key company: Grammarly
Grammarly’s “Expert Review” AI, which promised users curated insights from leading journalists and scholars, is now at the center of a class‑action lawsuit filed by veteran investigative reporter Julia Angwin. According to The Verge, Angwin discovered that her byline and writing style were being used by the feature without her consent, a claim echoed in a Wired report that first broke the story of Grammarly’s broader practice of cloning real‑world experts for commercial gain. The complaint alleges that Grammarly, operating under its parent Superhuman, violated privacy and publicity rights by employing Angwin’s identity to generate AI‑driven editing suggestions, a use that the law deems unlawful when done without permission.
The lawsuit also implicates other high‑profile figures whose names appeared in the AI’s output. The Verge identified Casey Newton, a prominent tech journalist, and several current Verge staffers—including editor‑in‑chief Nilay Patel—as part of the unauthorized “expert” pool. Screenshots of Grammarly’s suggestions show these individuals’ bylines attached to AI‑generated text, confirming the breadth of the alleged infringement. Angwin’s filing, filed on Wednesday, seeks damages for the unauthorized commercial exploitation of her persona and aims to halt the feature’s deployment pending a court order.
In response, Grammarly’s CEO Shishir Mehrotra announced an immediate suspension of the “Expert Review” agent. As reported by The Verge, Mehrotra said the tool was originally designed “to help users discover influential perspectives and scholarship relevant to their work, while also providing meaningful ways for experts to build deeper relationships with their fans.” He added that the company “hear[s] the feedback and recognize[s] we fell short on this,” and pledged to “rethink our approach going forward.” The company also rolled out an opt‑out portal earlier in the week, allowing writers and academics to request removal of their identities from the AI’s training data, though the portal was introduced only after the issue was publicly exposed.
TechCrunch corroborated the criticism, noting that the “Expert Review” feature “is just missing the actual experts.” The outlet highlighted that Grammarly’s marketing materials touted the AI as a bridge between users and authoritative voices, yet the underlying implementation relied on synthetic replicas rather than genuine, consented contributions. This discrepancy raises broader questions about the ethical limits of large‑language‑model personalization, especially when the models are trained on publicly available content that can be reconstituted into a recognizable likeness.
The controversy arrives amid a wave of scrutiny over AI companies’ use of personal data for model training. ZDNet recently covered Grammarly’s broader AI suite, which includes agents that detect AI‑generated text and automatically generate citations, underscoring the firm’s ambition to embed AI deeply into everyday writing workflows. However, the Angwin lawsuit spotlights a legal frontier: whether the commercial repurposing of a person’s public writing—without explicit licensing—constitutes a violation of publicity rights. If the court sides with Angwin, it could set a precedent that forces AI developers to obtain clear consent before leveraging individual identities in consumer‑facing products.
Industry observers are watching the case closely, as it may reshape how AI firms balance personalization with privacy. The outcome could compel companies to redesign features that rely on “expert” personas, shifting toward opt‑in models or anonymized data pipelines. For now, Grammarly’s suspension of the feature signals a tentative retreat, but the lawsuit’s resolution will determine whether the company must overhaul its entire approach to AI‑driven expert curation.
Sources
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.