Grammarly removes AI feature which used real authors identities, faces class action lawsuit

Share This Post

The Grammarly logo is seen displayed on a smartphone screen.

Grammarly has pulled its AI-powered Expert Review feature after being called out for using journalists’ and authors’ identities without permission. The writing assistant software is now facing a class action lawsuit accusing it of exploiting writers’ names for its own profit.

Launched alongside seven other AI agents last August, Expert Review was available on Grammarly’s Free and $12 Pro plans at launch, and was promoted as providing users with feedback on the content of their writing. A page on Grammarly’s website which has since been taken down stated that Expert Review “[drew] on insights from subject-matter experts and trusted publications,” and provided AI-generated feedback “based on publicly available expert content” (via Wayback Machine). Users could even personalise which “expert” sources Grammarly drew from by selecting the names of specific authors.

“Expert Review agent offers subject-matter expertise and personalized, topic-specific feedback to elevate writing that meets rigorous academic or professional standards tailored to the user’s field,” Grammarly wrote in its blog post announcing the feature.

Grammarly’s Expert Review came to attention last week after Wired reported that the feature was offering AI-generated edits in the name of real writers and academics, both living and dead. The tool’s user guide does provide the disclaimer that its references to experts “are for informational purposes only and do not indicate any affiliation with Grammarly or endorsement by those individuals or entities.” However, the same page also claims that Expert Review offers “insights from leading professionals, authors, and subject-matter experts.”

Many said subject-matter experts have not taken kindly to Grammarly using their identities without their knowledge or consent.

“[Grammarly] curated a list of real people, gave its models free rein to hallucinate plausible-sounding advice on their behalf, and put it all behind a subscription,” wrote Platformer founder Casey Newton, who was among those invoked by Grammarly. That’s a deliberate choice to monetize the identities of real people without involving them, and it sucks.”

“This has got to be some kind of defamation or something,” historian Mar Hicks posted to Bluesky, having shared a screenshot of their identity being included in Expert Review. “You can’t just steal people’s IP and then pretend they’re saying something they never said.”

Grammarly responds to Expert Review backlash

I need someone with grammarly to find out if I’m one of their experts so I can send a scathing email.

— Mikki Kendall (@karnythia.bsky.social) March 12, 2026 at 2:48 AM

Responding to the backlash, Grammarly told Platformer on Monday that it would allow writers to email them to opt out of inclusion in its Expert Review feature. This prompted further criticism, as experts were not told that Grammarly was using their identity, nor had they granted it permission in the first place. Impacted authors wouldn’t know that they needed to opt out unless a Grammarly user saw their name while using Expert Review and informed them. 

Further, providing the option to opt out did not address Grammarly’s use of dead authors’ identities. Deceased writers used by Expert Review reportedly included astronomer Carl Sagan and intersectional academic bell hooks.

“So Grammerly [sic] is violating the memory of bell hooks AND making AI versions of the rest of us before we’re even dead,” wrote researcher Sarah J. Jackson. “Someone tell me who to sue, not even joking.”

Oh that’s nice of them to let you ask to not steal from you

— mattcrwi.bsky.social (@mattcrwi.bsky.social) March 10, 2026 at 11:09 AM

Shishir Mehrotra, CEO of Grammarly developer Superhuman, subsequently announced on Wednesday that it was pulling Expert Review offline. However, he also indicated that the company intends to eventually bring it back in some form.

“Over the past week, we received valid critical feedback from experts who are concerned that the agent misrepresented their voices,” Mehrotra posted to LinkedIn. “As context, the agent was designed to help users discover influential perspectives and scholarship relevant to their work, while also providing meaningful ways for experts to build deeper relationships with their fans. We hear the feedback and recognize we fell short on this. I want to apologize and acknowledge that we’ll rethink our approach going forward.  

“After careful consideration, we have decided to disable Expert Review while we reimagine the feature to make it more useful for users, while giving experts real control over how they want to be represented — or not represented at all.”

I don’t think grammarly should just get to do “sorry deleting now” after ventriloquizing living and dead people without their consent to make money

— Lydia Kiesling (@lydiakiesling.bsky.social) March 12, 2026 at 6:37 AM

“That this even existed in the first place suggests a total disconnect from normal human society,” climate writer Ketan Joshi replied to Mehrotra’s post. “It should’ve been immediately obvious that this was exploitative and creepy and cruel.”

“With all the talk about how AI ‘builds from” (read: ‘steals’) existent content, creating a tool that actually makes up ‘advice’ from real people who spend their lives caring about writing and expertise… it’s hard to fathom,” wrote the New York Times’ Dan Saltzstein. “There should be consequences to this beyond ‘we’re going to reevaluate.’ A promise to never do anything like this again, at minimum.”

Class action lawsuit accuses Grammarly of using writers’ identities without consent

I really can’t wait to see how big the lawsuit against grammarly gets and I hope the plaintiffs sue them into complete and fundamental nonexistence. Like, “the company has to scrap their code rather than sell it as assets, and then also dissolve” nonexistence.

— Dr. Damien P. Williams can’t think of a fun display name right n (@wolvendamien.bsky.social) March 11, 2026 at 12:20 PM

Though Grammarly has made no such pledge at present, it is already facing repercussions for its actions that go beyond reputational damage. New York Times writer Julia Angwin filed a class action lawsuit against Superhuman on Wednesday, having discovered that Grammarly’s Expert Review had used her identity without her consent. The law firm representing her, Peter Romer-Friedman Law PLLC, has put out a call for any writers who were impacted to join the class action.

Though it isn’t clear exactly how many writers’ identities Grammarly allegedly misappropriated, it may be a sizable cohort. Looking at tech journalists alone, The Verge reports that Expert Review named several members of its editorial staff, as well as writers from Wired, Bloomberg, The New York Times, The Atlantic, PC Gamer, Gizmodo, Digital Foundry, Tom’s Guide, and Mashable’s sister sites IGN and Rock Paper Shotgun. Angwin has claimed that “lots of folks” have already made inquiries about joining the lawsuit.

“I’m taking this action on behalf of not just myself, but everyone who spent years and decades refining their skills as a writer and editor, only to find an AI impersonating them,” Angwin wrote in a LinkedIn post.

“For over 100 years, New York law has prohibited companies from using a person’s name for commercial purposes without their consent,” said Peter Romer-Friedman of Peter Romer-Friedman Law PLLC. “The law does not provide an exception for technology companies or AI.”

Filed in a New York District Court, the class action is seeking damages as well as an injunction to prevent Grammarly from using writers’ identities without their consent.

Mashable has reached out to Superhuman for comment.

Subscribe The Newsletter

Get updates and learn from the best

More To Explore

Do You Want To Stay Connected?

drop a line and keep in touch