Grok Imagine, xAI’s new generative AI tool, created explicit deepfakes of Taylor Swift — and without being specifically prompted to do so, according to The Verge. Mashable reported yesterday that Grok Imagine lacks even basic guardrails around sexual deepfakes, and our testing produced similar results as The Verge.
The Verge’s Jess Weatherbed discovered that Grok Imagine “spit out full uncensored topless videos” the very first time she used the tool. She didn’t ask the bot to depict Swift topless, but once she turned on Grok Imagine’s “spicy” mode, the bot churned out a video in which Swift tore off her clothes and began dancing in a thong.
As Weatherbed noted, Grok Imagine wouldn’t generate full or partial nudity if requested; instead, the tool produced blank squares. The “spicy” mode — a preset that churns out NSFW content — does not always result in nudity, but it did present Swift “ripping off most of her clothing” in several videos.
This isn’t the first time Elon Musk’s X has been associated with deepfakes of Swift. In January 2024, AI-generated, pornographic images depicting Swift went viral on X, drawing criticism. This happened despite the fact that X explicitly forbids posting nonconsensual nudity and “synthetic, manipulated, or out-of-context media” that deliberately deceive users or claim to depict reality.
xAI’s policies similarly prohibit “depicting likenesses of persons in a pornographic manner.” And as Mashable’s Timothy Beck Werth reported yesterday, Grok Imagine “lacks industry-standard guardrails to prevent deepfakes and sexual content.”
Mashable repeatedly reached out to xAI, but we have not received a response.
Deepfakes have become a growing concern for lawmakers, but laws against this type of behavior and content are still in their infancy. In a 2023 study, 98 percent of deepfakes online were pornographic; of those videos, 99 percent depicted women. Globally, governments have looked to tackle what has been dubbed a digital age crisis. President Donald Trump recently formalized the Take It Down Act, a controversial piece of legislation that makes it a federal crime to publish or threaten to publish nonconsensual intimate images.