Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsElon Musk's Pornography Machine
The Atlantic - Gift LinkEarlier this week, some people on X began replying to photos with a very specific kind of request. Put her in a bikini, take her dress off, spread her legs, and so on, they commanded Grok, the platforms built-in chatbot. Again and again, the bot complied, using photos of real peoplecelebrities and noncelebrities, including some who appear to be young childrenand putting them in bikinis, revealing underwear, or sexual poses. By one estimate, Grok generated one nonconsensual sexual image every minute in a roughly 24-hour stretch.
Although the reach of these posts is hard to measure, some have been liked thousands of times. X appears to have removed a number of these images and suspended at least one user who asked for them, but many, many of them are still visible. xAI, the Elon Muskowned company that develops Grok, prohibits the sexualization of children in its acceptable-use policy; neither the safety nor child-safety teams at the company responded to a detailed request for comment. When I sent an email to the xAI media team, I received a standard reply: Legacy Media Lies.
Musk, who also did not reply to my request for comment, does not appear concerned. As all of this was unfolding, he posted several jokes about the problem: requesting a Grok-generated image of himself in a bikini, for instance, and writing 🔥🔥🤣🤣 in response to Kim Jong Un receiving a similar treatment. I couldnt stop laughing about this one, the worlds richest man posted this morning sharing an image of a toaster in a bikini. On X, in response to a users post calling out the ability to sexualize children with Grok, an xAI employee wrote that the team is looking into further tightening our gaurdrails [sic]. As of publication, the bot continues to generate sexualized images of nonconsenting adults and apparent minors on X.
AI has been used to generate nonconsensual porn since at least 2017, when the journalist Samantha Cole first reported on deepfakesat the time, referring to media in which one persons face has been swapped for another. Grok makes such content easier to produce and customize. But the real impact of the bot comes through its integration with a major social-media platform, allowing it to turn nonconsensual, sexualized images into viral phenomena. The recent spike on X appears to be driven not by a new feature, per se, but by people responding to and imitating the media they see other people creating: In late December, a number of adult-content creators began using Grok to generate sexualized images of themselves for publicity, and nonconsensual erotica seems to have quickly followed. Each image, posted publicly, may only inspire more images. This is sexual harassment as meme, all seemingly laughed off by Musk himself.
Although the reach of these posts is hard to measure, some have been liked thousands of times. X appears to have removed a number of these images and suspended at least one user who asked for them, but many, many of them are still visible. xAI, the Elon Muskowned company that develops Grok, prohibits the sexualization of children in its acceptable-use policy; neither the safety nor child-safety teams at the company responded to a detailed request for comment. When I sent an email to the xAI media team, I received a standard reply: Legacy Media Lies.
Musk, who also did not reply to my request for comment, does not appear concerned. As all of this was unfolding, he posted several jokes about the problem: requesting a Grok-generated image of himself in a bikini, for instance, and writing 🔥🔥🤣🤣 in response to Kim Jong Un receiving a similar treatment. I couldnt stop laughing about this one, the worlds richest man posted this morning sharing an image of a toaster in a bikini. On X, in response to a users post calling out the ability to sexualize children with Grok, an xAI employee wrote that the team is looking into further tightening our gaurdrails [sic]. As of publication, the bot continues to generate sexualized images of nonconsenting adults and apparent minors on X.
AI has been used to generate nonconsensual porn since at least 2017, when the journalist Samantha Cole first reported on deepfakesat the time, referring to media in which one persons face has been swapped for another. Grok makes such content easier to produce and customize. But the real impact of the bot comes through its integration with a major social-media platform, allowing it to turn nonconsensual, sexualized images into viral phenomena. The recent spike on X appears to be driven not by a new feature, per se, but by people responding to and imitating the media they see other people creating: In late December, a number of adult-content creators began using Grok to generate sexualized images of themselves for publicity, and nonconsensual erotica seems to have quickly followed. Each image, posted publicly, may only inspire more images. This is sexual harassment as meme, all seemingly laughed off by Musk himself.
Great story about this by @matteowong.bsky.social â on Elon Muskâs X, nonconsensual porn and child sex abuse are just the latest memes
— Damon Beres (@damonberes.com) 2026-01-02T23:07:37.369Z
4 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Elon Musk's Pornography Machine (Original Post)
In It to Win It
Tuesday
OP
Flood X with pornographic Grok-produced images of Musk, his wives and kids, Republican women,
Midnight Writer
Wednesday
#4
spooky3
(38,293 posts)1. More misogyny from Musk. What a shocker! nt
Maru Kitteh
(31,274 posts)2. Oh well. It's only women and children, so YAWN
apparently.
Will anything be done? Do we even notice anymore.
Polybius
(21,528 posts)3. One of my friends (he hates Trump) posted this on his Facebook
So that's where it came from?

Midnight Writer
(25,152 posts)4. Flood X with pornographic Grok-produced images of Musk, his wives and kids, Republican women,
the wives and kids of Republican men, and of Republican men and influencers.
See if they get the message.
(I am not proud to admit that I can think of lots of creative scenarios I could assign Grok to illustrate)