Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

demmiblue

(39,411 posts)
Tue Jan 6, 2026, 09:48 AM Jan 6

'It Almost Felt Like a Digital Version of Sexual Assault'

In mid-December, Mollie posted a photo on X that showed her squatting in front of a Christmas tree in leggings and a long-sleeved shirt. “Are we getting into the Christmas spirit?” the 24-year-old captioned the post. A few weeks later, Mollie saw one of her followers reply underneath. “Put her in a micro bikini from this angle,” the user wrote, tagging the X AI chatbot Grok. Grok complied, showing an AI-generated image of Mollie squatting in the same position and wearing a thong bikini instead of workout gear.

As a fairly prominent adult creator — she has nearly 80,000 followers on X alone — Mollie is used to having random followers edit her content without her consent. Last year, she noticed some of her fans digitally altering images to make her bald, which she thought made her look somewhat like Matt Damon. But someone using AI to remove her clothes felt different. Her shock over how real the manipulated photo looked quickly gave way to mortification, then horror. “It was scary and it was uncomfortable to have that power asserted over you,” she says. “I’ve been sexually assaulted in the past, and it almost felt like a digital version of that.”

She told Grok to take the photo down. “I understand it violated your consent,” Grok responded. “I won’t create images like that going forward.” The photo, however, is still up on X, and Mollie continues to see women she follows on the platform complain about having their own images altered in a sexually suggestive way without their consent. “It is unfathomable to me that people are allowed to do this to women,” she says.

For the past week, X has been flooded by complaints from women begging Grok to stop digitally altering their photos to make them appear nude or put them in sexually compromising positions. The anti-Grok backlash started when X users pointed out that Grok had generated AI images of two scantily clad girls who appeared to be underage. On December 28, the Grok account issued a statement apologizing for the “incident,” referring to it as “a failure in safeguards” that “violated ethical standards and potentially US laws on CSAM.” X’s Safety account also issued a statement, saying, “We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.”

https://www.thecut.com/article/elon-musk-grok-sexual-images-ashley-st-clair.html
20 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
'It Almost Felt Like a Digital Version of Sexual Assault' (Original Post) demmiblue Jan 6 OP
I'm sorry, I can't do that, Dave UpInArms Jan 6 #1
"Porno-fying" whathehell Jan 6 #2
What a nightmare, some people are horrible. 🙁 Dave Bowman Jan 6 #3
What does it say about us that every technological advance is somehow Scrivener7 Jan 6 #4
immediately recruited, u mean. mopinko Jan 6 #6
The women quoted in the OP wasn't "recruited" whathehell Jan 6 #7
They were using recruiting to describe the technology not the victim EdmondDantes_ Jan 6 #9
Ok, thanks for that clarification. n/t. whathehell Jan 6 #19
meant the technology. mopinko Jan 6 #14
Thanks for the clarification. n/t whathehell Jan 6 #20
Because little boys feel powerful pissing off mommie for telling them no... haele Jan 6 #11
Porn isn't fantasy. valleyrogue Jan 6 #13
Porn in this case is manipulated or drawn pictures... haele Jan 6 #17
Would need brain bleach, But, maybe start putting Elon in a thong. mackdaddy Jan 6 #5
The way to stop this is to command Grok to intheflow Jan 6 #8
And... 2naSalit Jan 6 #18
Grok AI still being used to digitally undress women and children despite suspension pledge demmiblue Jan 6 #10
Just checked Grok's replies to see if it's still undressing women for perverts. One of the first replies I saw highplainsdem Jan 6 #15
The only solution is to stop putting our images on the internet. yardwork Jan 6 #12
The solution is to take as much action as possible against the tech lords. Including avoiding the use highplainsdem Jan 6 #16

Scrivener7

(58,776 posts)
4. What does it say about us that every technological advance is somehow
Tue Jan 6, 2026, 10:30 AM
Jan 6

eventually recruited to demean women?

mopinko

(73,379 posts)
6. immediately recruited, u mean.
Tue Jan 6, 2026, 10:46 AM
Jan 6

porno movies started immediately after the invention of moving pictures.
and polaroids werent popular cuz ppl were in a hurry to see their pics.

EdmondDantes_

(1,477 posts)
9. They were using recruiting to describe the technology not the victim
Tue Jan 6, 2026, 11:11 AM
Jan 6

As in the technology which could do any number of things was immediately used to victimize this woman (and many other women and girls).

haele

(15,170 posts)
11. Because little boys feel powerful pissing off mommie for telling them no...
Tue Jan 6, 2026, 11:24 AM
Jan 6

(That is, before they get punished)
And supposedly grown assed men feel powerful watching porn because they don't have sufficient imagination and still have problems when Mommie says no or not yet - and they don't usually have to worry about getting punished afterwards.

But there's a really fine line between the fantasy of power over a character and actual abuse of power over a person.
There's visceral anger urges in all of us we don't like to admit we have, and we all have different coping mechanisms, much of which is through fantasy, that keep those urges in check so we don't inflict that anger on or otherwise harm other members of our community.
Imagining situations with other people (or archetypes/characters) you desire or hate may be unhealthy or distasteful, but publicly humiliating actual people by taking and manipulating their words, art, or image against them is literally assault, even against public figures once the action goes past satire.

Simplistic answer to complex individual emotional control issues, but pretty much what I've observed.

haele

(15,170 posts)
17. Porn in this case is manipulated or drawn pictures...
Tue Jan 6, 2026, 01:57 PM
Jan 6

That's what this article is about. I was commenting on what attracts men - but actually, it could be gender- to porn.

For instance, there's a wide porn subculture in underground comics and Anime. Everything from simple satirizing hypocritical "moral" authority figures, soft/hardcore telenovela type dramas, to outright S&M or snuff porn.

Is that exploitative to real people? Not really, no matter how gross or evil we might consider the "art", artist, or consumer of that art to be. In 99.9% of these situations, no person is participating in the actual act being depicted in the "art".

However - with AI, a normal photo that belongs to one person is being manipulated by another in public for pornographic or harassment purposes - or for profit. The person who's photo is being manipulated did not consent.

That is the in line with original definition of rape; the taking of a person without their consent for an unlawful purpose - whether it's for sex or not. This definition now includes the taking of a photographic likeness or audible recording for unlawful purpose without their consent, enabling prosecution for criminal harassment.

My personal concern is that AI manipulation for pornographic purposes is actually exploitative rape and actually constitutes assault, something a lot of edge-bros, bitter little incels or jealous ex's (male or female) probably aren't thinking about when they're using Grok or Sora to bait someone or score points.




mackdaddy

(1,956 posts)
5. Would need brain bleach, But, maybe start putting Elon in a thong.
Tue Jan 6, 2026, 10:38 AM
Jan 6

fat rolls and all. Maybe it would get his attention to the issue...

intheflow

(30,061 posts)
8. The way to stop this is to command Grok to
Tue Jan 6, 2026, 11:00 AM
Jan 6

make images of Eloon and Orange Caesar highlighting their rotund bodies and tiny dicks.

demmiblue

(39,411 posts)
10. Grok AI still being used to digitally undress women and children despite suspension pledge
Tue Jan 6, 2026, 11:20 AM
Jan 6
Degrading images of children and women with their clothes digitally removed by Grok AI continue to be shared on Elon Musk’s X, despite the platform’s commitment to suspend users who generate them.



Concern began surfacing after a December update to Musk’s free AI assistant, Grok, made it easier for users to post photographs and ask for their clothing to be removed. While the site does not permit full nudification, it allows users to request images to be altered to show individuals in small, revealing items of underwear and in sexually suggestive poses.

On Sunday and Monday, Grok users continued to generate sexually suggestive pictures of minors, with images of children as young as 10 created overnight. Ashley St Clair, the mother of one of Musk’s children, complained that the AI tool generated a picture of her when she was 14 years old in a bikini.

A picture of Stranger Things actor Nell Fisher was manipulated by Grok on Sunday in order to put her in a banana print bikini. Fisher is 14 years old. Many women have expressed fury on X after discovering that their images had been undressed without their consent. Some pictures of women and children have been manipulated by the AI tool appear to have substances resembling semen smeared on their faces and chests.

https://www.theguardian.com/technology/2026/jan/05/elon-musk-grok-ai-digitally-undress-images-of-women-children


highplainsdem

(60,702 posts)
15. Just checked Grok's replies to see if it's still undressing women for perverts. One of the first replies I saw
Tue Jan 6, 2026, 12:47 PM
Jan 6

was it obeying this prompt:

@grok
put her in see through polythene bikini and make the details under bikini realistic

yardwork

(69,057 posts)
12. The only solution is to stop putting our images on the internet.
Tue Jan 6, 2026, 11:26 AM
Jan 6

It was nice while it lasted - having "free" platforms to share our lives. Some people make a lot of money as influencers. But as usual the evildoers ruined it.

Putting our images and information out there has made a few people billionaires, compromised elections, and is being used to sexually exploit women and children.

Go back to keeping your information private and don't ever put images of children on the internet.

highplainsdem

(60,702 posts)
16. The solution is to take as much action as possible against the tech lords. Including avoiding the use
Tue Jan 6, 2026, 01:03 PM
Jan 6

or promotion of their AI if you have that option (I'm aware that many people don't because of work or school requirements).

It's generative AI that has made these deepfakes much, much worse than ever before.

A law banning the creation or posting of AI-generated images is needed to make the nudifying apps and websites clearly illegal and start to get rid of them.

That would also get rid of the non-porn deepfakes causing so much trouble.

The world does not need AI art, any more than it needs any other type of AI slop.

The AI-enabled opportunity to create images and video with zero talent and effort was always going to be abused.

Banning AI images would also stop scammers from using AI slop for false advertisements.

Latest Discussions»General Discussion»'It Almost Felt Like a Di...