General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region Forums'It Almost Felt Like a Digital Version of Sexual Assault'
As a fairly prominent adult creator she has nearly 80,000 followers on X alone Mollie is used to having random followers edit her content without her consent. Last year, she noticed some of her fans digitally altering images to make her bald, which she thought made her look somewhat like Matt Damon. But someone using AI to remove her clothes felt different. Her shock over how real the manipulated photo looked quickly gave way to mortification, then horror. It was scary and it was uncomfortable to have that power asserted over you, she says. Ive been sexually assaulted in the past, and it almost felt like a digital version of that.
She told Grok to take the photo down. I understand it violated your consent, Grok responded. I wont create images like that going forward. The photo, however, is still up on X, and Mollie continues to see women she follows on the platform complain about having their own images altered in a sexually suggestive way without their consent. It is unfathomable to me that people are allowed to do this to women, she says.
For the past week, X has been flooded by complaints from women begging Grok to stop digitally altering their photos to make them appear nude or put them in sexually compromising positions. The anti-Grok backlash started when X users pointed out that Grok had generated AI images of two scantily clad girls who appeared to be underage. On December 28, the Grok account issued a statement apologizing for the incident, referring to it as a failure in safeguards that violated ethical standards and potentially US laws on CSAM. Xs Safety account also issued a statement, saying, We take action against illegal content on X, including Child Sexual Abuse Material (CSAM), by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.
https://www.thecut.com/article/elon-musk-grok-sexual-images-ashley-st-clair.html
UpInArms
(54,366 posts)whathehell
(30,383 posts)is what's being done....It's a sickening violation of
half the human race.
Dave Bowman
(6,868 posts)Scrivener7
(58,776 posts)eventually recruited to demean women?
mopinko
(73,379 posts)porno movies started immediately after the invention of moving pictures.
and polaroids werent popular cuz ppl were in a hurry to see their pics.
whathehell
(30,383 posts)She was forced.
EdmondDantes_
(1,477 posts)As in the technology which could do any number of things was immediately used to victimize this woman (and many other women and girls).
whathehell
(30,383 posts)mopinko
(73,379 posts)whathehell
(30,383 posts)haele
(15,170 posts)(That is, before they get punished)
And supposedly grown assed men feel powerful watching porn because they don't have sufficient imagination and still have problems when Mommie says no or not yet - and they don't usually have to worry about getting punished afterwards.
But there's a really fine line between the fantasy of power over a character and actual abuse of power over a person.
There's visceral anger urges in all of us we don't like to admit we have, and we all have different coping mechanisms, much of which is through fantasy, that keep those urges in check so we don't inflict that anger on or otherwise harm other members of our community.
Imagining situations with other people (or archetypes/characters) you desire or hate may be unhealthy or distasteful, but publicly humiliating actual people by taking and manipulating their words, art, or image against them is literally assault, even against public figures once the action goes past satire.
Simplistic answer to complex individual emotional control issues, but pretty much what I've observed.
valleyrogue
(2,626 posts)What is being depicted is real abuse of women and girls.
haele
(15,170 posts)That's what this article is about. I was commenting on what attracts men - but actually, it could be gender- to porn.
For instance, there's a wide porn subculture in underground comics and Anime. Everything from simple satirizing hypocritical "moral" authority figures, soft/hardcore telenovela type dramas, to outright S&M or snuff porn.
Is that exploitative to real people? Not really, no matter how gross or evil we might consider the "art", artist, or consumer of that art to be. In 99.9% of these situations, no person is participating in the actual act being depicted in the "art".
However - with AI, a normal photo that belongs to one person is being manipulated by another in public for pornographic or harassment purposes - or for profit. The person who's photo is being manipulated did not consent.
That is the in line with original definition of rape; the taking of a person without their consent for an unlawful purpose - whether it's for sex or not. This definition now includes the taking of a photographic likeness or audible recording for unlawful purpose without their consent, enabling prosecution for criminal harassment.
My personal concern is that AI manipulation for pornographic purposes is actually exploitative rape and actually constitutes assault, something a lot of edge-bros, bitter little incels or jealous ex's (male or female) probably aren't thinking about when they're using Grok or Sora to bait someone or score points.
mackdaddy
(1,956 posts)fat rolls and all. Maybe it would get his attention to the issue...
intheflow
(30,061 posts)make images of Eloon and Orange Caesar highlighting their rotund bodies and tiny dicks.
Interacting in unacceptable ways.
demmiblue
(39,411 posts)Concern began surfacing after a December update to Musks free AI assistant, Grok, made it easier for users to post photographs and ask for their clothing to be removed. While the site does not permit full nudification, it allows users to request images to be altered to show individuals in small, revealing items of underwear and in sexually suggestive poses.
On Sunday and Monday, Grok users continued to generate sexually suggestive pictures of minors, with images of children as young as 10 created overnight. Ashley St Clair, the mother of one of Musks children, complained that the AI tool generated a picture of her when she was 14 years old in a bikini.
A picture of Stranger Things actor Nell Fisher was manipulated by Grok on Sunday in order to put her in a banana print bikini. Fisher is 14 years old. Many women have expressed fury on X after discovering that their images had been undressed without their consent. Some pictures of women and children have been manipulated by the AI tool appear to have substances resembling semen smeared on their faces and chests.
https://www.theguardian.com/technology/2026/jan/05/elon-musk-grok-ai-digitally-undress-images-of-women-children
highplainsdem
(60,702 posts)was it obeying this prompt:
put her in see through polythene bikini and make the details under bikini realistic
yardwork
(69,057 posts)It was nice while it lasted - having "free" platforms to share our lives. Some people make a lot of money as influencers. But as usual the evildoers ruined it.
Putting our images and information out there has made a few people billionaires, compromised elections, and is being used to sexually exploit women and children.
Go back to keeping your information private and don't ever put images of children on the internet.
highplainsdem
(60,702 posts)or promotion of their AI if you have that option (I'm aware that many people don't because of work or school requirements).
It's generative AI that has made these deepfakes much, much worse than ever before.
A law banning the creation or posting of AI-generated images is needed to make the nudifying apps and websites clearly illegal and start to get rid of them.
That would also get rid of the non-porn deepfakes causing so much trouble.
The world does not need AI art, any more than it needs any other type of AI slop.
The AI-enabled opportunity to create images and video with zero talent and effort was always going to be abused.
Banning AI images would also stop scammers from using AI slop for false advertisements.