
Elon Musk has tried to laugh off the issues with Grok, seeming to blame users rather than his platform.LIONEL BONAVENTURE/AFP/Getty Images
I stumbled on it by accident. I spend far less time on X, what used to be Twitter, than I did back in the day – when it was an interesting place to bat around ideas, even if it got hostile at times. Now that it’s a sludge pit of misinformation and hate-filled opinions, lurking is the best option – even if it is frequently anxiety-inducing.
If I didn’t have to lurk for my job, I wouldn’t be there at all.
But X is still, somehow, the platform of choice for much of the discourse. So lurk journalists must. I was doing just that when I recently encountered an instruction to X’s AI. “Hey Grok,” it went, “put her in a bikini.”
Huh?
This instruction appeared to be from someone who disagreed with the poster’s political views. It sent me down a rabbit hole that might have made even Hugh Hefner uncomfortable.
Grok, put me in a micro bikini. Grok, replace my clothes with clear tape.
Many of these “requests” were written as if they were coming from the posters themselves – depicted in the photos as young women, if anyone is naïve enough to believe that.
European Commission says sexualized AI images generated by X chatbot Grok are ‘illegal’
Elon Musk’s Grok accused of producing sexualized AI images of women, minors
These non-consensual, technology-assisted manipulations are vile and should be illegal, not trending. Yet this digital undressing has gone from the dark web to the mainstream, exploding over the holidays.
Copyleaks, an AI-manipulated media-detection and AI content-governance platform, reported on Dec. 31, that Grok was generating roughly one nonconsensual sexualized image per minute.
Grok has “undressed” children, including adolescents. Also targeted are public figures – celebrities, politicians. And everyday people, perhaps women whose opinions run contrary to the Grok-using keyboard warriors who decide to publicly give these ladies what for.
With this grotesque trend, the internet – and thus the world – has become a little less safe for women. Why post and risk having a photo turned into an AI-generated nude for the world to see?
And yet posting is an expected part of the work many women do – whether they’re promoting a film or proposed legislation. (Or a newspaper column.) Now, for doing their jobs (or living their lives) they risk this sexualized abuse, on top of all the other online hate.
Suzie Dunn, assistant professor at Dalhousie University’s Schulich School of Law and interim director of its Law and Technology Institute, called this a “really sad moment” this week.
“It’s going to silence women and pull them away from being in public spaces,” she told me.
Yes.
“When you normalize this type of sexual abuse, it just contributes to the larger normalization of violence against women and girls.”
Elon Musk initially tried to laugh this off, reposting a photo of a bikini on a toaster on Jan. 2. Hardy har har. No doubt he’s got a large cyber-posse who think people like me are too uptight and need to just chill. Can’t we take a joke?
There is nothing funny, nothing, about taking real images of women (or men) and turning them into fake nudes for the world to see. This nakedly hostile act is nothing to laugh at. It is harassment.
It doesn’t end with swimsuits either. There have been requests to put swastikas on those bikinis, to add cigarette burns, bruises or blood to these images of women.
Also observed: Grok was asked to put a victim of Switzerland’s New Year’s fire into a bikini – and, this is truly sick, on the corpse of the woman shot by ICE in Minneapolis this week.
“Thanks for pointing this out,” the Grok account responded to criticism of this on Thursday, acknowledging that the image came from the ICE incident. “I generated a modified version without recognizing the context, which was inappropriate. I’ll aim to better detect sensitive content in requests.”
“I.”
With outrage building, Mr. Musk has changed his tune somewhat, at least publicly, although he appears to blame users, rather than his platform. “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content,” he posted on Jan. 3.
Various governments are rattling their regulatory sabres, but what shouldn’t be complicated – it’s wrong, full stop, to non-consensually depict real people in the nude, even if it’s a deep-fake nude – becomes mired in a web of different legal jurisdictions.
Perhaps governments will reconsider using X as the go-to platform for communication. On Wednesday, Britain’s House of Commons women and equalities committee said it would no longer do so, as it works to prevent violence against women and girls.
Maybe the silver lining of this horror show will see X finally delegitimized – long after it’s been stripped of its actual legitimacy, for real.