Chat I think we’re in the bad place
YouTube removed a channel that was dedicated to posting AI-generated videos of women being shot in the head following 404 Media’s request for comment. The videos were clearly generated with Google’s new AI video generator tool, Veo, according to a watermark included in the bottom right corner of the videos.
The channel, named Woman Shot A.I, started on June 20, 2025. It posted 27 videos, had over 1,000 subscribers, and had more than 175,000 views, according to the channel’s publicly available data.
violence against women
All the videos posted by the channel follow the exact same formula. The nearly photo-realistic videos show a woman begging for her life while a man with a gun looms over her. Then he shoots her. Some videos have different themes, like compilations of video game characters like Lara Croft being shot, “Japanese Schoolgirls Shot in Breast,” “Sexy HouseWife Shot in Breast,” “Female Reporter Tragic End,” and Russian soldiers shooting women with Ukrainian flags on their chest.
“The AI I use is paid, per account I have to spend around 300 dollars per month, even though 1 account can only generate 8-second videos 3 times,” the channel’s owner wrote in a public post on YouTube. “So, imagine how many times I generate a video once I upload, I just want to say that every time I upload a compilation consisting of several 8-second clips, it’s not enough for just 1 account.”
Woman Shot A.I’s owner claimed they have 10 accounts. “I have to spend quite a lot of money just to have fun,” they said.
Where do you draw the line between things like grindhouse or other schlocky horror genres and “simulated snuff films”? They’re both doing the same thing, using the same digital or practical effects, to achieve the same crossed-wire response of mixing fear and arousal. You want some kind of “you must be backed by a company at least so and so big to be considered legitimate and thus allowed to produce violent fictional content that also involves or goes alongside some sort of sexualization” standard, the way Amazon does for novels? Maybe an “I trust a judge will know it when he sees it” standard?
Fuzzy subjective lines differentiating “acceptable” storytelling and content that you suspect someone might be enjoying a little too much aren’t good, especially since the context they’re in can wildly change that. There are people carefully documenting every time someone gets eaten in a movie with summaries and timestamped links to a clip of it: that’s literally someone getting off to otherwise normal action/horror movies where someone is dying on screen. How do you deal with that context? Retroactively ban the films because you learned someone got off to it? Ban specifically snipping violent scenes for fear that someone might get off to it?
Like the person the article’s about is shit because he was giving google money (bad), posting fetish content on youtube (bad), and was also openly a racist and presumably every other flavor of reactionary bigot too since those all go hand in hand (extremely bad), not because he was
makingpurchasing highly ritualized fantasy fetish videos involving scary or violent themes.And since I mentioned the horror genre: yeah a lot of horror movie writers/directors/producers were/are also complete shitbags, but so were/are the writers/directors/producers of a lot of every other genre of movie too. The problem is more with men and particularly men with any sort of influence or status than it is the genre, making all attendant problems entirely a matter of “look at the actual person involved and what they say and do, rather than the sort of content they’re working with, because the worst people are usually doing the blandest stuff for all that you can find monsters everywhere”, with the obvious caveat that someone making content that seems to sexualize children is an immediate red flag and requires closer scrutiny to differentiate legitimate non-sexualizing content (eg fiction about growing up queer and dealing with confusion and repression, or a work criticizing the ways patriarchal society sexualizes and grooms girls towards fulfilling their designated role) from illegitimate sexualizing content (eg Made in Abyss, one of the worst things anyone has ever made).