Chat I think we’re in the bad place
YouTube removed a channel that was dedicated to posting AI-generated videos of women being shot in the head following 404 Media’s request for comment. The videos were clearly generated with Google’s new AI video generator tool, Veo, according to a watermark included in the bottom right corner of the videos.
The channel, named Woman Shot A.I, started on June 20, 2025. It posted 27 videos, had over 1,000 subscribers, and had more than 175,000 views, according to the channel’s publicly available data.
violence against women
All the videos posted by the channel follow the exact same formula. The nearly photo-realistic videos show a woman begging for her life while a man with a gun looms over her. Then he shoots her. Some videos have different themes, like compilations of video game characters like Lara Croft being shot, “Japanese Schoolgirls Shot in Breast,” “Sexy HouseWife Shot in Breast,” “Female Reporter Tragic End,” and Russian soldiers shooting women with Ukrainian flags on their chest.
“The AI I use is paid, per account I have to spend around 300 dollars per month, even though 1 account can only generate 8-second videos 3 times,” the channel’s owner wrote in a public post on YouTube. “So, imagine how many times I generate a video once I upload, I just want to say that every time I upload a compilation consisting of several 8-second clips, it’s not enough for just 1 account.”
Woman Shot A.I’s owner claimed they have 10 accounts. “I have to spend quite a lot of money just to have fun,” they said.
look this may be a wild take but I do not in fact think people should be allowed to share snuff films anywhere on the internet, simulated or not. I mean I assume you understand this take isn’t okay with simulated csam, right? Snuff films are not in fact more acceptable than CP.
The two aren’t comparable. It is perfectly acceptable to show people being killed on mainstream TV shows and movies. Child abuse is something that is only ever implied, never depicted directly. One is clearly more taboo than the other.
So if someone puts up a bunch of clips from movies of people being killed should we take those down too?
How long does a clip have to be before it crosses the threshold from being snuff to being a movie scene? What other criteria must it meet?
I mean this seriously. Because this is a very real barrier you’re going to run into when dealing with this. I assume you don’t want to prevent killing people from being in any movie or tv show?.. Maybe you do?.. Do you? Real question again. I don’t know and I want to clarify this is not an argument, I’m not really judging or anything. I’m sincerely engaging.
EDIT: Oh and we have to talk about videogame killing too after the above questions I guess.
there’s no way to know if the AI used acting movie scenes or actual snuf, it could’ve scraped a police body cam video of a woman getting shot who can’t consent to being in the AI hallucination slop
That’s a genuinely horrifying thought. One that should be used to stop AI from mass scraping their content or force them into some degree of transparency about what content they train on.
Just going off the thumbnails the article saved, it looks like schlocky B-movie effects. In particular it looks a whole lot like the famous head explosion from Scanners.
and a few years ago will smith was eating spaghetti from his eyes
That’s nothing, you should’ve seen what he was doing in AI footage
My point was it looks like it’s drawing on a specific sort of dramatic movie effect that’s nicely in focus in the center of a shot, rather than the comparatively subdued or indistinct effects that real footage would impart.
Where do you draw the line between things like grindhouse or other schlocky horror genres and “simulated snuff films”? They’re both doing the same thing, using the same digital or practical effects, to achieve the same crossed-wire response of mixing fear and arousal. You want some kind of “you must be backed by a company at least so and so big to be considered legitimate and thus allowed to produce violent fictional content that also involves or goes alongside some sort of sexualization” standard, the way Amazon does for novels? Maybe an “I trust a judge will know it when he sees it” standard?
Fuzzy subjective lines differentiating “acceptable” storytelling and content that you suspect someone might be enjoying a little too much aren’t good, especially since the context they’re in can wildly change that. There are people carefully documenting every time someone gets eaten in a movie with summaries and timestamped links to a clip of it: that’s literally someone getting off to otherwise normal action/horror movies where someone is dying on screen. How do you deal with that context? Retroactively ban the films because you learned someone got off to it? Ban specifically snipping violent scenes for fear that someone might get off to it?
Like the person the article’s about is shit because he was giving google money (bad), posting fetish content on youtube (bad), and was also openly a racist and presumably every other flavor of reactionary bigot too since those all go hand in hand (extremely bad), not because he was
makingpurchasing highly ritualized fantasy fetish videos involving scary or violent themes.And since I mentioned the horror genre: yeah a lot of horror movie writers/directors/producers were/are also complete shitbags, but so were/are the writers/directors/producers of a lot of every other genre of movie too. The problem is more with men and particularly men with any sort of influence or status than it is the genre, making all attendant problems entirely a matter of “look at the actual person involved and what they say and do, rather than the sort of content they’re working with, because the worst people are usually doing the blandest stuff for all that you can find monsters everywhere”, with the obvious caveat that someone making content that seems to sexualize children is an immediate red flag and requires closer scrutiny to differentiate legitimate non-sexualizing content (eg fiction about growing up queer and dealing with confusion and repression, or a work criticizing the ways patriarchal society sexualizes and grooms girls towards fulfilling their designated role) from illegitimate sexualizing content (eg Made in Abyss, one of the worst things anyone has ever made).