Skip Navigation

AI-generated child sexual abuse images could flood the internet. A watchdog is calling for action

114 comments
  • I am sort of curious, bc I don't know: of all the types of sexual abuse that happens to children, ie being molested by family or acquaintances, being kidnapped by the creep in the van, being trafficked for prostitution, abuse in church, etc etc... in comparison to these cases, how many cases deal exclusively with producing imagery?

    Next thing I'm curious about: if the internet becomes flooded with AI generated CP images, could that potentially reduce the demand for RL imagery? Wouldn't the demand-side be met? Is the concern normalization and inducing demand? Do we know there's any significant correlation between more people looking and more people actually abusing kids?

    Which leads to the next part: I play violent video games and listen to violent aggressive music and have for many years now and I enjoy it a lot, and I've never done violence to anybody before, nor would I want to. Is persecuting someone for imagining/mentally roleplaying something that's cruel actually a form of social abuse in itself?

    Props to anybody who asks hard questions btw, bc guaranteed there will be a lot of bullying on this topic. I'm not saying "I'm right and they're wrong", but there's a lot of nuance here and people here seem pretty quick to hand govt and police incredible powers for.. I dunno.. how much gain really? You'll never get rights back that you throw away. Never. They don't make 'em anymore these days.

  • Normally I err on the side of 'art' being separated from actual pictures/recordings of abuse. It falls under the "I don't like what you have to say, but I will defend your right to say it" idea.

    Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

    • I keep seeing people post this same idea, and I see no proof that it would actually happen.

      Why would you need "real" CP if there's like-for-like-quality AI CP out there?

      Also, aside from going out of our way to wreck the lives of individuals who look at the stuff, is there any actual concrete stats that say we're preventing any sort of significant number of RL child abuse by giving up rights to privacy or paying FBI agents to post CP online and entrap people? I Don't get behind the "if it theoretically helped one single child, I'd genocide a nation.." bs. I want to see what we've gained so far by these policies before I agree to giving govt more power by expanding them.

    • Photorealistic images of CP? I think that crosses the line, and needs to be treated as if it was actual CP as it essentially enables real CP to proliferate.

      While I absolutely don't want to sound like I'm defending the practice (because I'm not), I'm really not too sure of this. If this was true, would similar logic apply to other AI-generated depictions of illegal or morally reprehensible situations? Do photorealistic depictions of murder make it more likely that the people going out of their way to generate or find those pictures will murder someone or seek out pictures of real murder? Will depictions of rape lead to actual rape? If the answer to those or other similar questions is "no", then why is child porn different? If "yes", then should we declare all the other ones illegal as well?

      It's not that I think AI-generated child porn should be accepted or let alone encouraged by any means, but as was pointed out it might actually even be counterproductive to ruin someone's life over AI-generated material in which there is factually no victim, as reprehensible as the material may be; just because something is disgusting to most of us doesn't mean it's a very good justification for making it illegal if there is no victim.

      The reason why I'm not convinced of the argument is that a similar one has been used when eg. arguing for censorship of video games, with the claim that playing "murder simulators" which can look relatively realistic will make people (usually children) more likely to commit violent acts, and according to research that isn't the case.

      I'd even be inclined to argue that being able to generate AI images of sexualized minors might even make it less likely for the person to move over to eg. searching for actual child porn or committing abuse as it's a relatively easier and safer way for them to satisfy an urge. I wouldn't be willing to bet on that though

  • 🤖 I'm a bot that provides automatic summaries for articles: ::: spoiler Click here to see the summary NEW YORK (AP) — The already-alarming proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Tuesday.

    In a written report, the U.K.-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.

    In a first-of-its-kind case in South Korea, a man was sentenced in September to 2 1/2 years in prison for using artificial intelligence to create 360 virtual child abuse images, according to the Busan District Court in the country’s southeast.

    What IWF analysts found were abusers sharing tips and marveling about how easy it was to turn their home computers into factories for generating sexually explicit images of children of all ages.

    While the IWF’s report is meant to flag a growing problem more than offer prescriptions, it urges governments to strengthen laws to make it easier to combat AI-generated abuse.

    Users can still access unfiltered older versions of Stable Diffusion, however, which are “overwhelmingly the software of choice ... for people creating explicit content involving children,” said David Thiel, chief technologist of the Stanford Internet Observatory, another watchdog group studying the problem.


    Saved 78% of original text. :::

114 comments