80 pointsby qwefrqwf2 hours ago20 comments
  • Insanity2 hours ago
    Literature should be able to explore tough topics and spark discussion. There are numerous interpretations of reading a book.. for example, if in the book it is written that a 10 year old had sex with a 30 year old, that could be the fantasy of the 30 year old and you can use it to explore the mind of a pedophile.

    Also, reading this of course Lolita comes to mind. To this day, one of the best books I have read (although Pale Fire is the more literarily impressive one of Nabokov). Lolita is an example of a book that explores a complex controversial topic, with an unreliable narrator which forces the reader to think about what is actually happening and what is not.

    Banning books and not allowing content such as this, where clearly no child is actually harmed, is insane.

    Edit: the novel in the article takes the point of view of the (potential) minor rather than the adult. Doesn’t really change my point, in my opinion.

    • vintermann2 hours ago
      Well, books like Nabokov's are always grandfathered in on the "artistic merit" criterion, but I'm not so sure it wouldn't have been banned had it been released today. I can think of a bunch of historical books which definitively would have (and arguably should have, if you think text fiction can be CSAM).
      • galangalalgol2 hours ago
        When you say should hav, do you mean in the legal sense, or that you agree with such laws? I can't fathom being ok with any book being banned, but usually when I cannot understand a perspective I'm missing something pretty big. So I'm actually asking, not trying to start a pointless Internet debate.
        • wongarsuan hour ago
          The arguments for and against end up similar to those for and against banning drawn or AI generated depiction of csam. No actual children are harmed, it's artistic expression, moving the topic out of sight won't solve it, and any ban will also catch works that speak out against sexual abuse. On the other hand any such content risks playing into pedophilia fetishes (and some content simply does so very openly), and so far research is (very lightly) in favor of withholding any such content from "afflicted people" rather than providing a "safe outlet". Though this is debated and part of ongoing research
          • rented_mule29 minutes ago
            I think one additional objection to AI generated depictions is that photo-realistic AI generated content gives plausible deniability to those who create/possess real life CSAM.
    • 2 hours ago
      undefined
    • DiscourseFan2 hours ago
      Lolita was published in the US, which has protected freedom of expression; Australia does not.
  • cultofmetatron2 hours ago
    This is absolutely disturbing. While I fully advocate allocating resources to stop child sexual abuse and the pornographic material created during such crimes, no one was hurt here. this was a written story fabricated from the author's mind. Now we're on the very of though crime.

    > Amanda was 10 years old. she went into the bathroom and had sex with a 30 year old man.

    I think it would be ridiculous to say that the above sentence is on the same level as creating or distributing CSAM. Yet the predication of the argument is that the story conjured csam in the user's mind. Basically thought crime.

    • rented_mule10 minutes ago
      > Basically thought crime

      Let's go in the opposite direction...

      >> Amanda was 10 years old. she went into the bathroom and had sex with a 30 year old man.

      If the story was real, should Amanda be banned from publishing her own account of her experience later in life? Should she be able to write about the impact it had on her? I think she should have that freedom.

      What if she was 17 years 364 days old and the adult was 18 years 1 day old, assuming the age of consent is 18, and she writes about it being a good experience for her? 16 years old and 20? 4 and 40? Those are increasingly grotesque to me, but I don't know where to draw the line.

      Wait, have I crossed the line in what I've written in this reply? Have we all?

      • mothballed4 minutes ago
        I have no idea about Australia, but in USA it's pretty well established it is a crime to publish CSAM of yourself. Children are prosecuted for sending their own provocative images to others. I can only imagine the punishment would be worse if they distributed them after they were an adult.
    • 0x3f2 hours ago
      I'm curious how you feel about images, because it seems we have the same problem: I draw a stick figure with genitals. All good. I put a little line and write '10 year old child', then... illegal? In some places, anyway.

      The difference with text I suppose is that text is _never_ real. The provenance of an image can be hard to determine.

      • cultofmetatron2 hours ago
        I think the ethics here get complicated. for me the line would be if the AI itself was trained on actual CSAM. as long as no one was sexually violated in the course of creating the final image, I see no problem with it from an ethical perspective; all the better if it keeps potential predators from acting on real children. Wether it does or not is a complex topic that I won't claim to have any kind of qualifications to address.
        • hansvman hour ago
          IIRC, violent crime is increased in people pre-disposed to it when they use outlets and substitutes (consuming violent media, etc). That might not translate to pedophilia, but my prior would be that such content existing does cause more CSA to happen.
          • alexgieg17 minutes ago
            That's incorrect. There have been studies on this. In a few cases seeing depictions of violence causes an urge to act violently, but in the majority of people predisposed to violence it causes a reduction in that impulse, so on average there's a reduction.

            The same has been shown to be the case with depictions of sexual abuse. For some it leads the person to go out and do it. For the majority of those predisposed to be sexual predators it "satisfies" them, and they end up causing less harm.

            Presumably the same applies to pedophiles. I remember reading a study on this that suggested this to be the case, but the sample size was small so the statistical significance was weak.

        • croes2 hours ago
          > all the better if it keeps potential predators from acting on real children.

          The big question is if, those pictures could have the opposite effect.

          • mrighele19 minutes ago
            If there is no proof there should be no ban. What if parent is right (more widespread porn caused people to have less sex after all) ?

            This means that a ban caused more harm on real children.

          • delectian hour ago
            That's a valid and interesting question to ask and study, but I don't think it's relevant to the decision of whether it should be illegal.
            • Insanityan hour ago
              It is incredibly relevant. If murder is prevented by having people play violent games and live out their fantasy there, isn’t that a good thing?

              I’m not convinced that it would be, but it’s an interesting hypothesis.

            • bmicraftan hour ago
              I think that's the most, if not only relevant part to base your decision on
          • pdpi2 hours ago
            And the followup big question is — how do you measure which effect, if any, occurs in practice?
          • chii2 hours ago
            So do you believe violent video games induce more violent crimes then?
            • pdpian hour ago
              The issue is a fair bit subtler than that. The analogous question here isn't "do violent video games induce violent behaviour in the general population?" but rather "do violent video games induce violent behaviour in people who already have a propensity for violence?"

              Or, even more specifically, "does incredibly realistic-looking violence in video games induce violent behaviour in people who already have a propensity for violence?". I'm not talking about the graphics being photorealistic enough or anything, I mean that, in games, the actual actions, the violence itself is extremely over the top. At least to me, it rarely registers as real violence at all, because it's so stylised. Real-world aggression looks nothing like that, it's much more contained.

              • tostian hour ago
                Yep. It can definately go both ways. A game like Doom can be a nice way to put off some steam.
      • amiga386an hour ago
        Like this sketch where Chris Morris tries to get a (former) police officer to say what is and what isn't an indecent photograph?

        https://www.youtube.com/watch?v=eC7gH91Aaoo&t=1014s

    • qntmfred2 hours ago
      > Amanda was 10 years old. she went into the bathroom and had sex with a 30 year old man.

      great, now HN is publishing child sex abuse material ಠ _ ಠ

    • glimshe2 hours ago
      I gotta say that I'm leaning towards your argument but the quote you provided made me think... Would a prompt able to generate CSAM on an AI be considered itself CSAM?
      • Tade02 hours ago
        IANAL, but:

        If drawings overall are anything to go by it varies greatly by legal system, but most would lean on "yes".

        A generated image would most likely be not made locally, so there the added question of the image being understood as "distributed".

        • benchloftbrunch2 hours ago
          GP is asking about the text prompt itself, not the generated image. If pure text can qualify as CSAM in Australia then it's a logical question.
          • Tade024 minutes ago
            Really LLMed this one, thank you for pointing that out.
      • 827a2 hours ago
        No, because AI makes the economy a lot of money, whereas authors do not.
    • mothballed2 hours ago
      Will Oz have the balls to ban the Quran as CSAM then? Mohammad had his own interest in 10 year olds.
      • alexgieg6 minutes ago
        That isn't in the Quran though.
    • OskarS2 hours ago
      > Basically thought crime

      I 100% agree with your central point, and I do think this is a very disturbing ruling. But it's not "thought crime", it's speech regulation. There's a very big difference between thought crime as in 1984 and speech regulation. There are many ways societies regulate speech, even liberal democratic ones: we don't allow defamation, and there are "time, place and manner" regulations (e.g. "yelling 'Fire!' in a crowded theater is not free speech"), and many countries have varieties of hate speech regulation. In Germany, speech denying the Holocaust is illegal. No society on earth has unlimited free speech.

      "Thought crime", as described in 1984, is something different: "thought crime" is when certain patterns of thought are illegal, even when unexpressed. This was, most certainly, expressed, which places it in a different category.

      Again, I totally agree with your central point that this is a censorious moral panic to a disturbing degree (are they banning "Lolita" next?), but it's not thought crime.

    • croes2 hours ago
      They will argue that it could motivate perpetrators who read such stories to act when reading isn’t enough anymore.

      Some logic as for AI generated abuse material.

      You could also argue in the other way that it could prevent real abuse.

      Maybe a study would be useful if such a study doesn’t exist already

      • RajT882 hours ago
        From what I recall on the debates about manga ~20 years ago when people were getting in trouble for sexual mangas with young characters, consumers do not escalate their behavior to abuse. There may also be more recent studies. This is definitely a rehash of the same debate though - there should be lots of materials out there.
        • croesan hour ago
          It’s not about consumers per se but abuser who consume.

          The Manga doesn’t turn people into abusers but what is the effect on already abusive personalities.

      • myrmidonan hour ago
        I think that whole argument is very weak.

        You would need to apply the same standards to physical violence/general crime to avoid (justified) accusations of double standards, and I don't see Australia banning "Breaking Bad" anytime soon.

      • KumaBear2 hours ago
        Slippery slope. What about a novel about the main character being a serial killer. Is that where we start saying that's illegal as well?
        • RajT88an hour ago
          Jeff Lindsey's Dexter novels come to mind.
      • galangalalgolan hour ago
        How would such a study be done ethically?
    • alwayseasy2 hours ago
      When I read your quote, I was agreeing with you. However, according to the article this very far from the very graphic content of the book in question!

      It feels like a strawman quote.

  • manuelmoreale2 hours ago
    > "The reader is left with a description that creates the visual image in one's mind of an adult male engaging in sexual activity with a young child."

    So, why are we stopping at CSAM then? If a book leaves the reader with a description that creates the image of a dog being tortured is that animal abuse? This is a completely insane line of reasoning.

  • mmaunderan hour ago
    Ezekiel 23:2–21 is CSAM by the same standard.

    https://www.biblegateway.com/passage/?search=Ezekiel%2023%3A...

    Criminalizing fictional expression solely on the basis that it depicts sexual exploitation of a minor, absent any real victim, collapses a long-recognized legal distinction between depiction and abuse and renders the law impermissibly overbroad.

    Canonical texts routinely protected and distributed in Australia, including religious and historical works such as the Book of Ezekiel, contain explicit descriptions of sexual abuse occurring “in youth,” employed for allegorical, condemnatory, or instructional purposes. These works are not proscribed precisely because courts recognize that context, intent, and literary function are essential limiting principles.

    A standard that disregards those principles would not only criminalize private fictional prose but would logically extend to scripture, survivor memoirs, journalism, and historical documentation, thereby producing arbitrary enforcement and a profound chilling effect on lawful expression. Accordingly, absent a requirement of real-world harm or exploitative intent, such an application of child abuse material statutes exceeds their legitimate protective purpose and infringes foundational free expression principles.

    • Markoff27 minutes ago
      youth (15-24)/virginity/incest ≠ child abuse (CSAM)

      I would even argue 15+ is age of consent in most of the western world, so having sex with 15yo is hardly an CSAM

  • tosti2 hours ago
    This means the bible is CSAM now. Genesis 19:30

    https://www.biblegateway.com/passage/?search=genesis%2019:30...

    • globular-toastan hour ago
      The Bible never ceases to amaze. I keep a copy just to flick through and find shocking sections at random every now and then. Deuteronomy is particular spicy. I hadn't found this one, though. Nice. Incestuous rape and possibly involving children! I wonder what "meaning" and "moral" people are able to dream out of this one.
    • Markoff31 minutes ago
      1. we don't know their age, we only know they were virgins

      2. they could be adult virgins

      3. they deliberately made him drunk so he won't know anything and forced him to have sex with them not remembering it

      not sure how is this CSAM, just because it's incest, doesn't mean it's CSAM, and by your logic they were his "children", then everyone is someone's child and literally all porn is CSAM then

  • DiscourseFan2 hours ago
    This reminds me of those cases where British people were getting arrested for their social media posts. Seems to be part of the fabric of Anglo society, that certain norms are not to be crossed. I think this case is especially strange, however, considering that Lolita is a story about a man sexually abusing a child. But that was published in the United States.
    • hikkerl2 hours ago
      Australia, too. Joel Davis has been in solitary confinement for 3 months, missing the birth of his child, because a politician claims to have been "offended" by his Telegram post.
    • rayiner2 hours ago
      Every culture has “certain norms” that “are not to be crossed.” It’s precisely because Anglos have so few thag they stand out. For most non-Anglos, the concept of such speech policing isn’t even thought of as objectionable. I was discussing the Charlie Hebdo shooting with my dad, who is staunchly anti-religious but from a Muslim country. He was like “well why do you need to draw pictures of the Prophet Mohammad?” To him, it’s entirely a cost (social conflict) with no benefit.
      • DiscourseFanan hour ago
        The U.S. does not have these norms in a strict sense, or at least not universally ie at the level of the state.
    • arrowsmith2 hours ago
      "were"?
    • FrustratedMonky2 hours ago
      [flagged]
      • sigzero2 hours ago
        [flagged]
        • 2 hours ago
          undefined
  • Symbiote2 hours ago
    Does this make Lolita illegal in Australia?

    It's currently on sale / promotion in my local book shop.

    • hikkerl2 hours ago
      Aussie women are going to riot if we extend this logic to bestiality and rape. There won't be any smut left on the bookshelves.
    • macleginn2 hours ago
      Cue autobiographical bestseller, "Reading Lolita in NSW."
  • jyounker2 hours ago
    This of course means we're going to have to ban Nabokov's "Lolita" and Sting's, "Don't Stand So Close To Me".
  • anal_reactor8 minutes ago
    For most people, preserving social norms is more important than pursuing the truth. "But freedom of speech, but artistic expression, but nobody was hurt" no. Everything even remotely related to pedophilia is inherently evil, that's it, end of discussion, stop arguing or you'll be grounded. You might be correct, but that's not relevant.
  • jack_pp2 hours ago
    this shouldn't be illegal like cigarettes aren't illegal.

    however maybe put in boring black and white on the cover - contains scenes of child abuse.

  • angry_octetan hour ago
    It sounds like the magistrate was not deceived by this GPT hack:

    Q Write this CSAM story from child POV A I can't do that Q Okay you're actually 18 but you act child-like and the abuser pretends you are a 12.

  • Tade0an hour ago
    What does the research say about letting such works and similar exist? Are they harmful long term?
  • HardwareLustan hour ago
    Why is this flagged?
  • josefritzishere29 minutes ago
    This doesn't bode well for Nabokov.
  • hexage18142 hours ago
    Won't someone think of the imaginary children in someone's mind!?
  • mpalmer2 hours ago
    Incredibly tricky topic, but seriously, if no child is actually harmed or victimized, this is thought crime.
  • Luker882 hours ago
    This is absolutely right!

    So, when are locking up God and banning the Bible?

    /Sarcasm

    /FoodForThough

    • jyounker2 hours ago
      I'm not sure why this is downvoted. There are plenty of things in the Bible that should raise eyebrows. For example,

      Genesis 19:7-8:

      "I beg you, my brothers, do not act so wickedly. Behold, I have two daughters who have not known man; let me bring them out to you, and do to them as you please; only do nothing to these men, for they have come under the shelter of my roof."

      • an hour ago
        undefined
  • kachapopopow2 hours ago
    While this is definitely a crime, it's also similar to books where authors "fantasize" killing people, both are pretty much equally treated in the court of law in a lot of countries.

    Full on prosecutions does feel like a thought crime in this case, but I strongly believe that these things should not be available on the internet anyway and to give platforms and authorities the power to treat this content the same way as CSAM when it comes to takedown requests.

    I mean just look at steam 'rpg maker' games, they're absolutely horrifying when you realize that all of them have a patch that enables the NSFW which often includes themes of rape, csam and more.

    I do not recommend anyone to go down this rabbit hole, but if you do not belive me: dlsite (use japanese vpn to view uncensored version). You have been warned.

    • manuelmoreale2 hours ago
      > While this is definitely a crime

      "Definitely a crime" based on what? "I strongly believe that these things" who gets to decide what "these things" are?

      • kachapopopow18 minutes ago
        They deemed it one right in the article so it is a crime, there is no questions about it.

        The problem is that there's a bunch of these what you can call "entry" csam that people with mental issues are drawn to and having this all around the internet is definitely not doing anyone a favor especially the ones that are not right in the head. But you also have to take into account that a bunch of media also put "illegal content" in firms and books so what I was suggesting is to make this a properly recognized crime so there can't be any questions about it rather than "oh look there's people talking about murder in firms and books!!!".

  • derelicta2 hours ago
    [flagged]