A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • CameronDev@programming.dev
    link
    fedilink
    English
    arrow-up
    167
    arrow-down
    5
    ·
    4 months ago

    That kinda sounds reasonable. Especially if it can prevent someone going down that rabbithole? Good job PH.

  • FraidyBear@lemmy.world
    link
    fedilink
    English
    arrow-up
    127
    arrow-down
    3
    ·
    4 months ago

    Imagine a porn site telling you to seek help because you’re a filthy pervert. Thats gotta push some to get some help I’d think.

  • FinishingDutch@lemmy.world
    link
    fedilink
    English
    arrow-up
    89
    arrow-down
    2
    ·
    4 months ago

    Sounds like a good feature. Anything that stops people from doing that is great.

    But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

    • CameronDev@programming.dev
      link
      fedilink
      English
      arrow-up
      74
      arrow-down
      4
      ·
      4 months ago

      PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

      And im guessing they are trying to catch users who are trending towards questionable material. “College”✅ -> “Teen”⚠️ -> “Young Teen”⚠️⚠️⚠️ -> "CSAM"🚔 etc.

      • FinishingDutch@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        4 months ago

        Wow, that bad? I was aware they purged a lot of ‘amateur’ content over concerns regarding consent to upload/revenge porn, but I didn’t know it was that much.

          • azertyfun@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            1
            ·
            4 months ago

            Eeeeeeeh. There’s nuance.

            IIRC there were only a handful of verified CSAM videos on the entire website. It’s inevitable, it happens everywhere with UGC, including on here. Anecdotally, in the years leading up to the purge PH had already cleaned up its act and from what I saw pirated content was rather well moderated. However this time the media made a huge stink about the alleged CSAM, payment processors threatened to pull out (they are notoriously very puritan, it’s caused a lot of trouble to lemmynsfw’s admins for instance) and so regardless of the validity of the initial claims PH had to do something to gain back the trust of payment processors, so they basically nuked every video that did not have a government ID attached.

            Now if I may speculate a little, one of the reasons it happened this way is probably that due to its industry position PH is way better moderated than most (if not all) websites of their size and already had verified a bunch of its creators. At the same time the rise of OnlyFans and similar websites means that real amateur content has all but disappeared so there was less and less reason to allow random UGC anyway. So the high moderation costs probably didn’t make much sense anymore anyway.

            • root@precious.net
              link
              fedilink
              English
              arrow-up
              10
              ·
              4 months ago

              Spot on. The availability of CSAM was overblown by a well funded special interest group (Exodus Cry). The articles about it were pretty much ghost written by them.

              When you’re the biggest company in porn you’ve got a target on your back. In my opinion they removed all user content to avoid even the appearance of supporting CSAM, not because they were guilty of anything.

              PornHub has been very open about normalizing healthy sexuality for years, while also providing interesting data access for both scientists and the general public.

              “Exodus Cry is an American Christian non-profit advocacy organization seeking the abolition of the legal commercial sex industry, including pornography, strip clubs, and sex work, as well as illegal sex trafficking.[2] It has been described by the New York Daily News,[3] TheWrap,[4] and others as anti-LGBT, with ties to the anti-abortion movement.[5]”

              https://en.wikipedia.org/wiki/Exodus_Cry

              • azertyfun@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                6
                ·
                4 months ago

                They’re the fuckers who almost turned OF into Pinterest as well? Not surprising in retrospect. The crazy thing is how all news outlets ran with the narrative and payment processors are so flaky with adult content. De-platforming sex work shouldn’t be this easy.

    • Ace! _SL/S@ani.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      It had all sorts of illegal things before they purged everyone unverified due to legal pressure

    • silasmariner@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      4 months ago

      wree people really expecting to find that content on PornHub?

      Welcome to the internet 😂 where people constantly disappoint/surprise you (what word is that? Dissurprise? Disurprint?

    • 520@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      So…pornhub has actually had problems with CSAM. It used to be much more of a Youtube-like platform where anyone can upload.

      Even without that aspect, there are a looot of producers that don’t do their checks well and a lot of underage actresses that fall through the cracks

  • Mostly_Gristle@lemmy.world
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    1
    ·
    4 months ago

    The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn’t attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

        • Lemmy@lemm.ee
          link
          fedilink
          English
          arrow-up
          21
          ·
          edit-2
          4 months ago

          Same thing for me when I was 13. I freaked the fuck out when I saw a wikipedia article on the right. I thought I was going to jail the next day lmfao

      • Dran@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        ·
        4 months ago

        I’d think it’s probably not a majority, but I do wonder what percentage it actually is. I do have distinct memories of being like 12 and trying to find porn of people my own age instead of “gross old people” and being confused why I couldn’t find anything. Kids are stupid lol, that’s why laws protecting them need to exist.

        Also good god when I become a parent I am going to do proper network monitoring; in hindsight I should not have been left unattended on the internet at 12.

        • kylian0087@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          1
          ·
          4 months ago

          I was the same back then. And have come across some stuff which is surprisingly easy to find. Later to realize how messed up that was.

          I think monitoring is good but it has a fine line not to cross in your child privacy. If they suspect anything they sure know how to work around it and you loose any insight.

        • Rinox@feddit.it
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          4 months ago

          It’s not about laws, it’s about sexual education. Sexual education is a topic that can’t be left to the parents and should be explained in school, so as to give the kids a complete knowledge base.

          Most parents know about sex as much as they know about medicines. They’ve had some, but that doesn’t give them a degree for teaching that stuff.

        • Piece_Maker@feddit.uk
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          Sorry I know this is a serious subject and not a laughing matter but that’s a funny situation. I guess I was a MILF hunter at that age because even then I was perfectly happy to knock one out watching adult porn instead!

  • pHr34kY@lemmy.world
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    1
    ·
    4 months ago

    4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for “unwanted gift” once and insisted I seek help. These things have a lot of false positives.

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    51
    ·
    4 months ago

    Google does this too, my wife was searching for “slutty schoolgirl” costumes and Google was like “have a seat ma’am”

    • prole@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      23
      ·
      4 months ago

      Google now gives you links to rehabs and addiction recovery centers when searching for harm reduction information about non-addictive drugs.

        • gapbetweenus@feddit.de
          link
          fedilink
          English
          arrow-up
          44
          arrow-down
          4
          ·
          4 months ago

          Sexuality is tightly connected to societal taboos, as long as everyone involved is a consenting adult - it’s no-one else businesses. There is no need or benefit in moralizing peoples sexuality.

          • nickwitha_k (he/him)@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            6
            ·
            4 months ago

            To be clear, I absolutely agree. I’m not saying that people are immoral for liking some plaid. Just a kind of fetish that seems less “natural” (like spanking or bdsm) and more amplified in popular media in a parallel to sexualization of children in response to feminism (see: Brooke Shields’ experience) and that makes it one that I’m not comfortable participating in. But for those that don’t find their brains making such associations that are being safe, sane, and consensual, I wish wonderful, freaky times.

            • gapbetweenus@feddit.de
              link
              fedilink
              English
              arrow-up
              16
              arrow-down
              3
              ·
              4 months ago

              I’m not saying that people are immoral for liking some plaid.

              Fetishizing school uniforms worn by children gives some serious Steven Tyler vibes. fetish that seems less “natural”

              Sure sounds like you are. And you sound also rather judgy about it. Maybe it’s just a language thing - but at least that’s my impression.

              • nickwitha_k (he/him)@lemmy.sdf.org
                link
                fedilink
                English
                arrow-up
                5
                ·
                4 months ago

                It may well be my communication. The first statement was something of a half-joke at the expense of the rock singer and the normalization of predatory behavior towards minors that he and others engaged in during the height of rock’s popularity, not at the expense of people who like to engage in age-play.

                I am very accepting of others kinks and do not judge individuals for activities that are safe, sane, and consensual. Accepting the people and their ethically-sound activities does not mean that one cannot have preferences and perceptions on the activities themselves. Our preferences and perceptions are shaped to a degree (large or small) by our experiences. Mine are most definitely colored to a significant degree by my own early childhood trauma, which makes anything approaching age-play, power-play, and CNC, even just by indirect association in my own thought processes, uncomfortable and unsexy to me.

                I also find scat-play pretty disgusting (tbf, that’s probably part of the kink for some) but, I’m not going to turn someone away, unless they’ve not showered since their last session.

                • gapbetweenus@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  4 months ago

                  Our preferences and perceptions are shaped to a degree (large or small) by our experiences. Mine are most definitely colored to a significant degree by my own early childhood trauma, which makes anything approaching age-play, power-play, and CNC, even just by indirect association in my own thought processes, uncomfortable and unsexy to me.

                  Even if it means nothing from an internet stranger, sorry to hear you had traumatic childhood experiences. Makes sense that you are uncomfortable with said practices.

                  I also find scat-play pretty disgusting (tbf, that’s probably part of the kink for some)

                  We can agree on something here.

          • r3df0x ✡️✝☪️@7.62x54r.ru
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            9
            ·
            4 months ago

            It’s still weird to sexualize children. It’s less weird when it’s teenagers and everyone is of age but it’s a weird thing to engage in constantly.

            • gapbetweenus@feddit.de
              link
              fedilink
              English
              arrow-up
              17
              ·
              4 months ago

              It’s sexualizing children in the same way as daddy porn sexualizes incest, you are taking fantasies at their literal face value without looking into what’s going on.

  • _cnt0@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    47
    arrow-down
    4
    ·
    4 months ago

    Non-paywall link: https://web.archive.org/web/20240305000347/https://www.wired.com/story/pornhub-chatbot-csam-help/

    There’s this lingering implication that there is CSAM at Pornhub. Why bother with “searches for CSAM” if it does not return CSAM results? And what exactly constitutes a “search for CSAM”? The article and the linked one are incredibly opaque about that. Why target the consumer and not the source? This feels kind of backwards and like language policing without really addressing the problem. What do they expect to happen if they prohibit specific words/language? That people searching for CSAM will just give up? Do they expect anything beyond them changing the used language and go for a permanent cat and mouse game? I guess I share the sentiments that motivated them to do this, but it feels so incredibly pointless.

    • TheBlackLounge@lemm.ee
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      3
      ·
      4 months ago

      Lolicon is not illegal, and neither is giving your video a title that implies CSAM.

      That begs the question, what about pedophiles who intentionally seek out simulated CP to avoid hurting children?

        • CaptainEffort@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          4 months ago

          Which is, imo, pretty dumb. If it gives these people an outlet that literally hurts no one, I say they should be allowed to use it. Without it they’ll just go to more extreme lengths to get what they need, and as such may go to places where actual real life children are being abused or worse.

          So while it’s still disgusting and I’d rather not think about it, if nobody’s being hurt then it’s none of my business. Let them get out their urges in a safe way that doesn’t affect anybody else.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            1
            ·
            4 months ago

            I imagine the concern is that it would look identical to the real thing. Which blurs the lines. Kinda like how governments really hate when toy makers make toy guns look too real and why I have to tell airport security that I would like my bag searched now since there are homemade looking electronic devices in it.

            I guess in theory some government could make a certification system. Where legal simulated cp has like some digital watermark or something but you know that would involve a government paying someone to review child porn for a living. Kinda hard to sell that to the taxpayers or fill that role. Maybe the private sector would be willing to do it but that is a big ask.

            I am not sure I agree with you or disagree with you. Maybe all of us would be better off if there is a legal and harmless way for pedos to get what they want. Or maybe it is bad to encourage it at all even in a safe way, like if they consume that stuff it will make them more likely to seek out real children.

            Definitely isn’t a great situation be great if the condition is cured some day.

            • YarHarSuperstar@lemmy.world
              link
              fedilink
              English
              arrow-up
              4
              ·
              4 months ago

              This covered a lot of my concerns and thoughts on the topic. I want these people to be able to seek help and possibly even have a legal outlet that is not harming anyone, i.e. not even someone who has to view that shit for a living, so maybe we get AI to do it? IDK. It’s complicated but I believe that it’s similar to having an addiction in some ways and should be treated as a health issue, assuming they haven’t hurt anyone and want help. This is coming from someone with health issues including addiction and also someone who is very empathetic and sympathetic to any and all struggles of folks who are just trying to live better.

              • afraid_of_zombies@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                4 months ago

                I can’t even imagine the amount of money it would cost for someone to pay me to watch and critique child porn for a living. I have literally been paid money in my life to fish a dead squirrel that was making the whole place stink, from underneath a trailer in July and would pick doing that professionally over watching that filth.

      • Clbull@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        Depends on the jurisdiction. Indecent illustrations and ‘pseudo photographs’ depicting minors are definitely illegal in the UK (Coroners and Justice Act 2009.) Several US states are also updating their laws to clamp down on this too.

        I’m also aware that it’s illegal in Switzerland because a certain infamous rule 34 artist fled his home country to evade justice for that very reason.

      • archomrade [he/him]
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        7
        ·
        4 months ago

        I imagine high exposure (for individuals who are otherwise not explicitly searching for such material) could inadvertently normalize that behavior IRL.

        • CaptainEffort@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          edit-2
          4 months ago

          Like how video games supposedly normalize violence? Are you going to go shoot a bunch of people because GTA exists?

          Ffs guys what year is this? Thought we were past this silly mindset.

          • archomrade [he/him]
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            4
            ·
            4 months ago

            Deciding that you’re going to pull someone out of their car and clap them with a rocket launcher has a significantly higher situational barrier than finding yourself in a close relationship with a child who trusts you enough that you can abuse it in a moment of impulse.

            • CaptainEffort@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              10
              arrow-down
              1
              ·
              4 months ago

              You think abusing a child is easier than, say, punching someone in the face as you would do in video games?

              Dude if you genuinely think that I’d recommend reaching out to someone…

              In all seriousness tho, way to take the most extreme video game example possible to dismiss my point. Video game violence can have an extremely low “situational barrier”, but that doesn’t mean that video games will make you do those things.

              • archomrade [he/him]
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                4
                ·
                4 months ago

                Nobody is saying that porn ‘makes you’ do or become anything. But the stories told in video games are clearly fiction in form and content - you’re a soldier in the future fighting aliens, you’re a member of an elite group of time-traveling assassin’s, you’re an aspiring ex-convic with unlimited lives and pockets to carry an entire arsenal of weapons in a tanktop and shorts - whereas porn is written to make the fantasy seem just plausible enough so you can place yourself in as the subject (which is why the situations are always so contrived in pornography)

                The situations wherein you might plausibly choose to sexually exploit a child aren’t nearly as implausible as one where you could violently assault someone without immediate risk and consequence. Just look at how often porn dialogue waves away the likely objections; “we’re not actual siblings, you’re just my STEP brother”, “I won’t tell anybody”, “I just turned 18, I don’t want to be the only virgin in college”, ect.

        • _cnt0@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          18
          arrow-down
          13
          ·
          4 months ago

          Like exposure to gay people and gay content makes you gay? (/s if it wasn’t obvious)

          • squid_slime@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            4
            ·
            edit-2
            4 months ago

            no very different, but if someone hasn’t come out then having gay media will normalize being gay and id assume they could come out with less stigma but this is a painfully ignorant and insulting comparison

            • _cnt0@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              12
              arrow-down
              9
              ·
              4 months ago

              but this is a painfully ignorant and insulting comparison

              Only if you condemn the disposition and not its inacceptable form of execution. From where I stand being attracted to children is as acceptable as men being attracted to men. Abusing children is as inacceptable as men raping men. If it is, in your book, fine to condemn pedophiles for being pedophile, then christian fundamentalists are totally fine hating homosexuals for being homosexual. Don’t get me wrong, I’m neither condoning nor encouraging the (sexual) abuse of children. Unlike you I’m just not a hypocrite about different sexual orientations/preferences that nobody chooses. The only qualitative difference is that in one case one side cannot consent and needs better protection by society. The only point I am (consistently) trying to make here, is that I find it highly dubious that the measures described in the article have any impact on said required protection, and that the article completely fails to provide any shred of evidence or even indication that it does.

              • archomrade [he/him]
                link
                fedilink
                English
                arrow-up
                8
                arrow-down
                5
                ·
                edit-2
                4 months ago

                TW: discussions about sexual abuse

                spoiler

                If it is, in your book, fine to condemn pedophiles for being pedophile, then christian fundamentalists are totally fine hating homosexuals for being homosexual.

                Fetishizing an abusive sexual behavior is not the same as same-sex attraction. We would be having the same conversation if we were talking about rape porn between adults: it’s the normalization of the abusive behavior that we’re primarily concerned with, not the ethics of watching simulated abuse in general.

                While I don’t believe that banning simulated material would be helpful, it is completely reasonable to suggest that cautioning individuals about the proximity of their search to material that is illegal - and the risks associated with consuming it - would be preventative against future consumption.

                Especially considering Pornhub is only placing cautions around that material and isn’t removing that content generally. It’s hard to read your objections as anything other than pedophilia apologia.

                • _cnt0@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  arrow-down
                  3
                  ·
                  4 months ago

                  Being attracted to an abusive sexual behavior is not the same as being attracted to a consenting behavior between adults.

                  And I did not even hint at anything even close to the contrary.

                  We would be having the same conversation if we were talking about rape porn between adults: […]

                  Which is exactly the comparison I made.

                  […] it’s the normalization of the abusive behavior that we’re primarily concerned with, not the ethics of watching simulated abuse in general.

                  I wasn’t talking about the normalization of anything anywhere. You inject a component, that wasn’t the subject in our conversation before, to defend a point I wasn’t questioning (red herring).

                  While I don’t believe that banning simulated material would be helpful, […]

                  Another topic which we could discuss, but which - again - you just injected.

                  […]it is completely reasonable to suggest that cautioning individuals about the proximity of their search to material that is illegal - and the risks associated with consuming it - would be preventative against future consumption.

                  And again: I’m asking for qualitative and quantitative proof of that. It is the one and only thing I was and am questioning about the article.

                  Especially considering Pornhub is only placing cautions around that material and isn’t removing that content generally.

                  The point to our discussion being what?

                  It’s hard to read your objections as anything other than pedophilia apologia.

                  You seem to have major trouble with text comprehension and staying on track with discussions.

                • Gabu@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  4 months ago

                  Minor complaint: try to get an empty paragraph between the spoiled text and the non-spoiled text whenever possible - makes it easier to read.

                  Regarding the discussion, you’re both right at the end of the day. Limiting exposure to illegal and immoral-adjacent material is obviously in society’s interest, but at the same time the implication that a glorified ad for a mental illness helpline is a good solution is ludicrous - it’s at the absolute bottom of the barrel when it comes to the kinds of issues we should be working on.

              • squid_slime@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                edit-2
                4 months ago

                pedophilia is usually caused by a neurological disorder or a power fantasy, would you call rape a sexual orientation? its a preference at best and its not a sexual orientation as that is tide to gender and not age.

                as to condemning of pedophiles, i dont condemn them unless they act on they’re urges. i however fully support seeking help

                • _cnt0@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  ·
                  4 months ago

                  Would you call rape that isn’t happening rape?

                  as to condemning of pedophiles, i dont condemn them unless they act on they’re urges.

                  Up until this point everything you said read exactly like you would. Seems we’re finally on the same page?

            • Schadrach@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              How so? If CP and things adjacent to it (drawn stuff, “teen” porn, catholic schoolgirl outfits, etc) content is going to make people promote and encourage people to molest children, why wouldn’t gay porn promote and encourage homosexuality?

              Like this is one of those things that feels a lot like picking and choosing based on preference. I suspect violence in media being a historic right wing talking point is the only reason it’s not on the bad list like sexy women and loli stuff.

              • squid_slime@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                4 months ago

                this is an entirely different discussion. My point and issue is with the comparison being in poor taste, like I said previously I’d be equally annoyed if someone made a comparison with heterosensuality and beastiality one is normal and the other is morally wrong.

                Edit: my mistake I thought you replied to a different comment.

                We are products of our environment. I do believe that we are effected by the things around us, I’d imagine we’d have a lot more pedophiles if cp was on TV. Look at any industry built on abuse, people don’t go in thinking they’ll be the bad guy and fuck up someone’s day, they themselves are introduced to it through environment.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            3
            ·
            4 months ago

            Not exactly a fair analogy. First off it is willful exposure to cp not incidental. Secondly the concern isn’t that someone is oriented towards children the concern is the action. We can’t and should never ever attempt to police a person’s mind we can however as a society demand that adults don’t rape kids. Homosexuality is not the same, the vast majority of western society is fine with the action. So even if you could demonstrate a link between watching gay porn more and being more willing to have gay sex it doesn’t matter.

          • archomrade [he/him]
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            8
            ·
            4 months ago

            I’m going to go ahead and treat this as if it’s an earnest comparison because there shouldn’t be any room for ambiguity:

            Fuck right off with that analogy. Pedophilia and the sexual behaviors that result from it are immensely damaging to children - who cannot meaningfully consent to sexual relationships -, whereas the sexual behaviors between consenting adults are not.

            I don’t really care if you were speaking in-jest. If you were, i’d recommend you delete that comment before someone takes it seriously.

            • _cnt0@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              8
              arrow-down
              6
              ·
              edit-2
              4 months ago

              Yah … I already answered that: https://sh.itjust.works/comment/9541949

              but this is a painfully ignorant and insulting comparison

              Only if you condemn the disposition and not its inacceptable form of execution. From where I stand being attracted to children is as acceptable as men being attracted to men. Abusing children is as inacceptable as men raping men. If it is, in your book, fine to condemn pedophiles for being pedophile, then christian fundamentalists are totally fine hating homosexuals for being homosexual. Don’t get me wrong, I’m neither condoning nor encouraging the (sexual) abuse of children. Unlike you I’m just not a hypocrite about different sexual orientations/preferences that nobody chooses. The only qualitative difference is that in one case one side cannot consent and needs better protection by society. The only point I am (consistently) trying to make here, is that I find it highly dubious that the measures described in the article have any impact on said required protection, and that the article completely fails to provide any shred of evidence or even indication that it does.

              • archomrade [he/him]
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                8
                ·
                4 months ago

                Pedophilia is defined by it’s “inacceptable” (what a strange way of spelling ‘abusive’) behavior, homosexuality is not.

          • barsoap@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            6
            ·
            edit-2
            4 months ago

            Only a very, very small percentage of paedophiles are exclusive paedophiles. This is more like a bi person becoming more gay (or straight) by exposing themselves to more gay (or straight) porn. People can focus in on particular aspects of their sexuality or ignore others, and that’s before fetishisation comes into play where the mind projects sexual meaning onto stuff that’s not primitively (as in instinctively) sexual.

            Yes. Even if you’re a 110% straight dude, if you set your mind to it, with enough practice, you can learn to enjoy sucking dick, or at least having your dick sucked by a cute femboy. At the same time mere exposure to gay porn doesn’t do the same and that’s not a contradiction as your usual 110% straight dude has no interest whatsoever to setting their mind to learn how to enjoy sucking dick, there’s neither inclination nor reason to, the porn is just going to go straight past them. 90% straight? Much more likely. Neither is going to lose their original attraction to women, though, the most you get is nothing happening on that front because they’re occupied elsewhere. And that’s exactly where we want the sexuality of paedophiles to be: Occupied elsewhere.

            EDIT: I’ll assume the downvotes come from people not realizing just how plastic our mind is and not random reactionaries. Not on my lemmy.

      • _cnt0@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        4 months ago

        Like anything on the internet wasn’t tracked. If need be people will resort to physically exchanging storage media.

        • Blueberrydreamer@lemmynsfw.com
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          4 months ago

          But having that tracking shown to you has a very powerful psychological effect.

          It’s pretty well established that increasing penalties for crimes does next to nothing to prevent those crimes. But what does reduce crime rates is showing how people were caught for crimes, making people believe that they are less likely to ‘get away with it’.

          Being confronted with your own searches is an immediate reminder that the searcher is doing something illegal, and that they are not doing so unnoticed. That’s wildly different than abstractly knowing that you’re probably being tracked somewhere by somebody among billions of other people.

          • _cnt0@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            4 months ago

            And where is the quantification and qualification for that? Spoiler: it’s not in the article(s) and not one google search away. Does Nintendo succeed in stopping piracy with its show trials? If you have a look around here, it more looks like people are doubling down.

            • Blueberrydreamer@lemmynsfw.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              I mean, I know Google has been shitty lately, but Wikipedia isn’t hard to find: https://en.m.wikipedia.org/wiki/Deterrence_(penology)

              I’d wager Nintendo has put some fear into a few folks considering developing emulators, but that’s the only comparison to be made here. The lack of any real consequences for individuals downloading roms is why so many are happy to publicly proclaim their piracy.

              Now, I bet if megaupload added an AI that checked users uploads for copyrighted titles and gave everyone trying to upload them a warning about possible jail time, we’d see a hell of a lot less roms and movies on mega.

    • Jojo@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      4 months ago

      Why target the consumer and not the source?

      If for no other reason than it doesn’t have to be either/or. If you can meaningfully reduce demand for a “product” as noxious as CSAM, you should expect the rate of production to slow. There are certainly efforts in place to prevent that production from ever being done, and to prevent it from being shared/hosted once it is, but I don’t think attempting to reduce demand in this way is going to hurt.

      • _cnt0@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        4 months ago

        Does it reduce the demand though? Where are the measurements attesting to that? If history has shown one thing, it is that criminalizing things creates criminals. Did the prohibition stop people from making, trading, or consuming alcohol? How does this have any meaningful impact on the abuse of children? The article(s) completely fail to elaborate on that end. I’m missing the statistics/science here. What are the measuring instruments to assess any form of success? Just that searches were blocked and people were shown some links? … TL;DR: is this something with an actual positive impact or just an exercise in virtue signaling and waste of time and money? Blind “fixes” are rarely useful.

        • archomrade [he/him]
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          4 months ago

          It might not reduce demand in individuals already seaking out that material, but it would certainly reduce introduction to it and demand in the long-run.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 months ago

      Maybe liability or pretending to help? That way they can claim later on “we care about people struggling with this issue which is why when they search for terms related to it we offer the help they need”. Kinda how if you search for certain terms on Google it pops up suicide hotline on top.

      Ok Google just because I looked up some stuff on being sad in winter doesn’t mean I am planning to put a gun in my mouth.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    6
    ·
    4 months ago

    Did it? Or did it make them look elsewhere?

    The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        13
        ·
        4 months ago

        I kind of want to trigger it to see what searches it reacts to, but at the same time I don’t want my IP address on a watchlist.

      • michaelmrose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        4 months ago

        Reasonable adults sites don’t return obviously sketchy things for reasonable queries. EG you don’t search boobs and get 12 year olds.

      • PM_Your_Nudes_Please@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        And what days were those? Cuz you pretty much need to go all the way back to pre-internet days. Hell, even that isn’t far enough, cuz Playboy’s youngest model was like 12 at one point.

        • femtech
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Wtf? For real? Was cp not federal illegal when they did that.

        • The Snark Urge@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Depressing, isn’t it? I was more talking about how prevalent “fauxcest” has become in porn more recently. I guess that’s just my cross to bear as an only child 💅

    • EdibleFriend@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      4 months ago

      given the amount of extremely edgy content already on Pornhub, this is kinda sus

      Yeah…i am honestly curious what these search terms were, how many of those were ACTUALLY looking for CP. And of those…how many are now flagged somewhow?

      • Arsonistic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        I know I got the warning when I searched for young gymnast or something like that cuz I was trying to find a specific video I had seen before. False positives can be annoying, but that’s the only time I’ve ever encountered it.

  • ocassionallyaduck@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    7
    ·
    4 months ago

    This is one of the more horrifying features of the future of generative AI.

    There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

    This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn’t come from a real child’s suffering. It’s the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

    Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

    • gapbetweenus@feddit.de
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      1
      ·
      edit-2
      4 months ago

      The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it’s enabling a progression of the condition. As far as I know that is an unanswered question.

        • gapbetweenus@feddit.de
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          3
          ·
          4 months ago

          For porn in general, yes - I think the data is rather clear. But for cp or related substitute content it’s not that definitive (to my knowledge), be it just for the reason that it’s really difficult to collect data on that sensitive topic.

      • HonoraryMancunian@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        4 months ago

        Another question is, how will the authorities know the difference? An actual csam-haver can just claim it’s AI

            • cumming_normi@yiffit.net
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              4
              ·
              4 months ago

              Because “CSAM” states abuse as the third word in the acronym. Machine learning could (in theory, I lack knowledge on the current implementations) be trained without any children being abused (in any traditional sense anyway) and used to produce the content without any real children being involved (ignoring training data).

              The downvotes likely come from a difference in definition between abuse and CP, images of nonexistent people cannot realistically harm anyone.

              • FilthyHookerSpit@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                arrow-down
                1
                ·
                4 months ago

                Personally, I don’t think it’s arbitrary. A child in a sexual scenario is a depiction of abuse. Normal, healthy children don’t engage in such behaviors.

    • yamanii@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      What do you mean soon, local models from civitai can generate CSAM for at least 2 years. I don’t think it’s possible to stop it unless the model creator does something to prevent it from generate naked people in general like the neutered SDXL.

      • ocassionallyaduck@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        True. For obvious reasons I haven’t looked too deeply down that rabbit hole because RIP my search history, but I kind of assumed it would be soon. I’m thinking more specifically about models like SORA though. Where you could feed it enough input, then type a sentence to get video content. That is going to be a different level of darkness.

    • Zorque@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Are… we looking at the same article? This isn’t about AI generated CSAM, it’s about redirecting those who are searching for CSAM to support services.

      • ocassionallyaduck@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        Yes, but this is more about mitigating the spread of CSAM. And my feeling was it’s going to become somewhat impossible soon. AI generated porn is starting to flood the market and this chat it is also one of those “smart” attempts to mitigate this behavior. I’m saying that very soon, it will be something users don’t have to go anywhere to get if the model can just fabricate it out of thin air, so the chat it mitigation is only temporary, and the dark web of actual CSAM material will become overwhelmed and swamped in artificially generating new tidal waves of artificial CP. So it’s an alarming ethical dilemma we are on the horizon of that we need to think about.

      • ocassionallyaduck@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        5
        ·
        4 months ago

        So your takeaway is I’m… Against AI generative images and thus I “protest too much”

        I can’t tell if you’re pro AI and dislike me, or pro loli hentai and thus dislike.

        Dude, AI images and AI video are inevitable. To pretend that does have huge effects on society is stupid. It’s going to reshape all news media, very quickly. If reddit is 99% AI generated bot spam garbage with no verification of what is authentic, reddit is functionally dead, and we are on a train with no brakes in that direction for most public forums.

          • ocassionallyaduck@lemmy.world
            cake
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            You should probably research the phrase “protest too much” and the word “schtick” then.

            I’m not trying to clutch pearls here, as another poster here commented this isn’t a theoretical concern.

            • Varyk@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 months ago

              You aren’t trying to clutch pearls, but your pearls were just so available you felt you had to jump on the bandwagon to reply to a two-day old comment?

              Nobody said this was a theoretical concern and it’s okay if you don’t understand the phrases " protest too much" and "shtick“, but you can ask for the definitions and relevance directly instead of fishing.

        • Varyk@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          14
          ·
          4 months ago

          Ah, one of the “using words they don’t understand” crew.

          And several hours late, too.

          Swinging for the fences, aren’t you?

  • Kairos@lemmy.today
    link
    fedilink
    English
    arrow-up
    21
    ·
    4 months ago

    Oh just like an experiment the headline made me think someone was suing over this.

    • Gabu@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      4 months ago

      Not since the wipe, AFAIK. Still, at the bottom of the page you can (or at least could, haven’t used their services in a while) see a list of recent searches from all users, and you’d often find some disturbing shit.

    • BowtiesAreCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      3
      ·
      4 months ago

      If you read the paragraph thats literally right there it says when certain terms were searched by the user.

      • KrankyKong@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        4 months ago

        …That paragraph doesn’t say anything about whether or not the material is on the site though. I had the same reaction as the other person, and I didn’t misread the paragraph that’s literally right there.

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    4 months ago

    I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 months ago

      I’m not sure if it’s related but as a life-long miniskirt lover I’ve noticed that many sites no longer return results for the term “schoolgirl” and instead you need to search for a “student”

    • Squire1039@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      The MLs have been shown to be extraordinarily good at statistically guessing your words. The words covered are probably comprehensive.

      • Kusimulkku@lemm.ee
        link
        fedilink
        English
        arrow-up
        13
        ·
        4 months ago

        I think the other article talks about it being a manually curated list because while ML can get correct words it also gets random stuff, so you need to check it isn’t making spurious connections. It’s pretty interesting how it all works

    • Bgugi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Aylo maintains a list of more than 28,000 banned terms in multiple languages, which is constantly being updated.

      Id be very curious what these terms are, but I wouldn’t be surprised if “pizza guy” or “school uniform” would trigger a response.

  • n3uroh4lt@lemmy.ml
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    4 months ago

    The original report from the researchers can be found here: https://www.iwf.org.uk/about-us/why-we-exist/our-research/rethink-chatbot-evaluation/ Researchers said:

    The chatbot was displayed 2.8 million times between March 2022 and August 2023, resulting in 1,656 requests for more information and Stop It Now services; and 490 click-throughs to the Stop It Now website.

    So from 4.4 million banned queries, only 2.8 million (between the date interval in the quote above) and only 490 clicks to seek help. Ngl, kinda underwhelming. And I also think, given the amount of extremely edgy content already on Pornhub, this is kinda sus.

    • laughterlaughter@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      4 months ago

      It’s not really that underwhelming. Disclaimer: I don’t condone child abuse. I find it abhorrent, and I will never justify it.

      People have fantasies, though. If a dude searches for “burglar breaks in and has sex with milf,” does that mean that he wants to do this in real life? Of course not (or god I hope not!) So, some people may have searched for “dad has sex with young babysitter” and bam! Bot! Some people have a fetish for diapers - there are tons of porn of adults wearing diapers and having sex. Not my thing, but who am I to judge? So again, someone searches “sex with diapers” and bam! Bot!

      Let’s not forget that as much as pornhub displays a sign saying “Hey, are you 18?” a lot of people will lie. And those young folks will also search for stupid things.

      So I don’t think that aaaaaall 1+ million searches were done by people with actual pedophilia.

      The fact that 1,600 people decided to click and inform themselves, in the UK alone, well, that’s a lot, in my opinion, and it should be something to commend, not to just say “eh. Underwhelming.”