Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

    • flatbield@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      They will still need to have a developer set this up and presumably it should be added as an option to the main code base. I thought I heard the beehaw admins were not developers.

      There are a number of other issues that are driving the admins to dump lemmy. Same applies there.

    • thySatannic@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Wait… why is no access to csam hashes a good thing? Wouldn’t it make it easier to detect if hashes were public?! I feel like I’m missing something here…

        • sarmale@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          11 months ago

          Question, from what I saw it seems like every CSAM image ever is assigned a new hash. Isnt it unscalable to asign a separate hash for everything? does that mean that most CSAM images were detected before?