Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • AnUnusualRelic@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    3 months ago

    Anthropomorphism has long been used as a big bad thing, the catchall excuse to keep animals as the stupid things they were supposed to be. We’re coming back from that thankfully.

    It doesn’t mean the animals function the same way we do. But they do function in a lot of very similar ways.

    • CALIGVLA@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      3 months ago

      My point is I can’t see how they can “gossip” or “tell stories”, if that isn’t textbook anthropomorphism I don’t know what that is.

      • AnUnusualRelic@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        It’s shorthand for information sharing. Which they certainly do. Crows will absolutely tell one another about lots of stuff, such as people that have harmed them.