Pack it up. Nothing can be funnier than this. agony-minion

Or perhaps, nothing can be funny anymore. desolate

  • Frank [he/him, he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    3 months ago

    These systems aren’t intelligent because they’re not trying to develop a langford basilisk to put us out of our collective misery.

    Edit: the “langford basilisk” is a concept from science fiction of an image that for whatever reason causes damage to the human mind. Usually the conceit is it encodes information the mind can’t process resulting in a severe seizure or similar outcome. David Langford explored the idea in some depth starting with a short story called B.L.I.T which is a meditation on terrorism, weapons proliferation, hate, the dangers of rapid scientific discover, and also a Nazi gets pwned

    • UlyssesT [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      13
      ·
      3 months ago

      What if R0k0’s Bas!l!sk but it wants to keep millions to billions of simulations around of everyone it doesn’t like to be an unwilling audience to endless tedious cringe? no-mouth-must-scream

      • buckykat [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        17
        ·
        3 months ago

        Roko’s basilisk is very funny because it’s just a version of Pascal’s Wager where if you think it’s bullshit god just goes “understandable have a nice day” and only punishes you if you believe in it but don’t sufficiently obsess about it.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          15
          ·
          edit-2
          3 months ago

          Like so many other things techbros bloviate about, it’s been thought of before, but because “history is bunk” and other cliches, they keep believing they’re the first to discover concepts and keep stumbling over them while thinking they’re being trailblazers.

          Similarly, “similation theory” is just bazinga deism.

            • UlyssesT [he/him]@hexbear.netOP
              link
              fedilink
              English
              arrow-up
              9
              ·
              3 months ago

              I am going to struggle session about this with you just a little bit: I call it bazinga deism because the claim that the universe and everything in it and all the natural laws the govern it is “just a computer program” and that there’s some programmer(s) outside of it that set it all in motion and sort of stepped away sounds pretty damn deistic to me.

              I agree it is also solipsism in application, especially because its primary adherents really want to see other people as “NPCs” to justify dehumanizing them.

              • buckykat [none/use name]@hexbear.net
                link
                fedilink
                English
                arrow-up
                7
                ·
                3 months ago

                It does have that deistic element to it, but it’s primarily solipsistic because they don’t want to live in the simulated universe and accept it in its programmed natural laws, they want to escape the simulation because they believe it’s all fundamentally unreal.

                • UlyssesT [he/him]@hexbear.netOP
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  ·
                  3 months ago

                  So many of those fucks are the ones on the top of the monstrous system destroying the planet and all they seem to be interested in is trying to escape it, whether by fantastical fiefdoms on Mars or by “waking up” from the “simulation.”

                  The system is that fucked. They don’t seem satisfied with it, either. Then again, they tend to be psychological leaky buckets that can’t ever be satisfied.

        • Frank [he/him, he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 months ago

          It’s the most “what reading literally no philosophy at all and scoffing at the entire liberal arts your whole life does to an mf” thing possible.

        • UlyssesT [he/him]@hexbear.netOP
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          3 months ago

          Pretty unimpressive machine god if it’s only as uncreative and petty with its desire for revenge and its revenge motives as the average creepy libertarian computer toucher yud-rational

          They want to make a cringe god in their own cringe image. kelly

      • Frank [he/him, he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 months ago

        Very different concept. Lovecraft stuff is “ooh these cosmic higher dimensional beings are so weird they drive men mad!”

        A Langford Basilisk is based on the idea that your mind is analogous to a computer and the Basilisk image is visual data that causes an unrecoverable hard crash. There’s nothing magical about the image, the problem happens when your brain tries to make sense of what it is seeing.