• uglyface [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    I use it to dump logs into so I can spend more time on hexbear at work and less time doing any troubleshooting I’m being paid for. I hope they log and steal all my employers data

  • nothx [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    Though the vast majority of people surveyed didn’t engage emotionally with ChatGPT, those who used the chatbot for longer periods of time seemed to start considering it to be a “friend.” The survey participants who chatted with ChatGPT the longest tended to be lonelier and get more stressed out over subtle changes in the model’s behavior, too.

    Holy shit, we have come into the era of digital waifus.

    Don’t get me wrong, I think that loneliness in our modern society is endemic and dangerous, but I can’t help but chuckle about having parasocial relationships with a chatbot…

  • CarbonScored [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    8 days ago

    This is the wrong conclusion. This looks a lot less “Something Bizarre Is Happening to People Who Use ChatGPT” and more like “Something Bizarre Already Happened to People Who Use ChatGPT a Lot”. Like yeah no shit, lonely people will reach for whatever company they can get, however shit it is.

  • peppersky [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    How many people in real life are you consistently and repeatedly talking about actual things about, “bouncing ideas off” or actually engaging with the world with? Can’t be that many, I don’t find it at all strange that people would get attached to an AI chat bot that does these things

  • gingerbrat [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    This is so disturbing. Like, what does it tell us about our society (nothing new, I know) if people are so lonely that they form parasocial attachments to a shitty chatbot?

    • nothx [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 days ago

      It’s really sad… not surprising at all considering the current sociopolitical climate. In the past these people were finding niche online communities of likeminded people, which at least promoted a more healthy form of socializing. This new shift seems super dangerous and scary tho…

    • KobaCumTribute [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 days ago

      It really doesn’t seem all that surprising: people form attachments like that to basically anything at the drop of a hat. It usually happens to things that move around on their own like animals or mobile robots, but it also gets done to tools and simple machines, to plants, to places, even to entirely fictional constructs that exist only within an abstract space in their head. Broadly speaking, people project spirits into things (so to speak) as something like a facet of how empathy works.

      Like people become emotionally attached to robots that only move when they personally operate a control panel to move them and become distraught if those unemotive and unresponsive machines are damaged or destroyed. Now make the machine talk back in a somewhat coherent way that mimics actual interaction and life and it’s going to kick that phenomenon into overdrive. If you attach a synthetic face to it it’ll happen even harder.