I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

  • Razgriz@lemmy.worldOP
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    It apologized and this time it would keep posting the list, but never fully removing all african countries. If it removes one it adds another. And if I insist it ends the conversation.

    Jfc

    • xantoxis@lemmy.one
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      This sounds to me like a confluence of two dysfunctions the LLM has: if you phrase a question as if you are making a racist request it will invoke “ethics”, but even if you don’t phrase it that way, it still doesn’t really understand context or what “Africa” is. This is spicy autocomplete. It is working from somebody else’s list of countries, and it doesn’t understand that what you want has a precise, contextually appropriate definition that you can’t just autocomplete into.

      You can get the second type of error with most prompts if you’re not precise enough with what you’re asking.