I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

  • Sage the Lawyer@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    3
    ·
    edit-2
    1 year ago

    It can be useful for top-level queries that deal with well-settled law, as a tool to point you in the right direction with your research.

    For example, once, I couldn’t recall all the various sentencing factors in my state. ChatGPT was able to spit out a list to refresh my memory, which gave me the right phrases to search on Lexis.

    But, when I asked GPT to give me cases, it gave me a list of completely made up bullshit.

    So, to get you started, it can be useful. But for the bulk of the research? Absolutely not.