• shufflerofrocks@beehaw.org
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    No lol

    ChatGPT sucks at proper answers in my experience, I tried using it to generate code or summarise documents for me, but it sucked at both.

    I can google well enough to almost always get what I need, but I can also see areas where google search is pretty shit right now - almost everything non-tech related that I google gives me a shit feed of SEO-keywords-bloated pages that have no actual content.

    Problem is, I don’t think any search engine still comes close to Google. I’ve tried DuckDuckGo, and it’s crap in my experience.

    I’ve had good luck with Yandex, but everything else is meh.

    What are your search engine recommendations?

  • JackFromWisconsinA
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    Not because of ChatGPT. But I encourage everyone to stop using Google and use one of the many other search engines. Break that Google supremacy.

  • Barbarian@lemmy.ml
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    ChatGPT is absolutely NOT the right tool for the job if you want answers to questions. People need to understand that ChatGPT is not an AI. It doesn’t even have the concept of truth and fiction, let alone the ability to differentiate.

    It is a very sophisticated and very advanced autocomplete, using probabilities to attempt to predict the next word (or tokens, if you want to get technical) over and over again until probability says it’s done. It’s great for writing boilerplate documents without any facts it has to get right, creative writing (although it tends to produce very cliche text, understandably) and boilerplate code that’s been written millions of times before in its training data. Do NOT use it for anything where facts matter.

  • cavemeat@beehaw.org
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    I moved to a non-google search engine. The problem with ChatGPT is that it sounds very plausible and truthful, but is often just making shit up.

    • noodlejetski@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      edit-2
      1 year ago

      it sounds very plausible and truthful, but is often just making shit up.

      I’ve seen someone call it “mansplaining as a service”.

    • Mersampa@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      It helps to understand what ChatGPT is, and what it isn’t.

      ChatGPT does not understand anything you say. And it only does one word (technically part of a word, but to keep it simple) at a time.

      What it’s doing is it is guessing the most likely next word based on the words that have come before it. If you think of you phone’s keyboard, it probably has word suggestions for what to say next. ChatGPT is like hitting the recommended word over and over until it has an answer. It’s spouting words based on how likely the word is to come next. That is all.

      It uses advanced machine learning to do that, but whether it counts as AI is for the reader to decide. But it’s certainly not planning out a thoughtful answer for you.

      And that’s not even taking into account that the training data largely comes from the internet, the place where people continuously make shit up.

  • noodlejetski@beehaw.org
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    1 year ago

    given that ChatGPT often gives you answers that sound right but are completely wrong, including - but not limited to - dates and causes of death of the very people who ask for biographical notes about themselves, made up articles, or made up legal cases, I don’t see how can anyone use it as a search engine replacement.

    • SolarSailer@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Glad someone mentioned the lawyer that screwed up by including ChatGPT’s fake cited sources. It will be interesting to see what comes from this.

      Additionally not a lot of people realize that they’ve signed an indemnification clause when using ChatGPT (or what that means).

      Basically OpenAI can send you the legal bills for any lawsuits that come from your use of ChatGPT. So if you “jailbroke” ChatGPT and posted an image of it telling you the recipe to something illegal. OpenAI could end up with a lawsuit on their hands and they would bill that user for all of the legal fees incurred.

      Possibly the first case of this we’ll see will be related to the defamation case that a certain Mayor from Australia could have against OpenAI. https://gizmodo.com/openai-defamation-chatbot-brian-hood-chatgpt-1850302595

      Even if OpenAI wins the lawsuit they will most likely bill the user who posted the image of ChatGPT defaming the mayor.

    • argv_minus_one@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Imagine being told that you’re dead.

      “You died on 2 August 1979. You were on a fishing ship at sea that sank with all hands.”

      You are a computer programmer, you were born in 1985, you’ve never been within 200 miles of any body of water larger than a river, and come to think of it, you’ve always had a peculiar fear of oceans and lakes.

      Did the AI make up a story of your death…or is it somehow aware that that’s the day your previous life ended?

  • Mindless_Enigma@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Not at all. ChatGPT is a great tool for helping with brainstorming or creating a base for further work, but I wouldn’t rely on it at all for accurate answers to questions. ChatGPT’s main function is to create responses to prompts that look like real answers. It has no inclination to give you a correct answer and really has no clue what a correct or incorrect answer is. With the amount of effort you have to put in to research and verify the answer ChatGPT gave you is correct, you’re probably better off just skipping it and just doing the research yourself.

  • runekn@beehaw.org
    link
    fedilink
    arrow-up
    7
    ·
    edit-2
    1 year ago

    No. It is a language model, not a wiki. A very good model which can output correct info, but there is no rigid mechanism to restrict it to only facts.

  • jherazob@beehaw.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Tried it for work a few times. Every single time it would give good information mixed in with complete, hard to spot bullshit which would break the result. So no, I’m ignoring the thing for now except as a toy.

  • jjsearle@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    I find too often that ChatGPT will just make things up when asking things outside of the basics. Most of the time it is just quicker to search for official documentation and read it than relying on ChatGPT’s answers being right.

  • drowned Phoenician@feddit.de
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I feel like most of my searches are too specific. At least specific enough that I wouldn’t trust ChatGPT on the details anyway and would have to confirm the facts by googling

  • !ozoned@lemmy.world@beehaw.org
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I de-googled years ago. DuckDuckGo works just fine for me and if it doesn’t I can just add a !g to my search results and it’s proxied through DDG. Proton handles all the services they can. Even put LineageOS on my Android devices.

    Anyone has any other recommendations for degoogling I’d love to hear them. :-)

    So no. I also don’t use ChatGPT though.

    • noodlejetski@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I can just add a !g to my search results and it’s proxied through DDG.

      no it’s not. it redirects you to a regular old Google search result page, same as if you went to google dot com and searched from there.