• snooggums
    link
    fedilink
    English
    arrow-up
    21
    ·
    26 days ago

    This is absolutely in line with who buys into AI hype and why it is infuriating to try to convince them that they are reading way too much into how it seems to know things when all it is doing it returning results are statistically likely to be found as helpful to the audience it is designed for.

    I have said that LLMs and other AI are designed to return what people want to see/hear. It doesn’t know anything and will never be useful as a knowledge base or an independently functioning diagnostic tool.

    It certainly has uses, but it certainly isn’t going to solve all the things that are promoted by the AI hype train.

    • MagicShel@programming.dev
      link
      fedilink
      arrow-up
      7
      ·
      26 days ago

      I don’t buy into it, but it’s so quick and easy to get an answer, if it’s not something important I’m guilty of using LLM and calling it good enough.

      There are no ads and no SEO. Yeah, it might very well be bullshit, but most Google results are also bullshit, depending on subject. If it doesn’t matter, and it isn’t easy to know if I’m getting bullshit from a website, LLM is good enough.

      I took a picture of discolorations on a sidewalk and asked ChatGPT what was causing them because my daughter was curious. Metal left on the surface rusts and leaves behind those streaks. But they all had holes in the middle so we decided there were metallic rocks missed into the surface that had rusted away.

      Is that for sure right? I don’t know. I don’t really care. My daughter was happy with an answer and I’ve already warned her it could be bullshit. But curiosity was satisfied.

      • Gaywallet (they/it)@beehaw.orgOP
        link
        fedilink
        arrow-up
        18
        ·
        26 days ago

        Is that for sure right? I don’t know. I don’t really care. My daughter was happy with an answer and I’ve already warned her it could be bullshit. But curiosity was satisfied.

        I’m not sure if you recognize this, but this is precisely how mentalism, psychics, and others in similar fields have always existed! Look no further than Pliny the elder or Rasputin for folks who made a career out of magical and mystical explanations for everything and gained great status for it. ChatGPT is in many ways the modern version of these individuals, gaining status for having answers to everything which seem plausible enough.

        • MagicShel@programming.dev
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          26 days ago

          She knows not to trust it. If the AI had suggested “God did it” or metaphysical bullshit I’d reevaluate. But I’m not sure how to even describe that to a Google search. Sending a picture and asking about it is really fucking easy. Important answers aren’t easy.

          I mean I agree with you. It’s bullshit and untrustworthy. We have conversations about this. We have lots of conversations about it actually, because I caught her cheating at school using it so there’s a lot of supervision and talk about appropriate uses and not. And how we can inadvertently bias it by the questions we ask. It’s actually a great tool for learning skepticism.

          But some things, a reasonable answer just to satisfy your brain is fine whether it’s right or not. I remember in chemistry I spent an entire year learning absolute bullshit about chemistry only for the next year to be told that was all garbage and here’s how it really works. It’s fine.

      • snooggums
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        26 days ago

        Yes, treating AI answers with the same skepticism as web search results is a decent way to make it useful. Unfortunately the popular AI systems seem to be using multiple times as much energy to give answers that aren’t even as reliable as google used to be.

        Back in the day google was using the same ‘was this information useful’ to return results before the SEO craze took off.

        And yes, if the stains look like rust and there is a gap then there was a ferrous rock in the mix that rusted away. I have a spot on my sidewalk and a stone slab thing, and found out what caused it from someone who works with those materials!