ChatGPT use declines as users complain about ‘dumber’ answers, and the reason might be AI’s biggest threat for the future::AI for the smart guy?

  • nottheengineer@feddit.de
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    It definitely got more stupid. I stopped paying for plus because the current GPT4 isn’t much better than the old GPT3.5.

    If you check downdetector.com, it’s obvious why they did this. Their infrastructure just couldn’t keep up with the full size models.

    I think I’ll get myself a proper GPU so I can run my own LLMs without worrying that they could stop working for my use case.

    • anlumo@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      GPT4 needs a cluster of around 100 server-grade GPUs that are more than 20k each, I don’t think you have that lying around at home.

      • nottheengineer@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        I don’t, but a consumer card with 24GB of VRAM can run a model that’s about as powerful as the current GPT3.5 in some use cases.

        And you can rent some of that server-grade hardware for a short time to do fine-tuning, which lets you surpass even GPT4 in some niches.