• Saik0@lemmy.saik0.com
      link
      fedilink
      English
      arrow-up
      138
      ·
      6 months ago

      To me this is even worse though. They’re using your electricity and CPU cycles to grab the data they want which lowers their bandwidth bills.

      It happening “locally” while still sending all the metadata home is just a slap in the face.

    • 👍Maximum Derek👍@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      38
      arrow-down
      4
      ·
      6 months ago

      That’s a pretty big joke, but I think the bigger joke is calling LLMs AI. We taught linear algebra to talk real pretty and now corps want to use it to completely subsume our lives.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        3
        ·
        6 months ago

        I think the bigger joke is calling LLMs AI

        I have to disagree.

        Frankly, LLMs (which are based on neural networks) seem a Hell of a lot closer to how actual brains work than “classical AI” (which basically boils down to a gigantic pile of if statements) does.

        I guess I could agree that LLMs are undeserving of the term “AI”, but only in the sense that nothing we’ve made so far is deserving of it.

          • grue@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            edit-2
            6 months ago

            I’m not talking about interacting with it. I’m talking about how it’s implemented, from my perspective as a computer scientist.

            Let me say it more concretely: if even shitty expert systems, which are literally just flowcharts implemented in procedural code, are considered “AI” – and historically speaking, they are – then the bar is really fucking low. LLMs, which at least make an effort to kinda resemble the structure of biological intelligence, are certainly way, way above it.

            • degen
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 months ago

              I’m actually sad that the state of AI deserves the hate it gets. Neural networks are so sick, just going through the example of detecting a diagonal on a 2x2 grid was like magic to me. And they made me second guess simulation theory for quite a while lmao

              Tangentially, blockchain was a similar phenomenon for me. Or at least trust networks. One idea was to just throw away Certificate Authorities. Basically federate all the things, and this was before we knew about the fediverse. It gets all the hate because of crypto, but it’s cool tech. The CA thing would probably lead to a bad place too, though.

    • Aniki 🌱🌿@lemm.ee
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      edit-2
      6 months ago

      Runs locally, mirrors remotely.

      To ensure a seamless customer experience when their hardware isn’t capable of running the model locally or if there is a problem with the local instance.

      microsoft, probably.