• Aurenkin@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    2 months ago

    I think the big turning point for that could be the ability to run some advanced models (by today’s standards) on device. Would definitely unlock some pretty cool use cases.

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      I know we’re supposed to hate Apple here, but this is a big reason I’m excited about the upcoming event. I really like their path of on-device AI. I’ve been reading some of their case studies on making models work in limited memory situations and they’re already using their own soc with multiple specialized processing nodes that you can imagine being extended to support on device ai. Now let’s find out what they can deliver

      • Aurenkin@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        Yes, Google has also moved in this direction with tensor and Gemini nano. I expect to see a lot more movement here over the next few years as there is a big financial incentive to offload all that compute cost as well.