• TheDoctor [they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    ·
    6 months ago

    I hope for everyone’s sake that he knows he’s lying. There will indeed be further optimizations in AI computation energy efficiency. There will eventually be ASICs for training models which run a non-standard form of floating point representation which is optimized for LLM training. Those will be more energy efficient. But the idea that LLMs or any near-future iteration on them will be the catalyst for those optimizations is nonsense.

      • TheDoctor [they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 months ago

        Even more delusional the . There are really intensive bits of computer science dedicated to manipulative mathematical symbols and solving advanced maths. They don’t fall under the umbrella of machine learning and no amount of GPU cores will changes that.