Humans are “trained” with maybe ten thousand “tokens” per day
Uhhh… you may wanna rerun those numbers.
It’s waaaaaaaay more than that lol.
and take only a couple dozen watts for even the most complex thinking
Mate’s literally got smoke coming out if his ears lol.
A single Wh is 860 calories…
I think you either have no idea wtf you are talking about, or your just made up a bunch of extremely wrong numbers to try and look smart.
Humans will encounter hundreds of thousands of tokens per day, ramping up to millions in school.
An human, by my estimate, has burned about 13,000 Wh by the time they reach adulthood. Maybe more depending in activity levels.
While yes, an AI costs substantially more Wh, it also is done in weeks so it’s obviously going to be way less energy efficient due to the exponential laws of resistance. If we grew a functional human in like 2 months it’d prolly require way WAY more than 13,000 Wh during the process for similiar reasons.
Once trained, a single model can be duplicated infinitely. So it’d be more fair to compare how much millions of people cost to raise, compared to a single model to be trained. Because once trained, you can now make millions of copies of it…
Operating costs are continuing to go down and down and down. Diffusion based text generation just made another huge leap forward, reporting around a twenty times efficiency increase over traditional gpt style LLMs. Improvements like this are coming out every month.
True, my estimate for tokens may have been a bit low. Assuming a 7 hour school day where someone talks at 5 tokens/sec you’d encounter about 120k tokens. You’re off by 3 orders of magnitude on your energy consumption though; 1 watt-hour is 0.86 food Calories (kcal).
Uhhh… you may wanna rerun those numbers.
It’s waaaaaaaay more than that lol.
Mate’s literally got smoke coming out if his ears lol.
A single
Wh
is 860 calories…I think you either have no idea wtf you are talking about, or your just made up a bunch of extremely wrong numbers to try and look smart.
Humans will encounter hundreds of thousands of tokens per day, ramping up to millions in school.
An human, by my estimate, has burned about 13,000 Wh by the time they reach adulthood. Maybe more depending in activity levels.
While yes, an AI costs substantially more
Wh
, it also is done in weeks so it’s obviously going to be way less energy efficient due to the exponential laws of resistance. If we grew a functional human in like 2 months it’d prolly require way WAY more than 13,000Wh
during the process for similiar reasons.Once trained, a single model can be duplicated infinitely. So it’d be more fair to compare how much millions of people cost to raise, compared to a single model to be trained. Because once trained, you can now make millions of copies of it…
Operating costs are continuing to go down and down and down. Diffusion based text generation just made another huge leap forward, reporting around a twenty times efficiency increase over traditional gpt style LLMs. Improvements like this are coming out every month.
True, my estimate for tokens may have been a bit low. Assuming a 7 hour school day where someone talks at 5 tokens/sec you’d encounter about 120k tokens. You’re off by 3 orders of magnitude on your energy consumption though; 1 watt-hour is 0.86 food Calories (kcal).