ChatGPT is just 1 year old. And only recently has it become feasible to run LLMs locally on-device. Apple couldn’t really have planned for it hardware-wise.
A common requirement for these LLMs and stable diffusion models is large memory requirement, which Apple has always been stingy with. Only with the newer hardware do I foresee them increasing base memory, if they want to make AI more accessible.
ChatGPT is just 1 year old. And only recently has it become feasible to run LLMs locally on-device. Apple couldn’t really have planned for it hardware-wise.
A common requirement for these LLMs and stable diffusion models is large memory requirement, which Apple has always been stingy with. Only with the newer hardware do I foresee them increasing base memory, if they want to make AI more accessible.