I expect a creative destruction, like what happened with the dotcom bubble. A ton of GenAI companies will go bust and the market will be flooded with cheap GPUs and other AI hw which will be snapped on the cheap, and enthusiasts and researches will use them to make actually useful stuff.
Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.
I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.
I mean there are a lot of applications for linear algebra, although I admit I don’t fully know in what way “AI” uses linear algebra and what other uses overlap with it.
I expect a creative destruction, like what happened with the dotcom bubble. A ton of GenAI companies will go bust and the market will be flooded with cheap GPUs and other AI hw which will be snapped on the cheap, and enthusiasts and researches will use them to make actually useful stuff.
these are compute GPUs that don’t even have graphics ports
Yes, my point is that the compute from those chips can still be used. Maybe on actually useful machine learning tools that will be developed latter, or some other technology which might make use of parallel computing like this.
I know of at least one company that uses cuda for ray-tracing for I believe ground research, so there is definitely already some usefull things happening.
I mean there are a lot of applications for linear algebra, although I admit I don’t fully know in what way “AI” uses linear algebra and what other uses overlap with it.
I’m waiting on the a100 fire sale next year