Tux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 10 hours agoSoftware: Then vs Nowlemmy.worldimagemessage-square5fedilinkarrow-up123arrow-down19cross-posted to: memes@lemmy.ml
arrow-up114arrow-down1imageSoftware: Then vs Nowlemmy.worldTux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 10 hours agomessage-square5fedilinkcross-posted to: memes@lemmy.ml
minus-squareMako_Bunny@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up3·9 hours agoWhat Python code runs on a graphics card?
minus-squareapfelwoiSchoppen@lemmy.worldlinkfedilinkEnglisharrow-up10·8 hours agoPhyton, not Python. 🙃
minus-squareBougieBirdie@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up2·7 hours agoPython has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI There’s also some cuda libraries which by definition do things directly on the card
minus-squareTarogar@feddit.orglinkfedilinkEnglisharrow-up1·8 hours agoYes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.
What Python code runs on a graphics card?
Phyton, not Python. 🙃
Python has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI
There’s also some cuda libraries which by definition do things directly on the card
Yes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.