• RecallMadness@lemmy.nz
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      6 months ago

      While suing everyone else that makes shovel handles that work with your shovel heads.

    • Baggie@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      God I hope so, but the next thing will likely be even more stupid than this, NFTs and crypto.

  • dogslayeggs@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    2
    ·
    6 months ago

    I didn’t know there were that many PC gamers out there. /s

    Seriously, though, the pivot from making video cards to investing in AI and crypto is kinda genius. The crypto thing mostly fell into their laps, but they leaned in. The AI thing, though, I’m not sure how they decided to focus on that or who first pitched the idea to the board; but that was business genius.

    • dkc@lemmy.world
      link
      fedilink
      English
      arrow-up
      41
      arrow-down
      3
      ·
      6 months ago

      To your point, when you look at both crypto and AI I see a common theme. They both need a lot of computation, call it super computing. Nvidia makes products that provide a lot of compute. Until Nvidia’s competitors catch up I think they’ll do fine as more applications that require a lot of computation are found.

      Basically, I think of Nvidia as a super computer company. When I think of them this way their position makes more sense.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        6 months ago

        Also those thing are highly parallelizable and mainly deal with vector and matrix data, so the same “lots of really simple but fast processing units optimized for vectors and matrix operations working in parallel” that works fine for modern 3D Graphics (for example, each point on a frame image to display on the screen can be calculated in parallel with all the other points - in what’s called a fragment shader - and most 3D data is made of 3D vectors whilst the transforms are 3x3 Matrices) turns out to also work fine for things like neural networks were the neurons in each layer are quite simple and can all be processed in parallel (if the architecture of that wasn’t layered, GPUs would be far less effective for it).

        To a large extent Nvidia got lucky that the stuff that became fashionable now works by doing lots of simple and highly paralellizeable computations, since otherwise it would’ve been the makers of CPUs that gained from the rise of said computing power demanding tech.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      edit-2
      6 months ago

      They were doing that for years before it became popular. The same tech for video graphics just so happened to be useful for AI and big data, and they doubled down on supporting enterprise and research efforts in that when it was a tiny field before their competitors did, and continued to specialize as it grew.

      Supporting niche uses of your product can sometimes pay off if that niche hits the lottery.

      • webghost0101@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 months ago

        Hardware made for heavy computing being good at stuff like this isn’t all that schokking though. The biggest gamble is if new technology will take off at all. Nvidia, just like google has the capital to diversify, bet on all the horses at once to drop the losers later.

    • chrash0@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      6 months ago

      same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

    • RecallMadness@lemmy.nz
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      6 months ago

      They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.

      Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.

      Like Apple, but worse.

      I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.

    • slacktoid@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 months ago

      To their credit they’ve been pushing GPGPUs for a while. They did position themselves well for accelerators. Doesn’t mean they don’t suck.

    • swayevenly@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      DLSS was a necessity to make gains at speeds their hardware could not keep up with.

  • zewm@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    arrow-down
    2
    ·
    edit-2
    6 months ago

    All that value and they still can’t get their video cards to work worth a shit in Linux.

    • DaPorkchop_@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      4
      ·
      6 months ago

      Why does everyone always complain about Nvidia support on Linux? I’ve been using Nvidia GPUs on Ubuntu and Debian for years and it has never required any more effort than ‘sudo apt install nvidia-driver’.

      • zewm@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 months ago

        It’s not difficult to install the drivers. I recently had to swap out my 3090 for an AMD card because Wayland just crashes and works poorly with Nvidia.

        • TheGrandNagus@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          6 months ago

          You should probably rephrase that to say Nvidia crashes and works poorly with Wayland.

          Saying Wayland works badly with Nvidia is a bit like saying Linux doesn’t support Photoshop, rather than the other way around.

            • TheGrandNagus@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              Not really, the wording completely changes who is at fault.

              When you say Wayland doesn’t work for Nvidia, it’s blaming Wayland, but Linux/Wayland isn’t at fault here, Nvidia is for providing drivers that aren’t fit for purpose.

              If Nvidia drivers broke on Windows, nobody would say “Windows is broken for Nvidia”, they’d say the opposite, but with Linux we act like the problem is Wayland, for some reason.

      • bitwolf@lemmy.one
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        In my experience newer kernels and Wayland + nvidia is a huge mess.

        I switched to AMD and have had 0 downtime, all the cool features nvidia touts, and fully working Wayland with no effort at all.

    • Victor@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      I’m using a 2080 Super since 2020 and it’s been mostly gravy. Granted, I’ve not been using anything Wayland-related. But I’m gaming on Steam and shit and it works wonderfully. Better performance than on Windows. Though there is some slight audio delay. A few milliseconds over Windows.

      I’ve been looking to switch to Hyprland but it was a bit glitchy with gaming and screen sharing sometimes so I’m holding off on that until I jump over to the AMD ship. It’ll be sweet.

    • mal3oon@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      5
      ·
      6 months ago

      What card are you using? Their Linux support in the past years is impressive. They even have open source drivers now (still beta). And thanks to proton, gaming is seemless on Linux. I don’t see the issue you’re describing?

      • zewm@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        I was using 3090 but had to swap to an AMD card due to too many crashes and visual glitches/artifacts.

    • dev_null@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      5
      ·
      6 months ago

      I’ve been using Nvidia cards on Linux for many years and never had issues. I did have issues with the laptop cards (Optimus switching), but on the desktop it was always flawless for me.

      • accideath@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        6 months ago

        I mean, they work. But the drivers aren’t as feature complete as AMD or intel. Wayland support was a strict no until very recently and gamescope support is still very hit n miss and they are less stable than their competition. They’re completely useable though. My 1650 runs well, most of the time.

        • dev_null@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          6 months ago

          When I was in the market for a new card 2 years ago I looked into AMD, but learned that they don’t work as well as Nvidia for GPU passthrough to VMs, which I need to work. I’d love to switch because Nvidia is a shit company, but AMD GPU’s just don’t work for my use case.

          I’m curious though because I don’t know what I’m missing. What are the features in AMD drivers that make it more complete?

          • accideath@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 months ago

            As I said, AMD works much better with wayland and gamescope, thus has, for example, HDR and VRR support. Besides that, their Linux drivers are open source and more stable.

            But to my knowledge, AMD GPUs pass through just fine to VMs? What was your problem with them?

            • dev_null@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              Do many distros use Wayland now? I use Kubuntu and it doesn’t, so that probably explains why I never ran into any issue with that. Gamescope looks like some Wayland tool too from what I see. I don’t have an HDR monitor either. Looks like good stuff, that I just never needed so never noticed it not working.

              But to my knowledge, AMD GPUs pass through just fine to VMs? What was your problem with them?

              I asked on the VFIO subreddit back then and was told AMD cards have a bug where you have to restart the PC to switch between host and VM (which makes it no better than dualbooting since you have to restart anyway), this was not the case on Nvidia.

              So now that Nvidia has open source drivers and works on Wayland, what’s the difference? Just gamescope?

      • zewm@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        I guess you aren’t using Wayland. It’s abysmal with Wayland. Especially electron apps. They just flicker and crash.

          • lud@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Pretty sure Wayland is installed by default and maybe even enabled by default on new installs.

            On the login screen there should be a button to switch between x11 and Wayland.

            • dev_null@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              Oh yeah there is a button to switch on the login screen, but X11 is the default and I never saw a reason to switch the default.

              • lud@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                6 months ago

                Personally, I wouldn’t say “it doesn’t use Wayland” when it absolutely can with a single mouse click and it works great.

  • frezik
    link
    fedilink
    English
    arrow-up
    51
    ·
    6 months ago

    Last year’s Nvidia keynote at Computex had Jensen trying to get the audience to have an awkward, AI-generated sing along. The market thought this was great and sent the market cap over $1T.

    For this year’s keynote, Jensen wandered the stage like he was looking for his cat while rambling about language models. The market thinks this is great and sent the market cap over $3T.

    For the second biggest company on Earth, he is a shockingly bad speaker, and completely ill prepared. For some reason, the market loves this guy.

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      6 months ago

      Is it that the market loves him or is it that a CEO’s keynote isn’t really that big a deal and is mostly an ego-stroking event?

      Because I’m guessing what the market actually loves is the new products that are announced.

      • frezik
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 months ago

        That’s the thing: no new products were announced.

        • bitwolf@lemmy.one
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 months ago

          For consumers. They’re pushing put giant power hungry gpus for data centers to power LLM.

          Most of the valuation is likely consumers hyping the bull run, and speculation about just how much b2b revenue they will get.

          • frezik
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 months ago

            They didn’t though. Blackwell was announced before this, and there isn’t any real specifics besides showing some prototypes. There’s some software stuff about improving Pandas and pregenerated LLMs. That’s about it.

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            This weekend I proposed to my girlfriend, here’s what it taught me about B2B sales…

      • mal3oon@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        6 months ago

        Their main growth drivers are data centers, when demand will dry within 2 years, a bubble will pop. Especially when theoretical architecture of Neural Network change, the need for high performance will decrease.

        • frezik
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 months ago

          See also: Sun Microsystems, who made tons of servers that drove the dotcom boom. They didn’t fare so well afterword.

          This is a “grab the pile of cash and be happy” situation.

        • Aux@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          CUDA has a lot of applications outside of AI. They’ll just refocus on the next bubble and will continue hoarding wealth.

          • lud@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            No not really, the GPUs that datacenters are buying doesn’t even have a display output so they are useless to the vast majority of home users.

            If you run a home lab then maybe.

          • Aceticon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            Well, there are a period after the last Bitcoin bubble burst when the best way to get a good Graphics card for cheap was buying a used one from on the Bitcoin miner operations that were closing down.

  • flop_leash_973@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    3
    ·
    6 months ago

    The real game now is how long will it last before the hype and with the the floor falls out of “AI” and a good chunk of their stock gains with it.

    • bamboo@lemm.ee
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      6 months ago

      I don’t think generative AI is going anywhere anytime soon. The hype will eventually die down, but it’s already proved its usefulness in many tasks.

      • Neshura@bookwormstory.social
        link
        fedilink
        English
        arrow-up
        15
        ·
        6 months ago

        Is AI useful? Maybe. But is it profitable? AI will go the same way .com did: there will be a massive crash and at the end of that you’ll see who actually had their pants on

        • Nighed@sffa.community
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 months ago

          Nvidia IS making a profit on it though. It’s the whole “in a good rush, sell shovels” thing.

          • Neshura@bookwormstory.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 months ago

            My Point is more that their revenue stream will temporarily take a giant hit during that, when everyone is busy going bankrupt the few AI companies that make a profit with it have better things to do than buy new Accelerators right that instant.

        • bamboo@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          6 months ago

          It can be quite profitable. A ChatGPT subscription is $20/m right now, or $240/year. A software engineer in the US is between $200k and $1m with all benefits and support costs considered. If that $200k engineer can use ChatGPT to save 2.5 hours in a year, then it pays for itself.

          • Neshura@bookwormstory.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            6 months ago

            It’s quite funny that you think ChatGPT is making a profit on that 20$ subscription if you replace a software dev with it.

            The bust won’t be because it’s not profitable to use AI but because the companies selling the service cannot do so at rates which are both profitable and actually marketable. Case in point: OpenAI has not made a single cent of profit so far (or at least not reported a profit). The way AI is currently shoved in everywhere is not sustainable because the cost of running an AI model cannot be recuperated by most of these new platforms.

            • bamboo@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              6 months ago

              OpenAI is a non-profit. Further, US tech companies usually take many years to become profitable. It’s called reinvesting revenue, more companies should be doing that instead of stock buybacks.

              Let’s suppose hosted LLMs like ChatGPT aren’t financially sustainable and go bust though. As a user, you can also just run them locally, and as smaller models improve, this is becoming more and more popular. It’s likely how Apple will be integrating LLMs into their devices, at least in part, and Microsoft is going that route with “Copilot+ PCs” that start shipping next week. Integration aside, you can run 70B models on an overpriced $5k MacBook Pro today that are maybe half as useful as ChatGPT. The cost to do so exceeds the cost of a ChatGPT subscription, but to use my numbers from before, a $5k MacBook Pro running llama 3 70B would have to save an engineer one hour per week to pay for itself in the first year. Subsequent years only the electrical costs would matter, which for a current gen MacBook Pro would be about equivalent to the ChatGPT subscription in expensive energy markets like Europe, or half that or less in the US.

              In short, you can buy overpriced Apple hardware to run your LLMs, do so with high energy prices, and it’s still super cheap compared to a single engineer such that saving 1 hour per week would still pay for itself in the first year.

              • Neshura@bookwormstory.social
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                1
                ·
                6 months ago

                Yeah I don’t know why you keep going on about people using AI when my point was entirely that most of the companies offering AI services don’t have a sustainable business model. Being able to do that work locally if anything strengthens my point.

          • frezik
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            6 months ago

            I’ve seen pull requests filled with ChatGPT code. I consider my dev job pretty safe.

            • bamboo@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 months ago

              ChatGPT isn’t gonna replace software engineers anytime soon. It can increase productivity though, that’s the value LLMs provide. If someone made a shitty pull request filled with obvious ChatGPT output, that’s on them and not the technology. Blaming ChatGPT for a programmer’s bad code is like blaming the autocomplete in their editor for bad code: just because the editor suggests it doesn’t mean you have or should accept it if it’s wrong.

    • Damage@feddit.it
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      6 months ago

      Well, they also make good silicon that is apparently useful for different things, that may not change… If it’s good for the next fad as well, they’ll just stay on top.

  • MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 months ago

    I feel like the executives are all in this “AI” echo chamber. Like, most people grossly misunderstand what AI is, what it does and what it cannot do, with current tech… And all the execs are sitting around in a circle jerk making up solutions using AI, for which there is no problem to solve.

    Don’t get me wrong, some companies are doing cool shit with it. Not necessarily practical shit, but cool nonetheless, other companies just seem to be drinking the AI Kool aid and throwing it at fucking everything for no goddamned reason just to get in on the hype. Investors are close behind, trying to ride the coattails of their “success” to riches, and it’s all just a self-reaffirming system with no basis in reality.

    Nvidia is the one profiting here, all this AI smoke and mirrors needs something for it to run on top of, they’re selling the physical tools to make it go. Whether it goes somewhere useful or drives off a goddamned cliff, doesn’t matter to Nvidia in the slightest. They made their money. Get wrecked.

  • starman@programming.dev
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 months ago

    I wonder why AMD stock hasn’t really gone significantly up (500% vs over Nvidia’s 3000% in last 5Y). They make GPUs too

    • GoodEye8@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 months ago

      Because of a lot of things. From graphics side RTX and DLSS left AMD catching up (even if RTX isn’t really that big of a deal now), then there was Nvidia cards being better at crypto mining and now it’s Nvidia cards being better at AI computation + Nvidia pivoting into AI hardware space…

      If you want to boil it down to the undeniable, it’s that Nvidia is just better at marketing. Everyone knows what Nvidia is doing. What is AMD doing? Besides playing catch-up to Nvidia.

    • lorty@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      This is all AI hyoe, which Nvidia is sadly much ahead of their competitors.

      • guacupado@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Problem is, that’s why they’re jacking up their price and pumping GPUs so quickly a good chunk of them are DOA and their customer service sucks too. No one likes dealing Nvidia on any level which is why everyone is making their own asics to get away from having to buy Nvidia gpus.