• Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    2
    ·
    edit-2
    2 months ago

    I predicted in 2017 stock price over $100 when that happened.
    Took about 3-4 years longer than expected, but still congratulations to AMD, on their successful fight back from the brink of bankruptcy.

    • Snot Flickerman@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      1
      ·
      edit-2
      2 months ago

      Not to diminish the hard work AMD has put in, but it’s at least partially related to Intel’s ongoing issues with quality assurance (or the lack thereof, rather), and thus it’s arguable that they hold a stronger position at least partially due to Intel’s weakness in the last 10 years.

      • Fubarberry@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        88
        ·
        2 months ago

        Having a usable product while your opponents continually shoot themselves in the foot is a viable market strategy.

          • frezik
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 months ago

            Sony is also really good at this. With the PS2 against the Dreamcast, they walked on stage, said “$299”, and walked off. Later, the PS3 was struggling against the XB360, but then the Red Ring of Death issues popped up and they pulled way ahead. Microsoft then tries a bunch of Kintect crap with the next generation, and Sony says “do you want to play games? Buy a PS4. It will play games” and they win that generation outright.

            Tons of other problems with Sony, but they are masters of taking advantage of competitors’ mistakes.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        ·
        edit-2
        2 months ago

        Absolutely, if Intel hadn’t been sleeping on their laurels for 5 years on desktop performance, and had made 6 and 8 core CPUs themselves before Ryzen arrived. Ryzen would not have been nearly as successful. This was followed by the catastrophic Intel 10nm fab failures, allowing AMD to stay ahead even longer.

        So absolutely, AMD has been helped a lot by Intel failing to react in time, and then failing in execution when they did react.
        Still I think congratulation is in order, because Ryzen was such a huge improvement on the desktop and server, that they absolutely deserve their success. Threadripper was icing on the cake, and completely trashed Intel in the workstation segment.

        And AMD exposed Intel’s weakness in face of real competition. Arm and Nvidia had already done that in their respective areas, but AMD did it on Intel’s core business.

        • aard@kyu.de
          link
          fedilink
          English
          arrow-up
          25
          ·
          2 months ago

          For people who weren’t looking for a developer workstation back then: Threadripper suddenly brought the performance of a xeon workstation costing more than 20k for just a bit over 2k.

          That suddenly wasn’t a “should I really invest that much money” situation, but a “I’d be stupid not to, productivity increase will pay for that over the next month or so”

          • boonhet@lemm.ee
            link
            fedilink
            English
            arrow-up
            14
            ·
            2 months ago

            productivity increase will pay for that over the next month or so

            Found the fellow Rust developer

            Cargo build universe

        • Snot Flickerman@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          15
          ·
          edit-2
          2 months ago

          For sure, and as someone who has been stuck running Linux on an Intel box after being spoiled by all-AMD for about 6 years, I gotta say, the fact that a lot of AMD stuff “just works” in Linux when you have to jump through hoops for the same from Intel is probably a big reason they’re picking up in datacenters, too. Datacenters don’t usually run on fucking Windows Server, they usually run Linux, and AMD just plays better with Linux at the moment. (In my personal experience, anyway)

          • Buffalox@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            Yes this too is really a turnaround compared to “old times”. Intel used to be the safe choice, that’s definitely not the case anymore.

        • frezik
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          Their entire architecture also seems to be just plain behind now. The Ultra 2xx series of processors is not only on TSMC, but on a better node than AMD is using for Ryzen 9000 series. But you wouldn’t know it from the benchmarks of either performance or efficiency.

      • frezik
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 months ago

        Their market cap crossed paths well before that the 14th gen issues. Intel seems to be rushing things specifically because they’re trying to catch up to AMD, and is sacrificing too much to get there.

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      2 months ago

      What the fuck??? Insert Jumanji meme “What year is it?”

      Numbers check out too. Wintel, slayed, and we didn’t even notice.

      • palordrolap@fedia.io
        link
        fedilink
        arrow-up
        8
        ·
        2 months ago

        The whole ring -3 / MINIX business a while back put a serious amount of FUD into the market and Intel has been on the wane ever since.

        This is not necessarily unfounded FUD either. MINIX is literally there, lurking inside all modern Intel processors, waiting to be hacked by the enterprising ne’er-do-well. (NB: This is not to say that there aren’t ways to do similar things to AMD chips, only that MINIX is not present in them, and it’s theoretically a lot more difficult.)

        Then bear in mind that MINIX was invented by Andrew Tanenbaum, someone Linus Torvalds has had disagreements with in the past (heck, Linux might not exist if not for MINIX and Linus’ dislike of the way Tanenbaum went about it), and so there’s an implicit bias against MINIX in the data-centre world, where Linux is far more present than it is on the desktop.

        Thus, if you’re a hypothetical IT manager and you’re going to buy a processor for your data-centre server, you’re ever so slightly more likely to go for AMD.

        • frezik
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 months ago

          Note that Linus’ disagreement was largely over design decisions and microkernel stuff. Linus actually respects Tanenbaum a great deal. Tanenbaum’s book on operating systems is a CS classic and is a direct influence on the young Linus.

          • palordrolap@fedia.io
            link
            fedilink
            arrow-up
            1
            ·
            2 months ago

            Pretty sure my own education had a Tanenbaum book in amongst it, from which I learned a number of things. In another world, one where my brain isn’t its own worst enemy, I could well be one of those IT managers. There the FUD would have been the main factor in my decision. Probably. Because I’m not sure I’d be completely happy if it was a Linux buried in the chipset either. Especially one largely outside my control.

        • Laser@feddit.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          I’d guess this is less about MINIX vs. Linux and more about ultimately having 0 control over or insight into it.

    • ripcord@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2 months ago

      Their P/E is 125

      One fucking hundred and twenty five.

      That’s more than twice Nvidia. It’s completely disconnected from reality.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    5
    ·
    2 months ago

    Granite Rapids is probably going win some of that back: a lot of the largest purchasers of x86 chips in the datacenter were buying Epycs because you could stuff more cores into a given amount of rack space than you could with Intel, but the Granite Rapids stuff has flipped that back the other way.

    I’m sure AMD will respond with EVEN MORE CORES, and we’ll just flop around with however many cores you can stuff into $15,000 CPUs and thus who is outselling whom.

    • aard@kyu.de
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 months ago

      It’s not just cores - it is higher performance per rack unit while keeping power consumption and cooling needs the same.

      That allows rack performance upgrades without expensive DC upgrades - and AMD has been killing dual and quad socket systems from intel with single and dual core epycs since launch now. Their 128 core one has a bit too high TDP, but just a bit lower core count and you can still run it in a rack configured for power and cooling needs from over a decade ago.

      Granite rapids has too high TDP for that - you either go upgrade your DC, or lower performance per rack unit.

    • frezik
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      It’s not just performance, though. It’s also trust. If performance per watt was all that mattered, AMD would have cornered the server market years ago. Intel held on because they were considered rock solid stable–very important in a server. That trust was completely broken by the recent instability issues.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I didn’t think the consumer-level chip immolation carried over to their xeons?

        If it did, holy crap, they’re mega-ultra-turbo-plaid levels of screwed.

        • frezik
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Not quite that, but more that the entire thing brings into question Intel’s competence.