You heard him 4090 users, upgrade to a more powerful GPU.

  • Lemminary@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    2 years ago

    "We optimized it for the very high end of computers. The issue is your wallet."Kek mf’ing w

  • lustyargonian@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    2 years ago

    Damn this is a pathetic response. He could’ve said “We’ve tried our best to make it as polished as possible before launch, and are working towards further optimising it to give you the best experience, wherever you play”. Even if they did jackshit, it would not come out as condescending and snarky. Maybe he wasn’t prepared for a tough question on the spot right at the beginning of the interview, but it does show how he thinks about his games. In his mind, the game running at all on PC is optimised enough.

    I am not saying he’s bad for not making Creation Engine super optimised engine on this planet, I’m saying he’s bad for not acknowledging it is currently most demanding engine despite looking merely half as good as Cyberpunk 2077 or idk Arkham Knight.

    • Edgelord_Of_Tomorrow@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 years ago

      It’s not even about graphics alone.

      They’re clearly building their games in an extremely inefficient way. Starfield does not have anything going on in it that other games with much lower requirements also have done.

      You see evidence of this in their previous games. One of the major performance issues with Fallout 4 for example, was that instead of building their cities in performant ways, they literally plonked every building as an individual asset into the world which thrashed the CPU for no reason. Modders just had to merge them all into one model to significantly improve performance. Their games are full of things like this and Starfield will be no different.

      • Chailles@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Unless I’m completely mistaken here, modders didn’t combine the buildings together, that’s how they are by default. Mods, however, sometimes needed to break said system which resulted in massively degraded performance.

        • Edgelord_Of_Tomorrow@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 years ago

          Nah the Boston performance was terrible in vanilla. The precombination fixes made huge performance improvements. There were issues with mods breaking precombined meshes but that was a separate issue.

    • Silverseren@kbin.social
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      2 years ago

      Why would he? Todd hates everyone who plays his games and cares only about separating money from pockets. Fallout 76 made that quite clear to everyone.

    • MonkeyKhan@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      If he gave a standard appeasing PR statement without following it up at all, that would somehow be preferable? This may be snarky, but at least you know what to expect.

      • lustyargonian@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        I mean, yeah I guess this does help temper expectations that they are done optimising, so maybe you’re right, being blunt is probably for the best.

  • Katana314@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    2 years ago

    Since negative opinions travel fast, I’m just gonna say my GPU is actually below the minimum requirements, though admittedly I upgraded CPU last year. The game’s minimum is a GTX 1070 TI, I just have a regular GTX 1070.

    In my case, it’s doing a LOT of dynamic resolution and object blurring nonsense to get the game to run smoothly, but it does run smoothly. I get to see the character faces during conversations, I can see what I’m doing, there’s no hitching, etc. New Atlantis looks ugly, but that might change if I get a new GPU.

  • m-p{3}@lemmy.ca
    link
    fedilink
    English
    arrow-up
    10
    ·
    2 years ago

    The missing part is that the user with a 4090 complaining had a CPU from 2017 🥴

    • Vordus@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 years ago

      Considering that this thing runs great on a Series S (which is CPU-heavy, but with a weak graphics card) that makes so much more sense.

    • Capt. Wolf@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      2 years ago

      Yeah, I’m not buying that either. I’m on a 2014 i7 and a 3060 playing on ultra. My sole issue was not running on an SSD which I resolved yesterday. That kid is clearly playing on a potato and lying.

      • NuPNuA@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        I’m shocked at home many PC users are still running HDDs given that SSDs have been standard ok consoles for three years now.

        • Viking_Hippie@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          They’ve pretty much been standard for gaming and containing the os on PC for 5 if not more. HDDs are still good for storage, but only luddites and people trying to save money in the stupidest way would have their games on them.

      • rambaroo@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 years ago

        Playing on ultra on a 3060 ? So you’re getting 20-30 fps? Because that’s what it gets on mine with a much newer CPU. I had to turn it down to med-high to average 45 fps

    • rambaroo@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      2 years ago

      Gotta love the Bethesda fanboys upvoting this one cherry picked comment. They’re are like 70 comments in there with all different combos of system specs complaining about performance.

  • rDrDr@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    2 years ago

    It’s perfectly optimized. I’m getting a rock solid 30fps. /s

    Seriously though, I think it’s fine. Especially indoors and in space, it performs well and looks incredible. New Atlantis is kinda ugly and janky though.

    • whats_a_refoogee@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 years ago

      ID tech is nowhere near flexible enough for something like Starfield or even Skyrim. It’s partially the reason why it’s so efficient. It simply isn’t fit for the task.

      And the Bethesda developers are intimately familiar with Creation Engine, achieving the same level of productivity with something new will take a long time. Switching the engine is not an easy thing.

      Not to say that Creation Engine isn’t a cumbersome mess. It has pretty awful performance, stability and is full of bugs, but on the other hand it’s extremely flexible which has allowed its games to have massive mod communities.

      • rambaroo@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 years ago

        If Bethesda can’t take the time to do it then who can? People act like they’re some small time developer but they’re not. They simply refuse to expand their dev team to do things like a redesign.

        Creation engine is not going to hold up well for another 6 years, there’s no way their cell loading system will be considered acceptable by the time ES6 comes out. The amount of loading screens in Starfield is insane for a modern game. This company needs new talent badly.

    • NuPNuA@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 years ago

      You realise custom engines are built for specific game types right? iD Tech is great for creating high fedelity FPS games with linear levels and little environment interactivity. That’s not what Bethesda make though.

      • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 🏆@yiffit.net
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        2 years ago

        They could do everything they usually do but better if they used Unreal. They don’t need a custom engine. They just need an engine that isn’t over 2 decades old with a bunch of shit taped to it to make it look modern. Not to mention, ID already did make a custom built engine that handles much of what Bethesda RPGs do when they made RAGE. They could have used that, with the only issue being learning it. Not sure what their turnover rate is like… maybe they’re just too used to GameBryo/Creation to be able to switch now. It might take too long to learn anything new. Plus it would have to be able to have a toolset. If they didn’t release those easy to use modding tools, there could be rioting in the streets.

        • NuPNuA@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 years ago

          As far as I know, Bethesda are unusual in modern Devs in that they have a small team for the size of game they make, but they have strong retention of staff so have huge amounts of institutional knowledge about how they do things. Shifting to a new engine would basically mean starting from scratch on a company level. Unlike Ubisoft or Activison, they can’t just throw several thousend Devs at a game to brute force the development either.

          • rambaroo@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            2 years ago

            But that’s their biggest problem. There’s no reason for them to have a small unchanging team. It’s very very obvious that they never get an influx of new ideas. Starfield feels like it was made in 2016 and the optimization effort is comically bad. The writing is still mostly boring, campy and naive like it was written by a 15 year old Mormon. The facial animations are incrementally better than fallout but still noticeably worse than much older games like Witcher 3. I could go on.

            It’s not a bad game at all but it could’ve been so much better if Bethesda execs weren’t greedy cheapasses and the dev team was open to changing their process.

            This why Bethesda needs to be criticized instead of constantly getting fellated by fanboys. ES6 will be an outdated mess because Bethesda never sees any feedback except over the top praise for half-assing their games.

            • Blue@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              2 years ago

              Fanboys downvote you but you are right, even if I love the fallout franchise, the same gameplay loop, the same engines, potato faces in 2023, outdated animations… etc, right now I would prefer Microsoft to force obsidian to take care of the next fallout, and ban Todd Howard for ever putting one foot in the dev took, even in the building. He can go fuck himself and his shit engine.

    • Edgelord_Of_Tomorrow@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      2
      ·
      2 years ago

      Exactly this. It was only two generations ago when idTech was an open world engine, id can and have made it to do whatever they want and to suggest that despite Bethesda money (let alone MICROSOFT money) id couldn’t make a better engine with similar development workflows as Creation is just dishonest to suggest.

      • NuPNuA@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        I assume you’re talking about Rage, which had an open world map, but no where near the level of simulation systems as a Bethesda game. In fact I remember back at the time most of us saying the map was pointless as it was just a way to travel between levels with nothing to do in it.

        • peppersky@feddit.deBanned
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 years ago

          There are no “levels of simulation systems” in Starfield. NPCs don’t even have schedules in this game, they literally just stand around in the same spot 24/7.

          • NuPNuA@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            2 years ago

            It’s still keeping track of lots of variables across a big play space at any time regardless of NPC schedules.

            They tried that once with Oblivion and clearly it didn’t add enough to the game and players experience to return to.

            • Cypher@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 years ago

              They tried that once with Oblivion

              They advertised that with Oblivion’s AI but never delivered on half the claims.

              Go look at the pre-release claims of the Radiant AI and what was actually delivered.

            • peppersky@feddit.deBanned
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              3
              ·
              2 years ago

              “keeping track of lots of variables” doesn’t cost CPU time though, since nothing that isn’t on the same map as you is ever relevant for anything. Their engine just fucking sucks.

              • rambaroo@lemmy.world
                link
                fedilink
                English
                arrow-up
                3
                ·
                edit-2
                2 years ago

                Keeping track of variables doesn’t use CPU time? Ok man. I’m all for hating on Bethesda’s shitty engine but that’s just not true. At the very least it does track what NPCs are doing off screen which is how they end up at your ship when you tell them to go there. They will actually walk to your ship if you don’t get there first.

                On the other hand it’s basically guaranteed that Bethesda spent zero effort optimizing that. I bet it’s the same code they ran for Skyrim.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        It’s a shame idTech is no longer released publicly. It would’ve been amazing to see what people could do with the beast of an engine that powered DOOM Eternal, especially modders.

    • vagrantprodigy@lemmy.whynotdrs.org
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 years ago

      I know they don’t want to switch, but it would be worth it to make the swap to something like unreal, even if it takes a few years of customization to get the open world stuff right. Creation Engine just feels so old.

  • hairinmybellybutt@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    2 years ago

    I’m a game developer and I’m ashamed by this.

    When chip production will halt because of the climate, you will see programmers optimizing their code again.

    Jeez I hope this economy crashes.

  • With my experiences playing the game with an unsupported GPU and getting a solid 60 fps still as long as no NPCs are in the vicinity, I don’t think it’s the GPU side of things that needs optimization. It’s whatever uses the CPU.

    • Kowlown@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 years ago

      It’s the CPU. I had to throttle the process to be able to able. This game is a CPU ressource hog

  • vagrantprodigy@lemmy.whynotdrs.org
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    2 years ago

    lol, no they didn’t. They didn’t even test adequately, more than a few GPUs that meet the requirements didn’t work when early access launched.

  • MuhammadJesusGaySex@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 years ago

    I have a 3080 ti, and a 12700k, and 32 gigs of ddr5, and a 2 terabyte ssd. It runs great for me. I don’t understand the problem. /s

    • MxM111@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      2 years ago

      So, this system runs it fine? Good to know. I was worried that my computer would not be able to run it smoothly, but now no worries at all.

      • Krotiuz@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        2 years ago

        I’ve got an 8086K and 3080, running on a 4K screen - with Ultra settings and FSR of 80% I’m getting 35-40fps, which honestly doesn’t feel too bad. It’s noticable sometimes but it’s a lot smoother than the numbers suggest.

        Because my CPU is a little long in the tooth, I’ve gone probably a bit hard on the visuals, but my framerate didn’t improve much by lowering it. The engine itself has never really liked going past 60fps, so I don’t know why people expected to be able to run 100+ frames at 4k or something.

        • Edgelord_Of_Tomorrow@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          4
          ·
          2 years ago

          Sorry mate but 35fps on a 3080 with FSR is just objectively bad performance.

          Starfield is not doing anything in terms of graphics or gameplay that other games that run 3-4 times as well aren’t doing.

          • deranger@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            That’s because they’re CPU limited, mate.

            Easy 1440p60 on ultra everything with no scaling on my 3090. Frequently up in the 80-90 FPS range. This game runs fine. It’s not a “teetering mess” as you say.

              • Krotiuz@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                2 years ago

                Completed some testing on my end, using intels PresentMon and sitting at 35fps average in New Atlantis my GPU busy is pegged at about 99% of the frame time, so nothing really.

                I do get a bit of a CPU limitation when it’s raining, but nothing significant, dropped to about 30fps.

                Trying at 1440p with the same settings as the 3090 above got me around 50fps, 1440p is almost half the pixels of 80% of 4k as well, so that’s not helping my GPU much!

                • deranger@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 years ago

                  I’d really not expect the performance difference between a 3090 and a 3080 to be that large, and the only difference I can think of in our systems is the CPU. (5800X3D vs 8086k)

                  New Atlantis is a smooth 60+ fps with every setting maxed out at 1440p.

              • deranger@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 years ago

                Considering that CPU is less powerful than what’s in the Xbox Series S, which does 1080p30, I’m not at all surprised they’re getting a similar frame rate.

                If this was a “teetering mess” you would have heard it in the Gamers Nexus benchmarks. Steve says nothing to this end, and the game benches predictably across the spectrum of hardware tested.

        • avater@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 years ago

          for me the game runs pretty well with ~90+ fps on high and activated fsr2.

          5800x 3D, 64 gigs of ram and a 6900XT I shot cheap during the great gpu collapse. And by the looks of the game this seems pretty reasonable to me.

          • rambaroo@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            AMD users are having a better time with it, unsurprisingly. I wish I hadn’t gone for Nvidia but too late for that.

            • avater@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              edit-2
              2 years ago

              I think there wil be patches and some updates to NVIDIAs shitty driver that will fix things in the future :) Otherweise yeah maybe get an AMD GPU next time, don’t fall for the NVIDIA Marketing. Using Radeons since the 9800 pro Bundle with Half Life 2 and never had any issues with them or their drivers.

              • rambaroo@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                2 years ago

                Hopefully. I’ve always been more of an AMD/ATI fan, but for this laptop the deal worked out to be better with an Nvidia card. But next time I’m not settling for it. AMD CPU and GPU is the way to go. Especially because I’m trying to daily on Linux now and the driver side is much much nicer with AMD.

      • DontMakeMoreBabies@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        I’ve got that but with a 4080 - no issues.

        I admittedly feel like I went full retard on my build and seriously hope these specs aren’t what’s necessary…

        • RogueBanana@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Hey at least you don’t have to upgrade for a while, could probably run it for another yr if todd is generous

    • Piecemakers@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 years ago

      You had me in the first sentence, and then I realized it was sarcasm. 🤪 I’m running a similar rig, but it’s primarily for rendering work, etc., so for juuust a second there, I wondered if it was falling behind. 😅🤓