This feels like the discussion then Fallout 4 was released. People were getting 15fps in the locations like the outdoor cities and Bethesda blamed it on the PC.
The answer is likely just that Bethesda coders are bad at optimization. They don’t have to put out a quality product, so they never learn how to make one.
Yeah, when fallout 4 released I was getting sub-30 fps in city areas. It was atrocious. I haven’t had anything like that with starfield tho. Seems hardware specific
This feels like the discussion then Fallout 4 was released. People were getting 15fps in the locations like the outdoor cities and Bethesda blamed it on the PC.
(Dev computers are $4000 top of the line systems)
(Testing computers are the $4000 top of the line systems that they used last year)
The answer is likely just that Bethesda coders are bad at optimization. They don’t have to put out a quality product, so they never learn how to make one.
Yeah, when fallout 4 released I was getting sub-30 fps in city areas. It was atrocious. I haven’t had anything like that with starfield tho. Seems hardware specific
How did that end up then? Did they put patches and optimize it or what?
Fixed it but it took a few months—maybe even a year. It was a wild time. Lots of conspiracies about console version vs PC.