I’m running a 2080 in my desktop, and I haven’t run into a game yet that I need a better card to run (at least, not with my 1080p monitors, which I prefer over higher-res ones). I also got a Framework 16, with the discrete Radeon 7700S GPU, and everything on it runs butter-smooth.
Point is, I can’t see justifying something that is $1600 just so I can run Starsector at the same 60 FPS I already get. It would take some truly groundbreaking game to come along to force me to consider upgrading to these ridiculous cards, and I don’t see that happening given the current consoles out there.
It is undeniably satisfying though to turn all settings in a game up to maximum without performance tanking, but you and I (same card, but 1440p screen) are not the target audience. This is for people who want (and can afford) at least 4K with ray-tracing in the latest games and all of this at triple-digit frame rates - or they are actually using it for non-gaming applications: Even our old 2080 is a beast for tasks like offline rendering, scientific calculations, machine learning, etc. - and a 4090 is of course several times better at this.
I know this is going way off-topic, but I love providing a bit of perspective: The fastest supercomputer in 1996 was the Hitachi CP-PACS/2048 at 368.20 GFLOPS. In 1997, it was the Intel ASCI Red/9152 at 1.338 TFLOPS. An RTX 2080 achieves 314.6 GFLOPS at 64-bit precision (as used by the TOP 500 list of supercomputers) and an RTX 4090 1.290 TFLOPS. Granted, despite similar processing power on paper (and FLOPS being hardly an objective measure to compare vastly different architectures and systems), even ancient supercomputers still have modern GPUs beat in terms of the amount of memory alone (although latency is of course far worse): 128 GB (2,048 * 64
GBMB) in case of the Hitachi system, for example.This is for people who want (and can afford) at least 4K with ray-tracing in the latest games and all of this at triple-digit frame rates
Sure, but are there really enough people who fit into that category to justify these cards? Based on the 4080 series sales, it seems not, but they’re still coming out with even beefier, more expensive cards anyways.
I know this is going way off-topic
In a similar vein, I grew up around IT because my mom worked on mainframes. I remember many nights of sitting under her desk at 3am because she got called in as Production Support when jobs would ABEND. When I was in high school, wanting to learn more about mainframes I set up Hercules, a mainframe System3xx emulator (looks like it supports z/OS now as well), and managed to find boot media for System370 and MVS. The desktop computer I was running the emulator on (a Gateway, showing my age), was more powerful than the original mainframe hardware.
at least, not with my 1080p monitors, which I prefer over higher-res ones
Blasphemy!
4k monitors are beautiful for normal desktop usage, making text crisp and clean with smooth curves and none of that blockiness that comes from low resolution, and with modern scaling settings you can even have 4K text and 1080p graphics at the same time with the same performance as native 1080p.
still laughs in 1080 ti
Is it just me or does that guy look like Jon Snow?