According to Bethesda Support, even the Intel Arc A770 GPU does not meet Starfield’s PC minimum requirements.

  • ninjan@lemmy.mildgrim.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 months ago

    I’m placing 0 blame on developers here but it’s just a fact that Intel can’t reasonably optimize the drivers for all games past and present in such a short time. And developers haven’t had access to the card for even remotely long enough for it to be part of the testing for any game (outside small titles maybe but they generally don’t need special treatment driver wise) releasing this year or next. AMD and Nvidia have literal decades of head start. So while I would’ve wanted Intel to do a better job I’m not trivializing the monstrous task either, and all things considered they’ve done OK. Not great, not horrible.

    If it wasn’t clear in the articles you read then those places wanted the clicks and engagement that comes from vaguely implying that Intel is killing their GPU division.

    Falsehood flies, and the Truth comes limping after it - Jonathan Swift

    • Dudewitbow@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      10 months ago

      Its not like intel never had gpu drivers (they have had igpus for ever), they just never had to constantly need to update them for the gaming audience.

      Lets not pretend features like intels quicksync that came out on sandy bridge igpus to do video encoding didnt reshape how companies did encoding for viewing(which would lead to NVenc or AMD VCE) or scrubbing in the case of professional use.

      The gpu driver team had existed for awhile now, its just they never was seveeely pressured to update it specifically for gaming as theybreally didnt have anything remotely game ready till arguably tigerlake’s igpu.