Skip Navigation

Todd Howard asked on-air why Bethesda didn't optimise Starfield for PC: 'We did [...] you may need to upgrade your PC'

lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

243 comments
  • I installed an optimized textures mod and instantly improved my performance by like... 20 frames, maybe more.

    I have an RX 6600 XT that can run Cyberpunk on high no problem. C'mon Bethesda, the game is really fun, but this is embarrassingly bad optimization.

  • Games are somehow too CPU heavy these days even though they aren't simulating the entire world like Kenshi, just stuff around you, so even though I upgraded my gpu I can barely get to 30fps. Also had this problem with Wolong, Hogwarts and Wild Hearts.

    • This is what happens when consoles improve their CPUs.

      Suddenly they've got more cycles than they know what to do with, so they waste them on frivolous unnoticeable shit. Now you don't have that extra headroom to get you from console 30fps to PC 60fps+. You're on a much more even footing than PCs ever were with the underpowered (even at release) PS4 and Xbox One.

      You'll struggle to get a CPU that does double what a PS5 can, and if it's being held back by a single thread performance (likely), there's nothing you'll be able to do to get double that.

    • I agree, I have an i7-8700k and a 2080super which I'd say are like mid to high level specs and I have a terrible time running Wild Hearts and Starfield. Such a damn shame too as a big MHW and MHR fan I was really looking forward to Wild Hearts and just couldn't run the game well at all. At this point I'm just not surprised when a triple A game runs like dog water on my system, usually these games are free on gamepass I try them out and 5 minutes later I uninstall.

      Indies are where it's at nowadays.

      • I wouldn’t consider a 8700k or a 2080 super high level specs or even mid level right now.

        Consider that an 8700k is slower than a 13400f today which is considered the absolute lower end of the mid range, realistically 13600k or 13700 is the mid range on the intel side.

        To be blunt the 8700k is 5 years old.

        The 2080s is well look at this chart and you make a decision

        I think a lot of people are just not appreciative of how out of date their hardware is relative to consoles atm

  • Instead of cracking jokes he should improve the piss poor optimization.

    Can’t even render 50fps consistently on a Strix 3090OC at 1620p (accounting for resolution scale), what a joke.

    Edit: Scratch that, it’s even worse, averaging around 40 fps with HUB Quality settings, so not even on Ultra and my 12900K is nowhere near bottlenecking.

    What a joke.

243 comments