lol. has anyone found ways to optimize starfield for their pc, like reducing stuttering, FPS drops, etc?

  • gamermanh@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I’ll believe they actually optimized their PC port when their PC ports start showing some signs of effort at being actual PC ports

    No FOV slider after FO76 had one is, to me, a great sign of how little Bethesda actually cares about the platform that keeps them popular (thanks to mods)

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      They don’t want to put the work in for the biggest benefit of PC gaming.

      I don’t think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.

      And have the bare bones setting be able to work on shitty systems that do need upgraded.

      Bethesda just wants to make games that run on a very small slice of PCs, and who can blame them when they can’t even release a stable game on a single console? They’re just not good at

      • verysoft@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I don’t think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.

        This is the mentality they want you to have. And it’s a shit one. PCs should be able to run any game well when it comes out.

  • Vertelleus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’m glad everyone is beta testing this game for me.
    I’ll wait until it’s $20 on steam with all DLC for spaceship horse armor.

  • Poob@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    This game is not pretty enough to push my 3080 ti as hard as it does. I get around 40fps at max settings.

    • LazerFX@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      1 year ago

      What… are you doing in the background? I’ve got a 3070 and 4k monitor, and I get between 50 and 60 FPS with all the settings I can fiddle with disabled enabled. I use RivaTuner to pipe statistics to a little application that drives my Logitech G19 with a real-time frame graph, CPU usage, memory load and GPU load and it uses multi cores pretty well, and generally makes use of the hardware.

      – edit – Thanks for pointing out I made totally the wrong comment above, changing the meaning of my comment 180°…

  • DoucheBagMcSwag@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    In think the issues people are having is is 4K and above 1080p.

    Yet again i’m on ultra with 3060 ti but 1080p and using DLSS performance mod getting about 90 fps and 45 in cities

    • Freeman@lemmy.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      I really only play on 1080p. Without a DLSS mod even with a 4060ti i only get about 30 FPS using optimized settings in the larger cities.

      I imagine its only worse at larger resolutions. their FSR implementation is definately a core cause of the issue, especially for Nvidia cards.

      • DoucheBagMcSwag@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Fuck man what is going on?? I’m on a 3060 on ultra 1080p getting about 90fps most of the time and 45 in cities

        What’s your FPS with the DLSS mod?

        • Freeman@lemmy.pub
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          With the DLSS-FG mod and the settings outlined here I am getting 65-90 with the “Quality” settings on the 4060ti. Without the DLSS-FG mod and those exact same settings, i will get 24-40ish. 24-30 in places like New Atlantis, 40 ish in places like caves and such.

          For the my 1070 (TL:DR my house was hit by lightning, which took out my 3060 12 GB and I had to use a backup) and my 1650ti-max-q i basically need to turn the settings preset to low, then turn the indirect shadows to medium (there seems to be a bug with textures being really blurry if low), and then set FSR scaling back up to 100 manually. With that, i will get about 30-35 FPS. These are on the lower end of the hardware requirements. The 4060TI finally came in around the 5th, and its been quite a nice improvement, but its a shame this game needs mods to play at 60 FPS at all.

          That said the game plays alright at 30 FPS. especially if you opt to use a controller and some slight motion blur. If you try a KB+M at 30FPS, its pretty rough since the camera movements are much more precise and responsive. I learned with Fallout76 that Bethesda really only seems to develop and playtest with a controller, and thus, trying to force using a KB+M can work, but can be buggy. I dont really mind outputting to my TV and playing on the couch though. Its a nice relaxing experience vs sitting in an office chair.

  • circuitfarmer@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    Thing is, I did upgrade my PC. Starfield runs acceptably, but not to the level it should given my hardware.

    I’d much rather hear that they’re working on it in a patch rather than be gaslit into thinking it already runs well.

  • redfellow@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 year ago

    They didn’t optimize it for consoles either. Series X has equivalent of 3060 RTX graphical grunt, yet it’s capped to 30fps and looks worse than most other AAA games that have variable framerates up to 120fps. Todd says they went for fidelity. Has he played any recent titles? The game looks like crap compared to many games from past few years, and requires more power.

    The real reason behind everything is the shit they call Creation Engine. An outdated hot mess of an engine that’s technically behind pretty much everything the competition is using. It’s beyond me why they’ve not scrapped it - it should have been done after FO4 already.

    • Huschke@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      And don’t forget the constant loading screens. A game that has so many of them shouldn’t look this bad and run this poorly.

    • PatFusty@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Correct me if Im wrong but dont they limit frametimes so they can reduce tv stuttering? NTSC standard for TVs is 29.94 or 59.94 fps. I assume they chose the 30fps so it can be used more widely and if its scaled to 60 it would just increase frametime lag. Again, im not sure.

      Also, comparing CE2 to CE1 is like comparing UE5 to UE4. Also, i dont remember but doesnt starfield use the havok engine for animations?

      Edit: rather than downvote just tell me where I am wrong

      • Nerdulous@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Not to put too fine of a point in it but you’re wrong because your understanding of frame generation and displays is slightly flawed.

        Firstly most people’s displays, whether it be a TV or a monitor, are at least minimally capable of 60hz which it seems you correctly assumed. With that said most TVs and monitors aren’t capable of what’s called variable refresh rate. VRR allows the display to match however many frames your graphics card is able to put out instead of the graphics card having to match your display’s refresh rate. This eliminates screen tearing and allows you to get the best frame times at your disposal as the frame is generally created and then immediately displayed.

        The part you might be mistaken about from my understanding is the frame time lag. Frame time is an inverse of FPS. The more frames generated per second the less time in between the frames. Now under circumstances where there is no VRR and the frame rate does not align with a displays native rate there can be frame misalignment. This occurs when the monitor is expecting a frame that is not yet ready. It’ll use the previous frame or part of it until a new frame becomes available to be displayed. This can result in screen tearing or stuttering and yes in some cases this can add additional delay in between frames. In general though a >30 FPS framerate will feel smoother on a 60hz display than a locked 30 FPS because you’re guaranteed to have every frame displayed twice.

        • PatFusty@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          1 year ago

          Thanks, i was recently reading about monitor interlacing and i must have jumbled it all up.