• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    edit-2
    7 hours ago

    Trying to run Borderlands at 4K sounds about as stupid to me as…

    On the contrary, it should be perfectly runnable at 4K because its a 2025 FPS game and the cel-shaded graphics should be easy to render.

    ‘Unreal Engine’ is no excuse either. Try something like Satisfactory rendering literally thousands of dynamic machines on a shoestring budget with Lumen, like butter, on 2020 GPUs, and tell me that’s a sluggish engine.

    This is on Gearbox, who’ve developed on Unreal for 2 decades. And ‘sorry, we’ll work on it’ would have been a fine response…

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 hours ago

      My understanding is that Unreal is a garbage engine for optimization at face value, it has a lot of useful tools and a lot of incorrect/dated documentation for those tools some of which are also just kind of configured wrong as their default settings. If effort is put into optimization to configure things correctly and only use the various tools like nanite or lumen in their actual use cases (rather than just throwing them on everything) you can get some pretty fantastic optimization.

      TLDR: Good but complex tools marketed as low effort with bad defaults and exagerated marketing.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        7 hours ago

        Gearbox has developed on Unreal Engine since 2005. They have ~1,300 employees.

        I’m sorry, I know game dev is hard. But if small, new studios can get it to work, Gearbox should get it to fly. They have no excuse.

    • cerebralhawks@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      9
      ·
      7 hours ago

      Right because Cyberpunk looks amazing in 2025 and it ran like shit five years ago at launch.

      But 4K gaming is more demanding than people realise and gaming companies shouldn’t be getting so much flak for it.

        • paraphrand@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          3 hours ago

          Eh, I guess it is. But it also isn’t. We have stagnated a bit in the rasterization and VRAM departments when you talk about affordable entry level cards that most people buy.

          And the PlayStation 5 has not changed since it launched…

          Sure, newer cards have Ray Tracing stuff and new video encoder pipelines and some other things. But when you look at classic rasterization, and look at xx60 and xx70 class cards, it’s not the sort of leaps we use to have. Node shrinks are not what they use to be.

          A 3060 is as fast as a 2070 super. Just as an example I have first hand experience with. There use to be much larger performance gaps between generations.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        7 hours ago

        Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.

        Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.

        PTGI looks incredible, but it’s basically only usable with mods and a 4090+.