• cerebralhawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    7
    ·
    8 hours ago

    I don’t get it. Borderlands 1 ran at like 720p (less?) on native hardware (Xbox 360) and it looked great, mostly because of the cel-shaded graphics. Cel-shaded was code (maybe still is) for low-res and it covers it up well. So I feel like anyone trying to push 4K Borderlands (Gearbox, or players) is in the wrong here and that may be part of the problem. Now if Borderlands 4 can’t run at 1080p or 1440p on good hardware… that’s Gearbox’s problem to fix. Trying to run Borderlands at 4K sounds about as stupid to me as Animal Crossing at 8K — apparently that has been accomplished with a very powerful computer and an emulator. (AC for Switch is not performance locked to hardware, it’ll scale up to whatever you can run it on.) But it just begs the question “fucking WHY?”.

    I feel like the discourse around Borderlands 4 should be “is it fun” followed at a great distance by “is the story compelling enough to draw me in?” I did not have plot expectations for the first one, and it exceeded them.

    Disclaimer: played and beat BL1 and all its DLC. Dabbled in 2 and 3. Have not played BL4 yet. Looking forward to a Siren play on my Series X at some point, but not now.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      1
      ·
      edit-2
      7 hours ago

      Trying to run Borderlands at 4K sounds about as stupid to me as…

      On the contrary, it should be perfectly runnable at 4K because its a 2025 FPS game and the cel-shaded graphics should be easy to render.

      ‘Unreal Engine’ is no excuse either. Try something like Satisfactory rendering literally thousands of dynamic machines on a shoestring budget with Lumen, like butter, on 2020 GPUs, and tell me that’s a sluggish engine.

      This is on Gearbox, who’ve developed on Unreal for 2 decades. And ‘sorry, we’ll work on it’ would have been a fine response…

      • CheeseNoodle@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 hours ago

        My understanding is that Unreal is a garbage engine for optimization at face value, it has a lot of useful tools and a lot of incorrect/dated documentation for those tools some of which are also just kind of configured wrong as their default settings. If effort is put into optimization to configure things correctly and only use the various tools like nanite or lumen in their actual use cases (rather than just throwing them on everything) you can get some pretty fantastic optimization.

        TLDR: Good but complex tools marketed as low effort with bad defaults and exagerated marketing.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          edit-2
          7 hours ago

          Gearbox has developed on Unreal Engine since 2005. They have ~1,300 employees.

          I’m sorry, I know game dev is hard. But if small, new studios can get it to work, Gearbox should get it to fly. They have no excuse.

      • cerebralhawks@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        9
        ·
        7 hours ago

        Right because Cyberpunk looks amazing in 2025 and it ran like shit five years ago at launch.

        But 4K gaming is more demanding than people realise and gaming companies shouldn’t be getting so much flak for it.

          • paraphrand@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            3 hours ago

            Eh, I guess it is. But it also isn’t. We have stagnated a bit in the rasterization and VRAM departments when you talk about affordable entry level cards that most people buy.

            And the PlayStation 5 has not changed since it launched…

            Sure, newer cards have Ray Tracing stuff and new video encoder pipelines and some other things. But when you look at classic rasterization, and look at xx60 and xx70 class cards, it’s not the sort of leaps we use to have. Node shrinks are not what they use to be.

            A 3060 is as fast as a 2070 super. Just as an example I have first hand experience with. There use to be much larger performance gaps between generations.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          7 hours ago

          Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.

          Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.

          PTGI looks incredible, but it’s basically only usable with mods and a 4090+.