"If you're not happy using the tools available to you to improve frame rate and you're not happy with the frame rate you have, you should play a different game"
I don’t get it. Borderlands 1 ran at like 720p (less?) on native hardware (Xbox 360) and it looked great, mostly because of the cel-shaded graphics. Cel-shaded was code (maybe still is) for low-res and it covers it up well. So I feel like anyone trying to push 4K Borderlands (Gearbox, or players) is in the wrong here and that may be part of the problem. Now if Borderlands 4 can’t run at 1080p or 1440p on good hardware… that’s Gearbox’s problem to fix. Trying to run Borderlands at 4K sounds about as stupid to me as Animal Crossing at 8K — apparently that has been accomplished with a very powerful computer and an emulator. (AC for Switch is not performance locked to hardware, it’ll scale up to whatever you can run it on.) But it just begs the question “fucking WHY?”.
I feel like the discourse around Borderlands 4 should be “is it fun” followed at a great distance by “is the story compelling enough to draw me in?” I did not have plot expectations for the first one, and it exceeded them.
Disclaimer: played and beat BL1 and all its DLC. Dabbled in 2 and 3. Have not played BL4 yet. Looking forward to a Siren play on my Series X at some point, but not now.
Trying to run Borderlands at 4K sounds about as stupid to me as…
On the contrary, it should be perfectly runnable at 4K because its a 2025 FPS game and the cel-shaded graphics should be easy to render.
‘Unreal Engine’ is no excuse either. Try something like Satisfactory rendering literally thousands of dynamic machines on a shoestring budget with Lumen, like butter, on 2020 GPUs, and tell me that’s a sluggish engine.
This is on Gearbox, who’ve developed on Unreal for 2 decades. And ‘sorry, we’ll work on it’ would have been a fine response…
My understanding is that Unreal is a garbage engine for optimization at face value, it has a lot of useful tools and a lot of incorrect/dated documentation for those tools some of which are also just kind of configured wrong as their default settings. If effort is put into optimization to configure things correctly and only use the various tools like nanite or lumen in their actual use cases (rather than just throwing them on everything) you can get some pretty fantastic optimization.
TLDR: Good but complex tools marketed as low effort with bad defaults and exagerated marketing.
Eh, I guess it is. But it also isn’t. We have stagnated a bit in the rasterization and VRAM departments when you talk about affordable entry level cards that most people buy.
And the PlayStation 5 has not changed since it launched…
Sure, newer cards have Ray Tracing stuff and new video encoder pipelines and some other things. But when you look at classic rasterization, and look at xx60 and xx70 class cards, it’s not the sort of leaps we use to have. Node shrinks are not what they use to be.
A 3060 is as fast as a 2070 super. Just as an example I have first hand experience with. There use to be much larger performance gaps between generations.
Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.
Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.
PTGI looks incredible, but it’s basically only usable with mods and a 4090+.
I don’t get it. Borderlands 1 ran at like 720p (less?) on native hardware (Xbox 360) and it looked great, mostly because of the cel-shaded graphics. Cel-shaded was code (maybe still is) for low-res and it covers it up well. So I feel like anyone trying to push 4K Borderlands (Gearbox, or players) is in the wrong here and that may be part of the problem. Now if Borderlands 4 can’t run at 1080p or 1440p on good hardware… that’s Gearbox’s problem to fix. Trying to run Borderlands at 4K sounds about as stupid to me as Animal Crossing at 8K — apparently that has been accomplished with a very powerful computer and an emulator. (AC for Switch is not performance locked to hardware, it’ll scale up to whatever you can run it on.) But it just begs the question “fucking WHY?”.
I feel like the discourse around Borderlands 4 should be “is it fun” followed at a great distance by “is the story compelling enough to draw me in?” I did not have plot expectations for the first one, and it exceeded them.
Disclaimer: played and beat BL1 and all its DLC. Dabbled in 2 and 3. Have not played BL4 yet. Looking forward to a Siren play on my Series X at some point, but not now.
On the contrary, it should be perfectly runnable at 4K because its a 2025 FPS game and the cel-shaded graphics should be easy to render.
‘Unreal Engine’ is no excuse either. Try something like Satisfactory rendering literally thousands of dynamic machines on a shoestring budget with Lumen, like butter, on 2020 GPUs, and tell me that’s a sluggish engine.
This is on Gearbox, who’ve developed on Unreal for 2 decades. And ‘sorry, we’ll work on it’ would have been a fine response…
My understanding is that Unreal is a garbage engine for optimization at face value, it has a lot of useful tools and a lot of incorrect/dated documentation for those tools some of which are also just kind of configured wrong as their default settings. If effort is put into optimization to configure things correctly and only use the various tools like nanite or lumen in their actual use cases (rather than just throwing them on everything) you can get some pretty fantastic optimization.
TLDR: Good but complex tools marketed as low effort with bad defaults and exagerated marketing.
Gearbox has developed on Unreal Engine since 2005. They have ~1,300 employees.
I’m sorry, I know game dev is hard. But if small, new studios can get it to work, Gearbox should get it to fly. They have no excuse.
Right because Cyberpunk looks amazing in 2025 and it ran like shit five years ago at launch.
But 4K gaming is more demanding than people realise and gaming companies shouldn’t be getting so much flak for it.
why are we comparing a game 5 years ago to one released today? hardware is much more capable now.
Eh, I guess it is. But it also isn’t. We have stagnated a bit in the rasterization and VRAM departments when you talk about affordable entry level cards that most people buy.
And the PlayStation 5 has not changed since it launched…
Sure, newer cards have Ray Tracing stuff and new video encoder pipelines and some other things. But when you look at classic rasterization, and look at xx60 and xx70 class cards, it’s not the sort of leaps we use to have. Node shrinks are not what they use to be.
A 3060 is as fast as a 2070 super. Just as an example I have first hand experience with. There use to be much larger performance gaps between generations.
Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.
Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.
PTGI looks incredible, but it’s basically only usable with mods and a 4090+.