• cerebralhawks@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    9
    ·
    7 hours ago

    Right because Cyberpunk looks amazing in 2025 and it ran like shit five years ago at launch.

    But 4K gaming is more demanding than people realise and gaming companies shouldn’t be getting so much flak for it.

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        Eh, I guess it is. But it also isn’t. We have stagnated a bit in the rasterization and VRAM departments when you talk about affordable entry level cards that most people buy.

        And the PlayStation 5 has not changed since it launched…

        Sure, newer cards have Ray Tracing stuff and new video encoder pipelines and some other things. But when you look at classic rasterization, and look at xx60 and xx70 class cards, it’s not the sort of leaps we use to have. Node shrinks are not what they use to be.

        A 3060 is as fast as a 2070 super. Just as an example I have first hand experience with. There use to be much larger performance gaps between generations.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      7 hours ago

      Honestly Cyberpunk’s raytracing runs like poo compared to Lumen (or KCD2 Crytek) compared to how good it looks. I don’t like any of the RT effects but RT Reflections; both RT shadows options flicker, RT lighting conflicts with the baked-in lighting, yet doesn’t replace it if you mod it out.

      Most of Cyberpunk’s prettiness is there from good old rastarization, more than most people realize.

      PTGI looks incredible, but it’s basically only usable with mods and a 4090+.