Header Ads

ad728
  • Breaking News

    Cyberpunk 2077 shows why DLSS is pretty much mandatory for 4K next-gen gaming

    After years of waiting, Cyberpunk 2077 has finally arrived and the launch is, well, it's a bit of a mess. Visual glitches, performance and social issues all plague this game – and some of the visual effects can physically harm some people, according to this report from Game Informer

    I really haven't played enough of Cyberpunk 2077 to comment on the content of the game, I'm only a couple of hours in. However, what I can say is that the game is extremely heavy to run, and I don't like that I can't turn everything up to max and get a buttery smooth 60 fps at 4K without thinking about it. There's an Nvidia GeForce RTX 3090 in my machine – wasn't Nvidia marketing this thing as an 8K gaming card?

    Nvidia RTX 3090

    "8K" sure, Jan.  (Image credit: Nvidia)

    Performance problems

    Cyberpunk 2077 hails from the same developer that indirectly got me into the hardware enthusiast world in the first place with The Witcher 2 a decade ago. When that game came out, I started overclocking and tinkering with my hardware to try and hit 60 fps at high settings. But the thing is, I didn't have access to the high-end hardware I have right now. 

    If memory serves correctly, I was running a rig with an Intel Core i5-2500K, 4GB of RAM and an AMD Radeon HD 5870. This was in 2011, mind you, so this was a pretty solid mid-range build. Was totally able to play pretty much everything else at the time at 1080p, but I struggled with The Witcher 2. 

    My roommate at the time had a PC that was way better than mine, and was able to max the game out, and I would just sit in their room and gawk at their screen while they played it on the big screen TV they inexplicably used as a monitor at the time. I was determined to get to the point where I could max it out at a good framerate. 

    To this day, whenever I get a new graphics card in my personal machine, I still boot up The Witcher 2 with UberSampling just to see how it runs. I'm sure many people have games like this, whether it's Crysis, Metro, or whatever. 

    Cyberpunk 2077 is probably going to end up being that game for a lot of people, too. There are a lot of visual glitches and such, but when the game is working as intended, it's genuinely stunning when you can max it out with ray tracing at 4K, and will probably remain one of the best looking PC games for years to come. 

    But for right now, even the best hardware on the market struggles to run the game at the highest resolution without DLSS in performance mode. A setting which, if you're not aware, renders the game at a lower resolution, and uses AI to upscale the image back up to your native resolution. 

    Even turning off the ray tracing doesn't help that much, and in our experience boosted the frame rate from 24 fps to 33 fps on the RTX 3090 at 4K. That's a massive 38% performance boost, don't get me wrong, but at that framerate it still doesn't feel great. 

    On its own, this really isn't an issue. I think having a few games that really push the limits of what PC hardware can do is what drives half the fun of PC gaming in the first place. At least for me, personally. Right now, though, the problem is that more and more games are taking this approach. 

    Watch Dogs Legion

    With RT on, Watch Dogs looks amazing (Image credit: Ubisoft)

    What is it with these dystopic action games?

    Another game that's come out recently that seems to be very dependent on DLSS at high resolutions is Watch Dogs: Legion, another futuristic and dystopian action game, this time about hacking and piloting drones in London or something. 

    Again, I'm a hardware nerd, most of the time I play a game long enough to see how it pushes hardware, then move back to just playing WoW for hours on hardware that's way too beefy for that aging title. But in Watch Dogs: Legion, much like Cyberpunk 2077, the ray tracing on offer genuinely looks great, thanks to all the reflective surfaces throughout London. 

    At launch, we were seeing around 30-40 fps at max settings with ray tracing enabled and DLSS on Performance mode at 4K with the RTX 3080. Since then, patches and drivers have helped improve performance significantly, to the point where it no longer turns into a slideshow the second you hop into a car. But we're still not at the point where someone without a $1,499 GPU can just turn everything up without Nvidia's new upsampling tech. 

    I love DLSS, and really, I have since its inception with the Turing GPUs back in 2018. It's genuinely a great way for people to experience high-resolution gaming without having to fork over the money for a graphics card that can brute-force its way to 4K. We're at the point, though, where we have 5 graphics cards on the market right now that can brute force that resolution, with the RTX 3080, RTX 3090, RTX 3070, Radeon RX 6800 XT and Radeon RX 6800 all capable of delivering a solid 60 fps+ experience at 4K. 

    I genuinely don't know what the future holds when it comes to upcoming PC games, but what I don't want to see is developers leaning entirely on DLSS and AMD's Contrast Adaptive Sharpening (CAS) to get around making games accessible to people that don't have thousands of dollars to spend on the latest PC hardware. 

    It was only a few months ago that we were supposed to see the start of 8K gaming on PCs. Right now, though, it's looking like 8K gaming is not going to be the focus, and more that 4K is being pushed more out of the reach of more budget-minded consumers now that graphics cards that are capable of sustaining that resolution are genuinely affordable now. Hopefully games like Cyberpunk 2077 and Watch Dogs: Legion are just a temporary speedbump while developers get used to next-generation hardware, and not a sign of 4K gaming becoming inaccessible again. 

    No comments

    Post Top Ad

    ad728

    Post Bottom Ad

    ad728