Our 8K test system
PC built by Chillblast
Motherboard: Asus ROG STRIX Z390-E GAMING
CPU Cooler: Noctua NH-U14S
Case fans: Noctua NF-A14 PWM Chromax 140mm
Processor: Intel Core i9-9900K, 8 Cores / 16 Threads
GPU: Nvidia GeForce RTX 3090
Storage: 500GB Samsung 970 EVO Plus M.2 PCIe
Secondary Storage: 2TB Samsung 860 QVO
Power: Corsair RM850x 80 PLUS Gold 850W PSU
Case: Fractal Design Vector RS Tempered Glass
RAM: G.Skill Trident Z RGB 32GB DDR4 3200MHz
Monitor: Dell UltraSharp UP3218K
You may have noticed that a little game called Cyberpunk 2077 has just launched. Following years of development, and many, many, delays, the latest game from CD Projekt Red, the Polish company behind the The Witcher series of games, has now hit PC (along with games consoles), providing a graphically ambitious look at what a future Earth may look like.
While the game had been one of the most hotly anticipated titles in a long time, with hype levels almost matching what we'd usually see around a new GTA launch, the long development time saw a number of controversies arise. For a start, there were concerns about how the delays, and the reliance on 'crunch' (where employees work extremely long hours), had impacted employee welfare.
The fact that many reviews of the final game mention numerous bugs suggests that, even with the delays, the game may have needed longer in the oven.
There have also been concerns about the marketing and content of the game, which while trying to be 'edgy' often came off as transphobic and misogynistic. The depiction of race in the game has also been a subject of much debate. The gaming forum Resetera has a comprehensive thread that lists the controversies surrounding the game.
While this article looks at the performance of the game on PC, especially when it comes to 8K resolution, we feel that we need to highlight these concerns as well. They certainly dampened our excitement for the game.
However, from a hardware and software perspective, the design and scope of Cyberpunk 2077 are incredibly impressive. The TechRadar team, and, we're sure, many other gamers, are keen to see just how well this game runs on the mighty Nvidia RTX 3090 – currently the most powerful consumer gaming GPU around.
Utilizing advanced graphical effects like ray tracing, as well as using Nvidia's own DLSS technology to help the game run well despite its demanding graphics, Cyberpunk 2077 could either be a glorious showcase for the future of games, or a bloated system-killer that even the mightiest of rigs can't run. Which one will it be? Let's find out.
Cyberpunk 2077 8K performance
We ran Cyberpunk 2077 on our 8K test rig using the latest game-ready drivers provided by Nvidia, along with the Day 0 patch for the game itself, which aims to address several bugs and issues in the final game.
As usual, we started our tests with the highest preset, which in Cyberpunk 2077 is 'Ray Tracing Ultra'. As the name suggests, this puts most graphical settings at ultra levels, and adds intensive ray tracing effects. By default on this setting, DLSS is set to 'Auto'.
We set the resolution to 8K (7680 x 4320) and ran through a scene in a busy bar in Night City, the bustling metropolis that acts as the main setting for the game. The bar is filled with patrons, as well as bright and vibrant neon lights, smoke effects, reflections and other atmospheric details. At the highest settings, this game really does impress, and the sense of place is remarkable.
However, unsurprisingly, at 8K, the game runs terribly, only managing 18.6fps (frames per second) on average. It's a pretty game, sure, but at that fps it's very unpleasant to play. It's definitely not the kind of experience you'd pay $1,500 for, which is the price of the RTX 3090.
Ray Tracing Ultra isn't the very highest setting, however. You can actually whack a few settings, including ray tracing, up to 'Psycho' (again, that edginess…). The difference in image quality isn't really noticeable, but the corresponding drop in frame rate certainly is, with an average of just 15.4fps.
While this isn't surprising, what did impress us was what a difference DLSS (Deep Learning Super Sampling) makes. DLSS utilizes the power of the GPU to upscale games to a higher resolution using AI and machine learning, improving performance at ultra-high resolutions without too much impact on graphical fidelity.
We'd seen it help up the frame rates in games like Watch Dogs Legion at 8K, and in Cyberpunk 2077 it makes a huge difference to playability – though only when using the 'Ultra Performance' setting. This is the most aggressive setting for DLSS, and it takes the most liberties with image quality, but the results are astounding: with DLSS Ultra Performance mode on, Cyberpunk 2077 jumped to 31.8fps on average in the same scene. Suddenly, Cyberpunk 2077 at 8K is playable.
8K or not 8K?
The leap from 18.6fps to 31.8fps for such a graphically demanding game at 8K may seem too good to be true, but is it?
On the one hand, it allows you – or more specifically, Nvidia – to boast that, yes, the RTX 3090 is an 8K-capable graphics card. But the issue is, it's not really 8K. For a start, it's not using 8K assets. Due to the scarcity of people playing at 8K resolutions, it makes no sense to produce 8K assets when most people will be playing the game at 1080p, or 4K at a maximum.
But, DLSS is actually upscaling a 1440p image to 8K. It's still a demanding task, but it's not native 8K. However, with DLSS on, you're getting a much better image on an 8K screen than if you were playing the game at 1440p on the same screen.
DLSS off
DLSS Ultra Performance
When we actually start getting games with 8K assets, the comparison between native 8K and DLSS 8K will likely highlight the compromises DLSS makes, but for the moment, you're getting a good trade-off in terms of improved performance without a massive hit to image quality.
Brute-forcing 8K resolution using DLSS, thenm is possible, but it's not what it's made for. Instead, it's there to help maintain frame rates at 4K.
Dropping the settings to Ray Tracing Medium, we got a native frame rate of 24.7fps. That's closer to the 30fps we'd consider the absolute minimum for a playable experience, but you do notice the drop in ray tracing effects. At ultra, when we walked up to a door with a window in it, we got a wonderful effect where lights behind us were realistically reflected in the glass – this is absent at medium settings.
Ultra Ray Tracing
Ray Tracing Medium
Turning DLSS to Ultra Performance boosted the fps to 38.2 on average, with a maximum of 44.1fps. Now the game was starting to feel more slick and responsive.
Turning ray tracing off, and using the Ultra settings, the same scene returned a poor 12.2fps. This is because on the Ultra preset, DLSS is turned off completely. Putting it onto Ultra Performance saw frame rates leap to 50.2fps, showing just what kind of heavy lifting DLSS is doing here.
On the High preset with DLSS on Ultra Performance, we hit 54.9fps, but it was only on medium with DLSS on Ultra Performance where we broke the magic 60fps mark, hitting 68.2fps.
In other games, setting the game to the lowest settings just to get a playable experience at 8K isn't worth it at all – we found Assassin's Creed Valhalla at 8K on its lowest settings looked more like a PS2 game at times.
However, Cyberpunk 2077 still looks very impressive at medium settings. But, would you be happy to run at those settings having bought an RTX 3090? Probably not.
Action stations
While the scene we tested 8K performance in to begin with was a rather relaxed setting, despite it featuring numerous NPCs and atmospheric effects, we also wanted to see how the RTX 3090 handled a hectic action scene.
So, we played through an early mission that involves some stealth and ends in a gun fight. At 8K on the Ray Tracing Ultimate preset, with DLSS on Auto, the average frame rate crawled along at 18.8fps. Needless to say, this was an unpleasant experience, and made the shootout all but impossible.
Putting DLSS to Ultra Performance, however, got us a pretty solid 30fps during the same scene. While this is half of the 60fps minimum most PC gamers expect, it did at least mean the game was playable, and we were able to take down baddies with relative ease thanks to the smoother and faster frame rates.
Dropping the settings to medium, with DLSS Ultra Performance on, we hit 64fps. This was a much more enjoyable experience, but it meant the graphics once again took a hit.
A taste of the future
So, on one level, yes, the RTX 3090 can play Cyberpunk 2077 at 8K. That's a pretty remarkable achievement for Nvidia's hardware, as it seems like the game itself still needs quite a bit of work in order for it to run smoothly. We didn't encounter any game-breaking bugs while playing, but there were a few obvious instances of textures popping in after scenes had loaded.
When so much care has obviously been taken with the graphics, especially with the ray tracing implementation, to add immersion to a world, those bugs unfortunately only undermine that immersion and take you out of that world.
We also encountered a bug whereby at 8K resolution, menus and subtitles only render on half the screen, which is very annoying, but due to not many people playing at 8K, this is probably an issue that affects very few people. Still, it would be nice if that was fixed soon.
Also, while Cyberpunk 2077 does run at 8K, it needs one of the most expensive GPUs in the world to do so, and with heavy compromises made in terms of the graphics. It's yet another example of gamers having to sacrifice a lot of graphical bells and whistles to achieve 8K, something that just isn't worth it.
For example, playing the same bar room scene at Ultra Ray Tracing levels, with DLSS on, we hit 82fps (compared to 31.8fps at 8K with the same settings). The game looks phenomenal in places at 4K, so there really is no need to play at 8K and sacrifice over half the performance.
The real winner in this test, we think, is Nvidia's DLSS tech. It really is impressive how much of a difference it can make to the game, while minimizing the impact on graphical effects.
As we mentioned earlier, using DLSS to get a game like Cyberpunk 2077 to run at 8K is cool, but it's not what it's for. The fact that DLSS could mean someone with an RTX 2060 Super, for example, could run Cyberpunk 2077 at 1440p with ray tracing and high effects is far more useful – and exciting. What we don't want to see, however, is game devs using DLSS as a crutch, where they ship unoptimized games that run like crap until you turn DLSS on.
As for Cyberpunk 2077 itself, there's no doubt that this is a game that looks spectacular in places on high-end PC hardware. It's just such a shame that the bugs, along with some of the content of the game (and its marketing) are going to spoil the experience for some people.
https://news.google.com/__i/rss/rd/articles/CBMiZWh0dHBzOi8vd3d3LnRlY2hyYWRhci5jb20vbmV3cy9jeWJlcnB1bmstMjA3Ny1hdC04ay1zaG93cy10aGF0LW52aWRpYXMtZGxzcy10ZWNoLWlzLXRoZS11bHRpbWF0ZS1oYWNr0gFpaHR0cHM6Ly93d3cudGVjaHJhZGFyLmNvbS9hbXAvbmV3cy9jeWJlcnB1bmstMjA3Ny1hdC04ay1zaG93cy10aGF0LW52aWRpYXMtZGxzcy10ZWNoLWlzLXRoZS11bHRpbWF0ZS1oYWNr?oc=5
2020-12-13 15:00:00Z
CAIiEOw8vNJWJ5GkX2sZT2NFeAoqMwgEKioIACIQ8tpafVMbH9tFxS-w5kUsDioUCAoiEPLaWn1TGx_bRcUvsOZFLA4wlfrJBg
Tidak ada komentar:
Posting Komentar