Hitman 3 arrived this week in what became the series' biggest digital launch of all-time - and rightly so. As a culmination of the existing trilogy, Hitman 3 caps off the generation with the most impressive iteration of the Glacier Engine to date, with all existing current-gen Hitman content retrofitted into the latest game, while being essentially 'remastered' for next-gen consoles. PC owners can rejoice too - the game does everything that PS5 and Series X can and more besides, thanks to some nice upgrades and further scalability.
In delivering this article, IO interactive has furnished us with the exact PC settings for every console the game is running on. So, by using these settings on PC at the proper resolution, it is possible to see how a PC GPU fares in comparison to the consoles, with some interesting observations. First of all though, I'd like to give IO some kudos for a more fully featured PC options menu. The lack of granularity in settings was a real issue I had with Hitman 2, but this was improved over time and those efforts are ported over directly into Hitman 3, allowing high-end PC users to increase fidelity over console quality.
That starts with shadow quality, which sees PS5 operate at the normal preset, with Xbox Series X upping that to high. Ultra allows PC users to further refine the effect and in my tests had very little impact on performance. Screen-space ambient occlusion is interesting, in that all console builds operate at the 'minimum' setting - perhaps not surprising when dropping from ultra to minimum can add 20 per cent to frame-rate. Minimum can look a little rough, and I'd say that medium is the optimal setting - you claw back 16 per cent in frame-rate by dropping from the max and halo artefacts are minimised in the process.
Screen-space reflections are similarly taxing, with next-gen consoles at the equivalent to PC's medium setting. That's what I'd recommend for optimised settings too - dropping from high you gain four per cent of performance and while grainy artefacts are present as a result, it does not distract in most scenes. Another important setting is the mirror quality - a render to texture effect that effectively sees the game drawing the same scene from an extra angle. When a mirror dominates the scene, the performance hit is more noticeable. Here I found an anomaly - IO tells us that PS5 renders at PC's medium setting, but this does not seem to change fidelity from the high setting on the PC version when really it should. If medium is fixed, I'd recommend that based on the PS5's showing, but for now, high settings will have to do.
Level of detail is the last meaningful setting you should take a look at - the higher you set it, the more detailed you'll find objects in the distance. IO pegs next-gen consoles at the medium setting but on balance I'd recommend the high preset for optimised settings, as I find the object pop-in to be a tad distracting on medium and low, and indeed on the consoles. Then there's the motion blur setting to consider - its performance hit is only really noticeable if you have fast-moving objects taking up most of the camera view where switching from high to off produced a maximum of 15 per cent of extra performance. But this is a real outlier - I'd recommend tuning it to whatever you think looks best. For the record, consoles disable it.
From there we move on to other PC exclusive settings, kicking off with simulation quality. There are two options here - base and best,with all consoles using the latter in order to cut down on CPU load. By opting for the base configuration, NPCs in the distance effectively run at half-refresh and other features are also tweaked. To be honest, only the NPCs were noticeable to me. IO recommends the base option for those with quad-core CPUs and to be honest, I'd opt for best. It ran just fine for me on a Ryzen 5 3600 - the mainstream consumer champion CPU of the moment.
DF Optimised Settings | PlayStation 5 | |
---|---|---|
Shadow Quality | High | Medium |
SSAO | Medium | Minimum |
Screen-Space Reflections | Medium | Medium |
Mirror Quality | High | Medium |
Level of Detail Quality | High | Medium |
Simulation Quality | Best | Base |
Motion Blur Quality | Medium (Optional) | Off |
Variable Rate Shading | Quality (If Supported) | N/A |
Texture Quality | Ultra | Ultra |
The final option is variable rate shading - which looks to be the less impressive tier one implementation supported by Intel integrated GPUs, and applies to objects in the scene rather than the perceptual similarity of colour or speed of movement as found in tier two variants of VRS seen in titles like Gears 5 or the Wolfenstein games. Two options are present - quality and performance - and I can't recommend the latter, which noticeably impacts overall quality. If your GPU supports VRS, there is a three per cent frame-rate gain by using the quality mode, based on analysis from the game benchmark.
All in all, looking at our optimised settings, they are similar to the compromises made on consoles like the PlayStation 5 or Xbox Series X, but they are purposefully set higher in those areas where I believe quality can be gained without too much in the way of a meaningful performance downgrade. Running optimised settings up against max in the benchmark, you get a nice 17 per cent performance advantage and the game basically looks much the same in most scenarios. You'll see in the video how this all plays out across a range of Nvidia and AMD GPUs, but I was also impressed with CPU performance.
With Hitman 3, DX12 is the standard and the DX11 renderer is deprecated, boding will for better CPU performance. I tested a scene that proved particularly heavy on the CPU in Hitman 2 and found that even the mainstream Ryzen 5 3600 ran beautifully - and even lowest frame-rates were in the 80s. In terms of percentages, the Ryzen 9 3900X ran about 17 per cent better than the 3600, and the Intel Core i9 10900K delivered a 32 per cent advantage.
Bearing in mind we could get a nigh-on exact match for console settings via IO's assistance, I thought it might be interesting to see how PC GPUs stack up against PlayStation 5 and Xbox Series X using a stress test scenario from the Hitman 2 Miami stage - one of the only places where you'll find any drop beneath 60fps on PS5. You'll find my complete workings within the video at the top of this page, but perhaps not too surprising is the fact that AMD's Radeon RX 5700 XT provides the closest match to Sony's new machine. It possesses four more Navi dual compute units than PS5, but runs at a slower clock. I had some fun with this, and the comparisons with Series X at 4K resolution are also interesting, but I do expect that memory bandwidth plays a significant factor in the result based on what's being rendered, owing to particle overdraw being the most obvious challenge presented by the scene.
All told, the PC version of Hitman 3 is excellent - it looks great and runs well, and I hope to see further improvements made to this release over time. In particular, I really look forward to seeing what type of ray tracing may be added in the future - the current screen-space reflections are good, but a number of levels in Hitman 3 really look like they could benefit from full-blooded RT alternatives. It would also be great to see how DLSS would fare on the Glacier Engine. Typically, Nvidia's AI upscaling produces 'better than native' image quality on games with heavy TAA and post-processing so I'd be interested to see how it would fare on a more precise, pristine-looking title like this one. Hopefully we'll see that in the future, but right now, Hitman 3 on PC comes highly recommended.
https://news.google.com/__i/rss/rd/articles/CBMiaWh0dHBzOi8vd3d3LmV1cm9nYW1lci5uZXQvYXJ0aWNsZXMvZGlnaXRhbGZvdW5kcnktMjAyMS1oaXRtYW4tMy1wYy1vcHRpbWlzZWQtc2V0dGluZ3MtY29uc29sZS1jb21wYXJpc29uc9IBZGh0dHBzOi8vd3d3LmV1cm9nYW1lci5uZXQvYW1wL2RpZ2l0YWxmb3VuZHJ5LTIwMjEtaGl0bWFuLTMtcGMtb3B0aW1pc2VkLXNldHRpbmdzLWNvbnNvbGUtY29tcGFyaXNvbnM?oc=5
2021-01-23 14:09:06Z
52781322018365
Tidak ada komentar:
Posting Komentar