All benchmarks on this page were carried out with the GeForce 497.29 or the Adrenalin 21.12.1 due to time constraints. The image quality was checked by the editors with the newer drivers, there were no changes in this regard.

    God of War offers better graphics on PC than on PlayStation 4 and PlayStation 5, with the differences being quite visible in some scenes and only minor in others. A direct comparison is possible because the game uses the console graphics in the "Original" graphics preset.

    PC graphics with small improvements

    The two biggest differences are to be found in the shadow display and the sharpness of detail. The game does not show more details on the PC, but they are displayed sharper in places. This depends heavily on the object. While some show no difference, others are visibly sharper.

    Shadow improvements fall into two distinct categories. The shadows themselves are higher resolution on the PC and much more contoured than on the consoles. That's why some shadows are thrown more on the PC, some of which disappear into nothing on the console. At the same time, the ambient occlusion makes a visible step forward. In the console version, it is only applied very tentatively, which is why many objects have hardly any shadows of their own and it always seems as if objects are floating in the air. This works better on PC.

    Best possible graphics on the PC
    Original graphics from the PlayStation 4/5

    The display of volumetric lighting effects on the PC also makes a visible step forward. They appear quite blocky when used intensively on the PlayStation, but finer on the PC. The screenspace reflections are also calculated at a higher resolution, so they are visibly sharper.

    In addition, the developers on the PC have apparently also increased the tessellation factors of individual objects, which now appear bulkier and less flat. However, this is by no means always the case and is generally only minimally noticeable. Apart from that, the editors did not notice any significant differences in a direct comparison.

    Best possible graphics on the PC
    Original graphics from the PlayStation 4/5

     

    High performance loss for the better graphics (update)

    Better graphics cost a lot of performance in God of War. This is also because the engine may need to do some things it just wasn't designed to do. What is striking here is that the performance loss is higher in low resolutions than with many pixels. Accordingly, many of the changes seem to have long been not only a burden on the graphics card, but also on the processor.

    In numbers, this means that the GeForce RTX 3080 at 1920x1080 with maxed PC graphics has average frame rates that are 47 percent lower than with the original PlayStation graphics at the same resolution - that's just a little over half! In 2,560 × 1,440, the speed loss is then 43 percent, in 3,840 × 2,160 it is "only" 37 percent.

    PC vs. PlayStation-Graphics – 1920 × 1080


    PC vs. PlayStation-Graphics – 2560 × 1440


    PC vs. PlayStation-Graphics – 3840 × 2160

    There is no equal flight between Radeon and GeForce

    The Radeon RX 6800 XT behaves similarly, but not quite identically. In Full HD, the performance loss is almost identical at 46 percent, so there are no differences between an AMD and an Nvidia GPU here. In WQHD it is 40 percent less FPS, which is also similar, but already shows a trend. It is then at its largest in Ultra HD. With a lot of pixels to render, the Radeon with the best PC graphics is 31 percent slower, which is still 6 percent slower than the GeForce. Whether the Radeon RX 6800 XT has problems with a few pixels or the GeForce RTX 3080 has problems with many pixels cannot be determined.

    With the original PlayStation 4 graphics, high frame rates are no longer a real problem. The GeForce GTX 1060 and the Radeon RX 580 romp around the 50 FPS mark with slight advantages for the AMD graphics card, which does not mean a perfect gaming experience in Full HD, but God of War is at least playable that way.All other graphics cards, whether new or old, easily exceed the 60 FPS mark

    With PlayStation graphics, it doesn't have to be high-end anymore

    In 2,560 × 1,440, the Radeon RX 5700 XT and the GeForce RTX 2070 Super easily deliver more than 60 FPS, with the better PC graphics the former is overwhelmed and the latter does not quite reach 50 frames per second. In 3,840 × 2,160, Radeon RX 6700 XT and GeForce RTX 3060 Ti can also be played with the original preset, with the AMD graphics card delivering just under 60 FPS and the Nvidia counterpart a bit more.

    AMD FSR and Nvidia DLSS in comparison

    God of War offers both Nvidia's DLSS version 2.3.4.0 and AMD's competing FidelityFX Super Resolution 1.0.However, what is missing with both functions is a manual sharpening of the image. This is not possible with the game's own resources.

    And with DLSS, that's not necessary either. Because the game's own TAA still offers good image stability even in low resolutions, the image sharpness, on the other hand, is not a strength of temporal anti-aliasing. The picture is very blurred in Full HD, which gets better from WQHD onwards, but the effect hasn't completely disappeared even in Ultra HD. And since DLSS replaces the in-game anti-aliasing, the problem doesn't exist.

    DLSS sharpens the image and repairs objects


    As a result, even DLSS on performance in Ultra HD and thus a render resolution of just Full HD offers a sharper image than with native resolution. DLSS on Quality is just a little bit better, a sharper picture is currently not possible in God of War. At no time does it appear overly sharp, but simply as it should have been with the TAA.

    A second benefit of DLSS is image reconstruction. The temporal anti-aliasing also does a good job in this respect in 3,840 × 2,160, but DLSS is simply a little better, so that lines that are too fine, such as with thin branches, are occasionally "rebuilt" by the DLSS. The advantage in the game is small, but still worth mentioning.

    In terms of image stability, on the other hand, DLSS does not quite come close to native anti-aliasing, which does a really good job in this respect. With DLSS on Quality, a few edges flicker a bit more than with TAA, but that's still a high-level gripe. Only with DLSS on performance does the picture become more restless in several places. The flickering is not annoying, but the stability is no longer on a comparable UHD level.


    Smearing is back in GoW - and how

    Despite the small weaknesses of DLSS in terms of image stability, DLSS has so far been able to convince with a very good result in the analysis in God of War. And so far it has to be said that DLSS on Quality simply looks better in the game than the native resolution, despite significantly fewer render pixels. However, for some inexplicable reason, an old bugbear of DLSS is making a comeback in God of War, and quite violently at that.We're talking about smearing, which actually hardly exists with the latest DLSS versions.

    In God of War, however, the smearing is suddenly clearly pronounced again, you see it sometimes more and sometimes less in many sequences. Thin objects, primarily branches, are the problem, and there are a lot of them in God of War. But hair is also problematic, which can smudge with certain movement vectors of the camera as much as DLSS at the worst smearing times.

    FSR is amazingly stable but fuzzy

    Unlike Nvidia's DLSS, AMD's competing FSR does not offer its own temporal component, which is a major disadvantage in some cases, but is also less problematic. And that applies to God of War as well. The image stability is surprisingly good, which is simply due to the very good TAA in this regard.

    With FSR set to "Ultra Quality", the image doesn't flicker any more than with the native resolution, which FSR never really manages to do in other games. And even with FSR on Quality, the image stability is still good, even if weaknesses are noticeable. Even with even more aggressive FSR modes, the flicker is amazingly contained.

    In terms of image stability, FSR DLSS is hardly inferior. However, since FSR uses the game's TAA, image sharpness degrades visibly at lower resolutions. And since DLSS already scores in terms of image sharpness compared to native resolution, it is clearly superior to FSR. Even with FSR on "Ultra Quality", the image sharpness decreases compared to native resolution, DLSS performs significantly better in this discipline. From FSR on Quality the game gets pretty blurry, DLSS on "Quality" is massively sharper with the same render resolution.

    There are no problems for this

    Since FSR has no temporal component, unlike DLSS, FidelityFX Super Resolution cannot recover lost detail. As already described, this is not a problem in God of War. However, there are no graphics problems on the opposite side either, so the image never smears. And since the developers apparently only use the CAS portion of FSR slightly for resharpening, there are no disadvantages either.

    Without smearing, DLSS would be almost perfect

    And what should now be used in God of War? DLSS or FSR? With a Radeon, the question is answered immediately, as is well known, DLSS does not exist there.And since FSR does a good job in God of War, that's not exactly a broken leg either. FSR on "Ultra Quality" delivers a convincing result on the monitor, perhaps the best FSR has ever achieved in a game. Only the lack of image sharpness is annoying, and AMD CAS should be used to sharpen it - but this can potentially lead to other problems. In case of performance problems, FSR can be used without hesitation - the image quality is better than with a simply reduced resolution.

    Smearing DLSS

    Smearing DLSS 

    Nvidia's DLSS then actually delivers an almost perfect result in GoW and has the potential to clearly surpass native rendering in the game. That's impressive considering how fewer render pixels are available to the image. Unfortunately, the very pronounced smearing clouds the excellent result up to that point. As long as it is not noticeable, DLSS on Quality is clearly superior to the native resolution.

    If, on the other hand, it is noticed, it is difficult to ignore the re-tightening. And so for most gamers, native resolution will continue to be the eventual setting of choice, while DLSS will continue to be the setting of choice for FPS issues. But if Nvidia reduces the smearing in God of War via driver or patch, there will be a new, clear winner. As long as DLSS on Quality is the mode of choice when using DLSS

    The FPS gains of DLSS and FSR

    The performance gain of the upscaling techniques is limited in God of War, at least with a high-end graphics card. FSR on Quality and thus a render resolution of WQHD accelerates the Radeon RX 6800 XT by a not too high 37 percent, with the GeForce RTX 3080 it is even lower 29 percent. While this is a good performance boost, the level usually gives a slightly larger boost in other games. But that's not due to FSR, because the game's own scaling does not show better performance. God of War just doesn't scale all that well with fewer render pixels.

    If FSR is turned higher, the FPS boost is correspondingly higher, but in terms of quality this is simply not recommended. FSR on performance and thus a Full HD rendering resolution accelerate the Nvidia GPU by 48 percent and the AMD counterpart by 63 percent. FSR brings more performance to the action game on a Radeon than on a GeForce, which has not been shown in any other title so far. However, it is likely that the Radeon in God of War simply copes much better with fewer pixels than the GeForce - which is often the case in the RDNA 2 vs. Ampere duel.


    DLSS looks better, but brings less performance

    As usual, the FPS gain through Nvidia's DLSS is lower than that of AMD's FSR, since the calculation time of the neural network is significantly longer than that of the spatial upscaler. DLSS on Quality accelerates the GeForce RTX 3080 by 19 percent, so that FSR is 8 percent faster with the same render resolution - the computing time of DLSS is that much longer than that of FSR. DLSS on performance then accelerates the GeForce by 34 percent, here the distance to FSR on performance (both with Full HD resolution) is slightly higher 10 percent.

    Also Read; GOD OF WAR: Benchmarks and DLSS vs FSR comparison to the PC version

    If you have any queries or concern regarding this article please mail me at - ask@thedroidverse.com / amru.meajy@thedroidverse.com

    Post a Comment

    Comment Your Thoughts About This -

    Previous Post Next Post