God of War was released as a flagship title on the PlayStation 4 in 2018, and a visually enhanced version for the PC will follow on Friday. ComputerBase has already taken a detailed look at the technology, including Nvidia DLSS and AMD FSR, and created numerous graphics card benchmarks.
Even with a GeForce GTX 1060 or Radeon RX 580, God of War is still playable, albeit not optimally.
The newly added graphics cards are:
- AMD Radeon RX 580
- AMD Radeon RX Vega 64
- AMD Radeon RX 5700 XT
- AMD Radeon RX 6700 XT
- Nvidia GeForce GTX 1060
- Nvidia GeForce GTX 1080
- Nvidia GeForce RTX 2060
- Nvidia GeForce RTX 2070 Super
After years of Sony exclusively developing exclusive games for its own PlayStation, lately they have always come to the PC with some distance. Horizon Zero Dawn was the first , followed by Days Gone . And now follows with God of War another great title that has only been available on the PlayStation 4 and PlayStation 5 so far, making it one of the best-rated single-player games with a big campaign ever.
The actual developer Santa Monica Studio is not responsible for the PC version of God of War, but Jetpack Interactive. There are not only the usual adjustments to the PC such as mouse and keyboard control, free choice of resolution and unlimited frame rates.
There has also been a step forward in graphics quality. In addition, Nvidia's intelligent AI upsampling DLSS 2.0 is supported, as is AMD's competitor technology FidelityFX Super Resolution (FSR). Nvidia Reflex has also made it into the game, as well as ultrawide resolutions in 21:9 format. There is no ray tracing.
Several small improvements for the PC
Even if the graphics have undergone some updates, more on this on the third page of the article , you can see that the engine developed by Santa Monica Studio is more than three and a half years old. As a result, the optics cannot keep up with current graphics heavyweights such as Marvel's Guardians of the Galaxy, Call of Duty: Vanguard or Forza Horizon 5.
Several small improvements for the PC
Even if the graphics have undergone some updates, more on this on the third page of the article , you can see that the engine developed by Santa Monica Studio is more than three and a half years old. As a result, the optics cannot keep up with current graphics heavyweights such as Marvel's Guardians of the Galaxy, Call of Duty: Vanguard or Forza Horizon 5.
You can still see the high development budget
That's complaining on a pretty high level, though, because when it comes to artwork and production design, God of War is still absolutely top-notch - you can tell that the game was made on a big budget and by talented developers. Above all, the atmosphere created by the optics, the animations and the level of detail are still very good. This means that God of War is no longer in the top graphics category, but it is still enough for a place in the upper middle field - which is an absolutely respectable result after almost four years.A Graphics Menu That offers More Than The Bare Minimum
God of War has received a decent graphics menu on PC. In addition to the usual detail options with “Low”, “Original” (which uses the original PlayStation graphics), “High” and “Ultra”, there are four different graphic presets, with the reflections going one step further to “Ultra+ "
There is also an FPS limiter that can be configured between 30 and 120 FPS in 10 FPS increments. AMD FSR or Nvidia DLSS can be switched on. However, manual sharpening of the image is not possible with either method. There is also no such function if neither option is used. In addition, the graphics menu shows sample screenshots of the effects of the individual setting options. However, descriptions are missing. The whole thing is rounded off by a VRAM utilization indicator.
There is No Real Full-Screen Mode
There is one special feature to consider when it comes to the resolution. Because God of War does not offer a real exclusive full-screen mode on the PC, instead the game always runs in the resolution set for the Windows desktop. Alternatively, only a window mode is available. This makes sense for many things, but has the disadvantage, for example, that when using AMDs or Nvidia downsampling, the higher resolution has to be set on the Windows desktop. Because the game itself does not offer downsampling.
Alternatively, AMD's FSR or Nvidia's DLSS can be used. Both are qualitatively preferable to the game's own upscaling, but then fewer resolution levels are available, since FSR and DLSS work with fixed resolutions.Upscaling, on the other hand, does, and that's the way a lower render resolution is set in God of War. Because there is a controller that configures the render resolution in 10 percent increments to 50 to 100 percent of the desktop resolution. The advantage of the method is that the game's HUD is consistently displayed in the best possible quality. However, a comparison with the window mode shows that between 5 and 10 percent of the performance gain is lost with a lower resolution. Why the value is so unusually high remains unclear.
The Adrenalin 22.1.1 or GeForce 497.29 was used as the driver. The GeForce RTX 3080 was tested with the GeForce 511.17. The AMD driver and the GeForce 511.17 are already officially optimized for God of War. However, since the latter is currently only available for the GeForce RTX 3080, all other Nvidia graphics cards had to make do with the older GeForce 497.29, which was not yet fully optimized for the game.
The 25-second test sequence takes place in Alfheim shortly after the travel portal.
The scene is well at the expense of the GPU and is demanding with a high level of detail, lots of vegetation and the volumetric lighting. Although there are still some scenes with higher demands, God of War mostly runs slightly faster than in the test sequence.
The maximum possible graphics details are used for the resolutions 1,920 × 1,080, 2,560 × 1,440 and 3,840 × 2,160. The resolution is converted using the game's own scaling function on an Ultra HD monitor. AMD FSR and Nvidia DLSS are disabled unless explicitly mentioned.
Alternatively, AMD's FSR or Nvidia's DLSS can be used. Both are qualitatively preferable to the game's own upscaling, but then fewer resolution levels are available, since FSR and DLSS work with fixed resolutions.Upscaling, on the other hand, does, and that's the way a lower render resolution is set in God of War. Because there is a controller that configures the render resolution in 10 percent increments to 50 to 100 percent of the desktop resolution. The advantage of the method is that the game's HUD is consistently displayed in the best possible quality. However, a comparison with the window mode shows that between 5 and 10 percent of the performance gain is lost with a lower resolution. Why the value is so unusually high remains unclear.
The test system and the benchmark scene
All benchmarks were run on an Intel Core i9-12900K (test) running at default settings . The ROG Asus Maximus Z690 APEX (BIOS 0702) with the Z690 chipset was installed as the mainboard, graphics cards could be controlled accordingly with PCIe 4.0.The CPU was cooled by a Noctua NH-D15S with a centrally installed 140mm fan. 32 GB of memory (Corsair Vengeance, 2 × 16 GB, DDR5-5400, 40-40-40-84) was available to the processor. Windows 11 21H2 with all updates - just like the game - was installed on an NVMe M.2 SSD with PCIe 4.0. Resizable BAR has been used on supported graphics cards from both AMD and Nvidia. HAGS , where supported, was also active.The Adrenalin 22.1.1 or GeForce 497.29 was used as the driver. The GeForce RTX 3080 was tested with the GeForce 511.17. The AMD driver and the GeForce 511.17 are already officially optimized for God of War. However, since the latter is currently only available for the GeForce RTX 3080, all other Nvidia graphics cards had to make do with the older GeForce 497.29, which was not yet fully optimized for the game.
The 25-second test sequence takes place in Alfheim shortly after the travel portal.
The scene is well at the expense of the GPU and is demanding with a high level of detail, lots of vegetation and the volumetric lighting. Although there are still some scenes with higher demands, God of War mostly runs slightly faster than in the test sequence.
The maximum possible graphics details are used for the resolutions 1,920 × 1,080, 2,560 × 1,440 and 3,840 × 2,160. The resolution is converted using the game's own scaling function on an Ultra HD monitor. AMD FSR and Nvidia DLSS are disabled unless explicitly mentioned.
Resolution | Graphic details |
---|---|
1.920 × 1.080 | Ultra-Preset + Reflexionen Ultra+ |
2.560 × 1.440 | Ultra-Preset + Reflexionen Ultra+ |
3.840 × 2.160 | Ultra-Preset + Reflexionen Ultra+ |
Benchmarks in Full HD, WQHD und Ultra HD
God of War is a game whose frame rate isn't that dependent on the resolution. Not because the CPU is limiting or because the game requires little GPU power, the opposite is actually the case. A fast graphics card is necessary for the full quality. But the number of pixels to be rendered is apparently simply not the limiting factor. And so it is that, for example, the Radeon RX 6800 XT is only 17 percent slower when switching from Full HD to WQHD and only 27 percent from WQHD to Ultra HD. This is unusually little, but with Doom Eternal the FPS loss is significantly higher 23 percent and then again 45 percent. This sounds good at first, but this also means that the performance in Full HD does not increase significantly compared to the higher resolutions.And accordingly, it's not that easy in God of War to even reach the 60 FPS mark with f1ull PC graphics, even in 1,920 × 1,080. This is not absolutely necessary either, but 50 FPS should be aimed at as a minimum in the test scene. And even those are only available from a GeForce RTX 2070, GeForce RTX 3060 Ti or Radeon RX 6700 XT. With an old entry-level GPU such as the GeForce GTX 1060 or Radeon RX 580, the game does not even have to be started without a significant reduction in detail, and the Radeon RX Vega 64 and GeForce RTX 2060 also do not achieve playable FPS values.
In 2,560 × 1,440 you need a GeForce RTX 2070 Super, GeForce RTX 3060 Ti or Radeon RX 6700 XT (again) for 50 FPS, for 3,840 × 2,160 you need at least a GeForce RTX 3080 or Radeon RX 6800 XT. The requirements are high for a game that is now more than three years old, but the original PlayStation graphics , which are much more frugal, could help here.
God of War – 1920 × 1080
God of War – 2560 × 1440
God of War – 3840 × 2160
GeForce is sometimes significantly faster than Radeon
God of War is Nvidia country, GeForce graphics cards sometimes run significantly better than their AMD counterparts. This is especially true for Ultra HD, but the same behavior is also evident in lower resolutions. GeForce RTX 3080 and Radeon RX 6800 XT usually work equally fast in Ultra HD. In God of War, however, the Nvidia graphics card delivers a decent 17 percent more average FPS and the percentile FPS are even a whopping 33 percent better. Down to Full HD, the advantage is reduced to 8 and 21 percent, but remains significant even then.This does not always apply to old graphics cards. The Radeon RX 580 is 12 percent faster than the GeForce GTX 1060 and the Radeon RX Vega 64 is only just beaten by the GeForce GTX 1080. But that is of no use in the duel between Ampere and RDNA 2.The gap is still greatest for the fast graphics cards, but it remains in the current generation up to the slow models. The GeForce RTX 3070 is just as fast or faster than the Radeon RX 6800, the GeForce RTX 3060 Ti has the Radeon RX 6700 XT under control and the GeForce RTX 3060 is as fast as the Radeon RX 6600 XT. In contrast, the Radeons perform better in other AAA games.
Good frame times, even if no rounds
God of War shows average frame times with strengths and weaknesses on a Radeon RX 6800 XT. So there is a constant up and down time between the output of the individual images. But the outliers are all quite small, even medium-sized frame spikes - otherwise the standard in games - are not available in 2560 × 2160. This is positive for the gaming experience, with sufficient frame rates the game is pleasantly fluffy in terms of controls. However, when you drop below the 50 FPS mark, the not-so-great framepacing becomes noticeable. The situation is not optimal, but far from a problem.
Framepacing is smoother on a GeForce
The GeForce RTX 3080, on the other hand, clearly has better framepacing in the game. It's not entirely perfect either, so there's always a little up and down between the individual images. However, the differences are so small that they hardly matter. At high frame rates there is no difference between the gaming experience on a Radeon and a GeForce, but at low FPS the GeForce has the edge. It has to be mentioned that God of War apparently likes to reload in some places and this is noticeable in a short but noticeable hesitation. Accordingly, this does not happen during combat, but can occur during normal exploration of the area.8 GB VRAM is enough for all situations
8 GB VRAM is sufficient for God of War including Ultra HD. The game sometimes addresses significantly more, but the additional memory doesn't seem to be really needed, at least the editors couldn't find any performance or graphics problems in this case. In Full HD, 6 GB of memory on the graphics card is enough for full texture details.All benchmarks on this page were carried out with the GeForce 497.29 or the Adrenalin 21.12.1 due to time constraints. The image quality was checked by the editors with the newer drivers, there were no changes in this regard.
Post a Comment
Comment Your Thoughts About This -