Discrete Gaming Performance

As stated on the first page, here we take both APUs from DDR4-2133 to DDR4-3466 and run our testing suite at each stage. For our gaming tests, we are only concerned with real-world resolutions and settings for these games. It would be fairly easy to adjust the settings in each game to a CPU limited scenario, however the results from such a test are mostly pointless and non-transferable to the real world in our view. Scaling takes many forms, based on GPU, resolution, detail levels, and settings, so we want to make sure the results correlate to what users will see day-to-day.

Civilization 6

First up in our APU gaming tests is Civilization 6. Originally penned by Sid Meier and his team, the Civ series of turn-based strategy games are a cult classic, and many an excuse for an all-nighter trying to get Gandhi to declare war on you due to an integer overflow. Truth be told I never actually played the first version, but every edition from the second to the sixth, including the fourth as voiced by the late Leonard Nimoy, it a game that is easy to pick up, but hard to master.

(1080p) Civilization 6 on ASUS GTX 1060 Strix 6GB, Average Frames Per Second(1080p) Civilization 6 on ASUS GTX 1060 Strix 6GB, 99th Percentile

In Civilization 6, both CPUs scored a 10+% gain in both average frame rates and percentile gains.

Ashes of the Singularity (DX12)

Seen as the holy child of DX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go and explore as many of the DX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

(1080p) AoTS on ASUS GTX 1060 Strix 6GB, Average Frames Per Second(1080p) AoTS on ASUS GTX 1060 Strix 6GB, 99th Percentile

Ashes was again a little inconsistent in its results with memory using a discrete GPU, with gains avaialble on average frame rates up to about DDR4-2866 but beyond this there was not much more.

Rise Of The Tomb Raider (DX12)

One of the newest games in the gaming benchmark suite is Rise of the Tomb Raider (RoTR), developed by Crystal Dynamics, and the sequel to the popular Tomb Raider which was loved for its automated benchmark mode. But don’t let that fool you: the benchmark mode in RoTR is very much different this time around. Visually, the previous Tomb Raider pushed realism to the limits with features such as TressFX, and the new RoTR goes one stage further when it comes to graphics fidelity. This leads to an interesting set of requirements in hardware: some sections of the game are typically GPU limited, whereas others with a lot of long-range physics can be CPU limited, depending on how the driver can translate the DirectX 12 workload.

(1080p) RoTR on ASUS GTX 1060 Strix 6GB, Average Frames Per Second(1080p) RoTR on ASUS GTX 1060 Strix 6GB, 99th Percentile

Similar to Ashes, RoTR only sees gains from the base memory opyion. That being said, the Ryzen 3 2200G did get additional boosts in the percentile frame rates going up to DDR4-3466.

Shadow of Mordor

The next title in our testing is a battle of system performance with the open world action-adventure title, Middle Earth: Shadow of Mordor (SoM for short). Produced by Monolith and using the LithTech Jupiter EX engine and numerous detail add-ons, SoM goes for detail and complexity. The main story itself was written by the same writer as Red Dead Redemption, and it received Zero Punctuation’s Game of The Year in 2014.

(1080p) Shadow of Mordor on ASUS GTX 1060 Strix 6GB, Average Frames Per Second(1080p) Shadow of Mordor on ASUS GTX 1060 Strix 6GB, 99th Percentile

Mordor seeminly prefers its processor without SMT. Raising the memory saw small gains in percentile rates.

F1 2017

Released in the same year as the title suggests, F1 2017 is the ninth variant of the franchise to be published and developed by Codemasters. The game is based around the F1 2017 season and has been and licensed by the sports official governing body, the Federation Internationale de l’Automobile (FIA). F1 2017 features all twenty racing circuits, all twenty drivers across ten teams and allows F1 fans to immerse themselves into the world of Formula One with a rather comprehensive world championship season mode.

(1080p) F1 2017 on ASUS GTX 1060 Strix 6GB, Average Frames Per Second(1080p) F1 2017 on ASUS GTX 1060 Strix 6GB, 99th Percentile

Total War: WARHAMMER 2

Not only is the Total War franchise one of the most popular real-time tactical strategy titles of all time, but Sega delve into multiple worlds such as the Roman Empire, Napoleonic era and even Attila the Hun, but more recently they nosedived into the world of Games Workshop via the WARHAMMER series. Developers Creative Assembly have used their latest RTS battle title with the much talked about DirectX 12 API, just like the original version, Total War: WARHAMMER, so that this title can benefit from all the associated features that comes with it. The game itself is very CPU intensive and is capable of pushing any top end system to their limits.

(1080p) Total War: WARHAMMER 2 on ASUS GTX 1060 Strix 6GB, Average Frames Per Second(1080p) Total War: WARHAMMER 2 on ASUS GTX 1060 Strix 6GB, 99th Percentile

Integrated Gaming Performance on Vega Conclusions on Ryzen APU DDR4 Scaling
Comments Locked


View All Comments

  • Meat Hex - Thursday, June 28, 2018 - link

    I would not read into these graphs and tests too much as they are only showing you the % gained from higher speed memory and not the actual FPS. You're still going to have a higher FPS on the 2400G vs the 2200G.
  • PeachNCream - Thursday, June 28, 2018 - link

    The 2200G is a reasonable CPU and the price is good for the performance you get back. If you do upgrade to a dedicated graphics card later (unlikely given the HTPC role you're aiming for due to heat, power, noise, and space concerns) the dGPU benchmarks show most of the games measured demonstrate the 2200G is relatively close in performance to the 2400G so there's that as well.
  • sing_electric - Thursday, June 28, 2018 - link

    Especially for an HTPC, "good enough" performance is often EXACTLY what you want, especially when you're considering chips on the same architecture/process, since the 2200G will make less heat, making the entire system run cooler, which, in turn, can mean a quieter system.
  • Lolimaster - Friday, June 29, 2018 - link

    It's all about the extra thread that will minimize stuttering from 4c/4t cpu, and will handle better a dgpu higher than a 1060.
  • GreenReaper - Friday, June 29, 2018 - link

    To be honest it might make the most sense to buy the APU, then the dGPU later, then a later-model CPU using 7nm architecture to replace the APU.
  • drzzz - Thursday, June 28, 2018 - link

    After reading the article I was expecting a conclusion to talk heavily about DDR-3333 memory and how it performed the best overall and clearly point out that speeds over that seemed to fall off and maybe some insightful thoughts on why that was seen. Given the spikes at 2933 and 3333 compared to 2400 it would seem to indicate there is some steady state synergy with the IF, memory and the caching mechanics that only manifest at specific bounds in the frequency increase. Again expected more about these two points in a conclusion vice the 2133 to 3466 differences that were talked about. The performance at 2933 and 3333 makes me curious if there is some logical hard design choice AMD made that would make memory selection easier for us once we identify it and if the same factors play into all the Zen based CPU's. I find it interesting that the xx33 speeds seem to be the strong points. So I am curious what would 2533 and 3733 look like. I know 3733 is not a realistic option. If the xx33 speeds are the best performing across the spectrum I would seriously love to know why and if same is true for Zen base CPU's.
  • peevee - Tuesday, July 3, 2018 - link

    "Given the spikes at 2933 and 3333"

    No spikes, the steps are different in size.
    And of course the simply misleading slowing of latencies on slower frequencies by setting CL etc to the same value.
  • zodiacfml - Thursday, June 28, 2018 - link

    Do a review of the GT 1030 with DDR4.
  • PeachNCream - Thursday, June 28, 2018 - link

    Do you mean the GT 1030 that uses DDR4 on the card as opposed to the GDDR5 model? If that's the case, I'd like to second that. The performance difference between the two memory types would be worth analyzing. As the GDDR5 model is a bit ahead of AMD's APUs, I'd imagine the DDR4 model would cost enough performance to hand the performance advantage back to a 2400g, but it'd be useful to see that play out in Anandtech's benchmarks.
  • TheWereCat - Thursday, June 28, 2018 - link

    Gamers Nexus released their review yesterday.
    Apparently the GT1030 DDR4 is so starved for memory bandwidth that at best it performs "only" 50% worse than the GDDR5 version and at worst it falls 2x short.
    That makes it worse than a 2200G paired with a single channel 2400MHz DDR4.

Log in

Don't have an account? Sign up now