Why, for the integrated tests, are you showing 360pMin and 1080pMax? One of those I'm never going to use, and the other I'm not going to expect an IGPU to deliver.
Sure the 1080pMax is good to know but why not pick something sensible, like 1080pMin or 720pMedium, to suggest a setup people might actually want to know about?
Yeah, this test was done for or by a theoretical person, not a gamer, who would want to use values that give them an idea if it was going to be useful for gaming or not.
I also missed the GTX 1650, just to see how APUs compare to the cheapest *modern* card you can buy at $160 or so (unless there's a bitcoin mining boom).
Granted, if you buy an $160 GPU, and keep in the price range of the 4750G, you'd need to find a used i5-7xxx or so on eBay for under $100.
But that would surely run better than anything they tested here, other than the GTX-1080...
I'd argue 768p is a weird resolution in context of these chips, as I'd expect them to be used with an external display. In games, if you're using a lower resolution, I think 720p makes more sense anyhow.
True. Even though, people do play and benchmark these on YT @1080p with somewhat reduced settings. Its not horrible looking, but obviously no ray tracing there...
Actually, the review does include a soldered Intel iGPU of Tiger Lake family. Plus a Broadwell family desktop CPU.
@Ian: Can you specify in more detail per game what the max settings are?
Assuming that for Civ6 that means Ultra settings (by default without AA), I get 33 FPS on the M1 mini. That's 65% faster than the 4750G and entirely playable.
That's a big deflection. It only includes modern Intel CPUs on the *discrete GPU* tests. Given the title of the article it should have them in the regular iGPU tests. But that wouldn't fit the narrative nor the title now would it? This article needs to be pulled and re-written.
@shady28 that's because Intel Gen 9 and 9.5 iGPU from Skylake and it's derivative are actually worse than Broadwell's Gen 8 iGPU with eDRAM. comparing it would be useless.
If you want to split all the hairs, the M1 is an SoC. The difference being that an APU requires at least one other chip to function, the southbridge (theoretically, how you count "chips" here is somewhat up in the air) and an SoC doesn't.
Actually, Ryzen on desktop can run without the chipset. The chipset though adds more PCI-E lanes for nvme, etc. So only mITX (I think its an asrock board?) actually might go without the chipset.
Can you actually buy an M1 anywhere not soldered onto a Mac mobo? It can't be compared reliably because the platform is completely different, most tests would have to be run through emulation and so on.
Understand and even agree but even in emulation those test would be interesting to see curent state of the art of Apple technology. Most games runs on M1 if not native or Roseta, then PC versions via CodeWeavers. And are playable.
No kidding. I mean what about the quallcomm 627 in my phone? Or the VIA chip in my netowrk controller? You cant have a comprehensive review without those!!!!
comparing x86 with ARM64 just won't be fair, as there's too many variable at play. but, comparing it as an Apple laptop with general x86 laptop e.g. MSI Prestige or Acer Swift would be fair, though.
What is weird is that there is not single word about M1. Neither in Conclusion where autor go throught many architectures even consoles. But onmit hot new ball on playground. Seems like lack of communication among authors here.
It doesn't match up with the rest of the benches but when we're comparing it to GTX 950, GTX 1030, etc, 360P, no one plays that, 1080P max, no one can play that.
I either wind up playing at 720P max or 1080P medium on my laptop, which would be comparable.
Agree. Even having a 1080p minimum option would be useful, and perhaps the inclusion of more e-sports titles. Fortnite, CS Go and similar games should have pretty decent frame rates @ 1080p with reduced quality settings.
Thank God we got *lots* of results of the APU's performance when paired with a discrete GPU, however! (This lets the uninformed overlook the fact that discrete graphics still blows chunks...) SHowing results below 720P is, frankly, utterly retarded...
I think for a laptop integrated gpu 720p maxed out settings And for a desktop APU 1080P medium should be a target for 125W desktop should be ok. as a 1080P GPU card would be 60W +
Great article. But this APU seems severly bandwidth starved on the IGP tests. Would love to see some OC memory results, since a lot of people have been able to push the memory clocks really high with those chips.
I would assume 4 sticks 3200MHz sticks would also bring speedup compared to 2 sticks. Probably two DR sticks also would be beneficial compared to two SR.
Unfortunately I did not have time for actual OC testing. I need to do these "lab tests" remotely and failed OC tests do not work very well :-)
In strong light, it's always hard to tell what CPU you're pulling out of a box (and my boxes hold 50+ CPUs). Sharpie is very easy to see, especially at a quick glance.
I'd love to find a way to use this buying trick to acquire a Ryzen 3900 non X for a reasonable price. Using a B350 board so I can't just run the X version at 65W which is what I'm after for my ITX system.
I have to agree with other posters; neglecting to include test results for these APUs with overclocked RAM severely reduces the utility of this review. Given the lengths that people will have to go to to acquire these processors, they must have a use case that requires maximum performance of the iGPU, and will more than likely overclock their RAM. For me,if I was sourcing one of these, it would be going into a tiny ITX case with no room for a discreet video card, and would be overclocking my ram to it's limits. Not having at least DDR4-4000 with tight timings does a disservice. And, I would be fine with all the competing desktop APUs having overclocked RAM as well.
Ian claims that not enabling XMP is the more realistic scenario because he thinks that the majority of people who build their own PCs wont go into the BIOS and enable XMP, seriously.
I haven't on any of my builds until this year and I guarantee most of my friends fell into the same boat. Most build guides don't focus on RAM speed so it's not as common of knowledge.
I was really surprised at how well the 2400G/3400G has held up to these newer apus. I was thinking they would be left further behind, but they remain very competitive. Guess it shows that APU wise, AMD has not advanced a whole lot the last few years either.
Hoping the next gen zen3 bring a big graphics boost to the table as well. With the mid/low end games I typically play, that would mean no more graphics card purchases for me (unless I wanted a mid life cycle refresh or some such)...
A big part is memory bandwidth. While AMD has improved its not like we have 5000mhz DDR4 feeding these chips. AMD's focus has also been more on efficiency of the GPU as opposed to outright performance.
Try 4 sticks RAM in Bank + Channel Interleave mode. AMD APU gaming performance goes up by 1/3, leaving only 3 of the IGP tests under 30FPS on the 4750G.
4 channels of DDR5 directly on the APU package (not routed through MB), 200W+, combining GPU power with a discrete GPU, at least of the same architecture, and we are talking (I am buying).
3 sides of a CPU board can fit 4 SODIMM slots. Even 8, 2 one above the other. Can be horizontal, vertical (but that will limit the size of a radiator) or slanted.
Not because of their performance, but because of their OEM only status.
They were exactly what I needed for an 8 core mini-ITX server, with good multi...single threaded performance (many single threaded streams), a pico-PSU and no expansion slots and they were never made available. Some stock did appear here and there as the article explains, but the pricing was extreme.
Thus I still have a huge full tower with a R7-2700 and my next server will be either be an 8 core Xeon E5-V2 on one of those crazy Chinese x79 motherboards with dual m.2 NVMe slots in ITX.
Or a massive 2x14 core 2GHz dual Xeon E5-v3 with a total system price close to buying just one of those processors.
I lose out on single thread but gain on pure output.
Similar story here. I wanted to build a couple of nearly silent mini-ITX APU systems to replace my older Haswell desktops and I kept hitting walls. Zen+ APUs are getting long in the tooth and are incompatible with newer A520 and B550 boards (and X570 isn't optimal either). Zen 2 APUs have been stupidly expensive until recently and have questionable warranty support when purchased from Asian resellers. Zen 3 APUs are coming, but we don't know when and if they'll be OEM-only like their predecessor. Intel Comet Lake desktop CPUs lack an Iris Plus iGPU option, so you're stuck with horrible UHD 630 performance. Rocket Lake CPUs with Xe iGPU are coming, but given Intel's recent schedule misses, who knows when.
I have to echo the comments of others. Why benchmark 360p or 480p for integrated graphics? How is that even remotely relevant?
The funny thing is, I work with retro computers and as I type this I have a 3dfx Voodoo 2 on a test bench right now, stress (heat) testing at 640x480. It runs like butter at that res, at least in games from 1997-1999.
Why would anyone need to know how a modern IGP runs at a resolutions similar to or lower than what 3d accelerators used 23 years ago? What games even support 360P (480x360... a display resolution not normally used by PCs at any time period), and how could you even read menus at that res??
Why would they remove the GPU bottleneck on the Integrated Graphics test page? Isn't the whole point of that page to test the IGP performance?
No one buys a new computer to play at 1080P at 19fps OR to play at 480x360 at 70fps. Just show the most realistically playable setting for a given game. For most that'd be 1280x720 or 1366x768. If it can't break 30fps at those settings, most people will simply not play a game, or they may drop it to 1024x768 or some other somewhat common resolution. They used reasonably resolutions for some tests, I just don't get why they used 360P and 1080P on others, when the results were obviously not going to reflect anything that anyone would ever actually use the IGP for.
I feel bad, because this site is one of the few left that bother to do detailed written reviews like this (...bye Techreport), I just don't understand the choice of tests over the last several years.
The gpu testing is done in California and this is done in the UK. Obviously they have a lot of trouble in California (they've mentioned the wildfires).
29a. if you are that un happy with AT, WHY do you keep coming here and what seems to be, wasting your time ? or do you just come here to whine and complain ?
Issues with this review: Missing 720p which is the most common gaming resolution for chips like these.
No check/attempt at any form of memory overclock/seeing if there's an easy one-click option and using it as a comparison point as well, since IGPU are all memory sensitive.
No frame time graph, average FPS is not even close to the whole story if there are frame drops, and at most of these averages I'd expect them to at times drop substantially.
If they'd overclock ram on one system, they'd have to do it on all systems. They are currently testing them at manufacturer clocks for the cpu, ram and gpu. Overclocking deserves its own article, but I'm afraid it'd require too much time to do with every processor in #CpuOverload
It's about revealing the performance characteristics of the APU when either the CPU is the bottleneck (low resolution) or when the GPU is the bottleneck (high resolution) and then compare those limitations to the field so you can see how these APUs perform at those extremes. It's not about "let's see if we can squeeze a playable game out of this".
I actually recently picked up a Ryzen 5 Pro 3400GE from the aliexpress market. For a 35 watt chip with ECC support and built in GPU it makes an absolutely awesome little Linux server. I'm planning on adding a discrete GPU using PCI Passthrough to make up for graphics performance.
"This means that the only company taking socketed desktop graphics seriously right now is AMD..."
This is a strange statement given that AMD's integrated graphics offerings are a generation behind non-integrated, and the ones you can actually buy are two generations behind and have limited motherboard support.
Seriously, that is incredibly stupid. Anyone running jedec has UNKNOWINGLY misconfigured their system. Why is a tech review site afraid to go into the BIOS and set the ram speed properly. It probably fell back to 2133mhz with ridiculous timings. Considering that Zen is sensitive to memory latency it makes the entire review worse than worthless. It would be like benchmarking a router's WAN performance by hooking it up to a 1.5mbps DSL modem then saying "we feel this benchmark is appropriate because much of the rural US is limited to this speed"
My conclusion is that we need better low Power Consumption discrete GPUs. The GT1030 is over 3 year old and still faster than IGPs. If nvidia could release an RTX 3030 with a 30W TDP it would be perfect for small machines.
Why would you not test 720p which is what people use integrated GPUs for? 1080p high is too high and 360p is well no one plays at that resolution. Also wouldn't it be better to test the GT 1030 using the APUs? To remove any performance differences coming from the CPU side which from a 2600 to a 4750G is massive.
Thank you for the review. These are typically the types of processors I use to build systems for older family members. Almost exclusively used for web-surfing, online purchases, You-Tube, Facebook, Office/Libre, burning music cds, simple/archaic games, and one individual using Photoshop. I will agree, as others have stated, that 720P really should be the minimum monitor size. Mainly because it's more expensive to find a monitor with lower resolution than a cheap 720P/768P/1080P LCD. I don't mind the 1080P max tests because it's nice to see what kind of a brick wall the iGPU will be hitting. I just finished computer builds for my brother and ex-wife, which left replacing other family from really old APUs systems. Now I know why I don't see much of a selection anymore. Shame, because I enjoy DIY over buying a basic box from an OEM and this just makes it more of a hassle.
Anyone else notice the Ryzen 4750G struggling to beat the 3600X in games but dominating in a few office benchmarks? I figured the 4750G would make a clean sweep seeing as it runs at identical frequencies to the 3600X but has more L1 and L2, two more cores plus a monolithic die.
I just logged in to say thtas the wrong way to way IGP perf... Resolutions for IGP are 720P/768P min/med and 900P min/med depending on the game, 1080p full is too much and anything below 720p is worthless. 900P tend to be the sweetspot in performance and quality for IGP, but not every game can be run at that resolution.
Are those CPUs even supported on B450? ITX boards are much more expensive on B550 than B450, and if you want a PC with only APU inside then you don't want to spend too much on it.
To everyone complaining about the benchmark resolutions/settings: Just double the results of the 1080p benchmarks and that's the ballpark 720p performance. I'm sure 1080p max was used in order to make sure there was a complete GPU bottleneck. That's the only way to compare the GPUs in relation to each other. Once you have that scale, you can extrapolate to other resolutions.
Ian, what happened to the Subor Z+ review? That would be such an incredibly interesting comparison point.
Since these are OEM-only I wouldn't expect to see them married to high-performance RAM.
Many are looking at this lineup from the point of view of the build-a-gaming-PC-myself enthusiast sector but one can also look at it from the point of view of "How much does slow OEM RAM hobble these APUs?" Since OEMS often tout the performance of products that don't perform as well as they could or should (a thing helped out by companies that sell stealth watered-down versions of their products, sometimes with the same name attached) it's useful to have the information out there about how they will perform with baseline RAM.
However, given that 3200 has been cheap for a long time (I got 16 GB for $90 in 2016 as I recall) it would be good to always have the tests show both the slow RAM and something affordable like 3200 that offers quite a bit more performance.
One problem that a company like AMD faces if making CPUs like this is the possibility of them being used with slow RAM. The way around that is to engineer the CPUs to fail to run with slow RAM.
It's great to see these reviewed! I bought a 4650G off a German ebay store a couple of months back, and I couldn't be happier with it for my HTPC. Sips power (I've never seen it exceed 110W at the wall), and performs admirably. With my Crucial Sport LT 3200C16 running happily at 3800C16 (1:1:1) (with near zero effort thanks to 1usmus' dram calc) and the iGPU at 2100 it delivers 60-75fps in Rocket League at 1080p Quality preset, which is perfectly enjoyable. I understand AT's choice of running JEDEC max spec DRAM, but for these chips in particular I think DRAM OC testing would be a good idea.
I feel let down by AMD that they won't officially put their better APUs out in the retail chain, when most AM4 boards out there have video connectors and associated hardware ready to support them. It's like a promise that can't be fulfilled.
The Vega architecture and lack of DDR4X high speed RAM make AMD APU's just not worth it when you can get a 2600x and pair it with an RX5500 or GTX1650 or even an older 1050Ti and deliver 30-60% more gaming performance. With RDNA integrated, AMD could have blown away any Intel iGPU and lower end Nvidia solutions. This 4th gen AMD Desktop APUs are simply not worth it.
Please Dr. Cutress it is customer or enduser, home user or something more appropriate . I don’t consume CPU’s, or gpu‘s. I don’t eat or drink them and afterwards they are gone.
Love your work as always etc. But why not civ turn time!? It’s the only game I ever play on my laptop and fps doesn’t seem to mean much. I find the fps is highest whilst I’m waiting for a turn to process which would tell me that the slower the CPU at turn time the higher the average fps? Besides no ones pushing for higher fps in Civ. it’s a virtual board game almost!
Greetings. I got hold of a 4350G, and I see it can't play youtube HDR content in 720p and above without dropping the frames. The iGPU in task manager shows 90%+ usage during playback. video codec shows av01 and vp09. 1080p HDR youtube playback is a stutterfest. 8k HDR youtube content moves like slideshow, because CPU gets loaded instead of GPU. Am I missing something, or does this APU not have the necessary codec decoding ability to play youtube videos? A detailed HTPC investigation for video playback with these APUs would be much appreciated!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
104 Comments
Back to Article
NedHej - Wednesday, December 16, 2020 - link
Why, for the integrated tests, are you showing 360pMin and 1080pMax?One of those I'm never going to use, and the other I'm not going to expect an IGPU to deliver.
Sure the 1080pMax is good to know but why not pick something sensible, like 1080pMin or 720pMedium, to suggest a setup people might actually want to know about?
lucasdclopes - Wednesday, December 16, 2020 - link
I entered the comment section to say the same thing. For IGP 1080pmax is too much but 360p/600plow is too little. Needs a 768p medium.StevoLincolnite - Wednesday, December 16, 2020 - link
Yeah. Also agree. 720P is what I would be targeting if I intend to game on integrated graphics.360P/480P and 1080P are not my use cases/expectations.
n13L5 - Saturday, January 16, 2021 - link
Yeah, this test was done for or by a theoretical person, not a gamer, who would want to use values that give them an idea if it was going to be useful for gaming or not.I also missed the GTX 1650, just to see how APUs compare to the cheapest *modern* card you can buy at $160 or so (unless there's a bitcoin mining boom).
Granted, if you buy an $160 GPU, and keep in the price range of the 4750G, you'd need to find a used i5-7xxx or so on eBay for under $100.
But that would surely run better than anything they tested here, other than the GTX-1080...
jakky567 - Sunday, December 20, 2020 - link
I'd argue 768p is a weird resolution in context of these chips, as I'd expect them to be used with an external display. In games, if you're using a lower resolution, I think 720p makes more sense anyhow.n13L5 - Wednesday, December 30, 2020 - link
True.Even though, people do play and benchmark these on YT @1080p with somewhat reduced settings.
Its not horrible looking, but obviously no ray tracing there...
ricebunny - Wednesday, December 16, 2020 - link
Tests “World’s best APUs”, but does not include Apple M1??Tomatotech - Wednesday, December 16, 2020 - link
I admire the M1 but it isn’t an APU. That APU name is just AMD’s branding and only applies to their chips.(intel’s IGPUs aren’t tested here either.)
nandnandnand - Wednesday, December 16, 2020 - link
Hogwash. If Intel makes a CPU with a faster iGPU than AMD, then it's a better APU. Period.Technically, the Xbox Series X and PS5 have the world's best APUs. But good luck running your own OS on them.
ricebunny - Wednesday, December 16, 2020 - link
Actually, the review does include a soldered Intel iGPU of Tiger Lake family. Plus a Broadwell family desktop CPU.@Ian: Can you specify in more detail per game what the max settings are?
Assuming that for Civ6 that means Ultra settings (by default without AA), I get 33 FPS on the M1 mini. That's 65% faster than the 4750G and entirely playable.
shady28 - Wednesday, December 16, 2020 - link
That's a big deflection. It only includes modern Intel CPUs on the *discrete GPU* tests. Given the title of the article it should have them in the regular iGPU tests. But that wouldn't fit the narrative nor the title now would it? This article needs to be pulled and re-written.Fulljack - Thursday, December 17, 2020 - link
@shady28 that's because Intel Gen 9 and 9.5 iGPU from Skylake and it's derivative are actually worse than Broadwell's Gen 8 iGPU with eDRAM. comparing it would be useless.Flunk - Wednesday, December 16, 2020 - link
If you want to split all the hairs, the M1 is an SoC. The difference being that an APU requires at least one other chip to function, the southbridge (theoretically, how you count "chips" here is somewhat up in the air) and an SoC doesn't.Fataliity - Wednesday, December 16, 2020 - link
Actually, Ryzen on desktop can run without the chipset. The chipset though adds more PCI-E lanes for nvme, etc. So only mITX (I think its an asrock board?) actually might go without the chipset.throAU - Tuesday, December 22, 2020 - link
To be fair, pretty much exactly nobody in the market for an AMD APU based machine is in the market for an M1 based machine at the moment.So whilst they're both processors including integrated graphics, they're totally different market segments.
bananaforscale - Wednesday, December 16, 2020 - link
Can you actually buy an M1 anywhere not soldered onto a Mac mobo? It can't be compared reliably because the platform is completely different, most tests would have to be run through emulation and so on.Frantisek - Sunday, December 20, 2020 - link
Understand and even agree but even in emulation those test would be interesting to see curent state of the art of Apple technology. Most games runs on M1 if not native or Roseta, then PC versions via CodeWeavers. And are playable.Frantisek - Sunday, December 20, 2020 - link
Checked Ex Deus and there is test showing it on M1 in custom 1080p setting with 25-34 fps.TheinsanegamerN - Wednesday, December 16, 2020 - link
No kidding. I mean what about the quallcomm 627 in my phone? Or the VIA chip in my netowrk controller? You cant have a comprehensive review without those!!!!Kuhar - Thursday, December 17, 2020 - link
ricebunny - you are a hoot! :) that`s what you are. ricefunny :)Fulljack - Thursday, December 17, 2020 - link
comparing x86 with ARM64 just won't be fair, as there's too many variable at play. but, comparing it as an Apple laptop with general x86 laptop e.g. MSI Prestige or Acer Swift would be fair, though.Frantisek - Sunday, December 20, 2020 - link
What is weird is that there is not single word about M1. Neither in Conclusion where autor go throught many architectures even consoles. But onmit hot new ball on playground. Seems like lack of communication among authors here.regsEx - Wednesday, December 23, 2020 - link
So as Intel non-F CPUs which are also APUs. Even 10900K is APU.29a - Wednesday, December 16, 2020 - link
I agree the data in this article is useless because no one would ever use these settings. Why not using settings that people would actually play at?0ldman79 - Thursday, December 17, 2020 - link
I tend to agree.It doesn't match up with the rest of the benches but when we're comparing it to GTX 950, GTX 1030, etc, 360P, no one plays that, 1080P max, no one can play that.
I either wind up playing at 720P max or 1080P medium on my laptop, which would be comparable.
Irata - Thursday, December 17, 2020 - link
Agree. Even having a 1080p minimum option would be useful, and perhaps the inclusion of more e-sports titles. Fortnite, CS Go and similar games should have pretty decent frame rates @ 1080p with reduced quality settings.MDD1963 - Friday, December 18, 2020 - link
Thank God we got *lots* of results of the APU's performance when paired with a discrete GPU, however! (This lets the uninformed overlook the fact that discrete graphics still blows chunks...) SHowing results below 720P is, frankly, utterly retarded...nunya112 - Saturday, December 19, 2020 - link
I think for a laptop integrated gpu 720p maxed out settingsAnd for a desktop APU 1080P medium should be a target for 125W desktop should be ok. as a 1080P GPU card would be 60W +
ET - Sunday, December 20, 2020 - link
Yeah, same comment. I'd have loved to see tests relevant to how people will want to play the games with such an iGPU.Also, would have been nice to see the comparisons to the 3400G and 2400G in the discrete GPU and CPU sections.
n13L5 - Wednesday, December 30, 2020 - link
Good point... 360p... what?lucasdclopes - Wednesday, December 16, 2020 - link
Great article. But this APU seems severly bandwidth starved on the IGP tests. Would love to see some OC memory results, since a lot of people have been able to push the memory clocks really high with those chips.tamsysmm - Wednesday, December 16, 2020 - link
For a short reference I did a few tests with Phoronix Test Suite 10.0.1 for one of our builds for business customer.AMD Ryzen Pro 5 4650G @3.7Ghz / Asrock B550M PRO4 MB
Unigine Superposition 1.0
Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL
2 x 8192 MB 3200MHz Kingston, avg 32.8 fps (max 43.3 fps), (KVR32N22S8/8*2)
2 x 8192 MB 3600MHz Kingston, avg 34.7 fps (max 47.9 fps), (HX436C17PB4AK2/16)
4 x 8192 MB 3600MHz Kingston, avg 37.0 fps (max 51.9 fps), (HX436C17PB4AK2/16*2)
~ 5.79% faster with 12.5% more speed with 2 sticks (~ 10.06% max fps)
~ 12.8% faster with 12.5% more speed with 4 sticks (~ 19.86% max fps)
Naturally 3600MHz sticks had better timings.
3200MHz@jedec settings (22-22-22), 3600@XMP settings (17-19-19, both 2 & 4 sticks)
I would assume 4 sticks 3200MHz sticks would also bring speedup compared to 2 sticks. Probably two DR sticks also would be beneficial compared to two SR.
Unfortunately I did not have time for actual OC testing. I need to do these "lab tests" remotely and failed OC tests do not work very well :-)
dwillmore - Wednesday, December 16, 2020 - link
I like how you wrote the model # in Sharpie right under where AMD laser etched it into the IHS.shabby - Wednesday, December 16, 2020 - link
He should of wrote that its also an amd ryzen, because its not printed anywhere on the cpu...Ian Cutress - Wednesday, December 16, 2020 - link
In strong light, it's always hard to tell what CPU you're pulling out of a box (and my boxes hold 50+ CPUs). Sharpie is very easy to see, especially at a quick glance.Olaf van der Spek - Wednesday, December 16, 2020 - link
I only see average FPS values.. where do I find the 99 percentile frame time values?bananaforscale - Wednesday, December 16, 2020 - link
The A9-9820 probably isn't an XBox APU. The numbering and specs are off.Ian Cutress - Wednesday, December 16, 2020 - link
It's the Xbox One S (Edmonton) with the CPU clocked at 2350 MHz.GreenReaper - Friday, December 18, 2020 - link
Did a search on it and I found the long but very interesting https://thechipcollective.com/posts/cynical/cato/Reflex - Wednesday, December 16, 2020 - link
I'd love to find a way to use this buying trick to acquire a Ryzen 3900 non X for a reasonable price. Using a B350 board so I can't just run the X version at 65W which is what I'm after for my ITX system.CrispySilicon - Wednesday, December 16, 2020 - link
Poor Broadwell. Give it the latest DDR3 and some XMP love, sheesh. Even my 5775C htpc has some damn 1866 ddr3l in it.alufan - Wednesday, December 16, 2020 - link
will this be repeated when the new APUs come out?Ditto others comments re the 720p which is probably a APU sweet spot
lightningz71 - Wednesday, December 16, 2020 - link
I have to agree with other posters; neglecting to include test results for these APUs with overclocked RAM severely reduces the utility of this review. Given the lengths that people will have to go to to acquire these processors, they must have a use case that requires maximum performance of the iGPU, and will more than likely overclock their RAM. For me,if I was sourcing one of these, it would be going into a tiny ITX case with no room for a discreet video card, and would be overclocking my ram to it's limits. Not having at least DDR4-4000 with tight timings does a disservice. And, I would be fine with all the competing desktop APUs having overclocked RAM as well.29a - Thursday, December 17, 2020 - link
Ian claims that not enabling XMP is the more realistic scenario because he thinks that the majority of people who build their own PCs wont go into the BIOS and enable XMP, seriously.lmcd - Friday, December 18, 2020 - link
I haven't on any of my builds until this year and I guarantee most of my friends fell into the same boat. Most build guides don't focus on RAM speed so it's not as common of knowledge.29a - Friday, December 18, 2020 - link
Then you and your friends don't know how to properly build a computer.robbro9 - Wednesday, December 16, 2020 - link
I was really surprised at how well the 2400G/3400G has held up to these newer apus. I was thinking they would be left further behind, but they remain very competitive. Guess it shows that APU wise, AMD has not advanced a whole lot the last few years either.Hoping the next gen zen3 bring a big graphics boost to the table as well. With the mid/low end games I typically play, that would mean no more graphics card purchases for me (unless I wanted a mid life cycle refresh or some such)...
TheinsanegamerN - Wednesday, December 16, 2020 - link
A big part is memory bandwidth. While AMD has improved its not like we have 5000mhz DDR4 feeding these chips. AMD's focus has also been more on efficiency of the GPU as opposed to outright performance.GeoffreyA - Thursday, December 17, 2020 - link
Was surprised myself how well they've held up. Looking forward to seeing Cezanne in action.Cloakstar - Wednesday, December 16, 2020 - link
Try 4 sticks RAM in Bank + Channel Interleave mode. AMD APU gaming performance goes up by 1/3, leaving only 3 of the IGP tests under 30FPS on the 4750G.Cloakstar - Wednesday, December 16, 2020 - link
(All tests were done with 2 sticks of RAM, so channel interleave only.)tamsysmm - Wednesday, December 16, 2020 - link
I'd say that is a bit optimistic. I got speedup on my testing but not 33%Unigine Superposition 1.0
Resolution: 1920 x 1080 - Mode: Fullscreen - Quality: Low - Renderer: OpenGL
2 x 8192 MB 3600MHz Kingston, avg 34.7 fps (max 47.9 fps), (HX436C17PB4AK2/16)
4 x 8192 MB 3600MHz Kingston, avg 37.0 fps (max 51.9 fps), (HX436C17PB4AK2/16*2)
peevee - Wednesday, December 16, 2020 - link
4 channels of DDR5 directly on the APU package (not routed through MB), 200W+, combining GPU power with a discrete GPU, at least of the same architecture, and we are talking (I am buying).ArcadeEngineer - Sunday, December 20, 2020 - link
Four channels worth of DDR5 chips is far larger than any cpu socket.peevee - Wednesday, December 23, 2020 - link
3 sides of a CPU board can fit 4 SODIMM slots. Even 8, 2 one above the other. Can be horizontal, vertical (but that will limit the size of a radiator) or slanted.Danvelopment - Wednesday, December 16, 2020 - link
These chips were a real disappointment.Not because of their performance, but because of their OEM only status.
They were exactly what I needed for an 8 core mini-ITX server, with good multi...single threaded performance (many single threaded streams), a pico-PSU and no expansion slots and they were never made available. Some stock did appear here and there as the article explains, but the pricing was extreme.
Thus I still have a huge full tower with a R7-2700 and my next server will be either be an 8 core Xeon E5-V2 on one of those crazy Chinese x79 motherboards with dual m.2 NVMe slots in ITX.
Or a massive 2x14 core 2GHz dual Xeon E5-v3 with a total system price close to buying just one of those processors.
I lose out on single thread but gain on pure output.
foxalopex - Wednesday, December 16, 2020 - link
If you don't mind gambling a bit (I did and got what I was looking for), these chips can be found on aliexpress for a reasonable price.Lucky Stripes 99 - Thursday, December 17, 2020 - link
Similar story here. I wanted to build a couple of nearly silent mini-ITX APU systems to replace my older Haswell desktops and I kept hitting walls. Zen+ APUs are getting long in the tooth and are incompatible with newer A520 and B550 boards (and X570 isn't optimal either). Zen 2 APUs have been stupidly expensive until recently and have questionable warranty support when purchased from Asian resellers. Zen 3 APUs are coming, but we don't know when and if they'll be OEM-only like their predecessor. Intel Comet Lake desktop CPUs lack an Iris Plus iGPU option, so you're stuck with horrible UHD 630 performance. Rocket Lake CPUs with Xe iGPU are coming, but given Intel's recent schedule misses, who knows when.lmcd - Friday, December 18, 2020 - link
Rocket Lake won't slip, it's a 14nm part lolozzuneoj86 - Wednesday, December 16, 2020 - link
I have to echo the comments of others. Why benchmark 360p or 480p for integrated graphics? How is that even remotely relevant?The funny thing is, I work with retro computers and as I type this I have a 3dfx Voodoo 2 on a test bench right now, stress (heat) testing at 640x480. It runs like butter at that res, at least in games from 1997-1999.
Why would anyone need to know how a modern IGP runs at a resolutions similar to or lower than what 3d accelerators used 23 years ago? What games even support 360P (480x360... a display resolution not normally used by PCs at any time period), and how could you even read menus at that res??
Rudde - Wednesday, December 16, 2020 - link
The lowest resolutions are to remove any gpu bottleneck. It's to measure cpu performance, not igpu performance.ozzuneoj86 - Thursday, December 17, 2020 - link
Why would they remove the GPU bottleneck on the Integrated Graphics test page? Isn't the whole point of that page to test the IGP performance?No one buys a new computer to play at 1080P at 19fps OR to play at 480x360 at 70fps. Just show the most realistically playable setting for a given game. For most that'd be 1280x720 or 1366x768. If it can't break 30fps at those settings, most people will simply not play a game, or they may drop it to 1024x768 or some other somewhat common resolution. They used reasonably resolutions for some tests, I just don't get why they used 360P and 1080P on others, when the results were obviously not going to reflect anything that anyone would ever actually use the IGP for.
I feel bad, because this site is one of the few left that bother to do detailed written reviews like this (...bye Techreport), I just don't understand the choice of tests over the last several years.
erotomania - Wednesday, December 23, 2020 - link
Exactly, Joe. Exactly. Maybe small rodents make a comeback one day, although the chances are approaching zero.zakelwe - Wednesday, December 16, 2020 - link
How come Anandtech are posting this rather than discrete graphics reviews such as Nvidia 3xxx series and Amd 6xxx ?What is going on with your once great site ?
Rudde - Wednesday, December 16, 2020 - link
The gpu testing is done in California and this is done in the UK. Obviously they have a lot of trouble in California (they've mentioned the wildfires).quorm - Wednesday, December 16, 2020 - link
Lol29a - Thursday, December 17, 2020 - link
This is really the excuse they are using.lmcd - Friday, December 18, 2020 - link
If you've ever dealt with PG&E you'd immediately recant, and offer sacrifice to whatever responsible deity protected you from their incompetency.29a - Thursday, December 17, 2020 - link
Anand sold it and it went to shit with terrible articles that could be good like this one.Qasar - Friday, December 18, 2020 - link
29a. if you are that un happy with AT, WHY do you keep coming here and what seems to be, wasting your time ? or do you just come here to whine and complain ?RSAUser - Wednesday, December 16, 2020 - link
Issues with this review:Missing 720p which is the most common gaming resolution for chips like these.
No check/attempt at any form of memory overclock/seeing if there's an easy one-click option and using it as a comparison point as well, since IGPU are all memory sensitive.
No frame time graph, average FPS is not even close to the whole story if there are frame drops, and at most of these averages I'd expect them to at times drop substantially.
Rudde - Wednesday, December 16, 2020 - link
If they'd overclock ram on one system, they'd have to do it on all systems. They are currently testing them at manufacturer clocks for the cpu, ram and gpu. Overclocking deserves its own article, but I'm afraid it'd require too much time to do with every processor in #CpuOverloadlinuxgeex - Wednesday, December 16, 2020 - link
It's about revealing the performance characteristics of the APU when either the CPU is the bottleneck (low resolution) or when the GPU is the bottleneck (high resolution) and then compare those limitations to the field so you can see how these APUs perform at those extremes. It's not about "let's see if we can squeeze a playable game out of this".TelstarTOS - Wednesday, December 16, 2020 - link
Average CPUs and bad APUs. Anything that performs worse than a 1650 is useless for gaming.foxalopex - Wednesday, December 16, 2020 - link
I actually recently picked up a Ryzen 5 Pro 3400GE from the aliexpress market. For a 35 watt chip with ECC support and built in GPU it makes an absolutely awesome little Linux server. I'm planning on adding a discrete GPU using PCI Passthrough to make up for graphics performance.zamroni - Wednesday, December 16, 2020 - link
Desktop doesn't need on die integrated gpu.Intel kabylake g design, which has Radeon die in cpu package, is actually better for desktop
schujj07 - Wednesday, December 16, 2020 - link
I do wonder how much of the Tiger Lakes advantages in the iGPU tests that it wins deals more with having 33% more RAM bandwidth.cbm80 - Wednesday, December 16, 2020 - link
"This means that the only company taking socketed desktop graphics seriously right now is AMD..."This is a strange statement given that AMD's integrated graphics offerings are a generation behind non-integrated, and the ones you can actually buy are two generations behind and have limited motherboard support.
Leeea - Wednesday, December 16, 2020 - link
I see Anandtech is continuing the moronic policy of using JEDEC memory speeds.At least enable XMP!
xMetaRidley - Thursday, December 17, 2020 - link
Seriously, that is incredibly stupid. Anyone running jedec has UNKNOWINGLY misconfigured their system. Why is a tech review site afraid to go into the BIOS and set the ram speed properly. It probably fell back to 2133mhz with ridiculous timings. Considering that Zen is sensitive to memory latency it makes the entire review worse than worthless.It would be like benchmarking a router's WAN performance by hooking it up to a 1.5mbps DSL modem then saying "we feel this benchmark is appropriate because much of the rural US is limited to this speed"
Kuhar - Thursday, December 17, 2020 - link
I guess you two guys didn`t read the article, did you?! It states clearly about memory SPEED and about memory TIMINGS.29a - Thursday, December 17, 2020 - link
I know, it sucks. The reasoning is they think people who build their own computers wont go into the BIOS and enable XMP.BlueScreenJunky - Thursday, December 17, 2020 - link
My conclusion is that we need better low Power Consumption discrete GPUs. The GT1030 is over 3 year old and still faster than IGPs. If nvidia could release an RTX 3030 with a 30W TDP it would be perfect for small machines.nero10578 - Thursday, December 17, 2020 - link
Why would you not test 720p which is what people use integrated GPUs for? 1080p high is too high and 360p is well no one plays at that resolution. Also wouldn't it be better to test the GT 1030 using the APUs? To remove any performance differences coming from the CPU side which from a 2600 to a 4750G is massive.Fujikoma - Thursday, December 17, 2020 - link
Thank you for the review. These are typically the types of processors I use to build systems for older family members. Almost exclusively used for web-surfing, online purchases, You-Tube, Facebook, Office/Libre, burning music cds, simple/archaic games, and one individual using Photoshop. I will agree, as others have stated, that 720P really should be the minimum monitor size. Mainly because it's more expensive to find a monitor with lower resolution than a cheap 720P/768P/1080P LCD. I don't mind the 1080P max tests because it's nice to see what kind of a brick wall the iGPU will be hitting. I just finished computer builds for my brother and ex-wife, which left replacing other family from really old APUs systems. Now I know why I don't see much of a selection anymore. Shame, because I enjoy DIY over buying a basic box from an OEM and this just makes it more of a hassle.Superunknown9898 - Thursday, December 17, 2020 - link
Anyone else notice the Ryzen 4750G struggling to beat the 3600X in games but dominating in a few office benchmarks? I figured the 4750G would make a clean sweep seeing as it runs at identical frequencies to the 3600X but has more L1 and L2, two more cores plus a monolithic die.maindan - Thursday, December 17, 2020 - link
What a gross oversight not to include the new Apple M1 CPU here, which is essentially an APU.Inexcusable.
qlum - Thursday, December 17, 2020 - link
Not really the area of focus for anandtech but the 4700g has a small niche as being the best cpu for memory frequency records.Shivansps - Thursday, December 17, 2020 - link
I just logged in to say thtas the wrong way to way IGP perf... Resolutions for IGP are 720P/768P min/med and 900P min/med depending on the game, 1080p full is too much and anything below 720p is worthless. 900P tend to be the sweetspot in performance and quality for IGP, but not every game can be run at that resolution.Jacek Jagosz - Thursday, December 17, 2020 - link
Are those CPUs even supported on B450? ITX boards are much more expensive on B550 than B450, and if you want a PC with only APU inside then you don't want to spend too much on it.MDD1963 - Friday, December 18, 2020 - link
So, in a nutshell, this is still just a better CPU but still crippled with just over (barely) GT1030-level of integrated graphics...Assimilator87 - Friday, December 18, 2020 - link
To everyone complaining about the benchmark resolutions/settings: Just double the results of the 1080p benchmarks and that's the ballpark 720p performance. I'm sure 1080p max was used in order to make sure there was a complete GPU bottleneck. That's the only way to compare the GPUs in relation to each other. Once you have that scale, you can extrapolate to other resolutions.Ian, what happened to the Subor Z+ review? That would be such an incredibly interesting comparison point.
McFly323 - Friday, December 18, 2020 - link
The best World APU is PS5 AMD APU.But the AMD will never release that for PC buyers because that would murder PC components market.Oxford Guy - Friday, December 18, 2020 - link
Since these are OEM-only I wouldn't expect to see them married to high-performance RAM.Many are looking at this lineup from the point of view of the build-a-gaming-PC-myself enthusiast sector but one can also look at it from the point of view of "How much does slow OEM RAM hobble these APUs?" Since OEMS often tout the performance of products that don't perform as well as they could or should (a thing helped out by companies that sell stealth watered-down versions of their products, sometimes with the same name attached) it's useful to have the information out there about how they will perform with baseline RAM.
However, given that 3200 has been cheap for a long time (I got 16 GB for $90 in 2016 as I recall) it would be good to always have the tests show both the slow RAM and something affordable like 3200 that offers quite a bit more performance.
One problem that a company like AMD faces if making CPUs like this is the possibility of them being used with slow RAM. The way around that is to engineer the CPUs to fail to run with slow RAM.
Oxford Guy - Friday, December 18, 2020 - link
"The way around that is to engineer the CPUs to fail to run with slow RAM."So, not doing that means the company is satisfied with the parts being hobbled by slow RAM, not just the OEM.
vol.2 - Saturday, December 19, 2020 - link
If they make IGPU performance "deliver," it will eat into the sales of DGPUs.Valantar - Sunday, December 20, 2020 - link
It's great to see these reviewed! I bought a 4650G off a German ebay store a couple of months back, and I couldn't be happier with it for my HTPC. Sips power (I've never seen it exceed 110W at the wall), and performs admirably. With my Crucial Sport LT 3200C16 running happily at 3800C16 (1:1:1) (with near zero effort thanks to 1usmus' dram calc) and the iGPU at 2100 it delivers 60-75fps in Rocket League at 1080p Quality preset, which is perfectly enjoyable. I understand AT's choice of running JEDEC max spec DRAM, but for these chips in particular I think DRAM OC testing would be a good idea.artifex - Monday, December 21, 2020 - link
I feel let down by AMD that they won't officially put their better APUs out in the retail chain, when most AM4 boards out there have video connectors and associated hardware ready to support them. It's like a promise that can't be fulfilled.tkSteveFOX - Monday, December 21, 2020 - link
The Vega architecture and lack of DDR4X high speed RAM make AMD APU's just not worth it when you can get a 2600x and pair it with an RX5500 or GTX1650 or even an older 1050Ti and deliver 30-60% more gaming performance.With RDNA integrated, AMD could have blown away any Intel iGPU and lower end Nvidia solutions.
This 4th gen AMD Desktop APUs are simply not worth it.
Brane2 - Wednesday, December 23, 2020 - link
Isn't that a bit late now, that 5xxxx is to come out ?peevee - Wednesday, December 23, 2020 - link
5xxx are not APUs.chaosys - Friday, December 25, 2020 - link
Please Dr. Cutress it is customer or enduser, home user or something more appropriate . I don’t consume CPU’s, or gpu‘s. I don’t eat or drink them and afterwards they are gone.Scubasausage - Monday, April 5, 2021 - link
Love your work as always etc. But why not civ turn time!? It’s the only game I ever play on my laptop and fps doesn’t seem to mean much. I find the fps is highest whilst I’m waiting for a turn to process which would tell me that the slower the CPU at turn time the higher the average fps? Besides no ones pushing for higher fps in Civ. it’s a virtual board game almost!prateekprakash - Friday, July 30, 2021 - link
Greetings. I got hold of a 4350G, and I see it can't play youtube HDR content in 720p and above without dropping the frames. The iGPU in task manager shows 90%+ usage during playback. video codec shows av01 and vp09. 1080p HDR youtube playback is a stutterfest. 8k HDR youtube content moves like slideshow, because CPU gets loaded instead of GPU.Am I missing something, or does this APU not have the necessary codec decoding ability to play youtube videos?
A detailed HTPC investigation for video playback with these APUs would be much appreciated!