Decent? The TDP is twice that of the Intel offerings and it's still slower as a CPU. Mobile gaming might be decent, but we still don't know how the system scales down (could lose performance faster than TDP)
I vaguely remember reading how Intel TDP /= AMD TDP. The crux of the article claimed that Intel TDP represented average consumption, while AMD TDP represented the max theoretical consumption. Anyone familiar with the topic?
Well, looking at how GT3(e) is set up in Haswell, it seems likely that Intel's TDP measurements are going to be close to the maximum usage as well in any graphical application. The graphics parts will push even cpu usage down if TDP limits are reached there, so...
You are wrong marvee. This is an hold topic of many forums and the problem was solved long time ago. AMD TDP is equal to Intel TDP because "TDP" means a precise thing for OEMs. Go to a simple Wiki page and get a clue.
Unless the laws of thermodynamic have changed somehow, I'd expect TDP to put an upper bound on a CPU's power consumption. So, yes, it is about how much wattage it uses in the worst conceivable steady state. (Or isn't?)
The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of heat generated by the CPU, which the cooling system in a computer is required to dissipate in typical operation http://en.wikipedia.org/wiki/Thermal_design_power So , no its not.
TDP doesn't equal power consumption. It equals how much heat a cooling unit must dissipate for safe thermal levels of operation. While there is some correlation, as more TDP generally means higher power consumption, it's not a direct one.
I think it actually pretty much does equal top power draw, since energy in pretty much equals heat out. But do correct me if I don't understand the physics correctly. To me it simply seems like no work being done.
TDP means the maximum power that needs to be dissipated, but most CPUs/APUs are not going to be pushing max TDP all the time. My experience is that in CPU loads, Intel tends to be close to max TDP while AMD APUs often come in a bit lower, as the GPU has a lot of latent performance/power not being used. However, with AMD apparently focusing more on hitting higher Turbo Core clocks, that may no longer be the case -- at least on the 19W parts. Overall, for most users there won't be a sizable difference between a 15W Intel ULV and a 19W AMD ULV APU, particularly when we're discussing battery life. Neither part is likely to be anywhere near max TDP when unplugged (unless you're specifically trying to drain the battery as fast as possible -- or just running a 3D game I suppose).
It's amazing that we live in a world where information is accessible on a whim to most people living in the western world, yet even on a website that caters to more educated people (or so I would think), people have problems understanding even the simplest concepts that enable them to expose themselves to this medium. Energy is never lost, it's just used up in different ways. Essentially if we had access to a superconductive material to replace lines and a really efficient transistor, we would have a SoC that's TDP is zero watts. Say, a chip does not move a thing, there is no mechanical energy involved, all of the energy is wasted as heat. Why ? Electricity at it's core is flow of charged particles through a medium. If this medium is copper and the particles are electrons, the only thing standing in the way of the electrons flowing are the copper atoms. The electrons will occasionally bump into the atoms, exchanging kinetic energy, making the atom in question move. As the atoms move faster (i.e. their kinetic energy increases), collisions become more likely to occur, and so they do. In other words, the conductors resistance increases. What scale do we use to measure the movement of atoms ? Temperature! Heat is literally the average amount of kinetic energy of every atom of piece of thing has. Thereby all of the energy that is used to power electronics just goes to waste. Kind of.
I'm not sure who you are responding too. Nobody said energy is lost. The discussion was first about AMD TDP not being the same as Intel TDP and ten switched over to a discussion of TDP not actually meaning power draw, which by itself is true, but there obviously is a correlation which a several posters (yourself included with a more physical explanation) talked about .
Compare performance per watt in gaming and Intel stops looking impressive. If you're buying a notebook with the FX chip then that should be what you care about.
The comparison is for CPUs in the same price range, not CPUs in the same TDP range, obviously.
So the performance is decent for the price, as gdansk correctly pointed out. It is not decent for the TDP, at least not compared to Intel's chips, which is what you are focusing on, and is not the metric that most people use when comparing processors.
"The comparison is for CPUs in the same price range, not CPUs in the same TDP range, obviously."
For a mobile platform? I couldn't disagree more. Power consumption directly affects battery life, and you either need to spend more on the battery (negating cost savings) or just live with less runtime.
you can buy a LOT of extra battery capacity for the premium intel charges for its ULV CPU's. (a whole laptop battery replacement can be had for as little as 50 euro's.)
furthermore most of the time, and most of the power, will be spent on idle, which is completely separate from the TDP. and again TDP is not power draw. the 2 don't even have to be related. a TDP is quit often set for a whole RANGE of CPU's so OEM's can make 1 laptop designed to one TDP and put a whole host of CPU's in it.
so without some actual power draw tests doing various tasks this speculation is useless.
It's even a bigger joke when they test Intel chips with the worst possible iGPU (HD4400). If you have to use 15W chips vs. a 35W chip in comparison at least take the best of the bunch aka one with a HD5000. Apple offers the MacBook Air for an incredible cheap price, while having a high builds quality. Though for AMD to get into designs like that. When you have a processor that cuts corners, than your entire product has to cut corners IMHO. I mean I can pay 600€ for a cut corner AMD notebook or I can spend 300€ and be a happy camper.
Also dedicating so much text about why SSD are important doesn't really bode well for the product reviewed here...
Well, Tom's Hardware takes a 35W Intel chip to compare their Kaveri test system to, and the Kaveri still absolutely smokes the Intel competitor in gaming. What Jarred is getting at in his reviews I think, is that it's no use for AMD to put out 19/17W chips if OEMs aren't going to bother making anything worth a damn with them inside.
The comment about OEM's hamstringing AMD laptops is quite true, so many AMD laptops come out with single channel RAM and slow mechanical HDDs, but people attribute the low performance to the APU being slow, when for most purposes, any new mainstream chips coming from Intel or AMD are more than enough.
It doesn't "absolutely smoke" the Intel competition:
1.) DOTA2 is actually faster on the Intel chip 2.) Frametime variance is better on the Intel chip through out (although admittedly probably moot point, since abysmal FPS anyway) 3.) It's a HD4600, so still not the better Intel iGPUs
If you take an Intel cpu with HD5000 (that is the one with the expensive L4 cache right?) doesn't that make it really really expensive so totally outside the market where AMD is putting these chips? System1: kaveri $500 System2: intel HD5000 $700 Consumers compare on price not on performance.
No. The HD5000 is actually what's Inside the MacBook Air for example, so a 15W ULV part. For 35W you have the Iris 5100 (which is still without the "expensive" eDRAM). Only some of the 45W chips have the Iris Pro 5200.
I only have Apple as a reference but their notebooks are of high quality, don't cut corners and are actually affordable. The MacBook Air starting at 899€ and the 13" rMBP at 1299€. I would argue that an AMD equipped notebook which doesn't cut corners is not going to be much cheaper. In the end the price difference will be a question of weather you want the more powerful Intel chip or not.
Also I think AMDs claim of OpenCL performance I blown out of proportion. Their advertisement slides when Kaveri lunch actually had faked stats for Intels iGPU OpenCL performance. I did my own tests and received muh higher numbers. You can also check the PCMark and 3DMark websites to see that Intel was and is much better then what AMD wants you to believe. That is not to say that they are not better, I just think it needs to be put into better perspective if you really want to make the trade off (Better GPU for OpenCL but worse CPU)
Your price comparison is off, though. Apple gets the highest end GPU chips from Intel, and then charges LESS than competitors. That Sony i5 was MORE expensive than the MBA or even the Mac Pro. Apple doesn't make cheap hardware, but they haven't had overpriced hardware in years (or no more than competitors).
But I agree in principle, it just isn't going to happen for AMD. The HP sleekbook was easily the best looking 'ultrabook' and it was only briefly available (not that it was good, but lots of poor laptops do better). AMD is used by the OEMs only to keep Intel honest - and that's why they launched on desktop first. Intel doesn't really care about desktop, so AMD wisely chose that first, where they can get some enthusiasts as well as a few modest OEM wins.
I have an AMD chip and it was a good budget choice for a basic PC. But the OEMs are closer to adopting ARM en masse (HP has joined Samsung with ARM chromebooks now) than AMD. I'd like to think HSA might turn it around, but I think at this point Intel's guns are bigger than AMD's and they have more of them. Perhaps Lenovo will keep going vertical and scoop them up to move to China. They already have red logos, after all.
1) That Dota2 result is so wide it seems like a result of driver non-optimization, than the Intel GPU actually being better. 2) Like you said, it's a moot point. 3) The best Intel GPUs are found in chips that are way way out of AMD's price range. A true apples to apples comparison is between affordable midrange chips to affordable midrange chips, and that's what Tom's Hardware did.
I agree on your first two points but beg to differ on the later. Not everybody is shopping with a set amount of money, most usually look what will 100€ more or less get me or what can I get in a given thermal envelope. I would argue for example that the HD5000 will turn out faster in the 19/15 W TDP envelope next to AMDs chips.
Also I think most people are forgetting that AMD won't be competing with Haswell but with Broadwell which is close to be released.
Now this is not to say it's a bad chip, I just don't get the hype that's being made. I think the trade off worse CPU for better GPU is not worth it for most customers. HUMA and HSA still has to show how powerful it could be (and I have my doubts if it will ever get widespread attention) and people easily forget that Intel also supports OpenCL, so every software optimized for it will also run faster on Intel hardware. In the end AMD chips - in my eyes - still remain the budget choice (if at all) and this is why you probably won't see many non cut corners notebooks.
Exactly how is the 13" and 15" rMBP screen outdated and low res?
Oh and: not everybody shopping for a notebook needs a touch display nor pen input. I even give you a reason why the 12" Surface 3 will not replace either notebooks or tablets: 750g.
Your point? They are light now. I owned the 1st gen iPad and it was way to heavy to comfortably hold in my hands to read. I since switched to the iPad mini for its lightness and small form factor, nothing the Surface can provide me with. So I still tuck along my notebook to write stuff and use the iPad for media consumption. The surface simply doesn't work for me, but your mileage might vary. Just don't make it sound it's the holy grail that truly combines both tablets and notebooks, because it isn't. At least not for all.
The 35W AMD APU is giving you about 75% of the mobile gaming performance that you get with a 15W Intel CPU + 50W nVidia GPU; but over 200% of the 15W Intel APU performance. That's a very decent result in my opinion. Sure it's slower as a pure CPU, but I don't think it will hit its 35W TDP limit when the GPU part is dormant, either.
Another interesting comparison might have been to see how FX-7600P fairs against something like an i7-4558U, i.e. an Intel APU with a ~30W TDP and an Iris GPU.
While I agree with you perfectly on what you said, I want to bring up another point. The gt750m is a kepler part with 384 stream processors (2 SMX). We now have maxwell on the market and maxwell is very good. With maxwell you get 640 stream processors (5 SMM, 1 SMM is roughly 90% of an SMX) in roughly the same tdp that as the gt750m. Thus nvidia+intel is able to get a large increase in number of calculation units in roughly the same form factor.
So in sum AMD is able compete with intel and intel+nvidia only on price and/or time to market. Intel and Intel+Nvidia can meet AMD on graphic efficiency, form factors, and Intel is faster in cpu tasks. Sadly I wish AMD was doing better than they are, not because AMD is bad but I want more competition and more competition is always good for the end user. Problem is AMD may bring their B+ game often but when you have Intel and Nvidia as competitors you need to bring your A+ game to win for your competitors are just as talented as you are and they have far more design resources due to greater revenue and cash on hand which means more money spent on engineers.
Intel cheats with Turbo-Boost, though, being able to win 5-minute benchmarks because of Turbo-Boost, while also claiming long battery life because Turbo-Boost barely gets any use, so the chip is actually running at lower performance than advertised in the long term.
And because Intel is able to implement a better turbo they are somehow cheating? I mean it still is a advantage for the consumer in the end. Race to idle and all.
Actually sometimes Turbo helps with the overall energy efficiency. Turbo enable to push the CPU to the limit and finish the high demanding task quickly so it can return into a lower power state afterwards. So you are temporarily increasing the instantaneous power to get a better an overall energy efficiency.
The TDP of AMD's offering includes the much more powerful GPU (compared to Intel's HD4400 in the i7). For the benchmark results, the Intel was paired with a discrete nVidia card (750M), which also guzzles power NOT included in the 15W TDP of the i7.
I'm not familiar with what other parts are on-/off-die for these CPU's, but it's not fair to compare the power envelopes on a spec sheet like you're doing. The fair test would be to measure full-system power consumption of two comparable devices, which is sadly not possible at this time, as no Kaveri laptops have shipped yet.
Remember power consumption increases exponentially with frequency, and this phenomena is even more accentuated with Kaveri because of the process used. So what I'm saying is that the 19W APU's likely wouldn't be that much slower than the 35W one.
Maybe its also true of frequency, but I believe you're looking for voltage. Power consumption for two different frequencies at the same voltage is much different than two voltages at the same frequency.
Buzzzzz, wrong. Power = C*f*(V^2), power is linear with respect to frequency and quadratic with respect to voltage. (C is effective capacitance). Google "Power CMOS Circuits".
Actually the relation between power and frequency is linear. The relation between the power and voltage is quadratic. p = c*f*v^2. and a higher TDP allows the CPU to stay at a higher performance state for a larger amount of time. you can compare the performance between a 15 watt TDP and 28 watt TDP in the following link. (rMBP 13 has 28 TDP ship and air has 15 or 17) http://www.notebookcheck.net/Review-Apple-MacBook-...
Close compared to the i7 ULV and GT 750. Often better than the i5. It is good against what will likely cost at least $100 to $200 more. Battery life that is 80% and weight that is 110% of those Intel machines might be a reasonable trade off saving $200.
Who cares that it's slower as a CPU? Beside the point that it's not even noticeably slower, have you forgotten that it's not a CPU at all, but rather an APU, and as an APU its performance is something to be praised not degraded. Intel fans have honestly become the worst fanboys out there now. Can't give credit where it's due and can only make negative comments, even in the light of positive results. Facepalm.
Jarred.....Anand.....you are comparing an hot 35W SKU with a constrained 15W ultrabooks SKU. This time there is no justification, Anantech lab is plenty of notebooks with a 37W Intel cpu on board and i pretty believe of notebooks with 28W U parts with a less constrained HD 5000 or Iris 5100 GPUs. And why a i7 4500 (GT2) and not a i7 4550 (GT3) ?????? This is a pretty biased article, nearly useless for customers.
Comparing $300 CPUs against $150 APUs is potentially just as bad. And you'd be surprised how many laptops we don't actually have; most come with high-end configurations, which really muddies the waters. I do have a 37W i7-4702MQ available where I can disable the dGPU and see how performance compares, but that's a $370 CPU and again just not anywhere near the price of the AMD offerings. (Base price on the laptop is $1500+.)
your looking at tray cost. overall laptop costs will be comparable, intel has much more components on die. maybe $150 off. I would have liked to see a comparable wattage intel cpu in there as well.
Yes agreed with you. The perfect review could be with a i7 4550 or i5 4250 both GT3 with TDPup enabled up to 21/22 W. We have forgotten that Kaveri is 19W but we need of other 2/3W for southbridge that is on package on Haswell. Looking at others reviews, in this conditions Intel and Amd are average on pair in GPU (depending on game title) and Intel is in huge advantage in CPU. The 19W (21/22) TDP figure seems an attempt to put sand in customer eyes. Lets wait a review with real Notebooks, yes because the TDPup feature is largely utilized by Intel OEMs to have more performance from mobile Haswell.
Yeah, I tried to check it and found that the prototype was basically not tuned at all for battery life -- probably would have been under four hours for Internet surfing, which is way out of line with what we expect. Basically, idle power draw was around 16-17W, almost double what it should be, so the firmware and hardware wasn't tuned to go into low power states as far as I could tell. When we get shipping laptops, I suspect we'll see battery life competitive with Intel solutions, maybe even better. I figure 5-6W idle power draw or less isn't too far off these days.
Haswell is obviously the target, and AMD is claiming 11 hours vs. 9.2 hours for Kaveri vs. Richland (doing a low intensity eReader/Browser test), so that's at least a decent bump. But I've seen Trinity laptops with lousy battery life due to the OEM not spending the needed effort in that area, so really it's as much the laptop as it is the CPU/APU.
"maybe even better" i have some doubts, AMD has a process disasvantage that is a damnation in those uses where the cpu works at very low voltage or stay idle for a little. Web surfing is an example. Mullins has showed that AMD main weakness is there, being behind Qualcomm (TSMC) and Intel (in house) in idle power.
Comparing these chips to the haswell mobile i7 quads would only show that AMD has stopped competing in CPU perf. The cinebench scores tell that story - my old workhorse i5-520M from 4 years ago scores higher in both single & multi than these chips. I think the graphics scores also make it apparent that memory bandwidth is a big issue for GCN, and even 2133ddr3 doesn't cut it. 4x the gpu cores as kabini, with only 2x the output. What AMD desperately needs is an OEM that will put their chips in a design that doesn't look and perform like a black friday door buster. Sadly, with Intel's ultrabooks all calling for 17w cooling systems (and broadwell will be less!) there just aren't that many compelling (read: apple-like fashionable) 35w designs for AMD to hope they get recycled into.
I think the next step AMD should make in APU progression is the addition of DRAM on die. It's more efficient than adding more GPU cores for performance and you have hUMA so the CPU cores could take advantage of it too.
What you are asking for is an Xbox One APU. DRAM on die is not a trivial choice. The amount of DRAM you'd want to add is not insignificant and thus the amount of die area consumed would be very significant. To use the inevitably limited amounts of on-die DRAM introduces tremendous complexities in software, as coders have to special case the use of the high-bandwidth on-die DRAM and manage its use very carefully. You can do that with consoles, not with PCs. And the XBox One APU is at different cost/power/etc. points than Kaveri.
There are other solution paths, but that isn't suitable for a comment to a comment in a review article.
It's not unlocked, but what would the point be? Let's just say my experience with overclocking laptops is that there's usually a reason 99% of laptops don't allow it. Huge notebooks with much beefier coolers can try, but even then we often get only marginal bumps in performance.
Under clocking the CPU and overclocking the GPU. that would be helpful. I've actually done this in a laptop that shares coolers between a discrete GPU and CPU but I imagine it would be useful in an APU laptop too.
E.G. Note that the CPU is not underclocked in this test but I do normally underclock it to reduce the volume of the fan. http://www.3dmark.com/3dm11/6039951
I looked straight for the single-threaded performance and I see the 15w 4500u gets >50% higher score than the 35W 7600P. Yes, graphics is good at that high TDP, but without a competitive CPU performance I think it's hard to justify these SKUs for the target market. AMD really needs to stop this disastrous adoption of bulldozer-based CPUs for good.
Three thoughts: 1) I'm not defending AMD's single core performance, but the single-threaded cinebench test shows a 33% lead for the Intel Core-i5, which is not "greater than 50%". That said, I agree that AMD needs to close the gap on single-threaded CPU performance (and CPU performance-per-watt). This is as close as they've been in a long time, but the gap is still large. Their grand strategy has supposedly been to leverage the interaction of CPU+GPU to provide better performance, but when you are the minority player in the chip market, that's a tough hill to climb. How can you get software makers to write code for your APUS when they are only a fraction of the market *and* mostly found in the lowest-end consumer laptops?
2) Sure, multi-threaded CPU performance looks competitive here since AMD has thrown in more "cores" (purposeful use of quotation marks) for the same price as Intel, but that strategy can only go so far. At any time, Intel can choose to lower the price on its chips to match AMDs performance-per-dollar because Intel's chips use less silicon to produce the same CPU performance.
3) Anand is right that AMD needs to get these chips into nice laptops -- with lower-than-Intel prices -- and emphasize the gaming performance advantage. And those laptops had better have good battery life under normal usage. The presumed niche would be for an AMD laptop with reasonable gaming chops at several hundred dollars less than an Intel laptop with a discrete GPU.
That said, 35 watts is too high for the ultrabook form factor. Will AMDs graphics advantage remain this large when you move down to a 15 Watt APU? If the answer is no, then Anand's ideal doesn't really work. All you would get is a $50 cheaper AMD laptop with slower CPU performance and similar GPU performance!
[Note: I'm really rooting for AMD, but I'm going to remain cautious in my enthusiasm for these new chips until I see them in actual laptops. As they say, the proof is in the pudding.]
> I'm not defending AMD's single core performance, but the single-threaded cinebench test shows a 33% lead for the Intel Core-i5, which is not "greater than 50%".
Yes, but the core i5 I specifically refer to is only a 33% difference. I'm guessing you were looking at the i7. So we are both wrong and both right. And in agreement about the need for better single threaded performance.
Nice chip. But that said, without battery life results, this review does fall flat, I understand that they couldn't be done, though.
Other than that, you've touched on the main issue with AMD based laptops: nobody wants to make a good one.
Give me something like my old Sony Vaio Z12 with that FX-7600P in it, at a decent price, and you have a compelling product. I don't want ridiculous resolutions. I want one that will work well for games and desktop use at a nice size. The Z12 @ 13.1" 1600×900 offers that absolutely perfectly in my opinion.
"From what I could gather, AMD used "V" transistors in Trinity/Richland but has switched to "T" transistors for Kaveri, which explains the drop in maximum clock speed."
Perhaps, but the most important reason was the regression from PD-SOI to bulk.
Interesting claim there about the transistor "shapes." I've not heard of T and V shaped transistors... it appears to be the shape of the gate.
The GPU isn't good enough to game with and the CPU isn't good enough to do anything else with. You'll get half the battery life of the competition and it will get twice as hot. Abject failure.
Yes, you hit on the crux of the matter. Problem is still that of all APUs, the igp performance is still in limbo: almost there but not quite for gaming, while cpu performance and power consumption trail intel badly. Looking at the gaming tests, looks like one will be limited to 768p, and even then COH (admittedly terribly optimized) is not playable, and we have no tests for demanding games like Watchdogs, Metro LL, or Crysis 3. So one will be stuck with either a crappy 768p screen or playing at non-native resolution for a lot of games. Not to mention, this mobile chip should have come out first instead of being so close to the broadwell mobile launch.
You say "cpu performance and power consumption trail intel badly". There's no real data on the latter apart from TDP values (which, in themselves, aren't an accurate indicator), and the FX-7600P is certainly no slouch when viewing the CPU benchmarks in this preview. Let's not forget that large portion of the die dedicated to the GPU, either - just because it's there, doesn't mean it's being used. Toms had the FX-7600P doing well at GPGPU (it should) and PCMark, less well at physics scores in 3DMark (the competition was a high-end i7) and even worse in Sandra, but we know that doesn't always mean everything; after all, the Core 2 had low memory bandwidth due to having no IMC but was still easily the match of the Phenom II series.
This is a preview, so quite why people are expecting numerous tests on 2013/14 titles from a very brief testing period is beyond me. Patience, grasshopper!
But I suspect you can hit 1080p with an awful lot of 2007~2012 games, making kaveri mobile + 1080p a great humble bundle mobile gaming solution. Now, if they combined it with freesync, you could get away with lower frame rates and play even more recent games.
The use case for a laptop is simple: you don't require graphics horsepower the majority of the time. Hence it makes more sense to use a CPU that uses less power (and performs better), and pair it with a discrete GPU that won't be drawing any power most of the time. Intel understands this and it's why they make money.
AMD on the other hand, have dreamed up a mythical future reality where everyone who buys a laptop expects it to deliver decent GPU performance. But very few people buy laptops to play games on, they buy laptops to be productive while mobile - and faster CPU that use less battery power are always going to win out.
It's becoming more of a glaring omission as time progresses that AMD doesn't have a SSD caching option in their SATA BIOS/drivers. AMD is targeting the cheaper end of notebooks so I agree with the author: a pure SSD storage solution will not make the BOM, and because there is no AMD SSD caching solution a 8,16,32GD SSD cache can't make the BOM either - discrete chece is not supported and a hybrid HD unfortunately also seem to also command a $ premium and don't have as good integration with windows. Makes me wonder what the state of USB "ready boost" is with Windows these days?, things are very quiet on the that front. Also where is a cheap / free 3rd party utility to provide SSD/ NVM caching, do AMD laptop vendors have any budget / driver solution?
1) Inflate certain benchmark scores (PCMark being a major one). 2) Help laptop boot/resume faster.
Seriously, most other tasks that hit the storage hard -- installing an application, loading a bunch of apps at once, opening a browser with 30 active tabs -- are all only slightly faster than an HDD with SSD caching, where a pure SSD is substantially faster. The best SSD caching solution right now comes from Apple, and it's only good because it's 64GB or more; 24/32GB SSD caches just don't cut it in my experience.
i7-4550u has a tray price of $426. It would be interesting to see that comparison, but what is the point ultimately? Also, they probably couldnt get their hands on one.
While Kaveri seems like a good chip, it has two problems. It's cpu performance is still very poor compared to intel, and it is coming to mobile WAY too close to broadwell, which will probably be slightly faster, if intel bragging of 40% graphic improvement is correct. on top of that, anandtech, please compare the kaveri to 37 watt i5s. that's the real competition for this chip. leave the 19 watt to duke it out with the 15 watt chips.
You can look at tomshardware They reviewed FX7600 vs i7-4702 except the cpu performance everything else is not to shabby, decent enough but still not to my expectation as we know except MSI every single vendor would come out with shitty spec
Considering that the Kaveri FX has a 35W TDP part while the i7U+GT750M has 67W TDP, I'm quite impressed. This could have a lot of promise if OEMs would stop ruining everything
What i want is an AMD FX-7600P, in a 14/15" thin and light enclosure with Full HD screen 8GB DDR 3 @ 2133 MHZ, 128GB/256GB SSD, i'd buy straight away, unfortunately what we will see is, a bulkier 15.6 " screen at 1366*768, 4GB of DDr 3 @ 13333MHz, and a 5400 rpm hard drive, and a GPU added in crossfire or on it's own which is only slightly more powerful than the APU
Wonder if they will pair that up with a cheap 250x mobile version for some X-fire gaming on laptop. I bet that would beef up the graphics a fair bit. IMO the cpu part is pretty respectable trading blows with the i7 and i5 for the most part, which AMD usually gets trounced in.
Well it does look a lot better than Richland and price might be a big driver in making these interesting (I see no prices?), but 19w Kaveri versus 15w Haswell ULT...the 15w Haswell ULT, depending on the turbo performance of the 19w Kaveri part looks to run roughly as fast to a very sizeable 20-30% lead in CPU performance and with the GPU cut down to roughly half performance between core and clock reduction...the Haswell ULT is likely to be on par graphically +/-10%. In a lower power consumption package...and if the Kaveri has turboing issues or the GPU is thermally constrained in this package it could be a much more significant lead.
The 15w Kaveri doesn't look like it would be on the performance map at all (<<<50% of the performance of Haswell ULT). The standard voltage parts actually look the most interesting, especially if the price is moderate. That would probably allow you to make a decent performing machine (70-90% of an Intel's CPU performance) with a very nice iGPU for a rather low price point (assuming the FX-7600p is very price competitive to the Intel Haswell SV dual core parts).
I guess I missed something with the Intel chips in regard to the TDP. My experience is vastly different. Most Intel chips, both desktop and mobile, don't seem to come close to hitting max TDP under full CPU load. Especially in the desktop space I see something like my i5-3570 with a TDP of 77w is hitting around a 35w delta between idle and full load based on wall plug data, considering losses from the power supply and assuming a fairly low (CPU) idle number, thats only in the area of 30-40w total power consumption. That isn't simply under load, the processor is clocked up to 4.2GHz single core and 4GHz all 4 cores turbo. That isn't a max burn situation where every single itty bitty capacitor and transistor is loaded like Intel does in their TDP tests, but that is under 100% sustained CPU load across all 4 cores.
From what I have seen with Ivy Bridge testing (which ain't Haswell I'll admit) a fully loaded i5-3317u seems to only use in the area of 8-10w with both cores fully loaded max turbos 100% CPU load. Its the GPU that uses a huge whack of power, >12w under max load and turbos.
I'd assume Haswell is pretty close there, at least with the CPU, since the turbo core speed is the same, same lithography, very similar architecture. So I doubt a ULT Haswell is using more than 10w under max CPU load and turbos. Its the GPU that is going to be the power hog.
No idea what AMD is capable of on these, but we know they are using a less energy efficient process, so I'd assume/guess that their CPU is probably below the cap under max CPU load, but it might be much closer, maybe in the 12-15w range under max turbo, meaning GPU load is going to cause the CPU to scale pretty far back, even on the 19w chips.
It seems like any benchmarks of the Ax-7xxxP should be against an ix-4xxxM, not MQ, like over at Tom's, and the 7xxx, should be against the ix-4xxxU, at approximately the closest price point. That leaves plenty to choose from, in laptops in the market. Hopefully we can get that data so a real comparison can be made. I hope we are just waiting on OEMs to release their Kaveri laptops, in both 19w, and 35w forms, to make it worthwhile?
Why are they comparing Kaveri to the 15W ULV i7? Simple, because it's a more fair comparison for multiple reasons:
A) That i7 is a dual core with hyperthreading (4 threads). Kaveri is a dual module with modular multithreading (4 threads). An AMD module is what you compare to a single Intel Core with hyperthreading. That is the fair comparison. It wouldn't be fair to compare a dual core vs a quad core now would it?
B) Although the Intel chip is still more expensive than the Kaveri, it is much more closely priced than the quad core version. Price ranges should always be taken into consideration when comparing offerings of like-capabilities from different manufacturers.
Each Steamroller core has two integer pipes, making four per module, as compared to Haswell's four per core. If only the module was exposed to Windows rather than each integer core, wouldn't there be a chance that single threaded workloads would be significantly boosted?
Not with the current microarchitecture. The integer pipelines are statically partitioned between the cores on each module (unlike HyperThreading). It's a key aspect of what AMD calls CMT.
Yes, but I was thinking about changes to the front end. I suppose what I'm referring to is a regression but the integer cores are too weak on their own. Essentially, I was imagining what would be required to adopt a more SMT-style system and whether it'd be any sort of improvement.
Too bad the benchmarks were not done with 2166MHz RAM modules. Earlier Richland tests in a Hungarian magazine clearly showed that ALL games are memory bandwidth limited and FPS values 100% correlate with RAM clocks. 20% increase in RAM clock = 20% increase in FPS. I'd suspect exactly the same behavior holds for Kaveri too, specially that it's even more powerful in ALU terms.
The point is that a power supply may be expected to supply more than the TDP for significant portions of a second (more than enough to exhaust the input capacitors of the output switcher). Don't use TDP for electrical specs.
1366x768 and these are avg fps I'm sure as no mention of mins, so don't even expect to play in this CRAP resolution without a gpu as shown. Unfortunately as these things have gotten faster, game engines have gotten more taxing. They really have gained nothing on discrete which is why those sales are staying just where they are while PC sales are down 12% last year. You still need a card to do real gaming.
These chips are an amazing step for AMD. They have moved from i3-level CPU performance up to i5 level in some cases. People seem to neglect the potential of dual graphics in these laptops. With a GCN GPU, big gains could be seen. There would be less energy usage for more performance in games. The 19W FX looks like the best chip that they offer. AMDs previous ULV offerings have been dead awful. The OEMs are sure to ruin it though by putting them in inferior hardware.
Just think about this...considering that Kaveri is actually a pretty decent APU, especially for the mobile market, just imaging what Carrizo is going to be like. I honestly believe that Carrizo will be the first APU that will actually be considerable in terms of an enthusiast gaming laptop/ultrabook. When AMD moves to 20nm, we will finally have HD7870 performance in an iGPU and that is when APU's will completely take over the market as that puts out acceptable performance even for a hardcore gamers.
Add to that an all new x86 core built from the ground up by Jim Keller, mixed with a 16nm process by the time it comes out...and hell, we may even get what all AMD fans are dreaming of right now, an 8 or 16 core APU with the iGPU performance equivalent of an HD7970 or perhaps even R9-290x.
The 19W FX part is nice. These make great chips for linux laptops since you get good gpu performance without having to deal with dual graphics and graphics switching, a horrendous affair on linux (though improving). If they put this in a moderately well-built machine in the $600-800...
The tests are kind of meaningless. As others pointed out, AT compares AMD's 35W TDP part with Intel's 18W TDP designs. Moreover, those lower power Intel CPUs are dual core while the Kaveri A10s are quad core. Bring Intel's mobile quad core i7 or at least a 35W Core i5 (dual core) part, and there'd be no comparison. AMD would be far behind in any bench that doesn't use GPU.
After reading, I was a bit disappointed that that didn't have the socket type. After doing some digging, those of us currently using the 5750M (Socket FS1r2) will NOT be able to swap in the new FX-7600P (Socket BGA/FP3).
Shame, guess I'll have to wait for MSI to get on the bandwagon and I can order a motherboard/cpu combo that fits in their universal 17" chassis.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
125 Comments
Back to Article
gdansk - Wednesday, June 4, 2014 - link
Surprisingly decent performance.shing3232 - Wednesday, June 4, 2014 - link
yep.basroil - Wednesday, June 4, 2014 - link
Decent? The TDP is twice that of the Intel offerings and it's still slower as a CPU. Mobile gaming might be decent, but we still don't know how the system scales down (could lose performance faster than TDP)marvee - Wednesday, June 4, 2014 - link
I vaguely remember reading how Intel TDP /= AMD TDP. The crux of the article claimed that Intel TDP represented average consumption, while AMD TDP represented the max theoretical consumption. Anyone familiar with the topic?Vayra - Wednesday, June 4, 2014 - link
Well, looking at how GT3(e) is set up in Haswell, it seems likely that Intel's TDP measurements are going to be close to the maximum usage as well in any graphical application. The graphics parts will push even cpu usage down if TDP limits are reached there, so...Gondalf - Wednesday, June 4, 2014 - link
You are wrong marvee. This is an hold topic of many forums and the problem was solved long time ago. AMD TDP is equal to Intel TDP because "TDP" means a precise thing for OEMs. Go to a simple Wiki page and get a clue.formulav8 - Wednesday, June 4, 2014 - link
TDP is Thermal Design Power. In other words its not how much wattage it uses, its a guideline that AMD and Intel publish for the system builders.gngl - Thursday, June 5, 2014 - link
"its not how much wattage it uses"Unless the laws of thermodynamic have changed somehow, I'd expect TDP to put an upper bound on a CPU's power consumption. So, yes, it is about how much wattage it uses in the worst conceivable steady state. (Or isn't?)
kingpin888 - Thursday, June 5, 2014 - link
The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of heat generated by the CPU, which the cooling system in a computer is required to dissipate in typical operationhttp://en.wikipedia.org/wiki/Thermal_design_power
So , no its not.
silverblue - Wednesday, June 4, 2014 - link
If you can't say anything nice...xenol - Wednesday, June 4, 2014 - link
TDP doesn't equal power consumption. It equals how much heat a cooling unit must dissipate for safe thermal levels of operation. While there is some correlation, as more TDP generally means higher power consumption, it's not a direct one.Galatian - Wednesday, June 4, 2014 - link
I think it actually pretty much does equal top power draw, since energy in pretty much equals heat out. But do correct me if I don't understand the physics correctly. To me it simply seems like no work being done.JarredWalton - Wednesday, June 4, 2014 - link
TDP means the maximum power that needs to be dissipated, but most CPUs/APUs are not going to be pushing max TDP all the time. My experience is that in CPU loads, Intel tends to be close to max TDP while AMD APUs often come in a bit lower, as the GPU has a lot of latent performance/power not being used. However, with AMD apparently focusing more on hitting higher Turbo Core clocks, that may no longer be the case -- at least on the 19W parts. Overall, for most users there won't be a sizable difference between a 15W Intel ULV and a 19W AMD ULV APU, particularly when we're discussing battery life. Neither part is likely to be anywhere near max TDP when unplugged (unless you're specifically trying to drain the battery as fast as possible -- or just running a 3D game I suppose).Galatian - Wednesday, June 4, 2014 - link
Yes, which is why I said it equal top power draw ;-)JarredWalton - Wednesday, June 4, 2014 - link
Yeah, my response was to this thread in general, not you specifically. :-)nevertell - Wednesday, June 4, 2014 - link
It's amazing that we live in a world where information is accessible on a whim to most people living in the western world, yet even on a website that caters to more educated people (or so I would think), people have problems understanding even the simplest concepts that enable them to expose themselves to this medium. Energy is never lost, it's just used up in different ways. Essentially if we had access to a superconductive material to replace lines and a really efficient transistor, we would have a SoC that's TDP is zero watts. Say, a chip does not move a thing, there is no mechanical energy involved, all of the energy is wasted as heat. Why ? Electricity at it's core is flow of charged particles through a medium. If this medium is copper and the particles are electrons, the only thing standing in the way of the electrons flowing are the copper atoms. The electrons will occasionally bump into the atoms, exchanging kinetic energy, making the atom in question move. As the atoms move faster (i.e. their kinetic energy increases), collisions become more likely to occur, and so they do. In other words, the conductors resistance increases. What scale do we use to measure the movement of atoms ? Temperature! Heat is literally the average amount of kinetic energy of every atom of piece of thing has. Thereby all of the energy that is used to power electronics just goes to waste. Kind of.ol1bit - Wednesday, June 4, 2014 - link
Unless you live in a cold climate, then you get to use part of the energy as Heat! :-)Galatian - Thursday, June 5, 2014 - link
I'm not sure who you are responding too. Nobody said energy is lost. The discussion was first about AMD TDP not being the same as Intel TDP and ten switched over to a discussion of TDP not actually meaning power draw, which by itself is true, but there obviously is a correlation which a several posters (yourself included with a more physical explanation) talked about .johnny_boy - Saturday, June 7, 2014 - link
Compare performance per watt in gaming and Intel stops looking impressive. If you're buying a notebook with the FX chip then that should be what you care about.bji - Wednesday, June 4, 2014 - link
The comparison is for CPUs in the same price range, not CPUs in the same TDP range, obviously.So the performance is decent for the price, as gdansk correctly pointed out. It is not decent for the TDP, at least not compared to Intel's chips, which is what you are focusing on, and is not the metric that most people use when comparing processors.
Gigaplex - Wednesday, June 4, 2014 - link
"The comparison is for CPUs in the same price range, not CPUs in the same TDP range, obviously."For a mobile platform? I couldn't disagree more. Power consumption directly affects battery life, and you either need to spend more on the battery (negating cost savings) or just live with less runtime.
The_Countess - Sunday, June 15, 2014 - link
you can buy a LOT of extra battery capacity for the premium intel charges for its ULV CPU's. (a whole laptop battery replacement can be had for as little as 50 euro's.)furthermore most of the time, and most of the power, will be spent on idle, which is completely separate from the TDP. and again TDP is not power draw. the 2 don't even have to be related. a TDP is quit often set for a whole RANGE of CPU's so OEM's can make 1 laptop designed to one TDP and put a whole host of CPU's in it.
so without some actual power draw tests doing various tasks this speculation is useless.
Galatian - Wednesday, June 4, 2014 - link
It's even a bigger joke when they test Intel chips with the worst possible iGPU (HD4400). If you have to use 15W chips vs. a 35W chip in comparison at least take the best of the bunch aka one with a HD5000. Apple offers the MacBook Air for an incredible cheap price, while having a high builds quality. Though for AMD to get into designs like that. When you have a processor that cuts corners, than your entire product has to cut corners IMHO. I mean I can pay 600€ for a cut corner AMD notebook or I can spend 300€ and be a happy camper.Also dedicating so much text about why SSD are important doesn't really bode well for the product reviewed here...
hamoboy - Wednesday, June 4, 2014 - link
Well, Tom's Hardware takes a 35W Intel chip to compare their Kaveri test system to, and the Kaveri still absolutely smokes the Intel competitor in gaming. What Jarred is getting at in his reviews I think, is that it's no use for AMD to put out 19/17W chips if OEMs aren't going to bother making anything worth a damn with them inside.The comment about OEM's hamstringing AMD laptops is quite true, so many AMD laptops come out with single channel RAM and slow mechanical HDDs, but people attribute the low performance to the APU being slow, when for most purposes, any new mainstream chips coming from Intel or AMD are more than enough.
Galatian - Wednesday, June 4, 2014 - link
It doesn't "absolutely smoke" the Intel competition:1.) DOTA2 is actually faster on the Intel chip
2.) Frametime variance is better on the Intel chip through out (although admittedly probably moot point, since abysmal FPS anyway)
3.) It's a HD4600, so still not the better Intel iGPUs
Fergy - Wednesday, June 4, 2014 - link
If you take an Intel cpu with HD5000 (that is the one with the expensive L4 cache right?) doesn't that make it really really expensive so totally outside the market where AMD is putting these chips?System1: kaveri $500
System2: intel HD5000 $700
Consumers compare on price not on performance.
Galatian - Wednesday, June 4, 2014 - link
No. The HD5000 is actually what's Inside the MacBook Air for example, so a 15W ULV part. For 35W you have the Iris 5100 (which is still without the "expensive" eDRAM). Only some of the 45W chips have the Iris Pro 5200.I only have Apple as a reference but their notebooks are of high quality, don't cut corners and are actually affordable. The MacBook Air starting at 899€ and the 13" rMBP at 1299€. I would argue that an AMD equipped notebook which doesn't cut corners is not going to be much cheaper. In the end the price difference will be a question of weather you want the more powerful Intel chip or not.
Also I think AMDs claim of OpenCL performance I blown out of proportion. Their advertisement slides when Kaveri lunch actually had faked stats for Intels iGPU OpenCL performance. I did my own tests and received muh higher numbers. You can also check the PCMark and 3DMark websites to see that Intel was and is much better then what AMD wants you to believe. That is not to say that they are not better, I just think it needs to be put into better perspective if you really want to make the trade off (Better GPU for OpenCL but worse CPU)
nico_mach - Wednesday, June 4, 2014 - link
Your price comparison is off, though. Apple gets the highest end GPU chips from Intel, and then charges LESS than competitors. That Sony i5 was MORE expensive than the MBA or even the Mac Pro. Apple doesn't make cheap hardware, but they haven't had overpriced hardware in years (or no more than competitors).But I agree in principle, it just isn't going to happen for AMD. The HP sleekbook was easily the best looking 'ultrabook' and it was only briefly available (not that it was good, but lots of poor laptops do better). AMD is used by the OEMs only to keep Intel honest - and that's why they launched on desktop first. Intel doesn't really care about desktop, so AMD wisely chose that first, where they can get some enthusiasts as well as a few modest OEM wins.
I have an AMD chip and it was a good budget choice for a basic PC. But the OEMs are closer to adopting ARM en masse (HP has joined Samsung with ARM chromebooks now) than AMD. I'd like to think HSA might turn it around, but I think at this point Intel's guns are bigger than AMD's and they have more of them. Perhaps Lenovo will keep going vertical and scoop them up to move to China. They already have red logos, after all.
hamoboy - Wednesday, June 4, 2014 - link
1) That Dota2 result is so wide it seems like a result of driver non-optimization, than the Intel GPU actually being better.2) Like you said, it's a moot point.
3) The best Intel GPUs are found in chips that are way way out of AMD's price range. A true apples to apples comparison is between affordable midrange chips to affordable midrange chips, and that's what Tom's Hardware did.
Galatian - Thursday, June 5, 2014 - link
I agree on your first two points but beg to differ on the later. Not everybody is shopping with a set amount of money, most usually look what will 100€ more or less get me or what can I get in a given thermal envelope. I would argue for example that the HD5000 will turn out faster in the 19/15 W TDP envelope next to AMDs chips.Also I think most people are forgetting that AMD won't be competing with Haswell but with Broadwell which is close to be released.
Now this is not to say it's a bad chip, I just don't get the hype that's being made. I think the trade off worse CPU for better GPU is not worth it for most customers. HUMA and HSA still has to show how powerful it could be (and I have my doubts if it will ever get widespread attention) and people easily forget that Intel also supports OpenCL, so every software optimized for it will also run faster on Intel hardware.
In the end AMD chips - in my eyes - still remain the budget choice (if at all) and this is why you probably won't see many non cut corners notebooks.
beggerking@yahoo.com - Wednesday, June 4, 2014 - link
icrap and affordable? what a joke! loldon't forget outdated low res screen, no pen input, no touch screen, and still more expensive.
Surface Pro is much better and much more affordable.
Galatian - Wednesday, June 4, 2014 - link
Exactly how is the 13" and 15" rMBP screen outdated and low res?Oh and: not everybody shopping for a notebook needs a touch display nor pen input. I even give you a reason why the 12" Surface 3 will not replace either notebooks or tablets: 750g.
nerd1 - Wednesday, June 4, 2014 - link
1st gen iPad 3G weighed 730gr and everyone said it's as light as a feather.Oh, and macbook still looks and weighs exactly same since then.
Galatian - Wednesday, June 4, 2014 - link
Your point? They are light now. I owned the 1st gen iPad and it was way to heavy to comfortably hold in my hands to read. I since switched to the iPad mini for its lightness and small form factor, nothing the Surface can provide me with. So I still tuck along my notebook to write stuff and use the iPad for media consumption. The surface simply doesn't work for me, but your mileage might vary. Just don't make it sound it's the holy grail that truly combines both tablets and notebooks, because it isn't. At least not for all.Morawka - Wednesday, June 4, 2014 - link
well for one it's not a IPS Screen, and 2, it's a low res panelGalatian - Wednesday, June 4, 2014 - link
The iPad mini with retina? You need to check your facts.ShieTar - Wednesday, June 4, 2014 - link
The 35W AMD APU is giving you about 75% of the mobile gaming performance that you get with a 15W Intel CPU + 50W nVidia GPU; but over 200% of the 15W Intel APU performance. That's a very decent result in my opinion. Sure it's slower as a pure CPU, but I don't think it will hit its 35W TDP limit when the GPU part is dormant, either.Another interesting comparison might have been to see how FX-7600P fairs against something like an i7-4558U, i.e. an Intel APU with a ~30W TDP and an Iris GPU.
Roland00Address - Wednesday, June 4, 2014 - link
While I agree with you perfectly on what you said, I want to bring up another point. The gt750m is a kepler part with 384 stream processors (2 SMX). We now have maxwell on the market and maxwell is very good. With maxwell you get 640 stream processors (5 SMM, 1 SMM is roughly 90% of an SMX) in roughly the same tdp that as the gt750m. Thus nvidia+intel is able to get a large increase in number of calculation units in roughly the same form factor.So in sum AMD is able compete with intel and intel+nvidia only on price and/or time to market. Intel and Intel+Nvidia can meet AMD on graphic efficiency, form factors, and Intel is faster in cpu tasks. Sadly I wish AMD was doing better than they are, not because AMD is bad but I want more competition and more competition is always good for the end user. Problem is AMD may bring their B+ game often but when you have Intel and Nvidia as competitors you need to bring your A+ game to win for your competitors are just as talented as you are and they have far more design resources due to greater revenue and cash on hand which means more money spent on engineers.
Krysto - Wednesday, June 4, 2014 - link
Intel cheats with Turbo-Boost, though, being able to win 5-minute benchmarks because of Turbo-Boost, while also claiming long battery life because Turbo-Boost barely gets any use, so the chip is actually running at lower performance than advertised in the long term.Galatian - Wednesday, June 4, 2014 - link
And AMD doesn't turbo? o.ORoland00Address - Wednesday, June 4, 2014 - link
It turbos but not as well as IntelGalatian - Wednesday, June 4, 2014 - link
And because Intel is able to implement a better turbo they are somehow cheating? I mean it still is a advantage for the consumer in the end. Race to idle and all.zaza991988 - Thursday, June 5, 2014 - link
Actually sometimes Turbo helps with the overall energy efficiency. Turbo enable to push the CPU to the limit and finish the high demanding task quickly so it can return into a lower power state afterwards. So you are temporarily increasing the instantaneous power to get a better an overall energy efficiency.sspiff - Wednesday, June 4, 2014 - link
The TDP of AMD's offering includes the much more powerful GPU (compared to Intel's HD4400 in the i7). For the benchmark results, the Intel was paired with a discrete nVidia card (750M), which also guzzles power NOT included in the 15W TDP of the i7.I'm not familiar with what other parts are on-/off-die for these CPU's, but it's not fair to compare the power envelopes on a spec sheet like you're doing. The fair test would be to measure full-system power consumption of two comparable devices, which is sadly not possible at this time, as no Kaveri laptops have shipped yet.
Torashin - Wednesday, June 4, 2014 - link
Remember power consumption increases exponentially with frequency, and this phenomena is even more accentuated with Kaveri because of the process used. So what I'm saying is that the 19W APU's likely wouldn't be that much slower than the 35W one.Drumsticks - Wednesday, June 4, 2014 - link
Maybe its also true of frequency, but I believe you're looking for voltage. Power consumption for two different frequencies at the same voltage is much different than two voltages at the same frequency.I could be wrong, though.
JumpingJack - Wednesday, June 4, 2014 - link
Buzzzzz, wrong. Power = C*f*(V^2), power is linear with respect to frequency and quadratic with respect to voltage. (C is effective capacitance). Google "Power CMOS Circuits".zaza991988 - Thursday, June 5, 2014 - link
Actually the relation between power and frequency is linear. The relation between the power and voltage is quadratic. p = c*f*v^2. and a higher TDP allows the CPU to stay at a higher performance state for a larger amount of time. you can compare the performance between a 15 watt TDP and 28 watt TDP in the following link. (rMBP 13 has 28 TDP ship and air has 15 or 17)http://www.notebookcheck.net/Review-Apple-MacBook-...
eanazag - Wednesday, June 4, 2014 - link
Close compared to the i7 ULV and GT 750. Often better than the i5. It is good against what will likely cost at least $100 to $200 more. Battery life that is 80% and weight that is 110% of those Intel machines might be a reasonable trade off saving $200.Dan Ritchie - Wednesday, June 4, 2014 - link
On other sites (with slides) they're comparing the 19 watt part against the 15 watt part, not the 35 watt part.MLSCrow - Wednesday, June 4, 2014 - link
Who cares that it's slower as a CPU? Beside the point that it's not even noticeably slower, have you forgotten that it's not a CPU at all, but rather an APU, and as an APU its performance is something to be praised not degraded. Intel fans have honestly become the worst fanboys out there now. Can't give credit where it's due and can only make negative comments, even in the light of positive results. Facepalm.kingpin888 - Thursday, June 5, 2014 - link
Dude, mobile AMD Kaveri has a 35W TDP and is quad core and intel i5 is dual core and has a 35W and 47W TDP. What are you talking about ??Gondalf - Wednesday, June 4, 2014 - link
Jarred.....Anand.....you are comparing an hot 35W SKU with a constrained 15W ultrabooks SKU.This time there is no justification, Anantech lab is plenty of notebooks with a 37W Intel cpu on board and i pretty believe of notebooks with 28W U parts with a less constrained HD 5000 or Iris 5100 GPUs. And why a i7 4500 (GT2) and not a i7 4550 (GT3) ??????
This is a pretty biased article, nearly useless for customers.
JarredWalton - Wednesday, June 4, 2014 - link
Comparing $300 CPUs against $150 APUs is potentially just as bad. And you'd be surprised how many laptops we don't actually have; most come with high-end configurations, which really muddies the waters. I do have a 37W i7-4702MQ available where I can disable the dGPU and see how performance compares, but that's a $370 CPU and again just not anywhere near the price of the AMD offerings. (Base price on the laptop is $1500+.)Morawka - Wednesday, June 4, 2014 - link
your looking at tray cost. overall laptop costs will be comparable, intel has much more components on die. maybe $150 off. I would have liked to see a comparable wattage intel cpu in there as well.Gondalf - Thursday, June 5, 2014 - link
Yes agreed with you.The perfect review could be with a i7 4550 or i5 4250 both GT3 with TDPup enabled up to 21/22 W. We have forgotten that Kaveri is 19W but we need of other 2/3W for southbridge that is on package on Haswell.
Looking at others reviews, in this conditions Intel and Amd are average on pair in GPU (depending on game title) and Intel is in huge advantage in CPU. The 19W (21/22) TDP figure seems an attempt to put sand in customer eyes.
Lets wait a review with real Notebooks, yes because the TDPup feature is largely utilized by Intel OEMs to have more performance from mobile Haswell.
Thermogenic - Wednesday, June 4, 2014 - link
Can't wait until we get real laptops tests - tests without battery life on a mobile device aren't all that useful anymore.JarredWalton - Wednesday, June 4, 2014 - link
Yeah, I tried to check it and found that the prototype was basically not tuned at all for battery life -- probably would have been under four hours for Internet surfing, which is way out of line with what we expect. Basically, idle power draw was around 16-17W, almost double what it should be, so the firmware and hardware wasn't tuned to go into low power states as far as I could tell. When we get shipping laptops, I suspect we'll see battery life competitive with Intel solutions, maybe even better. I figure 5-6W idle power draw or less isn't too far off these days.s44 - Wednesday, June 4, 2014 - link
Competitive with Haswell or with Ivy Bridge? Battery life really is everything for most laptop use.JarredWalton - Wednesday, June 4, 2014 - link
Haswell is obviously the target, and AMD is claiming 11 hours vs. 9.2 hours for Kaveri vs. Richland (doing a low intensity eReader/Browser test), so that's at least a decent bump. But I've seen Trinity laptops with lousy battery life due to the OEM not spending the needed effort in that area, so really it's as much the laptop as it is the CPU/APU.Gondalf - Thursday, June 5, 2014 - link
"maybe even better" i have some doubts, AMD has a process disasvantage that is a damnation in those uses where the cpu works at very low voltage or stay idle for a little. Web surfing is an example.Mullins has showed that AMD main weakness is there, being behind Qualcomm (TSMC) and Intel (in house) in idle power.
Shivansps - Wednesday, June 4, 2014 - link
Wait, why they are comparing it to a 15W ULV I7? for the same price you can get a Acer Aspire V3-772G-9822, its a I7 QM+760M..takeship - Wednesday, June 4, 2014 - link
Comparing these chips to the haswell mobile i7 quads would only show that AMD has stopped competing in CPU perf. The cinebench scores tell that story - my old workhorse i5-520M from 4 years ago scores higher in both single & multi than these chips. I think the graphics scores also make it apparent that memory bandwidth is a big issue for GCN, and even 2133ddr3 doesn't cut it. 4x the gpu cores as kabini, with only 2x the output. What AMD desperately needs is an OEM that will put their chips in a design that doesn't look and perform like a black friday door buster. Sadly, with Intel's ultrabooks all calling for 17w cooling systems (and broadwell will be less!) there just aren't that many compelling (read: apple-like fashionable) 35w designs for AMD to hope they get recycled into.parkerm35 - Wednesday, June 4, 2014 - link
"a big issue for GCN, and even 2133ddr3 doesn't cut it"It was using 1866MHz ram.
jabber - Wednesday, June 4, 2014 - link
They probably gave up competing because 95% of the customers stopped caring around 2006 onwards.iTzSnypah - Wednesday, June 4, 2014 - link
I think the next step AMD should make in APU progression is the addition of DRAM on die. It's more efficient than adding more GPU cores for performance and you have hUMA so the CPU cores could take advantage of it too.Novaguy - Wednesday, June 4, 2014 - link
Except that 3000mhz ddr4 is coming out, so why invest time and r&d into on die dram for that next chip?CarrellK - Friday, June 6, 2014 - link
What you are asking for is an Xbox One APU. DRAM on die is not a trivial choice. The amount of DRAM you'd want to add is not insignificant and thus the amount of die area consumed would be very significant. To use the inevitably limited amounts of on-die DRAM introduces tremendous complexities in software, as coders have to special case the use of the high-bandwidth on-die DRAM and manage its use very carefully. You can do that with consoles, not with PCs. And the XBox One APU is at different cost/power/etc. points than Kaveri.There are other solution paths, but that isn't suitable for a comment to a comment in a review article.
Meaker10 - Wednesday, June 4, 2014 - link
Is it a faux "FX" chip or is it unlocked?JarredWalton - Wednesday, June 4, 2014 - link
It's not unlocked, but what would the point be? Let's just say my experience with overclocking laptops is that there's usually a reason 99% of laptops don't allow it. Huge notebooks with much beefier coolers can try, but even then we often get only marginal bumps in performance.Flunk - Wednesday, June 4, 2014 - link
Under clocking the CPU and overclocking the GPU. that would be helpful. I've actually done this in a laptop that shares coolers between a discrete GPU and CPU but I imagine it would be useful in an APU laptop too.E.G. Note that the CPU is not underclocked in this test but I do normally underclock it to reduce the volume of the fan.
http://www.3dmark.com/3dm11/6039951
Meaker10 - Friday, June 6, 2014 - link
The GS60 with M290x, with an a10-5750 it's a questionably balanced system, with an ad 5550m at 3.6-3.7ghz it offers a great deal of value.Meaker10 - Friday, June 6, 2014 - link
Damn auto correct, gx60 not gs60.vladx - Wednesday, June 4, 2014 - link
There's a mistake on the 3rd page, it should be i5-4200U not 5200U afaik.texasti89 - Wednesday, June 4, 2014 - link
I looked straight for the single-threaded performance and I see the 15w 4500u gets >50% higher score than the 35W 7600P. Yes, graphics is good at that high TDP, but without a competitive CPU performance I think it's hard to justify these SKUs for the target market. AMD really needs to stop this disastrous adoption of bulldozer-based CPUs for good.
TrackSmart - Wednesday, June 4, 2014 - link
Three thoughts:1) I'm not defending AMD's single core performance, but the single-threaded cinebench test shows a 33% lead for the Intel Core-i5, which is not "greater than 50%". That said, I agree that AMD needs to close the gap on single-threaded CPU performance (and CPU performance-per-watt). This is as close as they've been in a long time, but the gap is still large. Their grand strategy has supposedly been to leverage the interaction of CPU+GPU to provide better performance, but when you are the minority player in the chip market, that's a tough hill to climb. How can you get software makers to write code for your APUS when they are only a fraction of the market *and* mostly found in the lowest-end consumer laptops?
2) Sure, multi-threaded CPU performance looks competitive here since AMD has thrown in more "cores" (purposeful use of quotation marks) for the same price as Intel, but that strategy can only go so far. At any time, Intel can choose to lower the price on its chips to match AMDs performance-per-dollar because Intel's chips use less silicon to produce the same CPU performance.
3) Anand is right that AMD needs to get these chips into nice laptops -- with lower-than-Intel prices -- and emphasize the gaming performance advantage. And those laptops had better have good battery life under normal usage. The presumed niche would be for an AMD laptop with reasonable gaming chops at several hundred dollars less than an Intel laptop with a discrete GPU.
That said, 35 watts is too high for the ultrabook form factor. Will AMDs graphics advantage remain this large when you move down to a 15 Watt APU? If the answer is no, then Anand's ideal doesn't really work. All you would get is a $50 cheaper AMD laptop with slower CPU performance and similar GPU performance!
[Note: I'm really rooting for AMD, but I'm going to remain cautious in my enthusiasm for these new chips until I see them in actual laptops. As they say, the proof is in the pudding.]
Natfly - Wednesday, June 4, 2014 - link
> I'm not defending AMD's single core performance, but the single-threaded cinebench test shows a 33% lead for the Intel Core-i5, which is not "greater than 50%".1.33 / .87 = 1.53. 53% faster.
TrackSmart - Thursday, June 5, 2014 - link
Yes, but the core i5 I specifically refer to is only a 33% difference. I'm guessing you were looking at the i7. So we are both wrong and both right. And in agreement about the need for better single threaded performance.piroroadkill - Wednesday, June 4, 2014 - link
Nice chip. But that said, without battery life results, this review does fall flat, I understand that they couldn't be done, though.Other than that, you've touched on the main issue with AMD based laptops: nobody wants to make a good one.
Give me something like my old Sony Vaio Z12 with that FX-7600P in it, at a decent price, and you have a compelling product. I don't want ridiculous resolutions. I want one that will work well for games and desktop use at a nice size. The Z12 @ 13.1" 1600×900 offers that absolutely perfectly in my opinion.
Homeles - Wednesday, June 4, 2014 - link
"From what I could gather, AMD used "V" transistors in Trinity/Richland but has switched to "T" transistors for Kaveri, which explains the drop in maximum clock speed."Perhaps, but the most important reason was the regression from PD-SOI to bulk.
Interesting claim there about the transistor "shapes." I've not heard of T and V shaped transistors... it appears to be the shape of the gate.
rhx123 - Wednesday, June 4, 2014 - link
I am really confused about the Acer, as according to Ark, the 4500U has no PCIE 16x lane output, how is the 750M attached?http://ark.intel.com/products/75460/Intel-Core-i7-...
TheinsanegamerN - Wednesday, June 4, 2014 - link
It has a single x4 link and dual x2 ink. the 750m would be connected by the x4 link.coburn_c - Wednesday, June 4, 2014 - link
The GPU isn't good enough to game with and the CPU isn't good enough to do anything else with. You'll get half the battery life of the competition and it will get twice as hot. Abject failure.frozentundra123456 - Wednesday, June 4, 2014 - link
Yes, you hit on the crux of the matter. Problem is still that of all APUs, the igp performance is still in limbo: almost there but not quite for gaming, while cpu performance and power consumption trail intel badly. Looking at the gaming tests, looks like one will be limited to 768p, and even then COH (admittedly terribly optimized) is not playable, and we have no tests for demanding games like Watchdogs, Metro LL, or Crysis 3. So one will be stuck with either a crappy 768p screen or playing at non-native resolution for a lot of games. Not to mention, this mobile chip should have come out first instead of being so close to the broadwell mobile launch.silverblue - Wednesday, June 4, 2014 - link
You say "cpu performance and power consumption trail intel badly". There's no real data on the latter apart from TDP values (which, in themselves, aren't an accurate indicator), and the FX-7600P is certainly no slouch when viewing the CPU benchmarks in this preview. Let's not forget that large portion of the die dedicated to the GPU, either - just because it's there, doesn't mean it's being used. Toms had the FX-7600P doing well at GPGPU (it should) and PCMark, less well at physics scores in 3DMark (the competition was a high-end i7) and even worse in Sandra, but we know that doesn't always mean everything; after all, the Core 2 had low memory bandwidth due to having no IMC but was still easily the match of the Phenom II series.This is a preview, so quite why people are expecting numerous tests on 2013/14 titles from a very brief testing period is beyond me. Patience, grasshopper!
Novaguy - Thursday, June 5, 2014 - link
But I suspect you can hit 1080p with an awful lot of 2007~2012 games, making kaveri mobile + 1080p a great humble bundle mobile gaming solution. Now, if they combined it with freesync, you could get away with lower frame rates and play even more recent games.The_Assimilator - Thursday, June 5, 2014 - link
The use case for a laptop is simple: you don't require graphics horsepower the majority of the time. Hence it makes more sense to use a CPU that uses less power (and performs better), and pair it with a discrete GPU that won't be drawing any power most of the time. Intel understands this and it's why they make money.AMD on the other hand, have dreamed up a mythical future reality where everyone who buys a laptop expects it to deliver decent GPU performance. But very few people buy laptops to play games on, they buy laptops to be productive while mobile - and faster CPU that use less battery power are always going to win out.
nemi2 - Wednesday, June 4, 2014 - link
It's becoming more of a glaring omission as time progresses that AMD doesn't have a SSD caching option in their SATA BIOS/drivers. AMD is targeting the cheaper end of notebooks so I agree with the author: a pure SSD storage solution will not make the BOM, and because there is no AMD SSD caching solution a 8,16,32GD SSD cache can't make the BOM either - discrete chece is not supported and a hybrid HD unfortunately also seem to also command a $ premium and don't have as good integration with windows.Makes me wonder what the state of USB "ready boost" is with Windows these days?, things are very quiet on the that front. Also where is a cheap / free 3rd party utility to provide SSD/ NVM caching, do AMD laptop vendors have any budget / driver solution?
lmcd - Wednesday, June 4, 2014 - link
The SSD caching probably explains it -- thanks for posting (not an obvious conclusion to reach!)JarredWalton - Wednesday, June 4, 2014 - link
SSD caches are really only good for two things:1) Inflate certain benchmark scores (PCMark being a major one).
2) Help laptop boot/resume faster.
Seriously, most other tasks that hit the storage hard -- installing an application, loading a bunch of apps at once, opening a browser with 30 active tabs -- are all only slightly faster than an HDD with SSD caching, where a pure SSD is substantially faster. The best SSD caching solution right now comes from Apple, and it's only good because it's 64GB or more; 24/32GB SSD caches just don't cut it in my experience.
Shadowmaster625 - Wednesday, June 4, 2014 - link
i7-4550u has a tray price of $426. It would be interesting to see that comparison, but what is the point ultimately? Also, they probably couldnt get their hands on one.TheinsanegamerN - Wednesday, June 4, 2014 - link
While Kaveri seems like a good chip, it has two problems. It's cpu performance is still very poor compared to intel, and it is coming to mobile WAY too close to broadwell, which will probably be slightly faster, if intel bragging of 40% graphic improvement is correct. on top of that, anandtech, please compare the kaveri to 37 watt i5s. that's the real competition for this chip. leave the 19 watt to duke it out with the 15 watt chips.WorldMadness - Wednesday, June 4, 2014 - link
You can look at tomshardware They reviewed FX7600 vs i7-4702 except the cpu performance everything else is not to shabby, decent enough but still not to my expectation as we know except MSI every single vendor would come out with shitty specsilverblue - Thursday, June 5, 2014 - link
Yet there's still the unexplained matter as to why the last two generations of GX70s performed so badly...mga318 - Wednesday, June 4, 2014 - link
Considering that the Kaveri FX has a 35W TDP part while the i7U+GT750M has 67W TDP, I'm quite impressed. This could have a lot of promise if OEMs would stop ruining everythingHappyHubris - Wednesday, June 4, 2014 - link
"For people that truly need compute performance, I suspect they're looking at much higher performance parts than an API"API=APU?
Tikcus9666 - Wednesday, June 4, 2014 - link
What i want is an AMD FX-7600P, in a 14/15" thin and light enclosure with Full HD screen8GB DDR 3 @ 2133 MHZ, 128GB/256GB SSD, i'd buy straight away, unfortunately what we will see is, a bulkier 15.6 " screen at 1366*768, 4GB of DDr 3 @ 13333MHz, and a 5400 rpm hard drive, and a GPU added in crossfire or on it's own which is only slightly more powerful than the APU
Tikcus9666 - Wednesday, June 4, 2014 - link
1333 Mhz (not being able to alter comments)Roland00Address - Wednesday, June 4, 2014 - link
I hate dsylexia, my first time I read the first page I read AMD MOBILE KAVERI SKUS, as AMD MOBILE KAVERI SUCKS.kyuu - Wednesday, June 4, 2014 - link
I did the same thing, and I don't even have dyslexia. I had to do a double-take and look at it carefully before realizing it was SKUS and not SUCKS.CoronaL - Wednesday, June 4, 2014 - link
lol me toomax1001 - Wednesday, June 4, 2014 - link
I would love to see a nicely spec ultrabook with this APU and $800-$1k price point that can do decent gaming.AnnonymousCoward - Wednesday, June 4, 2014 - link
It'd be fun to throw a GTX 760 in the mix, just to see where we're at.CoronaL - Wednesday, June 4, 2014 - link
Wonder if they will pair that up with a cheap 250x mobile version for some X-fire gaming on laptop. I bet that would beef up the graphics a fair bit. IMO the cpu part is pretty respectable trading blows with the i7 and i5 for the most part, which AMD usually gets trounced in.azazel1024 - Wednesday, June 4, 2014 - link
Well it does look a lot better than Richland and price might be a big driver in making these interesting (I see no prices?), but 19w Kaveri versus 15w Haswell ULT...the 15w Haswell ULT, depending on the turbo performance of the 19w Kaveri part looks to run roughly as fast to a very sizeable 20-30% lead in CPU performance and with the GPU cut down to roughly half performance between core and clock reduction...the Haswell ULT is likely to be on par graphically +/-10%. In a lower power consumption package...and if the Kaveri has turboing issues or the GPU is thermally constrained in this package it could be a much more significant lead.The 15w Kaveri doesn't look like it would be on the performance map at all (<<<50% of the performance of Haswell ULT). The standard voltage parts actually look the most interesting, especially if the price is moderate. That would probably allow you to make a decent performing machine (70-90% of an Intel's CPU performance) with a very nice iGPU for a rather low price point (assuming the FX-7600p is very price competitive to the Intel Haswell SV dual core parts).
azazel1024 - Wednesday, June 4, 2014 - link
I guess I missed something with the Intel chips in regard to the TDP. My experience is vastly different. Most Intel chips, both desktop and mobile, don't seem to come close to hitting max TDP under full CPU load. Especially in the desktop space I see something like my i5-3570 with a TDP of 77w is hitting around a 35w delta between idle and full load based on wall plug data, considering losses from the power supply and assuming a fairly low (CPU) idle number, thats only in the area of 30-40w total power consumption. That isn't simply under load, the processor is clocked up to 4.2GHz single core and 4GHz all 4 cores turbo. That isn't a max burn situation where every single itty bitty capacitor and transistor is loaded like Intel does in their TDP tests, but that is under 100% sustained CPU load across all 4 cores.From what I have seen with Ivy Bridge testing (which ain't Haswell I'll admit) a fully loaded i5-3317u seems to only use in the area of 8-10w with both cores fully loaded max turbos 100% CPU load. Its the GPU that uses a huge whack of power, >12w under max load and turbos.
I'd assume Haswell is pretty close there, at least with the CPU, since the turbo core speed is the same, same lithography, very similar architecture. So I doubt a ULT Haswell is using more than 10w under max CPU load and turbos. Its the GPU that is going to be the power hog.
No idea what AMD is capable of on these, but we know they are using a less energy efficient process, so I'd assume/guess that their CPU is probably below the cap under max CPU load, but it might be much closer, maybe in the 12-15w range under max turbo, meaning GPU load is going to cause the CPU to scale pretty far back, even on the 19w chips.
worm - Wednesday, June 4, 2014 - link
It seems like any benchmarks of the Ax-7xxxP should be against an ix-4xxxM, not MQ, like over at Tom's, and the 7xxx, should be against the ix-4xxxU, at approximately the closest price point. That leaves plenty to choose from, in laptops in the market. Hopefully we can get that data so a real comparison can be made. I hope we are just waiting on OEMs to release their Kaveri laptops, in both 19w, and 35w forms, to make it worthwhile?JarredWalton - Wednesday, June 4, 2014 - link
I am actually testing the first dual-core standard voltage Haswell laptop right now, so I'll have that data added to Bench in the next week.meacupla - Wednesday, June 4, 2014 - link
still waiting on A6-7600K.I know it's a desktop CPU, but c'mon, it's been four months since Febuary and I don't even see it in stock at newegg.
MLSCrow - Wednesday, June 4, 2014 - link
Why are they comparing Kaveri to the 15W ULV i7? Simple, because it's a more fair comparison for multiple reasons:A) That i7 is a dual core with hyperthreading (4 threads). Kaveri is a dual module with modular multithreading (4 threads). An AMD module is what you compare to a single Intel Core with hyperthreading. That is the fair comparison. It wouldn't be fair to compare a dual core vs a quad core now would it?
B) Although the Intel chip is still more expensive than the Kaveri, it is much more closely priced than the quad core version. Price ranges should always be taken into consideration when comparing offerings of like-capabilities from different manufacturers.
silverblue - Thursday, June 5, 2014 - link
Each Steamroller core has two integer pipes, making four per module, as compared to Haswell's four per core. If only the module was exposed to Windows rather than each integer core, wouldn't there be a chance that single threaded workloads would be significantly boosted?aicom - Thursday, June 5, 2014 - link
Not with the current microarchitecture. The integer pipelines are statically partitioned between the cores on each module (unlike HyperThreading). It's a key aspect of what AMD calls CMT.silverblue - Thursday, June 5, 2014 - link
Yes, but I was thinking about changes to the front end. I suppose what I'm referring to is a regression but the integer cores are too weak on their own. Essentially, I was imagining what would be required to adopt a more SMT-style system and whether it'd be any sort of improvement.Meteorhead - Thursday, June 5, 2014 - link
Too bad the benchmarks were not done with 2166MHz RAM modules. Earlier Richland tests in a Hungarian magazine clearly showed that ALL games are memory bandwidth limited and FPS values 100% correlate with RAM clocks. 20% increase in RAM clock = 20% increase in FPS. I'd suspect exactly the same behavior holds for Kaveri too, specially that it's even more powerful in ALU terms.orenc17 - Thursday, June 5, 2014 - link
Why Haven't you compared it to iris or iris pro?MLSCrow - Friday, June 6, 2014 - link
They aren't comparing it to Iris or Iris Pro, because...A) Kaveri costs about half or a third of what an Intel chip costs with Iris Pro, and...
B) Iris Pro only comes on certain quad-core Intel chips in the 47W TDP range. There is no fair comparison to make.
silverblue - Saturday, June 7, 2014 - link
Point B just makes me think of The Terminator..."Iris Pro CPU in the 47W range."
"Hey, just what ya see, pal."
wumpus - Thursday, June 5, 2014 - link
The point is that a power supply may be expected to supply more than the TDP for significant portions of a second (more than enough to exhaust the input capacitors of the output switcher). Don't use TDP for electrical specs.MLSCrow - Friday, June 6, 2014 - link
Awesome name btw. I used to play a game for my TI-99 back in the early 80's called "Hunt the Wumpus".TheJian - Thursday, June 5, 2014 - link
1366x768 and these are avg fps I'm sure as no mention of mins, so don't even expect to play in this CRAP resolution without a gpu as shown. Unfortunately as these things have gotten faster, game engines have gotten more taxing. They really have gained nothing on discrete which is why those sales are staying just where they are while PC sales are down 12% last year. You still need a card to do real gaming.kirilmatt - Friday, June 6, 2014 - link
These chips are an amazing step for AMD. They have moved from i3-level CPU performance up to i5 level in some cases. People seem to neglect the potential of dual graphics in these laptops. With a GCN GPU, big gains could be seen. There would be less energy usage for more performance in games. The 19W FX looks like the best chip that they offer. AMDs previous ULV offerings have been dead awful. The OEMs are sure to ruin it though by putting them in inferior hardware.MLSCrow - Friday, June 6, 2014 - link
Just think about this...considering that Kaveri is actually a pretty decent APU, especially for the mobile market, just imaging what Carrizo is going to be like. I honestly believe that Carrizo will be the first APU that will actually be considerable in terms of an enthusiast gaming laptop/ultrabook. When AMD moves to 20nm, we will finally have HD7870 performance in an iGPU and that is when APU's will completely take over the market as that puts out acceptable performance even for a hardcore gamers.Add to that an all new x86 core built from the ground up by Jim Keller, mixed with a 16nm process by the time it comes out...and hell, we may even get what all AMD fans are dreaming of right now, an 8 or 16 core APU with the iGPU performance equivalent of an HD7970 or perhaps even R9-290x.
johnny_boy - Saturday, June 7, 2014 - link
The 19W FX part is nice. These make great chips for linux laptops since you get good gpu performance without having to deal with dual graphics and graphics switching, a horrendous affair on linux (though improving). If they put this in a moderately well-built machine in the $600-800...UtilityMax - Sunday, June 8, 2014 - link
The tests are kind of meaningless. As others pointed out, AT compares AMD's 35W TDP part with Intel's 18W TDP designs. Moreover, those lower power Intel CPUs are dual core while the Kaveri A10s are quad core. Bring Intel's mobile quad core i7 or at least a 35W Core i5 (dual core) part, and there'd be no comparison. AMD would be far behind in any bench that doesn't use GPU.Andrew H - Wednesday, August 13, 2014 - link
After reading, I was a bit disappointed that that didn't have the socket type. After doing some digging, those of us currently using the 5750M (Socket FS1r2) will NOT be able to swap in the new FX-7600P (Socket BGA/FP3).Shame, guess I'll have to wait for MSI to get on the bandwagon and I can order a motherboard/cpu combo that fits in their universal 17" chassis.
Source: http://www.cpu-world.com/CPUs/Bulldozer/AMD-FX-Ser...