Intel probably didn't test with D3D12. In this API their iGPUs are painfully slow. I didn't really know that Fable Legends how optimized for Intel, but an Iris Pro 6200 is two times slower than a Carrizo with 15 watt TDP.
Bigger issue for me is that they are talking as if i7-5775C Iris Pro represents the majority, or even some significant portion of the install base. The i7-6700K comes up short of AMD's IGP efforts, which are slotted in beneath their low end discrete cards, so the i7-5775C is the only chip in this article that may fit the statement. Most people getting a new computer are going skylake (6xxx series). There are some people upgrading existing Z97 boards with broadwell, but even if the entire DIY upgrade market went that route, that is a very small percentage of the install base. I have not seen a whole lot of broadwell in retail systems. This makes sense as people (and marketing departments) tend to gravitate to the "larger numbers" or "newness" of the skylake series. The brand new DDR4 is also an easy marketing differentiator. Given that the i7-5775C is a fair bit more expensive than the i7-6700K, I don't imagine that many of the uninformed would see any reason to pay more for an older processor. This is especially true when you take a look at the limitations of the platform and how many marketing bubbles the newer skylake platform gives you over the older broadwell platform. The informed crowd may see some of the benefits of the older chip despite the platform limitations, but they are often buying it to pair with discrete cards, so even if they do count towards the install base, they aren't always using them.
So the new mainstream is the mediocre of the previous era. And there I was, thinking society is improving.
Gaming at a resolution, lower than the standard one, at below 60 FPS - that's not mainstream. That's entry level. Mainstream gaming would be at least at 1080p60, high end at 2k120 and up.
Not that anyone expected Intel to be objective and realistic about the GPUs they keep cramming into chips rather than extra cores or lower prices - things people actually need. iGPU should only be in up to i5 products.
Honestly for the average persons 720p is what they game on. and a lot of games don't run game maxed out. That'es the enthusiast/high end PC crowd. Don't project your requirements on to the rest of gamers.
That being said I do agree those people have no idea what they're missing. Or just don't care enough. I run a 2600k @4.5ghz with a 390x on a 1440p 144hz freesync monitor so I wouldn't be able to go back to 720 or even 1080 TBH.
Mainstream is not your average casual angry birds gamer. The mainstream is really the "average" the "midrange". That's intel's idea - to prop up its entry level products as average, but that is straight out false advertising.
So intel's very best GPU doesn't entirely suck at playing games at low resolutions and low framerates - but that ain't mainstream, that was the mainstream a long time ago.
"Mainstream is current thought that is widespread"
I highly doubt people playing mine sweeper are the bulk of gamers out there. The mainstream gamer is the midrange gamer - and intel's hardware is really entry level.
More than 60% of steam user play on monitors LESS than 1080p (http://store.steampowered.com/hwsurvey) . "Mainstream" and "Midrange" do NOT mean the same thing. You can think that does but but they don't. I was a gamer on a very subpar/entry level build years because that's what I had. I have access to better now but back then I turned the fidelity down and lowered the resolution if I had to, it's what us gamers on a budget did.
its because of the 25% who own those cheap laptops. I would bet half of that are duplicate owners, who own a desktop machine at home, and have a cheap gaming laptop they use to CS GO or Dota.
Also the China and Korea market is on that same resolution
From the Hardware survey:
1366 x 768 26.47%
Everything other resolution below 1080p is 1-2% or less
@jasonelmore: "Everything other resolution below 1080p is 1-2% or less"
I'm going to assume you meant everything other than resolutions at or below 1080P. Steam survey says you are pretty close. 96.37% are at 1080P or below. Of course 1080P makes up 35.15% of their survey, so if you don't include 1080P, you only end up with 61.22%.
@ddriver: "I highly doubt people playing mine sweeper are the bulk of gamers out there. The mainstream gamer is the midrange gamer - and intel's hardware is really entry level."
I would define the midrange gamer as the arithmetic mean. The mainstream gamer is probably best defined as the median where the casual gamer is probably the mode.
If you want to define it based on computer specs, given that more than half of the users on steam play on monitors less than or equal to 1600x900, I say that is probably a more realistic mainstream resolution.
@haukionkannel: "Mainstream are people who plays mine-sweep and solitaire with their computer." I'm pretty sure that falls into the casual gamer category.
Of course, mainstream isn't the arithmetic mean. The mainstream gamer is probably best defined as the median where the casual gamer is probably the mode.
Pure speculation (obviously mine) says that the mainstream group probably include people that play games like Minecraft, the SIMS, and perhaps Portal. There is a rather large MMO crowd and franchises like Call of Duty do push the median up given the number of followers, but the number of mine-sweeper and solitaire only gamers is far greater. That said, many who own a computer don't play games on it at all and some of the casual gaming crowd has migrated to their phones/tablets. As they drop off the list, the median edges upwards. In any case it is hard to say exactly where the median now lies and it is probably safe to say that it is a moving target.
Mainstream would not be considered "average". If we talk statistics, there's mean, median, and mode. Mean is the average. Mode, which is the number that occurs most often, would more accurately describe the "mainstream". If you look at all gamers, and over 50% of them play at 720P, while the remaining people play above that, 720P is the mainstream, while the average would be 1080P or over.
@zo9mm: " Mode, which is the number that occurs most often, would more accurately describe the "mainstream""
I tend to think of Mode as the casual gamer, though I'll accept that I'm not an authority. I also tend to look at it more from the type of games played than the hardware it is played on. From a hardware perspective, Mode may not be a good fit for any gamer set in this discussion.
In any case, if you define the mainstream as Median, the Median resolution is less than the 1080P ddriver suggested. If we accept steam surveys as representative, then 1600x900 is the median resolution. Interestingly, 1920x1080 is the Mode resolution in steam surveys with 35.15% followed by 1366x768 with 26.47%. Surprisingly, 720P only has 1.33%, and only 3.96% of gamers on steam's survey game at 720P or below. Perhaps steams survey isn't the most accurate, but it is easy to look up and statistically significant. Feel free to reference an alternate survey if you want to use it instead.
I agree completely with you. Higher resolutions are nice, but I'm perfectly happy at 1024x768 or 1366x768 as it does a lot to remove the need to purchase a more powerful GPU and makes whatever GPU does end up performing graphics chores have an easier time of things. Plus, other visual effects can be turned up higher in exchange for the reduced resolution. It's an all around win to play at lower res which makes higher resolution a pretty unimportant thing. Sure it looks a little better, but there's a point of diminishing returns. For me, that point is rapidly reached above 1366x768.
@BrokenCrayons: "Higher resolutions are nice, but I'm perfectly happy at 1024x768 or 1366x768 as it does a lot to remove the need to purchase a more powerful GPU and makes whatever GPU does end up performing graphics chores have an easier time of things."
Thrilled to see you have enough sense to objectively evaluate your needs and set your requirements accordingly. My hat's off to you.
@BrokenCrayons: "Sure it looks a little better, but there's a point of diminishing returns. For me, that point is rapidly reached above 1366x768."
I think that depends on what games you play and how you play them. Taking shooters as an example, if you like to run and gun in a game with lots of tight corridors and not a lot of large open spaces, then there is no reason to up the resolution other than it looks nicer and 1024x768 is a nice cheap fit. If, however, you prefer to snipe at extreme range in games with large distances, then maybe resolution is extremely important to the point of giving up other visual effects and 4K makes some sense. Obviously monitor size plays a role here as well as if the monitor is too small, you eyes looks to ability to resolve such dense resolution seated 12 - 18 inches from the monitor. Most people fall somewhere in between these extremes. My games of choice and play style are such that 2560x1600 / 1440P is preferable, though I can perform well enough on 1920x1200 / 1080P. I also prefer 16x10 to the "Cinema oriented" 16x9 aspect ratio that is common place, but that is a topic for a different time.
Ah my current setup mimics yours, just with a single step down in each part. I game with a 2500k@4.4 ghz with a 390 on a 1080p120hz. Glad to know that the concept of a powerful GPU with a sufficiently powerful CPU has such a long life span with the slow down of CPU IPC improvements.
I plan on seeing if directx 12 will make my 2500k last even longer. Would be great have a CPU be relevant 6-7 years after it was released.
I don't know what "mainstream" means to you.. but most players don't care about resolution/fps just look at the sales of PS4/Xbox One and the players who still plays on the last generation consoles..
@marcelobm: "I don't know what "mainstream" means to you.. but most players don't care about resolution/fps just look at the sales of PS4/Xbox One and the players who still plays on the last generation consoles."
I don't disagree, but given that Intel isn't powering those consoles and there is not path of upgrade that will put Intel graphics in your console, I'm pretty sure they were talking mainstream PC gamers.
Point about resolution/fps still stands though. Most just think it looks good or it looks like crap with the occasional, "that game gives me motion sickness". I've met a few people who could play a particular game until a PC version came out with sufficiently powerful hardware to hide the jitters in the game engine. Hasn't been as much of a problem more recently, though.
I think intel's definition is mainstream of the computer industry, not mainstream of the gamer demographic. For someone not really into gaming just looking to fire up some basic appstore games or minecraft or something similar, iris pro graphics would probably be satisfactory to someone who doesn't know any better. 30fps is smooth enough to be playable and 720p is definately a bare minimum standard, but it is more than capable of doing that.
Of course anyone remotely into gaming wouldn't accept such low performance levels and would either go for a console, or a real gaming pc with a discrete GPU.
The title says "mainstream gamer" not "mainstream hardware". There is a difference.
A casual game player - that's what intel talks about, but that person is not a gamer, the gamer is a gaming enthusiast, who makes purchases with gaming in mind, gaming gear and whatnot.
Much like not every car driver is a racer. And claiming your family van is good enough for mainstream racers.
No.. a mainstream gamer is someone that plays games and enjoys the "mainstream" games. CS:GO, COD, AC, DA and other games targeted at a large pool of people. They don't need cutting edge hardware. Enthusiast gaming is different and you're not understanding. Your racing analogy is also totally wrong. It's more like Intel saying the "average commuter" is ok with this more efficient car and then you talking about "Well it's not good enough for racing!"
It's a fallacy to say that people are stupid if they're playing at 30 fps. I've been playing a lot of really good indie titles on a three-year-old i5, and it's performed really well. I've honestly been more limited by RAM and drive space. I've even managed to play League of Legends without problem.
Some games, like first-person shooters, require a higher frame rate to achieve a good level of accuracy. Keep in mind most movies are about 24 fps.
There are different markets. A lot of people will enjoy the Intel graphics.
think of it more this way: My wife plays games... she plays card games, and other such simple games. When win8 came out her Core2Duo could no longer play these kinds of 'mainstream' games with a decent frame rate, so I dug around and found a GT8600 GPU that wasn't in use which brought frame rates back up. When she moved up to an Ivy Bridge i3 the iGPU was as good or better than having the dedicated GPU, so we were able to leave it out of the build. Fast forward 4 years and now my kids are starting to play games on her PC that are a little more demanding. So the question in my mind is if I should upgrade her whole PC (really interested in something like a NUC or Brix), or just pick up another dGPU.
Essentially, Intel is saying to people like myself: don't bother with a dGPU, just upgrade the whole computer because it will make the whole thing run faster, quiter, sip less energy, and take less space, while adding modern connectivity, while having a GPU level roughly as good as the dGPU you are considering in the first place.
The problem is that at the end of the day I am going to end up spending ~$75 per year on her computer. Do I spend just that $75 on a dGPU now? Or do I spend $4-600 and have a new system that is going to last a few years. I really have half a mind to spend more now and not touch it for a while, so this kind of messaging is working on at least one person.
Your wife is not a mainstream gamer, she is not even a casual gamer, she is a non-gamer who plays casual games. Not every act of playing a game makes a gamer.
Anyway, who the hell buys 70$ GPUs? It makes no sense whatsoever. I suspect most of the people who end up purchasing such products don't really do it deliberately, it is just people who know nothing about tech, ending up being shoved a completely pointless GPU through buying some retailer's assembled configuration.
I reckon it is entirely pointless purchasing GPUs under 150$, 99% of all consumer grade CPUs come with iGPU that will be about as good. Now if you buy a PC with the intent to play games and you happen to be a poor guy, you will buy something in the 150-250$ range, and that's what a mainstream gamer is. Someone focused on gaming, making a purchase with gaming in mind.
The title doesn't say "mainstream GPUs" - it says "mainstream gamers" but really talks about casual gamers at best, and that's being generous, more appropriately it will be regular people who casually play games, not gamers.
If we are to generalize, then we could say a gamer is someone who plays actual board games, in that case, intel's GPUs are MORE THAN ENOUGH, because those people don't even need computers to play their games.
If you put your limit at discrete GPUs under $100 I'd probably agree, but you can get a 750ti or 260X for a touch over $100, and you can find 370s and 950s for under $150. These will all be substantially better than any integrated solution at the moment. Maybe a next gen Iris Pro with Lvl4 cache could rival a 750ti/260x, but you can bet they'll be a lot more expensive too.
I think there is a big disparity between the industry's perspective of "gamers" and the gaming community's perspective of "gamers"
You also might want to double check that perspective of "if you happen to be a poor guy, you will buy something in the 150-$250 range". That's very gamer-centric, elitist thinking. It's like saying "you must be poor if you're buying an Infinity Q series instead of a Mercedes S-class (or insert your choice of midrange luxury car line as opposed to a premium, far more expensive car line)"
I think mainstream is more people who play games like LoL / DotA / Hearthstone / CS:GO / Minecraft. These are the games my teenage nephew plays on my computers, even though my PCs are capable of 1080p high settings, he plays games that would actually be run perfectly fine on an IGP.
Headline is a bit disingenuous - when Intel says "Our IGP's" they mean the Iris pro found on a tiny % of relatively expensive Intel machines.
Hence almost all Intel IGP's out there aren't equivalent to discreet gpu's, and the ones that are cost a lot. Then there's the drivers which despite improvements since the really bad old days, are still behind Nvidia and AMD for stability and support.
Completely agree, also I from the headline l expected benchmarks comparing the whole family of Intel igp's vs mainstream gpu's from amd and nvidia, not reused ones from previous reviews. To be honest this is the kind of article that makes me lose a little bit of faith in technology journalists and/or sites... Ambiguous headlines, lack of evidence and that gut feeling of some undisclosed agreement with the manufacturer... Please keep or beloved Anandtech above that.
First time I do agree with a complaint. This really feels a promotion for Intel since Anandtech already mentioned that it is quite pointless to put the best graphics in expensive CPUs. Casual gamers usually use laptops. I can say this from experience as I bought a cheap laptop with 720p diplay last year with an i5-5200u specifically for Diablo 3 gaming, and with the lowest settings, it rarely reaches gameplay of 50fps or more.
I don't think it's AT. I think it's the author, Anton.
He got hired here a couple months after he got fired from KitGuru for writing, in an article about AMD's then-upcoming DDR4 RAM, that people who buy AMD products don't care about performance, and base their purchasing decisions on how cool the thing looks.
He's gotten a little better about disguising his fanboyism, but it's still there, and if you're aware of it, the tone of his articles makes much more sense.
Yeah, I would at least have expected a comment from the editor that currently Intel has not even announced any Skylake GT4e models. We have no idea if they will even bring them to the desktop or when.
"Intel’s latest integrated graphics processor found in Skylake chips..."
I don't think that counts as "being found".
And I agree with the driver: I really liked using my Ivy GPU to run Einstein@Home. but not any more with Skylake! An OpenCL driver bug leads to incorrect results, rendering the work pretty much useless. I've reported this months ago to Intel, with details and even delivering the source code.. apparently they couldn't fix anything yet :/
I was thinking the same thing. First off, the price of the Iris Pro editions of CPU's cost significantly more to the point almost costing as much as a low end dGPU. Secondly, No low end or mainstream gamer builds their own PC's and OEM's always put in base model CPUs and not Iris Pro enabled chips so the mainstream dont even get Iris Pro. You need to examine who is making the purchases and what they do with it. Steam says it all. #1 most popular card: Intel G33/G31 Express, #2: Intel 82945G Express, #3:NVIDIA GeForce 6100, #4 Intel HD Graphics 3000, #5 NVIDIA GeForce 7025. The highest percentage of users to an Iris enabled Card is:0.45% for Intel Iris Pro Graphics 5200 which is lower than Geforce GTX 980 at 1.08%.
That's a good point you brought up: Scarcity of Iris Pro across Intel's product line. At this moment, I can't even buy a Skylake processor with their latest Iris Pro GPU. In the previous Broadwell generation, you could only find Iris Pro in the top-of-the-line Core i5 and i7 processors (socketed or soldered). Even then I had a heckuva time acquiring one (I test a lot of hardware).
It's pretty much marketing fluff, Intel puffing out its chest saying they are relevent to gamers. There is a small possibility that Intel may be starting a marketing push to try and sell more Iris Pro equipped CPUs, but unless they start including them in their lower cost Core i3 and other less expensive processors, lotsa luck to them. Until then, the gaming segment belongs to the discrete GPU.
Intel's top end graphics really are more than enough for most gaming needs. In fact, even their lowest end modern GPUs in Cherry and Bay Trail chips are perfectly fine for older games and less demanding current games. I'd like to see improvements and most certainly migration of the full 128MB EDRAM package down the product stack, but I think it's more important to continue to focus efforts on reducing power consumption and heat output so moving parts in computers are unnecessary. We're already moving away from hard drives and kicking giant desktop PCs into the recycle bin in favor of mobile devices and much smaller desktop form factors. It's also time to do away with cooling fans. I've been relatively happy with Intel graphics since the GMA 950 came out and marginalized the importance of discrete graphics in my own computing. Sure, I've purchased or built the occasional desktop and fed it a GPU for fun, but it was never necessary and didn't really add a lot of value to my computing experience. So yes, I can see Intel's logic in this and their claims are pretty meaningful, but my preference is they spend more of their effort on making elegant, passively cooled processors and leave the double or triple digit TDP exclusively in server racks.
Your comment doesn't make sense in an "article" about gamers. Your comment leaves the impression that you don't game at all, so your comment really isn't relevant to the discussion.
It seems like you're trying to kick someone out of an "exculsive" tree house of boy gamers by reading between the lines to find a reality that you want that makes my comment fall outside of a preconceived notion of what does or doesn't constitute someone who plays video games. I admit I'm sort of perplexed by why that's significant enough to even comment about, but there is an interesting subculture in people who play computer games heavily that I don't fully understand.
I have experienced that a good portion of the enthusiast gaming community has a problem with thoughts and commentary that doesn't toe some imaginary line. It's basically tribalism or the "exclusive tree house" mentality as you've state and it really does make it difficult to have discussion that doesn't cater to that mentality.
Context could be relevant, at lest to contrast what gamers, older gamers and regular users constitute. I happily ran a laptop with Sandy-Bridge's HD-3000 powering a 2015's 2560x1980 IPS widescreen plus the integrated 1766x768, simultaneously. I play no games, mostly audio, office programs and the occasional HD video/film, hence not stressful demand from the GPU, nevertheless with no a single perceived problem whatsoever from a 5 years old Intel integrated GPU. Have to say that Intel drivers have been actualized on a frequent basis, allegedly improving performance/quality.
This. Intel with their effective monopoly has jacked up prices on all their better chips to the point of absurdity. Even though Moore's Law has run far enough for Intel to stop making dual core chips, we end up with tons of machines with only two cores. The modern web browser runs a lot better with more cores -- and better graphics.
The market needs AMD to make a strong comeback. Evolution in x86 has slowed to a crawl and prices are through the roof for chips that should be very cheap.
sure intel showing of there high end iris pro graphics performance, let people believe they get good graphics until they buy the low-mid i3-i5 and get horible gpu performance.
btw 720p benchmarks are ****, that is what you find on crappy laptop screens these days.
This would be nice if the chips that needed the higher-end integrated graphics actually had them. Very few notebooks ship with Iris Pro graphics because of the huge price premium that Intel changes for chips that integrate Iris Pro. Iris Pro on Intel's low to mid range chips would be an interesting product at the right price. Right now what they change for the functionality makes it very unpopular.
Come to our restaurant and with our world renowned wines, eat cardboard and paper! 30 times tastier than 5 years ago, when our soup of the day was dog puke! Yeah, the increase is good, but the starting point was horrible.
The big problem is that Intel's high-end iGPUs are indeed practical for a mainstream market... but they don't put those iGPUs on mainstream parts, they put them on high-end parts. So mainstream gamers get lower-end iGPUs which aren't practical.
If Intel put the Iris Pro on every CPU from the i3 on up, then maybe they'd have a point.
They don't need to put it on every part; just offer a few variants with them. Non-gamers still buy the majority of CPUs; they just need enough to be able to watch cattube in whatever the latest encoding standard is without turning on the fan.
Unfortunately for Intel, the other half of why budget discrete GPUs sell so well is that you can upgrade them. $100 for a new GPU after 2-3 years is a lot cheaper and less hassle than $200+ for a new CPU and mobo; if the new IGP needs faster dram to hit its full performance make that $300+. A $300 GPU will blow any of Intel's IGPs out of the water.
If they sell parts with worse iGPUs, guess which ones will end up inside the cases of the "mainstream" users? They need to sell IrisPro on every chip (maybe making an exception for Pentium ones) and give processors with no iGPU as the cheap alternative and/or to reduce costs on enthusiast parts. I don't know of many people who would buy a K processor and use the iGPU, anyway.
Anybody else notice which CPUs with the Iris graphics they were benchmarking? Yeah - the one that you -can't even freaking get-. The Broadwell based i7-5775c and the i5-5675c were both of such limited release that they might as well not even have been commercially available.
Combine the relative scarcity with a lack of production, and that brings us to the second problem - price. Yes, you can get these chips if you search, but they'll run you anywhere from $300-500 USD. For that, you could get an i5-4670k and a 750ti which would blow the Iris graphics out of the water and save $100 to boot.
I realize this article is not a full blown review, but I would have expected Anandtech to at least add a few words (outside of the graphs) mentioning the option of getting a much cheaper CPU along with an affordable discreet GPU, with bonus points for adding such a configuration in the graphs.
I'm not sure Intel's (or AMD's, for that matter) compute performance really means that much here. The A10-7890K should have similar compute performance to the Iris Pro 580 but they're hardly going to perform on an equal footing in the benchmarks listed here.
One thing that puzzles me for year is that, all 5 top PC vendors disable iGPU and replace it with dirt cheap dGPU which is, at best, marginally faster. At this bottom layer, I don't think a 20-30% graphics performance increase can justify the cost, power consumption, and noise associated with dGPU. In 1 extreme case, Asus used an AMD dGPU to disable the AMD iGPU. Only in this case that iGPU has 2X the performance of that dGPU.
I don't think I am the only person surprised by this seemingly stupid action from the PC vendors. When AMD started releasing "powerful" iGPU, they thought the demand for low-end dGPU would be greatly reduced. They thus put little resource in those dGPU. They later admit in an earning conference call that they were shocked that NVDA sold so many 610/615/620.
This strange phenomenon is more prominent in Asia.
That's just the result of marketing. Saying your computer has a 2GB discrete graphics card makes the average consumer think it's better than the computer sitting next to it that has Intel HDXXXX integrated graphics. A dedicated anything sounds better than integrated, even if it's not. It's the same reason you see those companies trying to sell $2000 gaming PCs with high end i7s, 32GB of RAM, a giant (mechanical) drive, and then a GTX 950 or 960. Unless you are way into computer hardware, its really difficult to tell what's good and what isn't and the companies take advantage of that.
So Intel releases a few SKUs that are so expensive you may as well just buy a real GPU and save some money, and with that they are now claiming that their ENTIRE line of woefully underpowered GPUs are equivalent to discrete. O...K... pull the other one now.
All I got from this article was: "Yes, our iGPUs barely outperform outdated and really low end dGPUs that are manufactured in a process 2 to 3 steps older depending on how you count them. Hey... who are those guys coming from far away? Oh, crap, it's Pascal and Polaris!!! Run, Iris!!!".
Just so most of you know, the Iris 580 is supposed to be around 75% faster than the one in the i7 5775C and basically faster than a desktop Nvidia GTX 750 non-Ti/260 non-X
Those low end GPUs are about to ~double in performance with the node shrink though, so they need that to keep pace(!). It might at least push NV/AMD to actually put die shrunk things down the stack reasonably fast. (Especially for mobile things.).
Okay Intel. If you are targeting consumers that buy low end discreet cards, then why don't you sell an I3 cpu with iris pro for 70 bucks more than a normal i3? That makes a lot more sense than what they are doing. The kind of people that buy a $70 r7 240 aren't the same kind of people who spend $300 on a cpu.
On Amazon.com (US) the Intel Core i7-5775C can be bough for $402 . The Core i7-6700K is $418. Both are ENTHUSIAST grade CPU's and waaay overpriced. From the article, quote from Intel's VP Gregory Bryant: “For the mainstream and casual gamer, we have improved our Iris and Iris Pro graphics tremendously,” I beleive that says it all, to further comment is needed.
Mainstream nowadays is definitely 1080p at or close to 60fps. Practically everyone I know has this resolution and monitors are so cheap you can almost get them in cereal boxes. 1080p30 would be the entry level. You can get really cheap discrete cards that hit either performance level.
One thing I haven't had significant problems with since the 4500MHD is Intel's GPU drivers. The GMA 950 drivers were rough, but some of the graphical issues I ran into back then were related to incomplete implementation of various features in the actual hardware (probably intentionally left out due to a lack of real estate on the chipset's die which was where the GPU resided before being moved to the CPU package with Arrandale...I think..whatever the first HD graphics were anyhow). In any event, Intel's drivers are pretty good these days and certainly very usable. The HD 3000 and the Cherry Trail's iGPU have both behaved perfectly with all the games I throw at them.
From the graph's, it's easy to see intel is hitting a brick wall in diminishing returns. Their GPU Arch is not scaling well from 4770K to 6700K, despite having 40% more EU and ALU.
Its time for them to start putting more cores, or do another architecture. Nvidia could wreck those numbers with that same GPU die space, if they had access to the lithography process. Same with AMD
Most of that wall is due to the graphics processor relying almost exclusively on shared system memory. eDRAM, per-EU caching, and other tricks only partly mitigate the fundamental problem Intel faces when competing with graphics cards that have GDDR5 or HBM that offers gobs of fast memory dedicated to the GPU's needs versus it being basically a free-for-all cage brawl between various other system components all contending for access to the computer's RAM. I'm genuinely surprised that Intel has come as far as it has while being memory-constrained.
You are probably right though. Once NV and AMD have access to a 14/16nm process, they'll close the transistor size gap Intel currently enjoys. I think you'll see the tables quickly turn back in favor of budget GPUs outclassing Intel's iGPUs unless Intel somehow changes the situation again with their strange voodoo.
I think the point everyone is missing here is that IRIS 540 or the IRIS Pro 580 is EXTREMELY attractive when they are put inside ultrabooks and other slim-profile notebooks since these smaller notebooks typically can't house dGPU anyway due to space heat/ space restraint. So the whole debate about is Iris better or worse than some of the cheaper dGPU is not even relevant in this scenario. I'd much rather have Iris than have nothing at all in my surface pro 4 for example.
Personally for me, I'm very excited for this tech. Popular example is the new Dell Skylake XPS 13 IRIS refresh that is coming up next month. The laptop was already a smash hit and with this inclusion of the Iris, all of a sudden it's VERY attractive solution for a mobile gamer like myself. I'm a semi-casual gamer that only plays games like League of Lengends and CS GO and Iris is more than capable of running those games at acceptable settings.
I was contemplating on whether or not I should get the larger DELL XPS 15 which is equipped with the 960M, but now I'm having second thoughts. Yes, 960M will blow the IRIS out of the water performance wise and we already established this. But the fact that I have options to achieve descent performance in a much smaller package units is really nice. Not all of us care to run Crysis level graphics nor do we have interest in playing games that look that nice.
Yea, it's nice and all. Sure, nice performance FROM THE HIGHEST END SKU's. What most people have is GT1 and GT2. Which performance is fuck all compared to anything.
And about the desktop. Where the fuck are the GT3e and GT4e desktop models at a reasonable price? I would love to buy them. The Broadwell C models don't count. Like where's the Core i3 with HD Graphics 540 (or 550)? That would sell like nuts. Or an i5 with HD Graphics 580. Or vice versa.
Intel is being all pompous and look at the benchmarks of our highest end iGPU SKU, looky look look. But the truth is, NOBODY has them. Even if we look at laptops, on Geizhals.eu there's a whopping 22 laptops listed with Iris Pro graphics. Out of 2600+ listed which have Intel iGPU.
When I got into PC gaming, it was basically just me trying to see if the family desktop PC had enough power to play the game in question. With an Athlon XP and similar-era complimentary GPU from Compaq, there were definitely games that I could not even play. Battlefield 2, for example, REQUIRED DirectX9 at the time. As in, the game wouldn't boot without DX9 support. TONS of the PCs at the time were not DX9 capable. Tons of my friends couldn't even afford any graphics card that was DX9 capable. One friend saved up for a GPU, thinking it would work, but Battlefield 2 still wouldn't even launch. Friends at school might encourage me to play a game with them, but it's hard to ask mom and dad not only for a game, but also some expensive gizmo that goes into the family computer, and doesn't provide any benefit other than playing that game. Not to mention how complex it gets when you find out your PSU can't push your new gizmo.
What's happening now is that integrated graphics in the "family desktop PC" are becoming so powerful that those days are probably behind us. With an i3 or i5 (or AMD APU) in a Dell or HP family computer you are automatically guaranteeing that gamers will be able to at least dip their toes into almost every modern game. Enough at least to play, become addicted to, and start to budget for a GPU that can run the game well.
I used to joke with my friends about how someday we would click Start -> All programs -> Accessories -> Games -> Battlefield 2 because it had unfathomably great graphics, and future computers would be so powerful they could play them as if they were minesweeper.
Honestly, with this level of horsepower in iGPUs, that day is here.
Let's remember that this comparison to AMD's iGPUs is not really an iGPU vs iGPU comparison but a eco-system vs eco-system. Let's check the iGPUs the same amount and speed of memory + remove the API overhead and the picture will look quite different. And that's without talking about AMD's slower CPU part
I think the main problem with this article is the fact that the processor it features is not available at most retailers despite its "launch" in Q2/15. As far as I concerned the 5775C is phantom technology - great on paper, but no one can actually buy them.
With the release of Skylake, its unlikely the inventory of 5775C's will increase, particularly because it will undermine the sales of the latest flagship processor.
It's remarkable Intel has the audacity to talk about Broadwell CPUs along with their Iris Pro integrated graphics when there are only two DT/socketed models available and Intel has never bothered to have any supply of these processors available in the Can/US market! Instead if one actually wants to buy one of these they have to get them from one of various importers (typically of questionable legitimacy) at a price of $100-150 more than their suggested price.
In other words they are touting the features of Iris Pro on Broadwell pretending like they've got these great/improved integrated graphics "now", when in reality they are talking about vapourware! In the meantime while Skylake iGPUs have improved over the previous generation as well, none of the mainstream Skylakes have Iris Pro either and at best are about half that speed.
Yeah Intel, it's great that you've "improved Iris graphics tremendously", but where are these products in reality? Nowhere! And LOL at talking about GT4e on Skylake. What on a rare handful of BGA/mobile i7 CPUs? Pathetic. It's basically just a bunch of nonsense to go on about how Iris Pro is so great when there's really no widely available or mainstream processors with the feature. Instead of making dumb press releases like this, why don't they put Iris Pro on regular i3s and i5s and *actually* make what they're saying a reality? No? Oh right marketing fluff is a much better idea.
According to Steam, 35% of gamers using Steam games in 1920x1080, 29% in 3840x1080.
If "mainstream gamers" plays in 720, that means that they're only 1.37% ... And if you add the 13366x768 26%, they are 29%.
So "mainstream" is no longer the majority, but the minority? Plus, let'ss be frank, gamers who tend to play on 1366x768 aren't on a desktop, because 768p monitors are rare and 1080p are super affordable. Which means that they're on a laptop, because mainstream laptop manufacturer have been shipping most of their mainstream laptop with crappy screen resolutions for at least 10 years, and I'm not even talking about the glare of those horrible glossy screen!
So basically, Intel is trying to aim for 720p gaming, which means, laptop gaming, on a desktop CPU?
I would have rather wanted some extra cache, cores or a CPU without an iGPU to cut down the price instead of having to pay for this useless crap.
This type of iGPU should only be in Core i3 or any low voltage "Core iX-xxxxS" and ultra low voltage "Core iX-xxxxT" desktop CPU and laptop CPUs.
Full TDP Core i5 and i7 shouldn't have this iGPU, they should have more cores or should be cheaper.
Remember that scene from American Psycho? He tries to make a reservation at the hottest restaurant in the city. The only thing he gets in return is maniacal laughter. Hangs up phone.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
96 Comments
Back to Article
zlatan - Thursday, January 14, 2016 - link
Intel probably didn't test with D3D12. In this API their iGPUs are painfully slow.I didn't really know that Fable Legends how optimized for Intel, but an Iris Pro 6200 is two times slower than a Carrizo with 15 watt TDP.
nathanddrews - Thursday, January 14, 2016 - link
Source for that benchmark?I suppose that yes, technically Iris Pro is on par with a dGPU that costs $50, but it's still "not great".
BurntMyBacon - Monday, January 18, 2016 - link
Bigger issue for me is that they are talking as if i7-5775C Iris Pro represents the majority, or even some significant portion of the install base. The i7-6700K comes up short of AMD's IGP efforts, which are slotted in beneath their low end discrete cards, so the i7-5775C is the only chip in this article that may fit the statement. Most people getting a new computer are going skylake (6xxx series). There are some people upgrading existing Z97 boards with broadwell, but even if the entire DIY upgrade market went that route, that is a very small percentage of the install base. I have not seen a whole lot of broadwell in retail systems. This makes sense as people (and marketing departments) tend to gravitate to the "larger numbers" or "newness" of the skylake series. The brand new DDR4 is also an easy marketing differentiator. Given that the i7-5775C is a fair bit more expensive than the i7-6700K, I don't imagine that many of the uninformed would see any reason to pay more for an older processor. This is especially true when you take a look at the limitations of the platform and how many marketing bubbles the newer skylake platform gives you over the older broadwell platform. The informed crowd may see some of the benefits of the older chip despite the platform limitations, but they are often buying it to pair with discrete cards, so even if they do count towards the install base, they aren't always using them.ddriver - Thursday, January 14, 2016 - link
So the new mainstream is the mediocre of the previous era. And there I was, thinking society is improving.Gaming at a resolution, lower than the standard one, at below 60 FPS - that's not mainstream. That's entry level. Mainstream gaming would be at least at 1080p60, high end at 2k120 and up.
Not that anyone expected Intel to be objective and realistic about the GPUs they keep cramming into chips rather than extra cores or lower prices - things people actually need. iGPU should only be in up to i5 products.
jragonsoul - Thursday, January 14, 2016 - link
Honestly for the average persons 720p is what they game on. and a lot of games don't run game maxed out. That'es the enthusiast/high end PC crowd. Don't project your requirements on to the rest of gamers.That being said I do agree those people have no idea what they're missing. Or just don't care enough. I run a 2600k @4.5ghz with a 390x on a 1440p 144hz freesync monitor so I wouldn't be able to go back to 720 or even 1080 TBH.
ddriver - Thursday, January 14, 2016 - link
Mainstream is not your average casual angry birds gamer. The mainstream is really the "average" the "midrange". That's intel's idea - to prop up its entry level products as average, but that is straight out false advertising.So intel's very best GPU doesn't entirely suck at playing games at low resolutions and low framerates - but that ain't mainstream, that was the mainstream a long time ago.
haukionkannel - Thursday, January 14, 2016 - link
Mainstream are people who plays mine-sweep and solitaire with their computer. Intel GPU is fast enough to do that!ddriver - Thursday, January 14, 2016 - link
"Mainstream is current thought that is widespread"I highly doubt people playing mine sweeper are the bulk of gamers out there. The mainstream gamer is the midrange gamer - and intel's hardware is really entry level.
jragonsoul - Thursday, January 14, 2016 - link
More than 60% of steam user play on monitors LESS than 1080p (http://store.steampowered.com/hwsurvey) . "Mainstream" and "Midrange" do NOT mean the same thing. You can think that does but but they don't. I was a gamer on a very subpar/entry level build years because that's what I had. I have access to better now but back then I turned the fidelity down and lowered the resolution if I had to, it's what us gamers on a budget did.jasonelmore - Friday, January 15, 2016 - link
its because of the 25% who own those cheap laptops. I would bet half of that are duplicate owners, who own a desktop machine at home, and have a cheap gaming laptop they use to CS GO or Dota.Also the China and Korea market is on that same resolution
From the Hardware survey:
1366 x 768 26.47%
Everything other resolution below 1080p is 1-2% or less
BurntMyBacon - Monday, January 18, 2016 - link
@jasonelmore: "Everything other resolution below 1080p is 1-2% or less"I'm going to assume you meant everything other than resolutions at or below 1080P. Steam survey says you are pretty close. 96.37% are at 1080P or below. Of course 1080P makes up 35.15% of their survey, so if you don't include 1080P, you only end up with 61.22%.
BurntMyBacon - Monday, January 18, 2016 - link
@ddriver: "I highly doubt people playing mine sweeper are the bulk of gamers out there. The mainstream gamer is the midrange gamer - and intel's hardware is really entry level."I would define the midrange gamer as the arithmetic mean. The mainstream gamer is probably best defined as the median where the casual gamer is probably the mode.
If you want to define it based on computer specs, given that more than half of the users on steam play on monitors less than or equal to 1600x900, I say that is probably a more realistic mainstream resolution.
BurntMyBacon - Monday, January 18, 2016 - link
@haukionkannel: "Mainstream are people who plays mine-sweep and solitaire with their computer."I'm pretty sure that falls into the casual gamer category.
Of course, mainstream isn't the arithmetic mean. The mainstream gamer is probably best defined as the median where the casual gamer is probably the mode.
Pure speculation (obviously mine) says that the mainstream group probably include people that play games like Minecraft, the SIMS, and perhaps Portal. There is a rather large MMO crowd and franchises like Call of Duty do push the median up given the number of followers, but the number of mine-sweeper and solitaire only gamers is far greater. That said, many who own a computer don't play games on it at all and some of the casual gaming crowd has migrated to their phones/tablets. As they drop off the list, the median edges upwards. In any case it is hard to say exactly where the median now lies and it is probably safe to say that it is a moving target.
zo9mm - Thursday, January 14, 2016 - link
Mainstream would not be considered "average". If we talk statistics, there's mean, median, and mode. Mean is the average. Mode, which is the number that occurs most often, would more accurately describe the "mainstream". If you look at all gamers, and over 50% of them play at 720P, while the remaining people play above that, 720P is the mainstream, while the average would be 1080P or over.BurntMyBacon - Monday, January 18, 2016 - link
@zo9mm: " Mode, which is the number that occurs most often, would more accurately describe the "mainstream""I tend to think of Mode as the casual gamer, though I'll accept that I'm not an authority. I also tend to look at it more from the type of games played than the hardware it is played on. From a hardware perspective, Mode may not be a good fit for any gamer set in this discussion.
In any case, if you define the mainstream as Median, the Median resolution is less than the 1080P ddriver suggested. If we accept steam surveys as representative, then 1600x900 is the median resolution. Interestingly, 1920x1080 is the Mode resolution in steam surveys with 35.15% followed by 1366x768 with 26.47%. Surprisingly, 720P only has 1.33%, and only 3.96% of gamers on steam's survey game at 720P or below. Perhaps steams survey isn't the most accurate, but it is easy to look up and statistically significant. Feel free to reference an alternate survey if you want to use it instead.
Thatguy97 - Thursday, January 14, 2016 - link
Hell ppl play cs go at 10 by 7 so they can see better. Not everyone cares about resolution.jragonsoul - Thursday, January 14, 2016 - link
Thank you! People act like to be mainstream you need a 1440p monitor with a high end GPU.BrokenCrayons - Friday, January 15, 2016 - link
I agree completely with you. Higher resolutions are nice, but I'm perfectly happy at 1024x768 or 1366x768 as it does a lot to remove the need to purchase a more powerful GPU and makes whatever GPU does end up performing graphics chores have an easier time of things. Plus, other visual effects can be turned up higher in exchange for the reduced resolution. It's an all around win to play at lower res which makes higher resolution a pretty unimportant thing. Sure it looks a little better, but there's a point of diminishing returns. For me, that point is rapidly reached above 1366x768.BurntMyBacon - Monday, January 18, 2016 - link
@BrokenCrayons: "Higher resolutions are nice, but I'm perfectly happy at 1024x768 or 1366x768 as it does a lot to remove the need to purchase a more powerful GPU and makes whatever GPU does end up performing graphics chores have an easier time of things."Thrilled to see you have enough sense to objectively evaluate your needs and set your requirements accordingly. My hat's off to you.
@BrokenCrayons: "Sure it looks a little better, but there's a point of diminishing returns. For me, that point is rapidly reached above 1366x768."
I think that depends on what games you play and how you play them. Taking shooters as an example, if you like to run and gun in a game with lots of tight corridors and not a lot of large open spaces, then there is no reason to up the resolution other than it looks nicer and 1024x768 is a nice cheap fit. If, however, you prefer to snipe at extreme range in games with large distances, then maybe resolution is extremely important to the point of giving up other visual effects and 4K makes some sense. Obviously monitor size plays a role here as well as if the monitor is too small, you eyes looks to ability to resolve such dense resolution seated 12 - 18 inches from the monitor. Most people fall somewhere in between these extremes. My games of choice and play style are such that 2560x1600 / 1440P is preferable, though I can perform well enough on 1920x1200 / 1080P. I also prefer 16x10 to the "Cinema oriented" 16x9 aspect ratio that is common place, but that is a topic for a different time.
mdriftmeyer - Thursday, January 14, 2016 - link
Mainstream/Casual gamers are on smartphones that run well past 1080p resolutions. They aren't playing on some entry level iSeries for < $900.pt2501 - Friday, January 15, 2016 - link
Ah my current setup mimics yours, just with a single step down in each part. I game with a 2500k@4.4 ghz with a 390 on a 1080p120hz. Glad to know that the concept of a powerful GPU with a sufficiently powerful CPU has such a long life span with the slow down of CPU IPC improvements.I plan on seeing if directx 12 will make my 2500k last even longer. Would be great have a CPU be relevant 6-7 years after it was released.
marcelobm - Thursday, January 14, 2016 - link
I don't know what "mainstream" means to you.. but most players don't care about resolution/fps just look at the sales of PS4/Xbox One and the players who still plays on the last generation consoles..BurntMyBacon - Monday, January 18, 2016 - link
@marcelobm: "I don't know what "mainstream" means to you.. but most players don't care about resolution/fps just look at the sales of PS4/Xbox One and the players who still plays on the last generation consoles."I don't disagree, but given that Intel isn't powering those consoles and there is not path of upgrade that will put Intel graphics in your console, I'm pretty sure they were talking mainstream PC gamers.
Point about resolution/fps still stands though. Most just think it looks good or it looks like crap with the occasional, "that game gives me motion sickness". I've met a few people who could play a particular game until a PC version came out with sufficiently powerful hardware to hide the jitters in the game engine. Hasn't been as much of a problem more recently, though.
kaesden - Thursday, January 14, 2016 - link
I think intel's definition is mainstream of the computer industry, not mainstream of the gamer demographic. For someone not really into gaming just looking to fire up some basic appstore games or minecraft or something similar, iris pro graphics would probably be satisfactory to someone who doesn't know any better. 30fps is smooth enough to be playable and 720p is definately a bare minimum standard, but it is more than capable of doing that.Of course anyone remotely into gaming wouldn't accept such low performance levels and would either go for a console, or a real gaming pc with a discrete GPU.
ddriver - Thursday, January 14, 2016 - link
The title says "mainstream gamer" not "mainstream hardware". There is a difference.A casual game player - that's what intel talks about, but that person is not a gamer, the gamer is a gaming enthusiast, who makes purchases with gaming in mind, gaming gear and whatnot.
Much like not every car driver is a racer. And claiming your family van is good enough for mainstream racers.
jragonsoul - Thursday, January 14, 2016 - link
No.. a mainstream gamer is someone that plays games and enjoys the "mainstream" games. CS:GO, COD, AC, DA and other games targeted at a large pool of people. They don't need cutting edge hardware. Enthusiast gaming is different and you're not understanding. Your racing analogy is also totally wrong. It's more like Intel saying the "average commuter" is ok with this more efficient car and then you talking about "Well it's not good enough for racing!"mkozakewich - Sunday, January 17, 2016 - link
It's a fallacy to say that people are stupid if they're playing at 30 fps. I've been playing a lot of really good indie titles on a three-year-old i5, and it's performed really well. I've honestly been more limited by RAM and drive space. I've even managed to play League of Legends without problem.Some games, like first-person shooters, require a higher frame rate to achieve a good level of accuracy. Keep in mind most movies are about 24 fps.
There are different markets. A lot of people will enjoy the Intel graphics.
CaedenV - Thursday, January 14, 2016 - link
think of it more this way:My wife plays games... she plays card games, and other such simple games. When win8 came out her Core2Duo could no longer play these kinds of 'mainstream' games with a decent frame rate, so I dug around and found a GT8600 GPU that wasn't in use which brought frame rates back up. When she moved up to an Ivy Bridge i3 the iGPU was as good or better than having the dedicated GPU, so we were able to leave it out of the build.
Fast forward 4 years and now my kids are starting to play games on her PC that are a little more demanding. So the question in my mind is if I should upgrade her whole PC (really interested in something like a NUC or Brix), or just pick up another dGPU.
Essentially, Intel is saying to people like myself: don't bother with a dGPU, just upgrade the whole computer because it will make the whole thing run faster, quiter, sip less energy, and take less space, while adding modern connectivity, while having a GPU level roughly as good as the dGPU you are considering in the first place.
The problem is that at the end of the day I am going to end up spending ~$75 per year on her computer. Do I spend just that $75 on a dGPU now? Or do I spend $4-600 and have a new system that is going to last a few years. I really have half a mind to spend more now and not touch it for a while, so this kind of messaging is working on at least one person.
CaedenV - Thursday, January 14, 2016 - link
Of course, the other option is that I just buy myself a GTX970 and put my current GTX570 in her computer... bu then I am spending even more money lol.ddriver - Thursday, January 14, 2016 - link
Your wife is not a mainstream gamer, she is not even a casual gamer, she is a non-gamer who plays casual games. Not every act of playing a game makes a gamer.Anyway, who the hell buys 70$ GPUs? It makes no sense whatsoever. I suspect most of the people who end up purchasing such products don't really do it deliberately, it is just people who know nothing about tech, ending up being shoved a completely pointless GPU through buying some retailer's assembled configuration.
I reckon it is entirely pointless purchasing GPUs under 150$, 99% of all consumer grade CPUs come with iGPU that will be about as good. Now if you buy a PC with the intent to play games and you happen to be a poor guy, you will buy something in the 150-250$ range, and that's what a mainstream gamer is. Someone focused on gaming, making a purchase with gaming in mind.
The title doesn't say "mainstream GPUs" - it says "mainstream gamers" but really talks about casual gamers at best, and that's being generous, more appropriately it will be regular people who casually play games, not gamers.
If we are to generalize, then we could say a gamer is someone who plays actual board games, in that case, intel's GPUs are MORE THAN ENOUGH, because those people don't even need computers to play their games.
rhysiam - Thursday, January 14, 2016 - link
If you put your limit at discrete GPUs under $100 I'd probably agree, but you can get a 750ti or 260X for a touch over $100, and you can find 370s and 950s for under $150. These will all be substantially better than any integrated solution at the moment. Maybe a next gen Iris Pro with Lvl4 cache could rival a 750ti/260x, but you can bet they'll be a lot more expensive too.Namisecond - Tuesday, January 19, 2016 - link
I think there is a big disparity between the industry's perspective of "gamers" and the gaming community's perspective of "gamers"You also might want to double check that perspective of "if you happen to be a poor guy, you will buy something in the 150-$250 range". That's very gamer-centric, elitist thinking. It's like saying "you must be poor if you're buying an Infinity Q series instead of a Mercedes S-class (or insert your choice of midrange luxury car line as opposed to a premium, far more expensive car line)"
Concillian - Thursday, January 14, 2016 - link
I think mainstream is more people who play games like LoL / DotA / Hearthstone / CS:GO / Minecraft. These are the games my teenage nephew plays on my computers, even though my PCs are capable of 1080p high settings, he plays games that would actually be run perfectly fine on an IGP.Dribble - Thursday, January 14, 2016 - link
Headline is a bit disingenuous - when Intel says "Our IGP's" they mean the Iris pro found on a tiny % of relatively expensive Intel machines.Hence almost all Intel IGP's out there aren't equivalent to discreet gpu's, and the ones that are cost a lot. Then there's the drivers which despite improvements since the really bad old days, are still behind Nvidia and AMD for stability and support.
Winterblade - Thursday, January 14, 2016 - link
Completely agree, also I from the headline l expected benchmarks comparing the whole family of Intel igp's vs mainstream gpu's from amd and nvidia, not reused ones from previous reviews.To be honest this is the kind of article that makes me lose a little bit of faith in technology journalists and/or sites... Ambiguous headlines, lack of evidence and that gut feeling of some undisclosed agreement with the manufacturer...
Please keep or beloved Anandtech above that.
zodiacfml - Thursday, January 14, 2016 - link
First time I do agree with a complaint. This really feels a promotion for Intel since Anandtech already mentioned that it is quite pointless to put the best graphics in expensive CPUs. Casual gamers usually use laptops. I can say this from experience as I bought a cheap laptop with 720p diplay last year with an i5-5200u specifically for Diablo 3 gaming, and with the lowest settings, it rarely reaches gameplay of 50fps or more.speely - Thursday, January 14, 2016 - link
I don't think it's AT. I think it's the author, Anton.He got hired here a couple months after he got fired from KitGuru for writing, in an article about AMD's then-upcoming DDR4 RAM, that people who buy AMD products don't care about performance, and base their purchasing decisions on how cool the thing looks.
He's gotten a little better about disguising his fanboyism, but it's still there, and if you're aware of it, the tone of his articles makes much more sense.
MrSpadge - Thursday, January 14, 2016 - link
Yeah, I would at least have expected a comment from the editor that currently Intel has not even announced any Skylake GT4e models. We have no idea if they will even bring them to the desktop or when."Intel’s latest integrated graphics processor found in Skylake chips..."
I don't think that counts as "being found".
And I agree with the driver: I really liked using my Ivy GPU to run Einstein@Home. but not any more with Skylake! An OpenCL driver bug leads to incorrect results, rendering the work pretty much useless. I've reported this months ago to Intel, with details and even delivering the source code.. apparently they couldn't fix anything yet :/
nandnandnand - Thursday, January 14, 2016 - link
That's right. GT4e/Iris 580 might be impressive... like a unicorn is impressive.I want GT4e/Iris 580 on the market and benchmarks compared to discrete cards and AMD's top integrated graphics.
ketacdx - Thursday, January 14, 2016 - link
I was thinking the same thing. First off, the price of the Iris Pro editions of CPU's cost significantly more to the point almost costing as much as a low end dGPU. Secondly, No low end or mainstream gamer builds their own PC's and OEM's always put in base model CPUs and not Iris Pro enabled chips so the mainstream dont even get Iris Pro. You need to examine who is making the purchases and what they do with it. Steam says it all. #1 most popular card: Intel G33/G31 Express, #2: Intel 82945G Express, #3:NVIDIA GeForce 6100, #4 Intel HD Graphics 3000, #5 NVIDIA GeForce 7025. The highest percentage of users to an Iris enabled Card is:0.45% for Intel Iris Pro Graphics 5200 which is lower than Geforce GTX 980 at 1.08%.Namisecond - Tuesday, January 19, 2016 - link
That's a good point you brought up: Scarcity of Iris Pro across Intel's product line. At this moment, I can't even buy a Skylake processor with their latest Iris Pro GPU. In the previous Broadwell generation, you could only find Iris Pro in the top-of-the-line Core i5 and i7 processors (socketed or soldered). Even then I had a heckuva time acquiring one (I test a lot of hardware).It's pretty much marketing fluff, Intel puffing out its chest saying they are relevent to gamers. There is a small possibility that Intel may be starting a marketing push to try and sell more Iris Pro equipped CPUs, but unless they start including them in their lower cost Core i3 and other less expensive processors, lotsa luck to them. Until then, the gaming segment belongs to the discrete GPU.
D. Lister - Thursday, January 14, 2016 - link
I love the details in this advertisement. When you're paying AT for promotion, dammit you're getting a bang for your buck. *thumbsup*BrokenCrayons - Thursday, January 14, 2016 - link
Intel's top end graphics really are more than enough for most gaming needs. In fact, even their lowest end modern GPUs in Cherry and Bay Trail chips are perfectly fine for older games and less demanding current games. I'd like to see improvements and most certainly migration of the full 128MB EDRAM package down the product stack, but I think it's more important to continue to focus efforts on reducing power consumption and heat output so moving parts in computers are unnecessary. We're already moving away from hard drives and kicking giant desktop PCs into the recycle bin in favor of mobile devices and much smaller desktop form factors. It's also time to do away with cooling fans. I've been relatively happy with Intel graphics since the GMA 950 came out and marginalized the importance of discrete graphics in my own computing. Sure, I've purchased or built the occasional desktop and fed it a GPU for fun, but it was never necessary and didn't really add a lot of value to my computing experience. So yes, I can see Intel's logic in this and their claims are pretty meaningful, but my preference is they spend more of their effort on making elegant, passively cooled processors and leave the double or triple digit TDP exclusively in server racks.fanofanand - Thursday, January 14, 2016 - link
Your comment doesn't make sense in an "article" about gamers. Your comment leaves the impression that you don't game at all, so your comment really isn't relevant to the discussion.BrokenCrayons - Friday, January 15, 2016 - link
It seems like you're trying to kick someone out of an "exculsive" tree house of boy gamers by reading between the lines to find a reality that you want that makes my comment fall outside of a preconceived notion of what does or doesn't constitute someone who plays video games. I admit I'm sort of perplexed by why that's significant enough to even comment about, but there is an interesting subculture in people who play computer games heavily that I don't fully understand.Namisecond - Tuesday, January 19, 2016 - link
I have experienced that a good portion of the enthusiast gaming community has a problem with thoughts and commentary that doesn't toe some imaginary line. It's basically tribalism or the "exclusive tree house" mentality as you've state and it really does make it difficult to have discussion that doesn't cater to that mentality.Nexing - Sunday, January 17, 2016 - link
Context could be relevant, at lest to contrast what gamers, older gamers and regular users constitute. I happily ran a laptop with Sandy-Bridge's HD-3000 powering a 2015's 2560x1980 IPS widescreen plus the integrated 1766x768, simultaneously. I play no games, mostly audio, office programs and the occasional HD video/film, hence not stressful demand from the GPU, nevertheless with no a single perceived problem whatsoever from a 5 years old Intel integrated GPU. Have to say that Intel drivers have been actualized on a frequent basis, allegedly improving performance/quality.Nexing - Sunday, January 17, 2016 - link
Should say; HD-3000 powering a 2015's 2560x1080 IPS widescreenguidryp - Thursday, January 14, 2016 - link
How about a dollar to dollar comparison.Cheap CPU + $99 GPU card VS Fancy Iris Pro.
10101010 - Monday, January 18, 2016 - link
This. Intel with their effective monopoly has jacked up prices on all their better chips to the point of absurdity. Even though Moore's Law has run far enough for Intel to stop making dual core chips, we end up with tons of machines with only two cores. The modern web browser runs a lot better with more cores -- and better graphics.The market needs AMD to make a strong comeback. Evolution in x86 has slowed to a crawl and prices are through the roof for chips that should be very cheap.
duploxxx - Thursday, January 14, 2016 - link
sure intel showing of there high end iris pro graphics performance, let people believe they get good graphics until they buy the low-mid i3-i5 and get horible gpu performance.btw 720p benchmarks are ****, that is what you find on crappy laptop screens these days.
testbug00 - Thursday, January 14, 2016 - link
"when you spend twice as much on our CPU with eDRAM than you would for an i3 and a cheap GPU we're almost as fast".Gee. Thanks. How about for the majority of the market where cost is the deciding factor.
Flunk - Thursday, January 14, 2016 - link
This would be nice if the chips that needed the higher-end integrated graphics actually had them. Very few notebooks ship with Iris Pro graphics because of the huge price premium that Intel changes for chips that integrate Iris Pro. Iris Pro on Intel's low to mid range chips would be an interesting product at the right price. Right now what they change for the functionality makes it very unpopular.imaheadcase - Thursday, January 14, 2016 - link
Not sure about anyone else, but "30 times what it was 5 years ago" seems pretty bad increases.xthetenth - Thursday, January 14, 2016 - link
Doubling right about every year is not bad at all.Gurbo - Thursday, January 14, 2016 - link
Come to our restaurant and with our world renowned wines, eat cardboard and paper! 30 times tastier than 5 years ago, when our soup of the day was dog puke!Yeah, the increase is good, but the starting point was horrible.
Guspaz - Thursday, January 14, 2016 - link
The big problem is that Intel's high-end iGPUs are indeed practical for a mainstream market... but they don't put those iGPUs on mainstream parts, they put them on high-end parts. So mainstream gamers get lower-end iGPUs which aren't practical.If Intel put the Iris Pro on every CPU from the i3 on up, then maybe they'd have a point.
DanNeely - Thursday, January 14, 2016 - link
They don't need to put it on every part; just offer a few variants with them. Non-gamers still buy the majority of CPUs; they just need enough to be able to watch cattube in whatever the latest encoding standard is without turning on the fan.Unfortunately for Intel, the other half of why budget discrete GPUs sell so well is that you can upgrade them. $100 for a new GPU after 2-3 years is a lot cheaper and less hassle than $200+ for a new CPU and mobo; if the new IGP needs faster dram to hit its full performance make that $300+. A $300 GPU will blow any of Intel's IGPs out of the water.
Gurbo - Thursday, January 14, 2016 - link
If they sell parts with worse iGPUs, guess which ones will end up inside the cases of the "mainstream" users? They need to sell IrisPro on every chip (maybe making an exception for Pentium ones) and give processors with no iGPU as the cheap alternative and/or to reduce costs on enthusiast parts. I don't know of many people who would buy a K processor and use the iGPU, anyway.bill.rookard - Thursday, January 14, 2016 - link
Anybody else notice which CPUs with the Iris graphics they were benchmarking? Yeah - the one that you -can't even freaking get-. The Broadwell based i7-5775c and the i5-5675c were both of such limited release that they might as well not even have been commercially available.Combine the relative scarcity with a lack of production, and that brings us to the second problem - price. Yes, you can get these chips if you search, but they'll run you anywhere from $300-500 USD. For that, you could get an i5-4670k and a 750ti which would blow the Iris graphics out of the water and save $100 to boot.
webdoctors - Thursday, January 14, 2016 - link
Anandtech did put prices in the graph. Maybe this whole exercise is in graph comprehension and logic.Based off those graphs, it looks like buying an i3 for $120 is the best bang for buck, paired with a $150 750TI, you'd be ahead in every respect.
This lineup makes no sense, the pricing is ridiculous on those "integrated" iGPUs.
Ktracho - Thursday, January 14, 2016 - link
I realize this article is not a full blown review, but I would have expected Anandtech to at least add a few words (outside of the graphs) mentioning the option of getting a much cheaper CPU along with an affordable discreet GPU, with bonus points for adding such a configuration in the graphs.fanofanand - Thursday, January 14, 2016 - link
I seem to have this same graphc comprehension issue you speak of, I see wattage listed but not price. Where are you seeing that?baobrain - Thursday, January 14, 2016 - link
LOLYeah, no.
silverblue - Thursday, January 14, 2016 - link
I'm not sure Intel's (or AMD's, for that matter) compute performance really means that much here. The A10-7890K should have similar compute performance to the Iris Pro 580 but they're hardly going to perform on an equal footing in the benchmarks listed here.extide - Thursday, January 14, 2016 - link
Come on, where is at least a pipeline article about the A1100 launch today??10101010 - Friday, January 15, 2016 - link
Anandtech is still waiting for Intel to write the article, so they can post it.jjpcat@hotmail.com - Thursday, January 14, 2016 - link
One thing that puzzles me for year is that, all 5 top PC vendors disable iGPU and replace it with dirt cheap dGPU which is, at best, marginally faster. At this bottom layer, I don't think a 20-30% graphics performance increase can justify the cost, power consumption, and noise associated with dGPU. In 1 extreme case, Asus used an AMD dGPU to disable the AMD iGPU. Only in this case that iGPU has 2X the performance of that dGPU.I don't think I am the only person surprised by this seemingly stupid action from the PC vendors. When AMD started releasing "powerful" iGPU, they thought the demand for low-end dGPU would be greatly reduced. They thus put little resource in those dGPU. They later admit in an earning conference call that they were shocked that NVDA sold so many 610/615/620.
This strange phenomenon is more prominent in Asia.
cfenton - Thursday, January 14, 2016 - link
That's just the result of marketing. Saying your computer has a 2GB discrete graphics card makes the average consumer think it's better than the computer sitting next to it that has Intel HDXXXX integrated graphics. A dedicated anything sounds better than integrated, even if it's not. It's the same reason you see those companies trying to sell $2000 gaming PCs with high end i7s, 32GB of RAM, a giant (mechanical) drive, and then a GTX 950 or 960. Unless you are way into computer hardware, its really difficult to tell what's good and what isn't and the companies take advantage of that.ToTTenTranz - Thursday, January 14, 2016 - link
I have to agree with most of the complaints here.Intel's point is moot if the price for their GT3e CPUs is the same as a lower-end CPU + much faster discrete GPU.
If they had a $150 Core i3 with a GT3e, they might have a point. Since their cheapest CPU with GT3e costs close to $300, their argument is ridiculous.
mdriftmeyer - Thursday, January 14, 2016 - link
I guess they think people have a hard time inserting a discrete gpu. Counting on the lazy factor.Shadowmaster625 - Thursday, January 14, 2016 - link
So Intel releases a few SKUs that are so expensive you may as well just buy a real GPU and save some money, and with that they are now claiming that their ENTIRE line of woefully underpowered GPUs are equivalent to discrete. O...K... pull the other one now.AdamF - Thursday, January 14, 2016 - link
This feels like sponsored content from Intel.baobrain - Thursday, January 14, 2016 - link
Because it probably is10101010 - Friday, January 15, 2016 - link
Most everything on the big tech sites is sponsored content these days.Gurbo - Thursday, January 14, 2016 - link
All I got from this article was: "Yes, our iGPUs barely outperform outdated and really low end dGPUs that are manufactured in a process 2 to 3 steps older depending on how you count them. Hey... who are those guys coming from far away? Oh, crap, it's Pascal and Polaris!!! Run, Iris!!!".Cryio - Thursday, January 14, 2016 - link
Just so most of you know, the Iris 580 is supposed to be around 75% faster than the one in the i7 5775C and basically faster than a desktop Nvidia GTX 750 non-Ti/260 non-XQwertilot - Thursday, January 14, 2016 - link
Those low end GPUs are about to ~double in performance with the node shrink though, so they need that to keep pace(!). It might at least push NV/AMD to actually put die shrunk things down the stack reasonably fast.(Especially for mobile things.).
vcsg01 - Thursday, January 14, 2016 - link
Okay Intel. If you are targeting consumers that buy low end discreet cards, then why don't you sell an I3 cpu with iris pro for 70 bucks more than a normal i3? That makes a lot more sense than what they are doing. The kind of people that buy a $70 r7 240 aren't the same kind of people who spend $300 on a cpu.pugster - Thursday, January 14, 2016 - link
Yeah, buy one of their overpriced cpu's at around $370 and you can buy a lower end cpu and a better graphics card.Gastec - Thursday, January 14, 2016 - link
On Amazon.com (US) the Intel Core i7-5775C can be bough for $402 . The Core i7-6700K is $418. Both are ENTHUSIAST grade CPU's and waaay overpriced.From the article, quote from Intel's VP Gregory Bryant: “For the mainstream and casual gamer, we have improved our Iris and Iris Pro graphics tremendously,”
I beleive that says it all, to further comment is needed.
iwod - Friday, January 15, 2016 - link
They are comparing their top of the line CPU with Best Graphics spec to a lower end GPU.Do they even realize the cost of these Iris Pro CPU?
r3loaded - Friday, January 15, 2016 - link
Mainstream nowadays is definitely 1080p at or close to 60fps. Practically everyone I know has this resolution and monitors are so cheap you can almost get them in cereal boxes. 1080p30 would be the entry level. You can get really cheap discrete cards that hit either performance level.mobutu - Friday, January 15, 2016 - link
lol, Intel, in your dreams :)also, should we also mention the topic of drivers and their "gaming" quality? ;)
BrokenCrayons - Friday, January 15, 2016 - link
One thing I haven't had significant problems with since the 4500MHD is Intel's GPU drivers. The GMA 950 drivers were rough, but some of the graphical issues I ran into back then were related to incomplete implementation of various features in the actual hardware (probably intentionally left out due to a lack of real estate on the chipset's die which was where the GPU resided before being moved to the CPU package with Arrandale...I think..whatever the first HD graphics were anyhow). In any event, Intel's drivers are pretty good these days and certainly very usable. The HD 3000 and the Cherry Trail's iGPU have both behaved perfectly with all the games I throw at them.nos024 - Friday, January 15, 2016 - link
Wow what an absolute garbage logic and article.Also, EPIC FAIL. You have a $130 7870K completely destroying a $400 i7 6700k.
jasonelmore - Friday, January 15, 2016 - link
From the graph's, it's easy to see intel is hitting a brick wall in diminishing returns. Their GPU Arch is not scaling well from 4770K to 6700K, despite having 40% more EU and ALU.Its time for them to start putting more cores, or do another architecture. Nvidia could wreck those numbers with that same GPU die space, if they had access to the lithography process. Same with AMD
BrokenCrayons - Friday, January 15, 2016 - link
Most of that wall is due to the graphics processor relying almost exclusively on shared system memory. eDRAM, per-EU caching, and other tricks only partly mitigate the fundamental problem Intel faces when competing with graphics cards that have GDDR5 or HBM that offers gobs of fast memory dedicated to the GPU's needs versus it being basically a free-for-all cage brawl between various other system components all contending for access to the computer's RAM. I'm genuinely surprised that Intel has come as far as it has while being memory-constrained.You are probably right though. Once NV and AMD have access to a 14/16nm process, they'll close the transistor size gap Intel currently enjoys. I think you'll see the tables quickly turn back in favor of budget GPUs outclassing Intel's iGPUs unless Intel somehow changes the situation again with their strange voodoo.
InternetLurker01 - Friday, January 15, 2016 - link
I think the point everyone is missing here is that IRIS 540 or the IRIS Pro 580 is EXTREMELY attractive when they are put inside ultrabooks and other slim-profile notebooks since these smaller notebooks typically can't house dGPU anyway due to space heat/ space restraint. So the whole debate about is Iris better or worse than some of the cheaper dGPU is not even relevant in this scenario. I'd much rather have Iris than have nothing at all in my surface pro 4 for example.Personally for me, I'm very excited for this tech. Popular example is the new Dell Skylake XPS 13 IRIS refresh that is coming up next month. The laptop was already a smash hit and with this inclusion of the Iris, all of a sudden it's VERY attractive solution for a mobile gamer like myself. I'm a semi-casual gamer that only plays games like League of Lengends and CS GO and Iris is more than capable of running those games at acceptable settings.
I was contemplating on whether or not I should get the larger DELL XPS 15 which is equipped with the 960M, but now I'm having second thoughts. Yes, 960M will blow the IRIS out of the water performance wise and we already established this. But the fact that I have options to achieve descent performance in a much smaller package units is really nice. Not all of us care to run Crysis level graphics nor do we have interest in playing games that look that nice.
lagittaja - Friday, January 15, 2016 - link
Yea, it's nice and all. Sure, nice performance FROM THE HIGHEST END SKU's.What most people have is GT1 and GT2. Which performance is fuck all compared to anything.
And about the desktop. Where the fuck are the GT3e and GT4e desktop models at a reasonable price? I would love to buy them. The Broadwell C models don't count.
Like where's the Core i3 with HD Graphics 540 (or 550)? That would sell like nuts. Or an i5 with HD Graphics 580. Or vice versa.
Intel is being all pompous and look at the benchmarks of our highest end iGPU SKU, looky look look. But the truth is, NOBODY has them.
Even if we look at laptops, on Geizhals.eu there's a whopping 22 laptops listed with Iris Pro graphics. Out of 2600+ listed which have Intel iGPU.
LarsBars - Friday, January 15, 2016 - link
When I got into PC gaming, it was basically just me trying to see if the family desktop PC had enough power to play the game in question. With an Athlon XP and similar-era complimentary GPU from Compaq, there were definitely games that I could not even play. Battlefield 2, for example, REQUIRED DirectX9 at the time. As in, the game wouldn't boot without DX9 support. TONS of the PCs at the time were not DX9 capable. Tons of my friends couldn't even afford any graphics card that was DX9 capable. One friend saved up for a GPU, thinking it would work, but Battlefield 2 still wouldn't even launch. Friends at school might encourage me to play a game with them, but it's hard to ask mom and dad not only for a game, but also some expensive gizmo that goes into the family computer, and doesn't provide any benefit other than playing that game. Not to mention how complex it gets when you find out your PSU can't push your new gizmo.What's happening now is that integrated graphics in the "family desktop PC" are becoming so powerful that those days are probably behind us. With an i3 or i5 (or AMD APU) in a Dell or HP family computer you are automatically guaranteeing that gamers will be able to at least dip their toes into almost every modern game. Enough at least to play, become addicted to, and start to budget for a GPU that can run the game well.
I used to joke with my friends about how someday we would click Start -> All programs -> Accessories -> Games -> Battlefield 2 because it had unfathomably great graphics, and future computers would be so powerful they could play them as if they were minesweeper.
Honestly, with this level of horsepower in iGPUs, that day is here.
Bring on the Zen / Polaris / HBM APUs!
junky77 - Monday, January 18, 2016 - link
Let's remember that this comparison to AMD's iGPUs is not really an iGPU vs iGPU comparison but a eco-system vs eco-system. Let's check the iGPUs the same amount and speed of memory + remove the API overhead and the picture will look quite different. And that's without talking about AMD's slower CPU partpurefluke - Tuesday, January 19, 2016 - link
I think the main problem with this article is the fact that the processor it features is not available at most retailers despite its "launch" in Q2/15. As far as I concerned the 5775C is phantom technology - great on paper, but no one can actually buy them.With the release of Skylake, its unlikely the inventory of 5775C's will increase, particularly because it will undermine the sales of the latest flagship processor.
ES_Revenge - Tuesday, January 19, 2016 - link
It's remarkable Intel has the audacity to talk about Broadwell CPUs along with their Iris Pro integrated graphics when there are only two DT/socketed models available and Intel has never bothered to have any supply of these processors available in the Can/US market! Instead if one actually wants to buy one of these they have to get them from one of various importers (typically of questionable legitimacy) at a price of $100-150 more than their suggested price.In other words they are touting the features of Iris Pro on Broadwell pretending like they've got these great/improved integrated graphics "now", when in reality they are talking about vapourware! In the meantime while Skylake iGPUs have improved over the previous generation as well, none of the mainstream Skylakes have Iris Pro either and at best are about half that speed.
Yeah Intel, it's great that you've "improved Iris graphics tremendously", but where are these products in reality? Nowhere! And LOL at talking about GT4e on Skylake. What on a rare handful of BGA/mobile i7 CPUs? Pathetic. It's basically just a bunch of nonsense to go on about how Iris Pro is so great when there's really no widely available or mainstream processors with the feature. Instead of making dumb press releases like this, why don't they put Iris Pro on regular i3s and i5s and *actually* make what they're saying a reality? No? Oh right marketing fluff is a much better idea.
GuizmoPhil - Monday, February 8, 2016 - link
"Mainstream gamers" ? They are surely talking about Facebook flash based games gamers.Because when I take a look at the Steam statistics, sure doesn't look like any Steam gamers are mainstream. http://store.steampowered.com/hwsurvey/videocard/
According to Steam, 35% of gamers using Steam games in 1920x1080, 29% in 3840x1080.
If "mainstream gamers" plays in 720, that means that they're only 1.37% ... And if you add the 13366x768 26%, they are 29%.
So "mainstream" is no longer the majority, but the minority? Plus, let'ss be frank, gamers who tend to play on 1366x768 aren't on a desktop, because 768p monitors are rare and 1080p are super affordable. Which means that they're on a laptop, because mainstream laptop manufacturer have been shipping most of their mainstream laptop with crappy screen resolutions for at least 10 years, and I'm not even talking about the glare of those horrible glossy screen!
So basically, Intel is trying to aim for 720p gaming, which means, laptop gaming, on a desktop CPU?
I would have rather wanted some extra cache, cores or a CPU without an iGPU to cut down the price instead of having to pay for this useless crap.
This type of iGPU should only be in Core i3 or any low voltage "Core iX-xxxxS" and ultra low voltage "Core iX-xxxxT" desktop CPU and laptop CPUs.
Full TDP Core i5 and i7 shouldn't have this iGPU, they should have more cores or should be cheaper.
Ommidiam - Sunday, February 14, 2016 - link
Remember that scene from American Psycho? He tries to make a reservation at the hottest restaurant in the city. The only thing he gets in return is maniacal laughter. Hangs up phone.