The ability to cram in more and more transistors in a die has made it possible to have both the CPU and GPU in the same silicon. Intel's GPUs have traditionally catered to the entry-level consumers, and have often been deemed good enough for basic HTPC use. AMD introduced their own CPU + GPU combination in the Llano series last year. While AMD does have a better GPU architecture in-house, they could not integrate their best possible GPU for fear of cannibalizing their mid-range GPU sales. The result was that Llano, while being pretty decent for HTPC use, didn't excite us enough to recommend it wholeheartedly.

Today, Intel is taking on AMD's Llano with a revamped integrated GPU. We have traditionally not been kind to Intel in our HTPC reviews because of the lack of proper drivers and open source software support. Things took a turn for the better with Sandy Bridge. One of Intel's engineers took it upon himself to bring reliable hardware decoding support on Intel platforms with the QuickSync decoder.

As a tech journalist in the HTPC space, I spend quite a bit of time on forums such as Doom9 and AVSForum where end-users and developers interact with each other. The proactive nature of the QuickSync developer in interacting with the end-users was something sorely lacking from Intel's side previously. We have seen various driver issues getting quashed over the last few releases, thanks to the new avenue of communication between Intel and the consumers.

With Ivy Bridge, we are getting a brand new GPU with more capabilities. Given the recent driver development history, even advanced HTPC users could be pardoned for thinking that Ivy Bridge would make a discrete HTPC GPU redundant. Video post processing quality is subjective, but that shouldn't prevent us from presenting pictorial results for readers to judge. One of the most talked about issues with the Intel GPU for HTPC purposes is the lack of proper 23.976 Hz display refresh rate support. Does this get solved in Ivy Bridge?

In this review, we present our experience with Ivy Bridge as a HTPC platform using a Core i7-3770K (with Intel HD Graphics 4000). In the first section, we tabulate our testbed setup and detail the tweaks made in the course of our testing. A description of our software setup and configuration is also provided. Following this, we have the results from the HQV 2.0 benchmark and some pictorial evidence of the capabilities of the GPU drivers. A small section devoted to the custom refresh rates is followed by some decoding and rendering benchmarks. No HTPC solution is completely tested without looking at the network streaming capabilities (Adobe Flash and Microsoft Silverlight performance). In the final section, we cover miscellaneous aspects such as power consumption and then proceed to the final verdict.

Testbed and Software Setup
Comments Locked

70 Comments

View All Comments

  • Exodite - Tuesday, April 24, 2012 - link

    Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.

    See, I can play that game too!

    Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.

    I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.

    It's a 2MS TN panel, obviously.

    All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.

    My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.

    End of the day it's personal preference, which I made abundantly clear in my first post.

    Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
  • Old_Fogie_Late_Bloomer - Tuesday, April 24, 2012 - link

    Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.

    I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
  • Exodite - Tuesday, April 24, 2012 - link

    Yeah, you shouldn't have gotten into this.

    Point being that whatever the difference is I bet you the same can be said about latency.

    Besides, as I've said from the start it's about the things that you personally appreciate.

    My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.

    But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.

    I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
  • Old_Fogie_Late_Bloomer - Wednesday, April 25, 2012 - link

    I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.

    But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.

    For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
  • Exodite - Wednesday, April 25, 2012 - link

    To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.

    That said it's about the frame of reference.

    Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?

    Sure, side-by-side I have no doubt you would.

    Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?

    Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)

    I dare say not.
  • DarkUltra - Monday, April 30, 2012 - link

    I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.

    Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
  • Origin32 - Saturday, April 28, 2012 - link

    I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.
    This all goes double for 4K video.

    That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.
    But not for film, not for me.
  • Old_Fogie_Late_Bloomer - Monday, April 23, 2012 - link

    Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).

    In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
  • peterfares - Thursday, September 27, 2012 - link

    a 2560x1600 monitor (available for years) has 1.975 times the amount of pixels as a 1920x1080 screen.

    4K would be even better, though!
  • nathanddrews - Monday, April 23, 2012 - link

    4K is a very big deal for a couple reasons: pixel density and film transparency.

    From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.

    Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).

    Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.

    You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.

Log in

Don't have an account? Sign up now