On-Package PCH, The First Single Chip Haswell

In 2010, with Clarkdale and Arrandale, Intel went from a 3-chip platform solution (CPU, IOH/MCH, ICH) down to a 2-chip platform (CPU + PCH). With Haswell, we see the first instantiation of a single-chip Core platform.

With the 8-series chipset, Intel moved from a 65nm process on the 7-series chipset to 32nm, skipping 45nm entirely. An older, less mobile-focused Intel would try to keep its chipsets on the oldest, economically sensible node possible, but these days things are different. The move to 32nm cuts TDP down considerably. Intel hasn’t publicly documented the power consumption any of its ultra mobile chipsets, but if we look at QM77 to QM87 we see a 34% decrease in TDP.

In Haswell desktop and standard voltage mobile parts, the 8-series chipset remains a an off-chip solution in a discrete package. With Haswell ULT and ULX (U and Y series SKUs), the 8-series PCH (Platform Controller Hub) moves on-package. Since it’s on-package, the TDP of the PCH is included in the overall TDP of the processor.

Bringing the PCH on-package not only saves space on the motherboard, but it also reduces the power needed to communicate with the chip. Signals no longer have to travel off die, through the package, via traces on the motherboard to the PCH. Instead you get much lower power on-package communication.

Intel also changed the interface between the CPU and PCH to a new on-package interface instead of DMI. Presumably Intel’s OPI is designed for much lower power operation.

Although PCIe support remains on the PCH (6 PCIe 2.0 lanes), there’s no external PCIe interface from the CPU itself. Any hopes for pairing a meaningfully high performance discrete GPU with Haswell ULT are dead. We didn’t see a ton of Ivy Bridge Ultrabooks with discrete GPUs, but the option simply won’t exist this time around. All of the sudden the creation of Intel’s 28W Haswell ULT with GT3 graphics makes a lot more sense. Haswell ULT lacks native VGA support. Update: NVIDIA tells me that it fully supports running a dGPU off of a x4 connection to the PCH. It's not the ideal solution, but discrete GPUs will still technically be possible with Haswell ULT.

Intel adds SDIO support. USB 3 and 6Gbps SATA are both there as well (although with fewer max ports supported compared to the desktop PCH, up to 4 and 3 respectively). There’s also a lot more sharing of bandwidth between individual PCIe lanes and USB/SATA. These limits shouldn’t be an issue given the port/drive configuration of most Ultrabooks.

Introduction Haswell ULT: Platform Power Improvements
Comments Locked

87 Comments

View All Comments

  • StealthGhost - Sunday, June 9, 2013 - link

    Is PCMark 8 Home less demanding than 1080p video? If not, it doesn't seem like the Asus can call itself an Ultrabook.

    Still, impressive gains in battery life. I hope this will carry over, to some extent at least, into the CPUs that will be in laptops like the MacBook Pros. Guess we will have to see when that review comes out =)
  • StealthGhost - Sunday, June 9, 2013 - link

    *Acer
  • Death666Angel - Sunday, June 9, 2013 - link

    As far as I know the Ultrabook specs don't define a minimum luminance for the display. So Anand's test isn't relevant for the Ultrabook spec.
  • axien86 - Sunday, June 9, 2013 - link

    Laptopmag just reviewed the Sony Vaio based on the i7-4500u and they also noted that performance wise the Haswell offered marginal improvement over Ivy Bridge laptops, while battery life was indicated as improved.

    One important characteristic that most laptop users want to know is regards to heat and fan noise. Laptopmag found that the Haswell i7-4500u based Sony Vaio found that the back of the laptop "reached a more troubling 110 degrees." They also found fan noise was definitely noticeable in a quiet room just running Word and a Youtube video.

    http://www.laptopmag.com/reviews/laptops/sony-vaio...

    The question is, why hasn't Anandtech run extensive temperature and noise characteristic analysis on Intel Haswell processors and your current laptop?
  • Kristian Vättö - Sunday, June 9, 2013 - link

    Likely because of lack of time. As Anand said in the article, he got the unit while in Computex, so basically he did the review between meetings. I'm pretty sure he didn't have the equipment with him to do temperature or noise tests properly but I'm sure this is something that will be tested in later reviews.

    Also, keep in mind that heat and fan noise are system-dependent. If the Vaio has poor cooling, then it will be hot and loud but that's not Haswell's fault (I'm not saying this is the case but we obviously need more reviews before any conclusions can be drawn).
  • mavere - Sunday, June 9, 2013 - link

    "The question is, why hasn't Anandtech run extensive temperature and noise characteristic analysis on Intel Haswell processors and your current laptop?"

    The WHr of battery that a laptop consumes is the exact same thing as its heat output.

    Anand is trying to test the Haswell platform, not the design deficiencies of any specific manufacturer. Sony's problems are its own.
  • n13L5 - Sunday, June 9, 2013 - link

    Cause they had to run their test in Taiwan with a borrowed machine?

    I was rather disturbed by the temperature of Sony's Vaio Pro, as well as the instability of the carbon case, which must be paper thin... And their keyboard backlight has the worst bleed I've ever seen, they should have a look at Lenovo and Samsung keyboards...

    Well, I'll be waiting for more tests and the ULT Haswell with the Iris 5100 GPU. Maybe that'll rack up a high enough temperature to fry a steak.
  • akbisw - Monday, June 10, 2013 - link

    Sony has terrible cooling system in their laptops in general. The worst I have seen done by any manufacturer. Even my my old acer before their "reiteration" had better cooling design than current VAIO systems.
  • Dnann - Friday, June 14, 2013 - link

    He probably didn't have the time considering the circumstances...
  • DanNeely - Sunday, June 9, 2013 - link

    The fact that companies historically known for horrible battery life are getting good results out of Haswell with Intel's help is promising; but starting from such a horribly low baseline makes getting gains much easier. I'm much more interested in seeing what companies who've historically offered good battery run times will be able to do with Haswell.

Log in

Don't have an account? Sign up now