Late last month, Intel dropped by my office with a power engineer for a rare demonstration of its competitive position versus NVIDIA's Tegra 3 when it came to power consumption. Like most companies in the mobile space, Intel doesn't just rely on device level power testing to determine battery life. In order to ensure that its CPU, GPU, memory controller and even NAND are all as power efficient as possible, most companies will measure power consumption directly on a tablet or smartphone motherboard.

The process would be a piece of cake if you had measurement points already prepared on the board, but in most cases Intel (and its competitors) are taking apart a retail device and hunting for a way to measure CPU or GPU power. I described how it's done in the original article:

Measuring power at the battery gives you an idea of total platform power consumption including display, SoC, memory, network stack and everything else on the motherboard. This approach is useful for understanding how long a device will last on a single charge, but if you're a component vendor you typically care a little more about the specific power consumption of your competitors' components.

What follows is a good mixture of art and science. Intel's power engineers will take apart a competing device and probe whatever looks to be a power delivery or filtering circuit while running various workloads on the device itself. By correlating the type of workload to spikes in voltage in these circuits, you can figure out what components on a smartphone or tablet motherboard are likely responsible for delivering power to individual blocks of an SoC. Despite the high level of integration in modern mobile SoCs, the major players on the chip (e.g. CPU and GPU) tend to operate on their own independent voltage planes.

A basic LC filter

What usually happens is you'll find a standard LC filter (inductor + capacitor) supplying power to a block on the SoC. Once the right LC filter has been identified, all you need to do is lift the inductor, insert a very small resistor (2 - 20 mΩ) and measure the voltage drop across the resistor. With voltage and resistance values known, you can determine current and power. Using good external instruments (NI USB-6289) you can plot power over time and now get a good idea of the power consumption of individual IP blocks within an SoC.

Basic LC filter modified with an inline resistor

The previous article focused on an admittedly not too interesting comparison: Intel's Atom Z2760 (Clover Trail) versus NVIDIA's Tegra 3. After much pleading, Intel returned with two more tablets: a Dell XPS 10 using Qualcomm's APQ8060A SoC (dual-core 28nm Krait) and a Nexus 10 using Samsung's Exynos 5 Dual (dual-core 32nm Cortex A15). What was a walk in the park for Atom all of the sudden became much more challenging. Both of these SoCs are built on very modern, low power manufacturing processes and Intel no longer has a performance advantage compared to Exynos 5.

Just like last time, I ensured all displays were calibrated to our usual 200 nits setting and ensured the software and configurations were as close to equal as possible. Both tablets were purchased at retail by Intel, but I verified their performance against our own samples/data and noticed no meaningful deviation. Since I don't have a Dell XPS 10 of my own, I compared performance to the Samsung ATIV Tab and confirmed that things were at least performing as they should.

We'll start with the Qualcomm based Dell XPS 10...

Modifying a Krait Platform: More Complicated
Comments Locked


View All Comments

  • djgandy - Friday, January 4, 2013 - link

    People care about battery life though. If you can run faster and go idle lower you can save more power.

    The next few years will be interesting and once everyone is on the same process, there will be less variables to find to assert who has the most efficient SOC.
  • DesktopMan - Friday, January 4, 2013 - link

    "and once everyone is on the same process"

    Intel will keep their fabs so unless everybody else suddenly start using theirs it doesn't look like this will ever happen. Even at the same transistor size there are large differences between fab methods.
  • jemima puddle-duck - Friday, January 4, 2013 - link

    Everyone cares about battery life, but it would take orders of magnitudes of improvement for people to actually go out of their way and demand it.
  • Wolfpup - Friday, January 4, 2013 - link

    No it wouldn't. People want new devices all the time with far less.

    And Atom swaps in for ARM pretty easily on Android, and is actually a huge selling point on the Windows side, given it can just plain do a lot more than ARM.
  • DesktopMan - Friday, January 4, 2013 - link

    The same power tests during hardware based video playback would also be very useful. I'm disappointed in the playback time I get on the Nexus 10, and I'm not sure if I should blame the display, the SOC, or both.
  • djgandy - Friday, January 4, 2013 - link

    It's probably the display. Video decode usually shuts most things off except the video decoder. Anand has already done Video decode analysis in other articles.
  • jwcalla - Friday, January 4, 2013 - link

    You can check your battery usage meter to verify, but... in typical usage, the display takes up by far the largest swath of power. And in standby, it's the wi-fi and cell radios hitting the battery the most.

    So SoC power efficiency is important, but the SoC is rarely the top offender.
  • Drazick - Friday, January 4, 2013 - link

    Why don't you keep it updated?
  • iwod - Friday, January 4, 2013 - link

    I dont think no one, or no anandtech reader with some technical knowledge in its mind, has ever doubt what Intel is able to come up with. A low power, similar performance or even better SoC in both aspect. Give it time Intel will get there. I dont think anyone should disagree with that.

    But i dont think that is Intel's problem at all. It is how they are going to sell this chip when Apple and Samsung are making one themselves for less then $20. Samsung owns nearly majority of the Android Market, Which means there is zero chance they are using a Intel SoC since they design AND manufacture the chip all by themselves. And when Samsung owns the top end of the market, the lower end are being filled by EVEN cheaper ARM SoCs.

    So while Intel may have the best SoC 5 years down the road, I just dont see how they fit in in Smartphone Market. ( Tablet would be a different story and they should do alright.... )
  • jemima puddle-duck - Friday, January 4, 2013 - link

    Exactly. Sometimes, whilst I enjoy reading these articles, it feels like the "How many angels can dance on the head of a pin" argument. Everyone knows Intel will come up with the fastest processor eventually. But why are we always told to wait for the next generation? It's just PR. Enjoyable PR, but PR none the less.

Log in

Don't have an account? Sign up now