Yesterday Apple unveiled its third generation iPad, simply called the new iPad, at an event in San Francisco. The form factor remains mostly unchanged with a 9.7-inch display, however the new device is thicker at 9.4mm vs. 8.8mm for its predecessor. The added thickness was necessary to support the iPad's new 2048 x 1536 Retina Display.

Tablet Specification Comparison
  ASUS Transformer Pad Infinity Apple's new iPad (2012) Apple iPad 2
Dimensions 263 x 180.8 x 8.5mm 241.2 x 185.7 x 9.4mm 241.2 x 185.7 x 8.8mm
Display 10.1-inch 1920 x 1200 Super IPS+ 9.7-inch 2048 x 1536 IPS 9.7-inch 1024 x 768 IPS
Weight (WiFi) 586g 652g 601g
Weight (4G LTE) 586g 662g 601g
Processor (WiFi)

1.6GHz NVIDIA Tegra 3 T33 (4 x Cortex A9)

Apple A5X (2 x Cortex A9, PowerVR SGX 543MP4)

1GHz Apple A5 (2 x Cortex A9, PowerVR SGX543MP2)
Processor (4G LTE) 1.5GHz Qualcomm Snapdragon S4 MSM8960 (2 x Krait)

Apple A5X (2 x Cortex A9, PowerVR SGX 543MP4)

1GHz Apple A5 (2 x Cortex A9, PowerVR SGX543MP2)
Connectivity WiFi , Optional 4G LTE WiFi , Optional 4G LTE WiFi , Optional 3G
Memory 1GB 1GB 512MB
Storage 16GB - 64GB 16GB - 64GB 16GB
Battery 25Whr 42.5Whr 25Whr
Pricing $599 - $799 est $499 - $829 $399, $529

Driving the new display is Apple's A5X SoC. Apple hasn't been too specific about what's inside the A5X other than to say it features "quad-core graphics". Upon further prodding Apple did confirm that there are two CPU cores inside the SoC. It's safe to assume that there are still a pair of Cortex A9s in the A5X but now paired with a PowerVR SGX543MP4 instead of the 543MP2 used in the iPad 2. The chart below gives us an indication of the performance Apple expects to see from the A5X's GPU vs what's in the A5:

Apple ran the PowerVR SGX 543MP2 in its A5 SoC at around 250MHz, which puts it at 16 GFLOPS of peak theoretical compute horsepower. NVIDIA claims the GPU in Tegra 3 is clocked higher than Tegra 2, which was around 300MHz. In practice, Tegra 3 GPU clocks range from 333MHz on the low end for smartphones and reach as high as 500MHz on the high end for tablets. If we assume a 333MHz GPU clock in Tegra 3, that puts NVIDIA at roughly 8 GFLOPS, which rationalizes the 2x advantage Apple claims in the chart above. The real world performance gap isn't anywhere near that large of course - particularly if you run on a device with a ~500MHz GPU clock (12 GFLOPS):

GLBenchmark 2.1.1 - Egypt - Offscreen (720p)

GLBenchmark 2.1.1's Egypt offscreen test pegs the PowerVR SGX 543MP2 advantage at just over 30%, at least at 1280 x 720. Based on the raw FP numbers for a 500MHz Tegra 3 GPU vs. a 250MHz PowerVR SGX 543MP2, around a 30% performance advantage is what you'd expect from a mostly compute limited workload. It's possible that the gap could grow at higher resolutions or with a different workload. For example, look at the older GLBenchmark PRO results and you will see a 2x gap in graphics performance:

GLBenchmark 2.1.1 - PRO - Offscreen (720p)

For most real world gaming workloads I do believe that the A5 is faster than Tegra 3, but the advantage is unlikely to be 2x at non-retinadisplay resolutions. The same applies to the A5X vs. Tegra 3 comparison. I fully expect there to be a significant performance gap at the same resolution, but I doubt it is 4x in a game.

Mobile SoC GPU Comparison
  Apple A4 Apple A5 Apple A5X Tegra 3 (max) Tegra 3 (min) Intel Z2580
GPU PowerVR SGX 535 PowerVR SGX 543MP2 PowerVR SGX 543MP4 GeForce GeForce PowerVR SGX 544MP2
MADs per Clock 4 32 64 12 12 32
Clock Speed 250MHz 250MHz 250MHz 500MHz 333MHz 533MHz
Peak Compute 2.0 GFLOPS 16.0 GFLOPS 32.0 GFLOPS 12.0 GFLOPS 8.0 GFLOPS 34.1 GFLOPS

The A5X doubles GPU execution resources compared to the A5. Imagination Technologies' PowerVR SGX 543 is modular - you can expand by simply increasing "core" count. Apple tells us all we need to know about clock speed in the chart above: with 2x the execution resources and 2x the performance of the A5, Apple hasn't changed the GPU clock of the A5X.

Assuming perfect scaling, I'd expect around a 2x performance gain over Tegra 3 in GLBenchmark (Egypt) at 720p. Again, not 4x but at the same time, hardly insignificant. It can take multiple generations of GPUs to deliver that sort of a performance advantage at a similar price point. Granted Apple has no problems eating the cost of a larger, more expensive die, but that doesn't change the fact that the GPU advantage Apple will hold thanks to the A5X is generational.

I'd also point out that the theoretical GPU performance of the A5X is identical to what Intel is promising with its Atom Z2580 SoC. Apple arrives there with four SGX 543 cores, while Intel gets there with two SGX 544 cores running at ~2x the frequency (533MHz vs. 250MHz).

With the new iPad's Retina Display delivering 4x the pixels of the iPad 2, a 2x increase in GPU horsepower isn't enough to maintain performance. If you remember back to our iPad 2 review however, the PowerVR SGX 543MP2 used in it was largely overkill for the 1024 x 768 display. It's likely that a 4x increase in GPU horsepower wasn't necessary to deliver a similar experience on games. Also keep in mind that memory bandwidth limitations will keep many titles from running at the new iPad's native resolution. Remember that we need huge GPUs with 100s of GB/s of memory bandwidth to deliver a high frame rate on 3 - 4MP PC displays. I'd expect many games to render at lower resolutions and possibly scale up to fit the panel.

What About the Display?

Performance specs aside, the iPad's Retina Display does look amazing. The 1024 x 768 panel in the older models was simply getting long in the tooth and the Retina Display ensures Apple won't need to increase screen resolution for a very long time. Apple also increased color gamut by 44% with the panel, but the increase in resolution alone is worth the upgrade for anyone who spends a lot of time reading on their iPad. The photos below give you an idea of just how sharp text and graphics are on the new display compared to its predecessor (iPad 2, left vs. new iPad, right):

The improvement is dramatic in these macro shots but I do believe that it's just as significant in normal use. 

Apple continues to invest heavily in the aspects of its devices that users interact with the most frequently. Spending a significant amount of money on the display makes a lot of sense. Kudos to Apple for pushing the industry forward here. The only downside is supply of these greater-than-HD panels is apparently very limited as a result of Apple buying up most of the production from as many as three different panel vendors. It will be a while before we see Android tablets with comparable resolutions, although we will see 1920 x 1200 Android tablets shipping in this half.

The CPU & More
Comments Locked

161 Comments

View All Comments

  • medi01 - Saturday, March 10, 2012 - link

    Don't iZombie much, please.

    I keep my phone and tablet at the same distance, I guess I "hold it wrong way" in Hypnosteve's books.

    The point of "retina" was that density was so high, that pixels were indistinguishable for a human eye. (distance matters a lot here) at some magical distance.

    Indeed by playing with distance one could reduce resolution yet claim "it's "retina"". But then one could apply that "retina" buzzword to many pieces of older hardware.

    Off-screen benchmarks show no practical results to the customers and are only deceiving. Nobody uses CPU/GPU on their own, it's used only with particular resolution screen and decoupling them is just a way to deceive.
  • doobydoo - Monday, March 12, 2012 - link

    How far you personally hold your tablet away is irrelevant. 'Retina' term isn't about you. It's about a typical user, with typical vision, holding the tablet at a typical distance, being unable to distinguish pixels.

    Typical users DO hold tablets further away, so it's perfectly logical.

    By 'Playing with the distance' you could indeed claim anything is retina - but that would make your claim incorrect because people don't hold the device at that distance, on average. The consensus amongst scientists and tech experts is that people DO hold tablets at the distance required to make this display retina.

    Off screen benchmarks eliminate both resolution and v-sync as factors (v-sync on screen benchmarks are the only reason the iPad 2 was slower in any GPU benchmarks - it limits FPS). As a result, you are given an accurate comparison of GPU performance. 'Practical Results' that you describe is a very difficult metric to calculate. While you would seemingly advocate a raw FPS metric, that fails to take into account resolution.

    For example, is 100 FPS at 10 x 10 resolution better than 60 FPS at 2000 x 1000? Of course not.

    Whichever way you look at it, the new iPad has a GPU which is up to 4x faster than the fastest Android tablet. It also has the best resolution. Any games designed to run on that high resolution will be tested to make sure they run at a playable FPS so the 'real world' performance will be both higher resolution and just as fast as any Android tablet.

    You seem to be completely bitter and unable to admit Apple has the technological lead right now.
  • seanleeforever - Monday, March 12, 2012 - link

    i didn't realize my 2 year old 1080p 65 inch TV was 'retina' display.
  • Michiel - Friday, March 9, 2012 - link

    Envy eats you alive. Go see a shrink !
  • medi01 - Saturday, March 10, 2012 - link

    Oh, sorry, I've forgotten it's a status thing.
    People paying 20-50 Euros less for a Samsung Galaxy obviously can not afford these über - revolutionary devices, hence they could only envy.
  • ripshank - Sunday, March 11, 2012 - link

    medi01: So sad. Your remarks only show your insecurity to the world.

    Relax, breathe and just let others enjoy their gadgets of choice rather than resorting to name calling and mockery. Realize these are friggin gadgets, not politics or religion. But from your comments, it's like Apple killed your family, took away your job and stole your wife.

    What is wrong with the world today when people get so worked up over an object?
  • medi01 - Sunday, March 11, 2012 - link

    Ad hominem, eh?

    There is nothing wrong with objecting to lies.

    Reviewers "forgetting iPhone in the pocket" on comparison photos where it would look pale, including nVidia's cherry picked card vs AMD's stock on marketing department's request and "off-screen benchmarks" all over the place are not simply bad, it stinks.
  • stsk - Monday, March 12, 2012 - link

    Seriously. Seek help.
  • doobydoo - Monday, March 12, 2012 - link

    1 - There is something wrong with objecting to lies INCORRECTLY. That's your own failing.

    2 - Ad hominem? I'll never understand why you Americans try to use that phrase all the time, as well as 'Straw man' - it not only makes you sound pretentious, trying to sound more intelligent than you are, it's also hypocritical:

    'Don't iZombie much, please.'

    Just say 'insults' - jeez.

    3 - Off-screen benchmarks are used by impartial review sites, as I explained above, because that is the only way to properly compare GPU performance. On-screen benchmarks have different resolutions and are limited by v-sync.

    4 - Claims of conspiracies on photos is just ridiculous.
  • Greg512 - Monday, March 12, 2012 - link

    "you Americans"

    Way to be a pretentious hypocrite.

Log in

Don't have an account? Sign up now