The Unlikely Legacy: AMD GFX8 Enters Its Fifth Year

With the release of Polaris 30 and the Radeon RX 590, something very interesting is about to take place: AMD’s “GFX8” core graphics architecture is turning 5. And in the process AMD is setting up GFX8 for what’s likely to be the single longest-lived mainstream graphics architecture we've ever seen.

AMD’s first GFX8 GPU rolled out back in the summer of 2014 with the release of the Tonga GPU and the Radeon R9 285. And while somewhat unassuming and definitely underpromoted by AMD at the time – at a high level it was little more than a modernized Tahiti GPU, AMD’s first GCN GPU – the GFX8 graphics architecture has since become an unlikely staple of AMD’s GPU efforts. The cornerstone of both GCN 3 and GCN 4, it has been with us in the Radeon 200, 300, 400, and 500 series, and now is clearly setup to last well into 2019 (if not beyond) as part of Polaris 30 and the Radeon RX 590.

At this point, before I go too far down the rabbit hole, I should probably stop here and clarify what GFX8 is, as I’m sure more than a few of you have stopped and asked “aren’t AMD’s GPU architectures named Graphics Core Next?” You would of course be correct, but this is where there’s a fine degree of architectural nuance that we often don’t get into even in most AnandTech articles. And at this juncture, that’s nuance that will be very helpful in expressing my amazement that GFX8 of all architectures is primed to become the longest-lived graphics architecture.

Though we don’t hear about it from AMD as much these days as we did back in 2015/2016, a core part of AMD’s GPU strategy remains the ideas of architectural blocks. That the company separately develops the display controller, the memory controller, the core graphics processor, the geometry processor, etc, such that these parts can be mixed and matched to a degree. This allows AMD’s semi-custom arm to offer a variety of options to customers – something successfully leveraged for the likes of the current-gen game consoles and the Intel-exclusive “Vega M” GPU – while also giving AMD the ability to upgrade its GPU designs in a piecemeal fashion.

This is something we saw in spades with the launch of the Polaris GPU family, where AMD even put out a very high-level slide listing the major parts of the GPU and the various bits they changed. And in fact because it was so high level, that slide ended up overstating things in some cases. Polaris had a whole bunch new to it, but it also borrowed a lot from earlier AMD architectures.

Each of AMD’s blocks has their own version number system, outside of the public eye and outside of how the company numbers the iterations of their Graphics Core Next architecture. If you did through AMD’s developer tools and Linux kernel documentation long enough you’ll find all the parts, but this isn’t something that is meant to matter to consumers (or even most enthusiasts). Rather AMD periodically bundles together all the different blocks and packages them together as an iteration of Graphics Core Next, with any given version of GCN essentially laying a minimum standard for what version of a given block can be used.

AMD GFX IP Generations
  GCN Version Major Video Cards Year Introduced
GFX6 GCN 1 Radeon 7970 2011
GFX7 GCN 2 Radeon R9 290X 2013
GFX8 GCN 3
GCN 4 (Polaris)
Radeon R9 285
Radeon R9 Fury X
Radeon RX 480/580/590
2014
GFX9 GCN 5 (Vega) Radeon RX Vega 64
Radeon Instinct MI60
2017

The core of any GPU is of course its graphics and compute core – you won’t see AMD launch a new graphics core without also revving GCN to match – and this is where we get to GFX8. As you can probably guess from the name, GFX8 is the 8th iteration of AMD’s core graphics architecture. Meanwhile AMD also has GFX9, which is the heart of Vega. All of which is ultimately a longwinded way of saying that AMD has multiple core graphics architectures in flight at any given time, and that consequently AMD has tended on waiver on how much they’re willing to promote a new graphics core, as they don’t want to undermine their existing products.

Anyhow, let’s talk about GFX8. Introduced in 2014, GFX8 is a bit of an oddity in that in the consumer space, it’s a bit of a footnote. GFX7 already supported Direct3D feature level 12_0, so GFX8 didn’t bring anything new to the table in that regard. Instead the biggest updates there were to the compute side of matters: GFX8 introduced new compute instructions and, while they weren’t full-on Rapid Packed Math, support for 16-bit data types and associated instructions. That it was such a small update on the graphics side of matters is why AMD was able to slip it into the Radeon 200 product stack late in its life, and then easily carry over the Tonga GPU into the 300 series as well. Similarly, GFX8 became the heart of the late-28nm Fiji GPU, AMD’s first High Bandwidth Memory product.


GFX8/GCN 3 Recap

Since 2014, GFX8 has gone on to live a long and productive life, and at this point it’s been a much longer life than I was ever expecting. If anything I was expecting GFX8 to be short-lived; AMD was clearly building up to what would eventually become Vega. Instead GFX8 was carried into Polaris (GCN 4), something that AMD didn’t make very obvious at the time, and has been a critical component of all Polaris GPUs since then. Including, of course, the new Polaris 30.

The end result is that if you wrote low-level shader code against Tonga’s ISA back in 2014, you can today run it unchanged on Polaris 30. The use of GFX8 means that the entire span of products are ISA compatible, which is a remarkable development since GPU ISAs are prone to changing every couple of years. GFX6 and GFX7 didn’t have this kind of shelf life, and it doesn’t look like GFX9 will have quite the same lifetime either. Even within NVIDIA’s stack, Maxwell would have needed to go another couple of years to keep pace.

Consequently, that GFX8 has lived for so long is a remarkable testament to AMD’s GPU design team; they built a solid, if unassuming architecture, and it has lasted the test of time. Technically now it’s on its second die shrink, having gone from 28nm to 14nm to 12nm, and has even shown up in the oddest of places like the not-quite-Vega “Vega M” GPU. If you had asked me back in 2014 or 2015 what architecture I thought would live the longest, GFX8 would not have been my answer.

The flip side to that however is that it does underscore AMD’s technical situation. This is a D3D feature level 12_0 part – meaning it lacks 12_1 features like conservative rasterization and raster ordered views. Which was fine back in 2014, but NVIDIA has been shipping 12_1 hardware since 2014 and AMD since 2017. So from one perspective, a brand-new Radeon RX 590 in 2018 is still lacking graphics features introduced by GPUs 4 years ago.

Ultimately however this is not a consumer concern, but more of a developer concern. The launch of a new feature level 12_0 GPU and card series – and one I expect will sell moderately well – means that the clock has been pushed back on developers being able to use 12_1 features as a baseline in their games. RX 590 cards are going to be around for a while even after their day is done, and developers will need to include support for them. All of which is going to make things very interesting once we reach the next generation of consoles, and multi-platform simplicity butts heads with PC compatibility.

Still, this only goes to show that the only thing predictable about the GPU market is how unpredictable it is. We’re in uncharted territory here, and strangely enough it’s a core graphics architecture from 2014 that’s blazing the trail.

The AMD Radeon RX 590 Review Meet the Cards: XFX RX 590 Fatboy & PowerColor RX 590 Red Devil
Comments Locked

136 Comments

View All Comments

  • El Sama - Thursday, November 15, 2018 - link

    To be honest I believe that the GTX 1070/Vega 56 is not that far away in price and should be considered as the minimum investment for a gamer in 2019.
  • Dragonstongue - Thursday, November 15, 2018 - link

    over $600 for a single GPU V56, no thank you..even this 590 is likely to be ~440 or so in CAD, screw that noise.

    minimum for a gamer with deep pockets, maybe, but that is like the price of good cpu and motherboard (such as Ryzen 2700)
  • Cooe - Thursday, November 15, 2018 - link

    Lol it's not really the rest of the world's fault the Canadian Dollar absolutely freaking sucks right now. Or AMD's for that matter.
  • Hrel - Thursday, November 15, 2018 - link

    Man, I still have a hard 200 dollar cap on any single component. Kinda insane to imagine doubling that!

    I also don't give a shit about 3d, virtual anything or resolutions beyond 1080p. I mean ffs the human eye can't even tell the difference between 4k and 1080, why is ANYONE willing to pay for that?!

    In any case, 150 is my budget for my next GPU. Considering how old 1080p is that should be plenty.
  • igavus - Friday, November 16, 2018 - link

    4k and 1080p look pretty different. No offence, but if you can't tell the difference, perhaps it's time to schedule a visit with an optometrist? Nevermind 4K, the rest of the world will look a lot better also if your eyes are okay :)
  • Great_Scott - Friday, November 16, 2018 - link

    My eyes are fine. The sole advantage of 4K is not needing to run AA. That's about it.

    Anyone buying a card just so they can push a solid framerate on a 4K monitor is throwing money in the trash. Doubly so if they aren't 4->1 interpolating to play at 1K on a 4K monitor they needed for work (not gaming, since you don't need to game at 4K in the first place).
  • StevoLincolnite - Friday, November 16, 2018 - link

    There is a big difference between 1080P and 4k... But that is entirely depending on how large the display is and how far you sit away from said display.

    Otherwise known as "Perceived Pixels Per Inch".

    With that in mind... I would opt for a 1440P panel with a higher refresh rate than 4k every day of the week.
  • wumpus - Saturday, November 17, 2018 - link

    Depends on the monitor. I'd agree with you when people claim "the sweet spot of 4k monitors is 28 inches". Maybe the price is good, but my old eyes will never see it. I'm wondering if a 40" 4k TV will make more sense (the dot pitch will be lower than my 1080P, but I'd still likely notice lack of AA).

    Gaming (once you step up to the high end GPUs) should be more immersive, but the 2d benefits are probably bigger.
  • Targon - Saturday, November 17, 2018 - link

    There are people who notice the differences, and those who do not. Back in the days of CRT monitors, most people would notice flicker with a 60Hz monitor, but wouldn't notice with 72Hz. I always found that 85Hz produced less eye strain.

    There is a huge difference between 1080p and 2160p in terms of quality, but many games are so focused on action that the developers don't bother putting in the effort to provide good quality textures in the first place. It isn't just about not needing AA as much as about a higher pixel density and quality with 4k. For non-gaming, being able to fit twice as much on the screen really helps.
  • PeachNCream - Friday, November 16, 2018 - link

    I reached diminishing returns at 1366x768. The increase to 1080p offered an improvement in image quality mainly by reducing jagged lines, but it wasn't anything to get astonished about. Agreed that the difference between 1080p and 4K is marginal on smaller screens and certainly not worth the added demand on graphics power to push the additional pixels.

Log in

Don't have an account? Sign up now