Am I correct in understanding from the article that MFAA does anti-alias transparent textures, where MSAA does not? If this is the case, could additional performance be gained by using MFAA instead of MSAA+FXAA (particularly in BF4/DA:I)?
Yes, from my understanding (and looking at some screen shots), MFAA can help reduce jaggies in transparent textures. As for increased performance relative to MSAA+FXAA, I believe the performance hit from FXAA is quite small, so I'm not sure how much you'd get by doing MFAA vs. MSAA+FXAA; however, as you can see from the charts 2xMFAA is generally far less demanding than 4xMSAA.
Aaaand they still haven't fixed DSR in combo with SLI and Gsync.
I have a RoG Swift and 980 SLI. I can only use DSR if I disable SLI or use a different monitor. According to nVidia, support is coming in a later driver release.
Oh, and this new feature doesn't work in SLI either.
Starting to wonder why I've spent so much on nVidia products!
Because you know that ROG Swift and G-Sync are worth it. ;)
But in all seriousness, you are basically at the Apex and "Holy Trinity" of new Nvidia technologies right now, you should expect it to take some time to for them to get everything working properly together.
Yes, for years on end. and then still not fix it, for a few more years, then, the vaporware would be abandoned.
With nVidia however one can be supremely confident the repair is on the way quickly, and meanwhile one can enjoy the plethora of the other many and varied additional features AMD cards lack completely.
Then, as in now, everything AMD complained about concerning the competition when AMD wore it's self proclaimed and false holier than thou halo, it has at the time or subsequently done in a much more vast and directly dark and corrupt manner, like the rampant and embarrassing direct re-branding, without even memory or bus or clockspeed or bit width changes nVidia always incorporated to vary the cores output.
Yeah, there's a reason wasting money on Vertdetrol resulted in ever more embarrassment for AMD, as it was a childish and irresponsible game far below the level purported to be upheld by the "Gamers Manifesto" - ROFL
Isn't AC:U a terrible game to use for benchmarks. It *is* very demanding, but it also is using a very unscaleable engine, and its performance characteristics are hard to pin down due to inexplicable slowdowns, random npc placement etc.
Yes, it is. Thanks for bringing up the issues that I was going to bring up. AC:U is VERY un/under-optimized for PC's as well so they would do better to use another game that is better optimized for PC's to rate whether MFAA has a benefit or loss.
"when the game is in motion, seeing jaggies also becomes more difficult"
This is, possibly, a bit subjective. Aliasing crawl and flicker drives me up the wall, I dislike it immensely. It is the most visible on scenes with a high dynamic range in lighting and is most visible when in motion.
I'm lost on this too. First it says it has to be implemented game by game but then says "one nice thing with MFAA is that it currently ties into the existing MSAA support in games."
Yes, it will not work on a GTX 860m. Any maxwell prior to GTX 970/980 (m) was GM10X (GM107/GM108), and features like VXGI and MFAA are made to work only with GM20X (GM204)
Same story: you need hardware features (specifically, the AA sample patterns stored in RAM instead of ROM) in order to do MFAA. It would take a lot more time to come up with a clear verdict, but I do wonder how much of a difference there is between TXAA and MFAA -- if I didn't have a bunch of other stuff going, I would have investigated that in this piece, but as it was I spent the vast majority of the last 48 hours testing (and retesting) and capturing video of ACU and Crysis 3.
Nice, I will have to check this out. How is it being implemented? Through a driver dropdown and whitelist/AA bits? There doesn't seem to be as big of a negligible perf difference as Nvidia stated, but their 4xMSAA quality at 2xMSAA perf hit does seem to be pretty accurate. I would really like to see if this could be forced on games that don't support any MSAA at all, and rely on FXAA or DSR, like GTA4 and FFXIV. These games could really use any anti-aliasing available!
The 970/980 are truly revolutionary, with faster FPS, better Jaggy ability/less FPS hit while doing it all with less power! I've had 6 or more Nivida cards, 1 ATI/1AMD cards, 2 3DFX cards, and 1 Matrox card. Overall Nvidia has the best drivers, so don't give up on them! Thanks for the update!
This is an important topic when talking about SLI and CrossFire. In the past, AMD was the one in the hot seat with terrible CrossFire scaling, performance and smoothness. A lot has changed over the last couple of years, AMD now has a frame pacing technology as well as its new XDMA technology. These technologies combined have turned the tables and now CrossFire feels smoother than SLI. It still feels smoother even when compared to NVIDIA's new GTX 980 cards in SLI. While overclocked GTX 980 SLI is faster in framerate, the actual feeling and smoothness in games feels better on AMD Radeon R9 290X. AMD currently has the upper hand in this arena.
There also seems to be some games that scale better on AMD CrossFire. Watch Dogs is a prime example of a game that works great with AMD CrossFire, but for some reason is completely dropping the ball under SLI. In our previous evaluation we also found better CrossFire scaling in Alien: Isolation."
R9 290X CF = $600 980 SLI = $1100
4K or multi-monitor gaming, 290X CF is smoother despite costing nearly HALF!
Cards like Sapphire Tri-X R9 290 are $250 with 4 games. 970 and especially 980 are now the overpriced cards and need price drops really.
Why are you using Watch Dogs as an example when that game's engine is unoptimized and an idiot in general, and why are you comparing the 290x to the 980, when the 970 is its actual competition? 900 series cards scale at least 90%, and usually 95-100% in sli, which Amd cards cannot approach, and as someone intamately familiar with things like frame time, unless something rather drastic has happened in the few weeks since the release of the 900 series, Nvidia is still the king of multi-gpu smoothness, though since then I haven't really been paying any attention to it. I don't hate Amd cards, but Crossfire is simply inferior, and pricing/stability/performance wise, the 970 is king in multi-gpu configurations. That said, I personally wouldn't put any modern 4 gb card in cf/sli, because you are going to run out of vram in some situations before you can even use your potential gpu performance!
Why even buy the hot as heck AMD patched and wanting junk ? Drivers are a pain, compatibility is a pain, and they always have ongoing game issues, and a severe lack of features.
I suppose the only reason possible is being a simpleton and feeling like 1 or 2 or 10 percent of an already sufficient frame rate, and frame rate only - minus all the other issues, or a frame rate that is unplayable yet produces a 4k "win", would be the reason.
So for the foolish and easily manipulated and boring gamer who only stares blankly at his fraps counter ( while purportedly paying attention to the game ), it might be proper and just punishment. Otherwise, forget it. They are so far behind the nVidia curve and new technologies AMD is no longer desirable at all. AMD's latest falter, the 285, is a 3 year old sidestep failure and completely unacceptable. Their cpu's also show the same lack of skill and engineering and software mastery. Plus they lose money nearly every quarter, going on the better part of a decade.
Forget the failure and endlessly promoting like a verdetrol. It will only work with the weak minded and desperate.
Yes, whatever, after 2 years of horrible CF for $579+ release cards, it's about time. However everyone likes a really good deal so maybe now is the time, since AMD has stretched the 290's life... so yeah I check Newegg and yep, well it's a 20 rebate so 270 and some paperwork and waiting - but 4 games too ? Who wouldn't want that deal for an upgrade? So I check and it's the space gold games 3 + 1 Civ .. ok forget the Civ but 3 others I can choose great - so Newegg has the AMD pic of gamers gold but you can't read all the games - I mean what the heck are they thinking ? So I google to AMD's list and guess what - the same tiny, tiny little dvd box pics - WITH NO GAMES LIST VISIBLE... https://www.amd4u.com/radeonrewards/ Man I'm telling you - I was ready to give AMD another chance but the frustration they cause with their immense stupidity is not only undesirable it's scary... it makes me wonder what other massive oversights I'll be running into and how many times their idiocy (or covering up the cruddy games by not listing them and making their pics so small blowing the browser to 400% view still results in hits ad misses) will cause me anger... Just forget it AMD ! No polish, common sense not present... the games you give away - well -- hard to tell what they might be --- wow what great marketing skills ! OMG - I AM NOT BUYING NOW - IT WAS SO CLOSE BUT AMD MADE SURE TO BLOW IT AGAIN !
Am I the only one who finds it more than just plain annoying that NVIDIA doesn't distinguish between Maxwell (v2) and Maxwell (v1)? For christ sake, every f'ing thing they offered in the last two weeks says "You need Maxwell to run this" and all users think: Great, that's what I have, let's try it out. When you actually try (like the Apollo demo which only took 6(!) hours to download -- no thanks to NVIDIAs damn slow servers) you're greeted with "You need Maxwell to run this application". I mean WTF?!?!?!?
Yeah, they should have something like Intel has where it can scan your computers and tell you "You have the proper equipment to run this demo or tester!" Of course, many of those things use JAVA (which is verboten somewhat unjustly with a lot of people) so......
Good drivers for those who have a graphics card that NVidia still supports, but I am a little twerked that they no longer support 9800 series graphics cards - 300 series graphics cards. These drivers are only for the 400 series or newer and there is no way to 'fool' the computer I am on into installing the drivers. Even a manual attempted installation fails.
Can i be a bit of an ass and point out that this "new technology" is actually ~10y old and pioneered on the Radeon x800 pro? Sure it has been refined a bit (vsync isnt a requirement nowadays) but the main issues that existed back then exist right now, that it does have a minimum performance required to actually appreciate the effect & it doesn't really work on most games. http://techreport.com/review/6672/ati-radeon-x800-...
IMHO shader based AA is actually where they should be doing the lion's share of their research going forward.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
28 Comments
Back to Article
QuantumPion - Tuesday, November 18, 2014 - link
Am I correct in understanding from the article that MFAA does anti-alias transparent textures, where MSAA does not? If this is the case, could additional performance be gained by using MFAA instead of MSAA+FXAA (particularly in BF4/DA:I)?JarredWalton - Tuesday, November 18, 2014 - link
Yes, from my understanding (and looking at some screen shots), MFAA can help reduce jaggies in transparent textures. As for increased performance relative to MSAA+FXAA, I believe the performance hit from FXAA is quite small, so I'm not sure how much you'd get by doing MFAA vs. MSAA+FXAA; however, as you can see from the charts 2xMFAA is generally far less demanding than 4xMSAA.rtho782 - Tuesday, November 18, 2014 - link
Aaaand they still haven't fixed DSR in combo with SLI and Gsync.I have a RoG Swift and 980 SLI. I can only use DSR if I disable SLI or use a different monitor. According to nVidia, support is coming in a later driver release.
Oh, and this new feature doesn't work in SLI either.
Starting to wonder why I've spent so much on nVidia products!
chizow - Tuesday, November 18, 2014 - link
Because you know that ROG Swift and G-Sync are worth it. ;)But in all seriousness, you are basically at the Apex and "Holy Trinity" of new Nvidia technologies right now, you should expect it to take some time to for them to get everything working properly together.
silverblue - Wednesday, November 19, 2014 - link
...whilst the competition would get slammed for the very same.FlushedBubblyJock - Thursday, November 20, 2014 - link
Yes, for years on end. and then still not fix it, for a few more years, then, the vaporware would be abandoned.With nVidia however one can be supremely confident the repair is on the way quickly, and meanwhile one can enjoy the plethora of the other many and varied additional features AMD cards lack completely.
Then, as in now, everything AMD complained about concerning the competition when AMD wore it's self proclaimed and false holier than thou halo, it has at the time or subsequently done in a much more vast and directly dark and corrupt manner, like the rampant and embarrassing direct re-branding, without even memory or bus or clockspeed or bit width changes nVidia always incorporated to vary the cores output.
Yeah, there's a reason wasting money on Vertdetrol resulted in ever more embarrassment for AMD, as it was a childish and irresponsible game far below the level purported to be upheld by the "Gamers Manifesto" - ROFL
tuxfool - Tuesday, November 18, 2014 - link
Isn't AC:U a terrible game to use for benchmarks. It *is* very demanding, but it also is using a very unscaleable engine, and its performance characteristics are hard to pin down due to inexplicable slowdowns, random npc placement etc.Lerianis - Monday, November 24, 2014 - link
Yes, it is. Thanks for bringing up the issues that I was going to bring up. AC:U is VERY un/under-optimized for PC's as well so they would do better to use another game that is better optimized for PC's to rate whether MFAA has a benefit or loss.tuxfool - Tuesday, November 18, 2014 - link
"when the game is in motion, seeing jaggies also becomes more difficult"This is, possibly, a bit subjective. Aliasing crawl and flicker drives me up the wall, I dislike it immensely. It is the most visible on scenes with a high dynamic range in lighting and is most visible when in motion.
YazX_ - Tuesday, November 18, 2014 - link
So as far as i understood, you can enable MFAA in games by setting AA to 2x/4x MSAA while it is on in NCP?Tetracycloide - Friday, November 21, 2014 - link
I'm lost on this too. First it says it has to be implemented game by game but then says "one nice thing with MFAA is that it currently ties into the existing MSAA support in games."barleyguy - Tuesday, November 18, 2014 - link
Does this specifically not work on a Maxwell 860M?mvitkun - Tuesday, November 18, 2014 - link
Yes, it will not work on a GTX 860m.Any maxwell prior to GTX 970/980 (m) was GM10X (GM107/GM108), and features like VXGI and MFAA are made to work only with GM20X (GM204)
Samus - Tuesday, November 18, 2014 - link
How about GTX 750/750Ti?JarredWalton - Tuesday, November 18, 2014 - link
Same story: you need hardware features (specifically, the AA sample patterns stored in RAM instead of ROM) in order to do MFAA. It would take a lot more time to come up with a clear verdict, but I do wonder how much of a difference there is between TXAA and MFAA -- if I didn't have a bunch of other stuff going, I would have investigated that in this piece, but as it was I spent the vast majority of the last 48 hours testing (and retesting) and capturing video of ACU and Crysis 3.chizow - Tuesday, November 18, 2014 - link
Nice, I will have to check this out. How is it being implemented? Through a driver dropdown and whitelist/AA bits? There doesn't seem to be as big of a negligible perf difference as Nvidia stated, but their 4xMSAA quality at 2xMSAA perf hit does seem to be pretty accurate. I would really like to see if this could be forced on games that don't support any MSAA at all, and rely on FXAA or DSR, like GTA4 and FFXIV. These games could really use any anti-aliasing available!chizow - Tuesday, November 18, 2014 - link
Nvm, I forgot MFAA relies on the base MSAA sample, so the game has to support MSAA to apply MFAA on top of it.eanazag - Tuesday, November 18, 2014 - link
Thank you for posting an Nvidia only article so that the conspiracy theorists can go away.ol1bit - Wednesday, November 19, 2014 - link
The 970/980 are truly revolutionary, with faster FPS, better Jaggy ability/less FPS hit while doing it all with less power! I've had 6 or more Nivida cards, 1 ATI/1AMD cards, 2 3DFX cards, and 1 Matrox card. Overall Nvidia has the best drivers, so don't give up on them! Thanks for the update!RussianSensation - Thursday, November 20, 2014 - link
"SLI and CrossFire SmoothnessThis is an important topic when talking about SLI and CrossFire. In the past, AMD was the one in the hot seat with terrible CrossFire scaling, performance and smoothness. A lot has changed over the last couple of years, AMD now has a frame pacing technology as well as its new XDMA technology. These technologies combined have turned the tables and now CrossFire feels smoother than SLI. It still feels smoother even when compared to NVIDIA's new GTX 980 cards in SLI. While overclocked GTX 980 SLI is faster in framerate, the actual feeling and smoothness in games feels better on AMD Radeon R9 290X. AMD currently has the upper hand in this arena.
There also seems to be some games that scale better on AMD CrossFire. Watch Dogs is a prime example of a game that works great with AMD CrossFire, but for some reason is completely dropping the ball under SLI. In our previous evaluation we also found better CrossFire scaling in Alien: Isolation."
R9 290X CF = $600
980 SLI = $1100
4K or multi-monitor gaming, 290X CF is smoother despite costing nearly HALF!
Cards like Sapphire Tri-X R9 290 are $250 with 4 games. 970 and especially 980 are now the overpriced cards and need price drops really.
darth415 - Thursday, November 20, 2014 - link
Why are you using Watch Dogs as an example when that game's engine is unoptimized and an idiot in general, and why are you comparing the 290x to the 980, when the 970 is its actual competition? 900 series cards scale at least 90%, and usually 95-100% in sli, which Amd cards cannot approach, and as someone intamately familiar with things like frame time, unless something rather drastic has happened in the few weeks since the release of the 900 series, Nvidia is still the king of multi-gpu smoothness, though since then I haven't really been paying any attention to it. I don't hate Amd cards, but Crossfire is simply inferior, and pricing/stability/performance wise, the 970 is king in multi-gpu configurations. That said, I personally wouldn't put any modern 4 gb card in cf/sli, because you are going to run out of vram in some situations before you can even use your potential gpu performance!FlushedBubblyJock - Thursday, November 20, 2014 - link
Why even buy the hot as heck AMD patched and wanting junk ? Drivers are a pain, compatibility is a pain, and they always have ongoing game issues, and a severe lack of features.I suppose the only reason possible is being a simpleton and feeling like 1 or 2 or 10 percent of an already sufficient frame rate, and frame rate only - minus all the other issues, or a frame rate that is unplayable yet produces a 4k "win", would be the reason.
So for the foolish and easily manipulated and boring gamer who only stares blankly at his fraps counter ( while purportedly paying attention to the game ), it might be proper and just punishment.
Otherwise, forget it.
They are so far behind the nVidia curve and new technologies AMD is no longer desirable at all.
AMD's latest falter, the 285, is a 3 year old sidestep failure and completely unacceptable.
Their cpu's also show the same lack of skill and engineering and software mastery.
Plus they lose money nearly every quarter, going on the better part of a decade.
Forget the failure and endlessly promoting like a verdetrol. It will only work with the weak minded and desperate.
FlushedBubblyJock - Wednesday, November 26, 2014 - link
Yes, whatever, after 2 years of horrible CF for $579+ release cards, it's about time.However everyone likes a really good deal so maybe now is the time, since AMD has stretched the 290's life... so yeah I check Newegg and yep, well it's a 20 rebate so 270 and some paperwork and waiting - but 4 games too ? Who wouldn't want that deal for an upgrade?
So I check and it's the space gold games 3 + 1 Civ .. ok forget the Civ but 3 others I can choose great - so Newegg has the AMD pic of gamers gold but you can't read all the games - I mean what the heck are they thinking ?
So I google to AMD's list and guess what - the same tiny, tiny little dvd box pics - WITH NO GAMES LIST VISIBLE...
https://www.amd4u.com/radeonrewards/
Man I'm telling you - I was ready to give AMD another chance but the frustration they cause with their immense stupidity is not only undesirable it's scary... it makes me wonder what other massive oversights I'll be running into and how many times their idiocy (or covering up the cruddy games by not listing them and making their pics so small blowing the browser to 400% view still results in hits ad misses) will cause me anger...
Just forget it AMD ! No polish, common sense not present... the games you give away - well -- hard to tell what they might be --- wow what great marketing skills !
OMG - I AM NOT BUYING NOW - IT WAS SO CLOSE BUT AMD MADE SURE TO BLOW IT AGAIN !
Daniel Egger - Wednesday, November 19, 2014 - link
Am I the only one who finds it more than just plain annoying that NVIDIA doesn't distinguish between Maxwell (v2) and Maxwell (v1)? For christ sake, every f'ing thing they offered in the last two weeks says "You need Maxwell to run this" and all users think: Great, that's what I have, let's try it out. When you actually try (like the Apollo demo which only took 6(!) hours to download -- no thanks to NVIDIAs damn slow servers) you're greeted with "You need Maxwell to run this application". I mean WTF?!?!?!?Lerianis - Monday, November 24, 2014 - link
Yeah, they should have something like Intel has where it can scan your computers and tell you "You have the proper equipment to run this demo or tester!"Of course, many of those things use JAVA (which is verboten somewhat unjustly with a lot of people) so......
SeanJ76 - Monday, November 24, 2014 - link
344.75 driver sucks!!! Crysis 3 performs way better with 344.65 driver.Lerianis - Saturday, November 29, 2014 - link
Good drivers for those who have a graphics card that NVidia still supports, but I am a little twerked that they no longer support 9800 series graphics cards - 300 series graphics cards.These drivers are only for the 400 series or newer and there is no way to 'fool' the computer I am on into installing the drivers. Even a manual attempted installation fails.
Revdarian - Tuesday, December 2, 2014 - link
Can i be a bit of an ass and point out that this "new technology" is actually ~10y old and pioneered on the Radeon x800 pro? Sure it has been refined a bit (vsync isnt a requirement nowadays) but the main issues that existed back then exist right now, that it does have a minimum performance required to actually appreciate the effect & it doesn't really work on most games.http://techreport.com/review/6672/ati-radeon-x800-...
IMHO shader based AA is actually where they should be doing the lion's share of their research going forward.