Closing Thoughts

It took a while to get here, but if the proof is in the eating of the pudding, FreeSync tastes just as good as G-SYNC when it comes to adaptive refresh rates. Within the supported refresh rate range, I found nothing to complain about. Perhaps more importantly, while you’re not getting a “free” monitor upgrade, the current prices of the FreeSync displays are very close to what you’d pay for an equivalent display that doesn’t have adaptive sync. That’s great news, and with the major scaler manufacturers on board with adaptive sync the price disparity should only shrink over time.

The short summary is that FreeSync works just as you’d expect, and at least in our limited testing so far there have been no problems. Which isn’t to say that FreeSync will work with every possible AMD setup right now. As noted last month, the initial FreeSync driver that AMD provided (Catalyst 15.3 Beta 1) only allows FreeSync to work with single GPU configurations. Another driver should be coming next month that will support FreeSync with CrossFire setups.

Besides needing a driver and FreeSync display, you also need a GPU that uses AMD’s GCN 1.1 or later architecture. The list at present consists of the R7 260/260X, R9 285, R9 290/290X/295X2 discrete GPUs, as well as the Kaveri APUs – A6-7400K, A8-7600/7650K, and A10-7700K/7800/7850K. First generation GCN 1.0 cards (HD 7950/7970 or R9 280/280X and similar) are not supported.

All is not sunshine and roses, however. Part of the problem with reviewing something like FreeSync is that we're inherently tied to the hardware we receive, in this case the LG 34UM67 display. Armed with an R9 290X and running at the native resolution, the vast majority of games will run at 48FPS or above even at maximum detail settings, though of course there are exceptions. This means they look and feel smooth. But what happens with more demanding games or with lower performance GPUs? If you're running without VSYNC, you'd get tearing below 48FPS, while with VSYNC you'd get stuttering.

Neither is ideal, but how much this impacts your experience will depend on the game and individual. G-SYNC handles dropping below the minimum FPS more gracefully than FreeSync, though if you're routinely falling below the minimum FreeSync refresh rate we'd argue that you should lower the settings. Mostly what you get with FreeSync/G-SYNC is the ability to have smooth gaming at 40-60 FPS and not just 60+ FPS.

Other sites are reporting ghosting on FreeSync displays, but that's not inherent to the technology. Rather, it's a display specific problem (just as the amount of ghosting on normal LCDs is display specific). Using higher quality panels and hardware designed to reduce/eliminate ghosting is the solution. The FreeSync displays so far appear to not have the same level of anti-ghosting as the currently available G-SYNC panels, which is unfortunate if true. (Note that we've only looked at the LG 34UM67, so we can't report on all the FreeSync displays.) Again, ghosting shouldn't be a FreeSync issue so much as a panel/scaler/firmware problem, so we'll hold off on further commentary until we get to the monitor reviews.

One final topic to address is something that has become more noticeable to me over the past few months. While G-SYNC/FreeSync can make a big difference when frame rates are in the 40~75 FPS range, as you go beyond that point the benefits are a lot less clear. Take the 144Hz ASUS ROG Swift as an example. Even with G-SYNC disabled, the 144Hz refresh rate makes tearing rather difficult to spot, at least in my experience. Considering pixel response times for LCDs are not instantaneous and combine that with the way our human eyes and brain process the world and for all the hype I still think having high refresh rates with VSYNC disabled gets you 98% of the way to the goal of smooth gaming with no noticeable visual artifacts (at least for those of us without superhuman eyesight).

Overall, I’m impressed with what AMD has delivered so far with FreeSync. AMD gamers in particular will want to keep an eye on the new and upcoming FreeSync displays. They may not be the “must have” upgrade right now, but if you’re in the market and the price premium is less than $50, why not get FreeSync? On the other hand, for NVIDIA users things just got more complicated. Assuming you haven’t already jumped on the G-SYNC train, there’s now this question of whether or not NVIDIA will support non-G-SYNC displays that implement DisplayPort’s Adaptive Sync technology. I have little doubt that NVIDIA can support FreeSync panels, but whether they will support them is far less certain. Given the current price premium on G-SYNC displays, it’s probably a good time to sit back and wait a few months to see how things develop.

There is one G-SYNC display that I’m still waiting to see, however: Acer’s 27” 1440p144 IPS (AHVA) XB270HU. It was teased at CES and it could very well be the holy grail of displays. It’s scheduled to launch next month, and official pricing is $799 (with some pre-orders now online at higher prices). We might see a FreeSync variant of the XB270HU as well in the coming months, if not from Acer than likely from some other manufacturer. For those that work with images and movies as well as playing games, IPS/AHVA displays with G-SYNC or FreeSync support are definitely needed.

Wrapping up, if you haven’t upgraded your display in a while, now is a good time to take stock of the various options. IPS and other wide viewing angle displays have come down quite a bit in pricing, and there are overclockable 27” and 30” IPS displays that don’t cost much at all. Unfortunately, if you want a guaranteed high refresh rate, there’s a good chance you’re going to have to settle for TN. The new UltraWide LG displays with 75Hz IPS panels at least deliver a moderate improvement though, and they now come with FreeSync as an added bonus.

Considering a good display can last 5+ years, making a larger investment isn’t a bad idea, but by the same token rushing into a new display isn’t advisable either as you don't want to end up stuck with a "lemon" or a dead technology. Take some time, read the reviews, and then find the display that you will be happy to use for the next half decade. At least by then we should have a better idea of which display technologies will stick around.

FreeSync vs. G-SYNC Performance
Comments Locked

350 Comments

View All Comments

  • silverblue - Saturday, March 21, 2015 - link

    I can certainly let you off most of those, but third party activities shouldn't count, so you can subtract 6 and 12. Additionally, 13 can be picked apart as the 295X2 showed that AMD can present a high quality cooler, and because I believe lumping the aesthetic qualities of a cooler in with heat and noise is a partial falsehood (admit it - you WILL have been thinking of metal versus plastic shrouds). I also don't agree with you on 11; at least, not if you move back past the 2XX generation as AMD had more aggressive bundles back then. 8 is subjective but NVIDIA usually gets the nod here.

    Also, some of your earlier items are proprietary tech, to which I could always tease you about as it's not as if they couldn't license any of this out. ;)

    I'll hand it to you and credit you with your dozen.
  • chizow - Saturday, March 21, 2015 - link

    And I thank you for not doing the typical dismissive approach of "Oh I don't care about those features" that some on these forums might respond with.

    I would still disagree on 6 and 12 though, ultimately they are still a part of Nvidia's ecosystem and end-user experience, and in many cases, Nvidia affords them the tools and support to enable and offer these value-add features. 3rd party tools for example, they specifically take advantage of Nvidia's NVAPI to access hardware features via driver and Nvidia's very transparent XML settings to manipulate AA/SLI profile data. Similarly, every feature EVGA offers to end users has to be worth their effort and backed by Nvidia to make business sense for them.

    And 13, I would absolutely disagree on that one. I mean we see the culmination of Nvidia's cooling technology, the Titan NVTTM cooler, which is awesome. Having to resort to a triple slot water cooled solution for a high-end graphics card is terrible precedent imo and a huge barrier to entry for many, as you need additional case mounting and clearance which could be a problem if you already have a CPU CLC as many do. But that's just my opinion.

    AMD did make a good effort with their Gaming Evolved bundles and certainly offered better than Nvidia for a brief period, but its pretty clear their marketing dollars dried up around the same time they cut that BF4 Mantle deal and their current financial situation hasn't allowed them to offer anything compelling since. But I stand by that bulletpoint, Nvidia typically offers the more relevant and attractive game bundle at any given time.

    One last point in favor of Nvidia, is Optimus. I don't use it at home as I have no interest in "gaming" laptops, but it is a huge benefit there. We do have them on powerful laptops at work however, and the ability to "elevate" an application to the Nvidia dGPU on command is a huge benefit there as well.
  • anubis44 - Tuesday, March 24, 2015 - link

    @chizow:
    But hey kids, remember, after reading this 16 point PowerPoint presentation where he points out the superiority of nVidia using detailed arguments like "G-Sync" and "GRID" as strengths, chizow DOES NOT WORK FOR nVidia! He is not sitting in the marketing department in Santa Clara, California, with a group of other marketing mandarins running around, grabbing factoids for him to type in as responses to chat forums. No way!

    Repeat after me, 'chizow does NOT work for nVidia.' He's just an ordinary, everyday psychopath who spends 18 hours a day at keyboard responding to every single criticism of nVidia, no matter how trivial. But he does NOT work for nVidia! Perish the thought! He just does it out of his undying love for the green goblin.
  • chizow - Tuesday, March 24, 2015 - link

    But hey remember AMD fantards, there's no reason that the overwhelming majority of the market prefers Nvidia, those 16 things I listed don't actually mean anything if you prefer subpar product and don't demand better, and you continually choose to ignore the obvious one product supports more features and the other doesn't. But hey, just keep accepting subpar products and listen to AMD fanboys like anubis44, don't give in to the reality the rest of us all accept as fact.
  • sr1030nx - Saturday, March 21, 2015 - link

    Only if they were NVIDIA branded speaker cables 😉
  • Darkito - Friday, March 20, 2015 - link

    False
  • Darkito - Friday, March 20, 2015 - link

    False, it's indistinguishable "Within the supported refresh rate range" as per this review. What happens outside the VRR window however, and especially under it, is incredibly different. With G-sync, if you get 20 fps it'll actually duplicate frames and tune the monitor to 40Hz, which means smooth gaming at sub-30Hz refresh rates (well, as smooth as 20fps can be). With FreeSync, it'll just fall back to v-sync on or off, with all the stuttering or tearing that involves. That means that if your game ever falls below the VRR window on FreeSync, image quality falls apart dramatically. And according to PCPer, this isn't just something AMD can fix with a driver update because it requires the frame buffer and logic on the G-Sync module!

    http://www.pcper.com/reviews/Displays/AMD-FreeSync...

    Take note that the LG panel tested actually has a VRR window lower bound of 48Hz, so image quality starts falling apart if you dip below 48fps, which is clearly unacceptable.
  • AdamW0611 - Sunday, March 22, 2015 - link

    Yah just like R.I.P Direct X, Mantle will rule the day, now AMD is telling developers to ignore Mantle, Gsync is great, and for those of us who prefer drivers being updated the same day games are released will stick with Nvidia, while months later AMD users will be crying that games still don't work right for them.
  • anubis44 - Tuesday, March 24, 2015 - link

    Company of Heroes 2 worked like shit for nVidia users for months after release, while my Radeon 7950 was pulling more FPS than a Titan card. To this day, Radeons pull better, and smoother FPS than equivalently priced nVidia cards in this, my favourite game. The GTX970 is still behind the R9 290 today. Is that the 'same day' nVidia driver support you're referring to?
  • chizow - Tuesday, March 24, 2015 - link

    More BS from one of the biggest AMD fanboys on the planet, an AMD *CPU* fanboy nonetheless. CoH2 ran faster on Nvidia hardware from Day1, and also runs much faster on Intel CPUs, so yeah, as usual, you're running the slower hardware in your favorite game simply bc you're a huge AMD fanboy.

    http://www.techspot.com/review/689-company-of-hero...
    http://www.anandtech.com/show/8526/nvidia-geforce-...

Log in

Don't have an account? Sign up now