FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • willis936 - Thursday, March 19, 2015 - link

    I would like an actual look at added input latency from these adaptive sync implementations. Nobody has even mentioned it but there's a very real possibility that either the graphics TX or monitor's scaler has to do enough thinking to cause a significant delay from when pixels come it to when they're displayed on the screen. Why isn't the first issue to be scrutinized be the thing that these technologies seek to solve?
  • mutantmagnet - Thursday, March 19, 2015 - link

    Acer already posted the MSRP

    http://us.acer.com/ac/en/US/content/model/UM.HB0AA...

    $800
  • mutantmagnet - Thursday, March 19, 2015 - link

    I forgot to mention it's already on sale in Europe.
  • JarredWalton - Thursday, March 19, 2015 - link

    Google was failing me last night, though granted I haven't slept much in the past two days.
  • ezridah - Thursday, March 19, 2015 - link

    It's odd that on their product page they don't mention G-Sync or the refresh rate anywhere... It's like they don't want to sell it or something.
  • eanazag - Thursday, March 19, 2015 - link

    My monitors last longer than 5 years. Basically I keep them till they die. I have a 19" 1280x1024 on the shared home computer I'm considering replacing. I'd be leaning towards neither or Freesync monitors.

    I currently am sporting AMD GPUs, but I am one of those who go back and forth between vendors and I don't think it is as small a minority as was assumed. I bought two R9 290's when AMD last February. If I was buying right now, I'd be getting a GTX 970. I do like the GeForce Experience software. I'm still considering a GTX 750 Ti.

    I'm not totally sold on what AMD has in the market at the moment. I have a lot of heat concerns using in Crossfire and the wattage is higher than I like. The original 290 blowers sucked. I'd like blower cards again that are quality like Nvidia's.
  • Dorek - Thursday, March 19, 2015 - link

    Wait, you didn't just say that you use two R9 290s ona 1280x1024 monitor, right?
  • medi03 - Thursday, March 19, 2015 - link

    I don't get how 970 is better than 290x. it is slower and more expensive:
    http://www.anandtech.com/show/8568/the-geforce-gtx...

    And total system consumption is lower by about 20-25% (305w on 970 vs 365 on 290x). No big deal
  • JarredWalton - Thursday, March 19, 2015 - link

    It's not "better" but it is roughly equivalent. I've got benchmarks from over 20 games. Average for 290 X at 2560x1440 "Ultra" across those games is 57.4 FPS while the average for 970 is 56.8 FPS. Your link to Crysis: Warhead is one title where AMD wins, but I could counter with GRID 2/Autosport and Lord of the Fallen where NVIDIA wins. And of the two GPUs, 970 will overclock more than 290X if you want to do that.
  • TallestJon96 - Thursday, March 19, 2015 - link

    I'm an NVIDIA User, but in happy to see the proprietary GSYNC get beat down. I've got a 1080p144 non GSYNC panel, so I won't be upgrading for 3-5 years, and hopefully 4k and FreeSync will both be standard by then.

Log in

Don't have an account? Sign up now