FreeSync vs. G-SYNC Performance

One item that piqued our interest during AMD’s presentation was a claim that there’s a performance hit with G-SYNC but none with FreeSync. NVIDIA has said as much in the past, though they also noted at the time that they were "working on eliminating the polling entirely" so things may have changed, but even so the difference was generally quite small – less than 3%, or basically not something you would notice without capturing frame rates. AMD did some testing however and presented the following two slides:

It’s probably safe to say that AMD is splitting hairs when they show a 1.5% performance drop in one specific scenario compared to a 0.2% performance gain, but we wanted to see if we could corroborate their findings. Having tested plenty of games, we already know that most games – even those with built-in benchmarks that tend to be very consistent – will have minor differences between benchmark runs. So we picked three games with deterministic benchmarks and ran with and without G-SYNC/FreeSync three times. The games we selected are Alien Isolation, The Talos Principle, and Tomb Raider. Here are the average and minimum frame rates from three runs:

Gaming Performance Comparison

Gaming Performance Comparison

Except for a glitch with testing Alien Isolation using a custom resolution, our results basically don’t show much of a difference between enabling/disabling G-SYNC/FreeSync – and that’s what we want to see. While NVIDIA showed a performance drop with Alien Isolation using G-SYNC, we weren’t able to reproduce that in our testing; in fact, we even showed a measurable 2.5% performance increase with G-SYNC and Tomb Raider. But again let’s be clear: 2.5% is not something you’ll notice in practice. FreeSync meanwhile shows results that are well within the margin of error.

What about that custom resolution problem on G-SYNC? We used the ASUS ROG Swift with the GTX 970, and we thought it might be useful to run the same resolution as the LG 34UM67 (2560x1080). Unfortunately, that didn’t work so well with Alien Isolation – the frame rates plummeted with G-SYNC enabled for some reason. Tomb Raider had a similar issue at first, but when we created additional custom resolutions with multiple refresh rates (60/85/100/120/144 Hz) the problem went away; we couldn't ever get Alien Isolation to run well with G-SYNC using our custome resolution, however. We’ve notified NVIDIA of the glitch, but note that when we tested Alien Isolation at the native WQHD setting the performance was virtually identical so this only seems to affect performance with custom resolutions and it is also game specific.

For those interested in a more detailed graph of the frame rates of the three runs (six total per game and setting, three with and three without G-SYNC/FreeSync), we’ve created a gallery of the frame rates over time. There’s so much overlap that mostly the top line is visible, but that just proves the point: there’s little difference other than the usual minor variations between benchmark runs. And in one of the games, Tomb Raider, even using the same settings shows a fair amount of variation between runs, though the average FPS is pretty consistent.

FreeSync Features Closing Thoughts
Comments Locked

350 Comments

View All Comments

  • Oxford Guy - Friday, March 20, 2015 - link

    "Now you want existing displays that are already assembled to be pulled apart and upgraded. That would likely cost more money than just selling the displays at a discount, as they weren't designed to be easily disassembled and upgraded."

    If that's the case... I wonder why that is? Could it be the blithe acceptance of ridiculous cases of planned obsolescence like this?

    Manufacturers piddle out increments of tech constantly to try to keep a carrot on a stick in front of consumers. Just like with games and their DLC nonsense, the new mindset is replace, replace, replace... design the product so it can't be upgraded. Fill up the landfills.

    Sorry, but my $800 panel isn't going to just wear out or be obsolete in short order. People who spent even more are likely to say the same thing. And, again, many of these products are still available for purchase right now. The industry is doing consumers a disservice enough by not having standards (incompatible competing G-Sync and FreeSync) but it's far worse to tell people they need to replace otherwise perfectly satisfactory equipment for a minor feature improvement.

    You say it's not feasible to make monitors that can be upgraded in a relatively minor way like this. I say it's not. It's not like we're talking about installing DisplayPort into a panel that didn't have it or something along those lines. It's time for the monitor industry to stop spewing out tiny incremental changes and expecting wholesale replacement.

    This sort of product and the mindset that accompanies it is optional, not mandatory. Once upon a time things were designed to be upgradable. I suppose the next thing you'll fully endorse are motherboards with the CPUs, RAM, and everything else soldered on (which Apple likes to do) to replace DIY computing... Why not? Think of how much less trouble it will be for everyone.
  • Oxford Guy - Friday, March 20, 2015 - link

    "it's probable that G1 *couldn't* be properly upgraded to support TRIM" "since you were working at Intel's Client SSD department...oh, wait, you weren't." So, I assume I should use the same retort on you with your "probable", eh?
  • Oxford Guy - Friday, March 20, 2015 - link

    The other thing you're missing is that Intel never told consumers that TRIM could not be added with a firmware patch. It never provided anyone with an actual concrete justification. It just did what is typical for these companies and for publications like yours = told people to buy the latest shiny to "upgrade".
  • Gunbuster - Thursday, March 19, 2015 - link

    So G-Sync has been available to purchase for what a year now? And AMD comes to the table with something exactly the same. How impressive.

    Oh and Crossfire driver the traditional trust us Coming soon™
  • chizow - Thursday, March 19, 2015 - link

    18 months later, and not exactly the same, still worst. But yes we must give it to AMD, at least they brought something to the table this time.
  • Gigaplex - Friday, March 20, 2015 - link

    The troll is strong in this one. You keep repeating how this is technically worse than G-SYNC and have absolutely nothing to back it up. You claim forced V-SYNC is an issue with FreeSync, but it's the other way around - you can't turn V-SYNC off with G-SYNC but you can with FreeSync. You don't address the fact that G-SYNC monitors need the proprietary scaler that doesn't have all the features of FreeSync capable scalers (eg more input ports, OSD functionality). You accuse everyone who refutes your argument with AMD fanboy sentimentality, when you yourself are the obvious NVIDIA fanboy. No doubt you'll accuse me of being an AMD fanboy too. How wrong you are.
  • JarredWalton - Friday, March 20, 2015 - link

    Technically the G-SYNC scaler supports an OSD... the options are just more limited as there aren't multiple inputs to support, and I believe NVIDIA doesn't bother with supporting multiple *inaccurate* color modes -- just sRGB and hopefully close to the correct values.
  • chizow - Friday, March 20, 2015 - link

    Actually you're wrong again, Vsync is always off, there is a frame cap turned on via driver but that is not Vsync as the GPU is still controlling frame rate.

    Meanwhile, FreeSync is still clearly tied to Vsync, which is somewhat surprising in its own right since AMD has historically had issues with driver-level Vsync.

    I've never once glossed over the fact G-Sync requires proprietary module, because I've clearly stated the price and tech is justified if it is a better solution and as we saw yesterday, it clearly is.

    I've also acknowledged that multiple inputs an OSD are amenities that are a bonus, but certainly not over these panels excelling at what they are purchased for. I have 2xU2410 companion panels with TONS of inputs for anything I need beyond gaming.
  • darkfalz - Thursday, March 19, 2015 - link

    I have to give it to AMD here - I was skeptical this could be accomplished without dedicated hardware to buffer the video frames on the display, but they've done it. I still wouldn't buy one of their power hungry video cards but it's good for AMD fans. This is good news for G-Sync owners too as it should drive down the artificially inflated price (partly due to lack of competition, partly due to early adoption premium). After fiddling around with triple buffering and tripe buffering overrides for years (granted, less of a problem on DX10/11 as it seems many modern engines have some form of "free" triple buffering) it's good to go to perfect refresh rates. As a big emulation fan, with many arcade games using various refresh rates from 50 to 65 Hz, these displays are also great. Was input lag tested? AMD don't claim to have Vsync-off like input lag reduction. This would be superb in a laptop where displaying every last frame is important (Optimus provides a sort of "free" triple buffering of its own, but it's not the smoothest and often requires you to set a 60 FPS frame cap).
  • darkfalz - Thursday, March 19, 2015 - link

    By G-Sync owners, I Guess I mean NVIDIA fans / prospective G-Sync buyers. G-sync owners (like me) have already paid the premium.

Log in

Don't have an account? Sign up now