120hz.NET - Source for 120hz 1440p Monitors - GSYNC vs. FreeSync: HyperMatrix's Indepth Thoughts
Join Today
    • GSYNC vs. FreeSync: HyperMatrix's Indepth Thoughts/Review

      Hey everyone. HyperMatrix here. I decided to make use of this website to share some information regarding GSYNC and FreeSync. There appear to be tons of differing opinions on various aspects of the 2 technologies all over the internet. I'm here to share a bit of what I know, and a bit of what I think. As there are currently no FreeSync monitors out there, anyone who writes about it is merely speculating. I too, have the power of speculation and I intend to use it.




      WHAT IS IT: Adaptive Sync

      Adaptive Frame Synchronization sounds simple enough, and makes you wonder why we're only seeing it now. NVIDIA actually drew up a pretty good graph demonstrating how it works. I don't have any fancy graphs, so I will try to explain it in words. Your standard LCD monitor has a 60Hz refresh rate. That means that every second, the image you see on the screen is updated 60 times. Or rather, every 16.67ms the display is refreshed with a new image. (For the purpose and intent of this article, we're not going to get into pixel response time, which is how long after a new image is sent to the monitor, the pixels can actually transition to show it).


      So every 16.67ms, your screen displays a new picture. The problem can be seen here. Below you see the 5 refreshes and the time at which they update on your screen


      0 Seconds - Refresh 1
      16.67ms - Refresh 2
      33.33ms - Refresh 3
      50ms - Refresh 4
      66.67ms - Refresh 5


      Now here is an example of what frame generation looks like (how long between each frame, based on your current FPS):


      0 seconds - Frame 1
      12ms - Frame 2
      30ms - Frame 3
      40ms - Frame 4
      53ms - Frame 5


      So what does this all mean when you try to match up when a frame is generated on your computer, and when the display is updated with the current frame? Well look here (note: the delay between frame generation and the previous display update time is shown between < and > below)


      0 seconds - Frame 1
      <0ms delay>
      0 Seconds - Refresh 1


      12ms - Frame 2
      <4.67ms delay>
      16.67ms - Refresh 2


      30ms - Frame 3
      <3.33ms delay>
      33.33ms - Refresh 3


      40ms - Frame 4
      <10ms delay>
      50ms - Refresh 4


      53ms - Frame 5
      <13.67ms delay>
      66.67ms - Refresh 5


      So what you can see is that there is no correlation between when your computer generates a frame, and when your monitor updates the display with the new frame. In fact, depending on how the timing works out, your display (what you're looking at) could be anywhere between 0.1ms to 16.6ms behind what your computer has generated. This directly adds to input lag, and can also result in mistimed frame generation and fluctuating fps to cause the same frame to fall on to 2 or even 3 refresh cycles, which takes away from smoothness in motion.


      Now...on to Adaptive Sync. It's really quite simple. Your monitor still has a maximum refresh rate. Whether that be 60Hz, 120Hz, or 144Hz. But your monitor no longer needlessly updates every 16.67ms as was the case with the 60Hz monitor example above. Adaptive Sync waits for your computer to generate a frame, then sends just that frame to be updated and displayed on your screen. Whether you are getting 30fps, 45fps, or 60fps, your monitor will update your screen instantly, as soon as a frame is generated! This eliminates screen tearing, for sure, but also reduces input lag. You no longer end up with that variable 0.1ms to 16.6ms delay between frame generation, and screen update. This...is huge.






      GSYNC: What the heck is it?

      GSYNC at its core, is simply Adaptive Sync. But it's also more. But only sometimes. Ok here goes. In order to get your GPU and your Monitor to play nice together, Nvidia had to create a module that they named GSYNC. But when you go to the trouble of creating your own monitor scaler (GSYNC), you ask yourself if there's anything else you can do with this opportunity. And in this case, the answer is absolutely there's more you can do. Nvidia borrowed a bit from its Nvidia 3D Vision supported monitors. Certified Nvidia 3D Vision monitors had a feature called "Lightboost" that was intended for use in a 3D environment. With some driver hacks, some very clever people realized they can activate Lightboost to help reduce motion blur in 2D gaming. Nvidia also saw the benefit of this. And they've added support for an enhanced version of Lightboost, called ULMB (Ultra Low Motion Blur). ULMB is a strobed backlight that seeks to simulate some of the benefits of old CRT monitors. If you check out the BlurBusters website, Mark Rejhon has some great tests and articles showcasing the amazing benefits of ULMB.


      The downside to a strobing backlight, is that it is most beneficial at higher refresh rates. Current monitors have a minimum 85Hz requirement to enable ULMB. This means that while the GSYNC module supports ULMB, if you have a 60Hz monitor, ULMB will not be active. A backlight strobing at 60Hz could actually be quite annoying to see for many people. So this feature will be of great benefit in the 120Hz+ suite of monitors. It should be noted that due to the nature of strobing backlights, there is currently no way to activate ULMB at the same time as GSYNC. The general idea is that for games where you have lower fluctuating FPS, you keep GSYNC on. For games where you have the GPU power to run at high FPS, nothing can beat ULMB. But the important thing to take away from this is that the GSYNC module doesn't just give you Adaptive Sync, but also gives you ULMB. And also potentially support for Nvidia 3D Vision (such as with the ASUS ROG Swift).






      AMD FreeSync: So what's the difference?


      Nothing! Well...we don't know for sure. The basic functionality of AMD's FreeSync is just Adaptive Sync. However, it is absolutely imperative that I mention the following about FreeSync. FreeSync is NOT FREE. This was a brilliant bit of marketing by AMD by calling it "Free" sync. Just like the GSYNC module, FreeSync requires a custom asic scaler. The difference is that unlike Nvidia, AMD is not charging a licensing fee to use it. And FreeSync doesn't include any stereoscopic 3D support or ULMB, which means the basic hardware should be cheaper to make than GSYNC, and not have a licensing fee to add to the cost. Although that also means for enthusiast level gaming, your *standard* FreeSync monitor will fall short of what GSYNC offers.






      PRICE: So will FreeSync be cheaper?


      Probably...but likely not as cheap as you may think. There are a few issues. This gets into "business/economics" speculation. So here is what we know about FreeSync:


      Why it may be cheaper:
      - No Licensing Fee
      - Less expensive hardware due to fewer features
      - VESA DisplayPort support (it is an OPTIONAL component of DP. This is a very important fact)


      Why it may NOT be cheaper:
      - Basic integration doesn't cover ULMB support. Manufacturer can add ULMB on their own, but as a separate module which adds to cost/complexity (Note: Nvidia currently works with manufacturers to integrate GSYNC)
      - Nvidia has no current plans to support FreeSync. Why is this important? Because it limits the market for FreeSync monitors, which is an important consideration for manufacturers
      - Intel does not/cannot currently support it. There has been no communication from them about whether they will or will not support FreeSync in the future.
      - AMD is only sitting at 30% of the discrete gpu sales market


      My thoughts are as follows. Even with GSYNC monitors, the cost of the module is $100 or less for the bigger manufacturers. However, if you take a look at pricing, the difference between standard and GSYNC monitors is far more than $100. The reason for this is that in business, when you spend time/invest in/take a risk with a feature like GSYNC, you're thinking of how much more you can earn on the feature. So Nvidia tells ASUS "Hey...I've got this awesome module that will make your monitors rock" and ASUS thinks "Hmmm...if I pay an extra $100 per monitor to buy this module for enthusiasts, I need a return on my $100 investment as well. So let's up that price by another $100. This is common practice and is not some greedy move by Asus or anyone else.


      When making Niche products you take into account the cost of development and manufacture of a much smaller run. So when Asus decided to make the ROG Swift, they took the cost to develop the monitor, spread it across the expected number of sales, and voila. Now with GSYNC, they were making a product that technically 70% of new gamers could use (due to Nvidia market share). So let's say the development of the monitor cost just $1,000,000 (completely arbitrary numbers here). And Asus estimated they would sell 25,000 units in the first year. That means they could distribute the cost of development across 25,000 units at $40 a piece. So the cost of GSYNC integration would be $100 for the module/license, $100 markup for having integrated this niche feature, and $40 for development. That's a $240 premium over the same monitor without GSYNC. Now if Asus were doing the same thing with FreeSync (which they are doing right now, but haven't released prices for yet) then you could say a $50 custom asic scaler, $100 markup for niche feature, and based on 70%/30% split between Nvidia/AMD market share, an expected 10,800 units sold within the first year, bringing the development cost distribution to $92 per monitor. So $50 + $100 + $92 = $242 premium over the same monitor without FreeSync.


      The example above is important to note, because until Intel and Nvidia sign on to support DisplayPort with Adaptive Sync, all these "FreeSync" monitors will be seen as a standard monitor to them and no one is going to pay the premium for a feature they can't use. Even if the manufacturers decide to stick to a $50 markup instead of a $100 markup, that's still only a $50 price difference between the GSYNC and FreeSync. And if the FreeSync monitor also adds in ULMB integration, then that $50 price difference is gone yet again.






      So...FreeSync is pointless?


      No. Absolutely not. There are still a lot of variables in play. Even if Nvidia doesn't sign on to DisplayPort with Adaptive Sync, Intel could. If Intel supports it, you're going to see FreeSync monitors come down in price a lot, as tons of people will be able to use them. That means easier returns on investment to recover development costs, and less risks associated with launching a FreeSync product. I envision FreeSync could become a fairly standard feature in monitors if Intel supports it. But I can't see Nvidia giving in to FreeSync or giving up on GSYNC. Unless, of course, Intel does. And don't forget...if you're an AMD user, FreeSync is your only option. And vice versa with Nvidia users and GSYNC. Which actually makes it funny when you see people arguing or fighting over GSYNC vs. FreeSync, when people on either side of the argument don't really have a choice to use the other technology anyway. We are all just taking what they hand us.




      CONCLUSION:


      Both FreeSync and GSYNC are going to bring a much needed feature to LCD monitors. Adaptive Sync. And while AMD has won a major marketing battle over the naming of FreeSync, making people think it is going to be substantially cheaper than GSYNC when similarly equipped, that isn't the case. And with ULMB support baked in, GSYNC will be the better option for gaming, even if it turns out it may not be the best option for your wallet. Nvidia is betting that the same people paying the big bucks for its more expensive GPUs will line up and pay the premium for the GSYNC monitors, while AMD and its Value approach to CPUs, GPUs, and now Monitors, is hoping the lower cost of implementation could help it steal some market share away from Nvidia.


      All I know for sure is...Adaptive Sync is amazing. Whatever package it comes in.




      Monitors of note:
      ASUS ROG Swift (GSYNC, 27", 144Hz, 1440p, TN)
      Acer XB270HU (GSYNC, 27", 144Hz, 1440p, IPS)
      Acer Predator XR341CK (GSYNC, 34" Ultrawide, 144Hz, 3440x1440, IPS)
      ASUS MG279Q (Adaptive Sync -unbranded FreeSync-, 27", 120Hz, 1440p, IPS)
    • Recent Articles