Adaptive sync technologies like G-Sync and FreeSync have become a big deal in graphics and monitor technology. They offer a real solution to a real problem that has been plaguing many monitor owners. If you’ve every experienced choppy images, stutters and odd visual affects it could be due problems with your graphics card keeping in sync with your monitor.
If you’ve ever noticed your image not quite lining up, especially during busy parts of a game, you are experiencing screen tearing. Many gamers don’t even notice it, or at least tolerate it. But once you become aware of it this affect can really detract from your experience.
The tearing occurs when your graphics card can’t keep up with the monitor. For example, a 60Hz monitor will try to refresh the screen 60 times in one second so to do that it needs to get 60 frames from the graphics card. The monitor attempts to output part of one frame at the top and part of another at the bottom and of course you get a miss-match part way through the screen.
Adaptive sync attempts to solve this problem by getting the graphics card to drive the process, this ensures the monitor adapts to stay in sync with the GPU’s ability to draw frames.
How is this different to V-sync?
There is already an existing solution to this problem and you may have already seen this option in games that you own. This visual synchronisation, or V-sync, stops the graphics card from doing anything until the monitor completes it’s refresh. This is different from adaptive sync in that it is still the monitor that drives the process.
V-sync has some problems. It solves tearing but at the cost of both latency and fixed refresh cycles (30,60,90,120). If you’ve ever used V-Sync you might have well have noticed the accompanying input lag associated with this solution.
When watching a video which has a fixed number of frames you might find frames are dropped resulting in a juddering effect. Video games on the other hand generate frames in real time, so whilst v-sync can improve the output of these frames it can lead to what is known as “input lag”. The game feels slightly disconnected when you play it
Microvascular arterial bypass and venous ligation viagra vs cialis vs levitra carried out by the National Population and Family.
. This is a real pain when playing something that requires fast reactions.
Adaptive Synchronisation Technologies
Unfortunately for the consumer there are a number of different approaches being taken right now each with their own pros and cons. We’ll give you an overview of the current situation and try to outline the pros and cons. You’ll then be in a better position to chose which horse to back! It seems quite likely that there will be some kind of convergence and standardisation in the future, but we’ll have to wait and see. I’ll keep this article updated as things develop.
VESA Adaptive-Sync
Adaptive-Sync is a standard administered by VESA. FreeSync is based upon this standard but has certain additional requirements and tests that are required before the branding can be applied. It allows your monitor to sync its refresh rate to your graphics card while inside the variable refresh rate (VRR) window of your monitor.
It is compatible with select AMD graphics cards (GCN1.1 or later architecture) and certain Intel graphics chips. It is important to note that Nvidia cards do not support this standard although it hasn’t been ruled out as a future option. If this article hasn’t been updated for a while, perhaps things have changed!
Adaptive-Sync is an industry standard that enables technologies such as FreeSync.
FreeSync
FreeSync is the competing brand from AMD and features in their Radeon line of graphics cards. The difference here is that the FreeSync is a free and open technology, it is part of VESA’s DisplayPort technology and does not incur a performance hit. It allows your monitor to sync its refresh rate to your graphics card but only within the range of the variable refresh rate (VRR) window of your monitor.You’ll often see this referred to as “FreeSync Range” but isn’t always broadcast loudly on the spec sheets. It’s an important consideration when buying a FreeSync monitor so do look up the information before hand so you aren’t caught out. The bigger the range the better but particularly note the lower end. When the frame rate drops below this range you’ll drop back to V-Sync and that’s not good. If you have a low end card that will struggle to output frames above the lower end of your screens range then you’re not going to benefit.
Whilst FreeSync is built on the Adaptive-Sync standard certain display quality thresholds have to be met before a manufacturer can use the branding. This is simply to ensure consumers get a certain standard of display when they buy FreeSync. You can read more about what separates FreeSync from Adaptive-Sync. It works over both DisplayPort and HDMI cables.
At this point I must confess that I invested in a FreeSync monitor. I was looking for a reasonably priced 144hz 24″ 1080 screen and the cost of FreeSync vs G-Sync was enough to push me to the AMD solution. G-Sync screens are, at the time of writing, considerably more expensive and whilst it is arguably a better solution the additional cost didn’t see worth it to me. You may well feel different.
Your further questions may well be dealt with in AMD’s FAQ.
FreeSync 2
In January 2017 AMD announced FreeSync 2. The original FreeSync standard has been widely adopted and can be found in the monitors of over 20 manufacturers in a multitude of different models giving stutter free smooth game play to millions of gamers around the world. The purpose of the new technology is to tackle a number of other gaming headaches that might trouble game players.
The first point to note is that FreeSync 2 isn’t exactly a replacement. You’ll still see FreeSync screens being released so think of this new development as a new set of standards that have to be met to allow you to place the FreeSync 2 badge on your hardware box. The new tech is designed to get the best images for HDR (high dynamic range) monitors with the lowest input lag.
To output images to HDR screens you typically need to do two passes to map colour tones to a scene. This technology will allow games to talk directly to AMD’s FreeSync 2 API to get the native characteristics of your monitor. With this information the game can avoid the second pass by mapping to your monitor in the first phase. You get the best image quality but also reduce rendering time, cutting lag.
In addition, the new standard makes low frame rate compensation technology mandatory which helps keep game play super smooth when your GPU is struggling. This tech is available to the existing FreeSync standard but it is not required. This helps AMD keep up with G-Sync which has never had a limitation at low frame rates.
FreeSync 2 monitors will be high end affairs, it is likely that the cheaper end will not chase full certification. AMD will need to convince game developers to support the new technology, it will have to be baked into the game itself although it is thought it shouldn’t be too onerous.
We’ll enter all the new FreeSync 2 compatible monitors into our database as they are released, we expect to see the first models in Q2 2017 along with the release of the new Vega range of graphics cards from AMD.
G-Sync
G-Sync is a technology from graphics card manufacturer Nvidia. It is a proprietary system and to make use of it your graphics card and monitor must contain the right hardware for it to work. Compatible graphics cards include the latest range from Nvidia, and older cards like the entry level GTX950, you’ll want to check your specific model number carefully to be sure. I would expect future generation cards to include the technology too, although there is always a chance Nvidia could adopt the Adaptive-Sync VESA standard.G-Syc has a variable refresh rate window that will spn the enitre range of the monitor. So if you have a 144Hz screen then you can be assured G-Sync has your back all the way from 1 to 144 frames per second. This is superior to the FreeSync system which only operates within a limited window.
G-sync works in both windowless and windowed modes (although I hear FreeSync will allow this in the future) and can work with DispalyPort and second generation HDMI. If a monitor is G-Sync certified that doesn’t just mean it contains the required chip, it also has to meet certain other standards such as limited ghosting. Think of it as a seal of approval.
You can find out more in Nvidia’s faq.
Fast-Sync
The Pascal generation of Nvidia cards, the 10 series, include Fast-Sync. This is a technology we’ve been waiting for for years! It’s another approach to eliminating screen tearing but this time you don’t need any special tech in your monitor to make it work. It allows you to have v-sync off, no tearing at the cost of a small amount of input lag – but nothing that should spoil your experience.
It only really kicks in for games where the frame rate massively exceeds the monitors refresh rate. Imagine playing a game like CS:GO with 300fps on a 60Hz screen. The GPU will continue to produce 300fps but some technical wizardary ensures that only 60 complete frames are displayed on the monitor. This gives the effect of tear free gaming but massively reduces the lag that is introduced with approaches like V-Sync as the game engine is still ticking over at 300fps. Watch this video for more information.
This makes it great for competitive games with low system requirements (CS:GO, LOL, DOTA2, and the like).
Which can I use
Not all these technologies are available on older cards, so do check carefully before making a purchase, but in general:
- AMD card or APU? You can use Free-Sync, Adaptive-Sync, V-Sync
- Nvidia? G-Sync, V-Sync, Fast-Sync, and who knows, maybe Adaptive-Sync in the future
- Intel processor with iGPU? V-Sync and Adaptive-Sync if your CPU supports it
So which is the best out of G-Sync vs FreeSync?
At the time of writing, the G-Sync setup is slightly ahead in terms of technology and what it can do but there is, ironically, a performance cost to running with G-Sync – thought to be around 2%. FreeSync only operates within a specific window of refresh rates.
But FreeSync also has the distinct advantage of being more open and is now part of the VESA DisplayPort standard, this makes it cheaper! Historically it is often the more available non-proprietary techs that become the standard. So we have to wonder if G-Sync will die out in a few years, after all Nvidia will be free to adopt the VESA Adaptive-Sync standard. It seems inconceivable that AMD would ever use G-Sync if it is going to add a premium price to their cards, they struggle to compete with Nvidia as it is. Not that Nvidia would license it anyway.
The upshot is that FreeSync monitors tend to be significantly cheaper and this alone was enough to push me over to AMD when I last upgraded my monitors – even though I currently own a Nvidia graphics card! Which of course means my next graphics card will be a Radeon!
In a way I hope that the FreeSync system wins out so that we aren’t tied to one brand or another, it would be a shame to have our choice limited in this way. If I spend a large amount on a top line gaming monitor then I want to keep it for a number of years to come and I want to be able to chose the best cost effective graphics card available at the time.
I suspect most people’s choice will be determined by the card they have in the system and how much they are willing to spend when they go to buy their next monitor. The two technologies essentially do the same thing so perhaps it is not something to get hung up about right now, but it is worth thinking through whether you want to be stuck on team green or team red for a few years to come.
Looking for a G-Sync or FreeSync monitor? Click through to see what’s available:
Leave a Reply