ComputingComputing: Guides

FreeSync vs G-Sync

freesync vs g-sync


The FreeSync VS G-Sync debate has been raging on for a while now: even the best gaming monitors are split between their support, and the fundamental divide between AMD FreeSync VS Nvidia G-Sync drives both manufacturers to add more and better features to their products to emerge with the best graphics cards.

Still, the relative merits of G-Sync VS FreeSync can be decided through knowing what you want and what’s best suited for you, in terms of budget, needs, compatibility, and support, as well as the presence or absence of issues such as ghosting.

First, however, before we can get into a FreeSync VS G-Sync discussion, we need to establish why a comparison exists. Let’s establish a foundation; what is FreeSync and what is G-Sync?

Why FreeSync And G-Sync Matter

Both FreeSync and G-Sync are different forms, or trademarks, of a specific technology. That technology is AdaptiveSync, one that works to match the performance of your monitor, so to speak, with your graphics card.

AdaptiveSync synchronizes the screen’s refresh rate to the rate at which video frames are rendered from the graphics card. Each frame is rendered individually, meaning that the processing power of the graphics card comes into play heavily in terms of what that rate is – this leads to problems such as a new frame being drawn up before the previous one, the current one, completes rendering.

The solution to these problems – tearing artifacts, as they’re called – is AdaptiveSync. This technology is utilized as G-Sync by Nvidia’s GPUs, and as FreeSync with AMD cards, meaning that every frame is drawn completely (by the monitor) before the card sends the next one.

G-Sync: Pros And Cons

G-Sync, first released in 2013, has a first-mover advantage. It thus gained popularity using the AdaptiveSync technology and not only making the importance of such a thing known in the public’s eye but also providing a smooth implementation of the concept.

Monitor refresh rates before G-Sync were traditionally known to always be “ahead” of the GPU in terms of being able to send output. At best, this ensues in a catch-up game between the display and the graphics card.

As mentioned before, tearing and its associated complexities are thus eliminated. G-Sync achieves this through manipulation of the VBI. The VBI is the “vertical blanking interval”, the gap between the mismatch of the frames as described in the previous paragraphs. Enabling G-Sync essentially makes the GPU hold off on sending another frame before its time.

The current most relevant version of this technology, the G-Sync Ultimate, offers support for both HDR (high dynamic range) quality and 4K displays at 144Hz.

However, G-Sync has caught flack for its propriety nature as well as its steep prices – propriety meaning the monitor and graphic cards both need to be equipped with G-Sync, making the whole thing a pricey premium package, upwards of 200 dollars.

FreeSync: Pros And Cons

FreeSync appeared on the scene nearly two years after G-Sync. The name tells us a lot about why it endured even after being a late-entry competitor; it’s an open standard, “free” for all technology from the perspective of developers and manufacturers, meaning nobody has to pay royalties to AMD to use it in their equipment.

This means that thanks to their usage of the standard built into the standard DisplayPort 1.2a, any screen using this input – which means pretty much all of them except VGA, DVI, and other similar legacy connections – are compatible with this.

All of this ultimately means that FreeSync is much, much cheaper than G-Sync and much more compatible (even working with certain Nvidia equipment in the hands of the right people).

Additionally, in more recent years, certain issues with FreeSync – too little power shows empty pixels and too much power running through results in ghosting – were fixed with the updated Freesync 2 HDR, bringing it pretty much on par with G-Sync.

Conclusion: AMD FreeSync VS Nvidia G-Sync

Ultimately, there’s no question in the hardware department with G-Sync being superior… but is it enough to warrant the monumental (to many) difference in price? You decide: FreeSync-equipped devices are much cheaper, for the reasons given above than those equipped with G-Sync. Even in a device with the ghosting issue just described, power management can be tweaked so as to eliminate the imprecision that leads to them.

About author

A finance major with a passion for all things tech, Uneeb loves to write about everything from hardware to games (his favorite genre being FPS). When not writing, he can be seen in his natural habitat reading, studying investments, or watching Formula 1.
Related posts
AudioAudio: Guides

Headphones Burn-in. What is it and Does it Really Work?

LaptopsLaptops: Guides

Can an iPad Really Replace a Laptop?

MiscMisc: Guides

Can You Get a Virus on Your Smart TV? How to Protect Your TV

MiscMisc: Guides

What iPad is Right For You?

Leave a Reply

Your email address will not be published. Required fields are marked *