Nvidia And AMD’s Nerdiest Fight Is The One That Matters Most

Nvidia And AMD’s Nerdiest Fight Is The One That Matters Most

At CES earlier this month, Nvidia and AMD traded words, with Nvidia CEO Jensen Huang saying his GPUs would “crush” AMD’s newly announce Radeon VII. AMD CEO Lisa Su was more measured in her response, but she didn’t pass up the opportunity to make a dig at Nvidia, noting that when her company finally adopts ray tracing, people would actually know what it is.

But let’s set aside their back and forth on ray tracing and GPUs and hone in on another tiff between the companies — this one is about monitors, and if you’re buying one for gaming in the next few months, you need to pay attention.

Both companies are having a minor slap fight over monitors that support variable refresh rate. What is VRR? The display on a monitor, be it one for gaming, productivity, or even just a TV, always refreshes. A 60Hz display refreshes the image 60 times a second, while a 144Hz display refreshes 144 times a second. When the display refreshes, it refreshes from top to bottom — something you can see if you slow the image waaaaaay down.

Your computer isn’t beholden to the refresh rate of your monitor though. So it might spit out 120 frames per second when your monitor only handles 60. This leads to screen tearing. That’s when your display starts drawing an image and realises the computer is like 60 images ahead so it skips ahead to catch up, leading to what looks like weird tears across the screen. Not the end of the world, but not attractive!

A monitor with a variable refresh rate is more nimble. It knows how the GPU is drawing those images every second and refreshes at the correct rate to reduce tears, making things smoother. The technology in practice is really pretty, but AMD and Nvidia employ it differently. Nvidia’s G-Sync requires proprietary tech, in this case, an Nvidia-created display scaler, built into the display to communicate quickly with the GPU, providing an experience that should theoretically be smoother.

It comes in two versions, G-Sync, which will give you good peak brightness, a decent backlight system, and solid colours, and G-Sync Ultimate HDR, which requires the display to have a higher peak brightness (1,000 nits), a better backlight system (full array), and display more colours (DCI-P3 colour gamut).

AMD’s FreeSync is based on VESA’s Adaptive Sync technology, which is a different kind of variable refresh rate tech that should work with more monitors and GPUs because no custom scaler is required. FreeSync is an offshoot of Adaptive Sync, much like Intel’s promised Adaptive Sync software will be. So all FreeSync monitors are Adaptive Sync monitors, but not all Adaptive Sync monitors support FreeSync. Because it is open source and doesn’t require special hardware beyond a DisplayPort 1.2a port, FreeSync monitors are a lot easier to make and cheaper than G-Sync.

Which is why, despite 74.18-per cent of Steam users using Nvidia GPUs to game, the majority of monitors being made and sold continue to be FreeSync, not G-Sync. When I looked on Amazon on January 24, 2019, the second bestselling monitor has FreeSync, while the bestselling G-Sync is ranked down at 34, and NPD hardware analyst Stephen Baker told me that there were “3 to 4 times as many” FreeSync monitors selling versus G-Sync.

But according to Nvidia, those sales come at a significant cost to quality. Nvidia is so sure FreeSync sucks that it spent a good portion of its CES 2019 press conference ridiculing FreeSync. CEO Jensen Huang claimed only 12 out of 400 Adaptive Sync monitor models tested actually worked, and while he didn’t cite specifics, a big chunk of those Adaptive Sync monitors are FreeSync.

Later, in a smaller press conference I attended he said, “most of the FreeSync monitors do not work.” He went on to claim, “They don’t even work with AMD’s graphics cards, because nobody tested it. And we think that is a terrible idea to let a customer buy something believing the promise of that product and have it not work.”

At a roundtable immediately following Nvidia’s press event, AMD CEO Lisa Su said denied that claim. “I don’t believe we’ve seen that,” she said. And the next day AMD Director of Product Marketing, Sasa Marinkovic, went a step further, challenging that Nvidia had not tested every AMD monitor it claimed to have tested. “Prove it,” he told me in an interview.

It sort of feels like everyone is pointing fingers at one another. So who do you trust? It’s probably safe to question both companies’ boldest claims. Remember big, for-profit corporations are not your friends. Huang, when asked about Nvidia’s decision to support Adaptive Sync, pointed to quality control issues with the current Adaptive Sync monitors on the market, while NPD’s Stephen Baker suggested the real reason is Nvidia isn’t competitive enough in the monitor space with G-Sync alone.

Nvidia is getting creamed by AMD, which is selling a lot more monitors and working with a lot more companies to make monitors. Nvidia needs to compete on its rival’s level. “[I]t’s really about cost,” Baker told me, “People may argue about technology or whatever else, but as long as there’s a significant incremental cost to G-Sync, it’s gonna be a tough sell.”

So Nvidia’s gotten on AMD’s level, and its new G-Sync compatible monitors toss out the cool Nvidia scaler and take a note from AMD, supporting Adaptive Sync though software exclusively. Vijay Sharma, Product Manager at Nvidia and head of G-Sync for the company, told me to think of all the different standards as something almost like a family tree.

At the top level is the concept variable refresh rates. From there spring three types of technology that use VRR. G-Sync with its custom scaler, Adaptive Sync, which relies on software and the DisplayPort 1.2a standard, and HDMI VRR, an HDMI version of variable refresh rate that’s slowly getting support in TVs from makers like Samsung.

These technologies then branch out. From G-Sync springs regular G-Sync and G-Sync Ultimate HDR. From Adaptive Sync springs FreeSync, FreeSync 2 (more on that in a moment), Intel’s future Adaptive Sync support, and the new G-Sync compatible standard. From HDMI VRR springs, well, not a lot— just FreeSync on a select group of Samsung TVs that only works with the Xbox One X or Xbox One S.

Back to the claims of these companies—particularly Nvidia’s repeated insinuations that AMD is just slapping its name on monitors that don’t work. Is that true?

No. What AMD is doing is saying that if you get a monitor, and it says it works with FreeSync, then it will work with your AMD GPU and give you some kind of variable refresh rate. What kind can vary a lot. A monitor might only do FreeSync when it’s being asked to refresh between 60 and 120 frames a second. It might not work below 60 — which is when many gamers with cheaper cards would want VRR the most.

The original open source FreeSync standard leaves a lot of wiggle room. Even worse, a lot of those FreeSync monitors ship with FreeSync turned off. Including the one on my desk! I didn’t even realise it had FreeSync disabled by default until I started working on this piece.

If trolling review sites and living in the spec sheets for monitors is unappealing to you, AMD points to the newer FreeSync 2 HDR, which gives a very specific set of guidelines that a monitor must meet before it can be labelled FreeSync 2. As with G-Sync Ultimate HDR, those guidelines include support for HDR and a wider colour gamut.

It also includes something called Low Framerate Compensation, or LFC. This is AMD’s way of guaranteeing gamers have a smooth gaming experience even when performance dips. Above I mentioned that not all VRR monitors are created equal, and in some, VRR only works for a limited range of refresh rates.

For a monitor to qualify for FreeSync 2 HDR, it needs a sufficiently high refresh rate range. According to AMD, when you divide the top refresh rate of a monitor by its midpoint refresh rate, it must equal 2.5 or higher. So a 144Hz display with a mid of 48Hz would pass. A 144Hz display with a middle of 84Hz (and thus a low of 60Hz) would fail. Nvidia’s new G-Sync compatible displays also require a specific number, 2.4, to pass.

Unfortunately, display makers don’t just drop the range of refresh rates that a VRR runs at on the spec sheet. That would be too easy! So if you want to know what your monitor, or future monitor, is capable of you’ll have to check out a chart of support monitors. Nvidia has one handy for G-Sync compatible displays here. AMD’s super searchable list of FreeSync monitors, including VRR range, can be found here. Using it, I learned there was a very good reason my 4K FreeSync monitor was so cheap—it only supports VRR from 40Hz to 60Hz.

If you’re planning to buy a monitor and don’t want to find yourself holding something that doesn’t do the cool promised thing well, then your safest bet is to look for monitors labelled as FreeSync 2 HDR or G-Sync Compatible. Otherwise, you should double check the monitor’s range in the links above. Remember, you want a monitor that supports VRR not just at the highest refresh rate, but at the lowest too.

And hopefully, Nvidia and AMD’s pissing match continues to benefit consumers. Right now it’s highlighted the flaw in a lot of the VRR displays being sold. More discussion could lead to things like display manufacturers making it easy to note the supported variable refresh rate range.

Or it could just make displays cheaper. Something Lisa Su insinuated when asked about Nvidia’s adoption of Adaptive Sync. She wasn’t worried about the competition. “We think that’s just that just means that it’s better for gamers.”


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.