Part Of Nvidia’s Pitch: Games Can Get Better Looking Over Time


A shot of Wolfenstein 2: The New Colossus using variable content shaders, where shaders concentrate on parts of the image that are more prominent to the user. Image: Alex Walker/Kotaku

About a hundred or so journalists, YouTubers and other tech media had just sat through about three hours of dense presentations. It was the middle of the Nvidia Editor’s Day, which was essentially a day where various Nvidia executives break down the architecture of their upcoming graphics cards in exhausting detail.

It was gruelling, particularly if you’re not a polymath. But when the crowd broke up a little, and we wandered into an adjacent room to mess with some tech demos in person, a couple of Australians started chatting about some of the techniques that the general gaming populace would start to see in the coming months.

And there’s one technique in particular that could have a particular impact.

There’s been a ton of discussion about ray tracing, both from the theoretical and practical implications once developers get more access to capable hardware and familiarity with the techniques.

But what might impact gamers a lot more — and might get much more usage than raytracing off the bat — is NGX. It’s a part of Nvidia’s Turing platform that incorporates AI and deep learning for various uses. Examples given included cheat detection, facial and character animations, but the most illustrative use case was anti-aliasing.

The basic principle is that Nvidia would generate reference a image — or ground truth, as they called it — and then use their Turing-based supercomputers to train the AI model to replicate that at higher resolutions.

There’s multiple purposes for this — another example given was inpainting, where deep learning was used to touch up images or remove unsightly parts from photos. It’s much like the spot healing tool already in Photoshop, but substantially faster:

Where it gets cool for a gaming perspective is anti-aliasing (AA). Two of the more common higher-end AA techniques these days is multisample anti-aliasing (MSAA), which offers much better image quality than FXAA, or temporal anti-aliasing (TAA).

Both techniques come at a cost. They’re more accurate than FXAA — an Nvidia-developed algorithm that’s now one of the lower-end settings for resolving jaggies in games — but they have different impacts on the hardware. But they also have their problems, ranging from ghosting, flickering, and blurry looking images in static scenes.

This slide is the perfect example of the kind of detail that current high-end AA techniques — in this case, temporal anti-aliasing, one of the most hardware intensive methods — don’t quite nail.

Zoomed out, both images look pretty good. But when you zoom in on the finer detail, you’ll notice all sorts of blurred edges, artifacting, and details that are just wrong. Anything displaying text in the distance is almost guaranteed to be buggered, but it’s the kind of detail that deep learning is incredibly effective at rendering properly.

On the 1080 Ti, which is running off the older Pascal architecture, you can see the edges of the minigun are pretty flawed. They’re not round, and in some cases it looks like part of the barrel has actually rotted away.

It happens because, as Nvidia’s Tony Tamasi pointed out, temporal AA can get confused. The technique relies on data from previous frames to make the best estimate possible for what the image should look like going forward. But if you’ve got something moving at high motion, or you’ve got text that’s far away in the background, you’re going to run into issues.

Having a neural network constantly reference against an established image — particularly with fine, sharp details like lettering — helps to correct that problem. And the most important element: DLSS wasn’t a more resource-intensive technique than what’s currently available.

Put simply: if you’ve got an RTX GPU, and a game comes or is patched with support for DLSS, you’d be silly not to use it. But therein lies the problem.

A live demo of Wolfenstein 2: The New Colossus using variable shaders.

Only a handful of games will have support for DLSS by the end of the year, let alone when the full RTX range of cards becomes available (as of this week for the RTX 2080, next week for the RTX 2080 Ti). It does include titles like Hellblade: Senua’s Sacrifice, PUBG, Islands of Nyne, We Happy Few and Hitman 2 and Final Fantasy XV.

But for the most part, games will need to patch DLSS support over the coming year. And some major titles using RTX tech aren’t supporting DLSS initially: Battlefield 5 will ship with raytracing support, but not DLSS, and Metro Exodus is in the same boat.

[referenced url=”https://www.kotaku.com.au/2018/08/all-the-games-that-will-use-nvidias-rtx-tech-so-far/” thumb=”https://www.kotaku.com.au/wp-content/uploads/sites/3/2018/08/nvidia-rtx-2080-ti-geforce.jpg” title=”All The Games That Will Use Nvidia’s RTX Tech (So Far)” excerpt=”What developers can do with real-time ray-tracing is pretty cool. Question is. what games can you play in the near future that will actually take advantage of it?”]

However, that’s also where a neat little opportunity opens up.

For any of the NGX based features to work, games will have to implement part of Nvidia’s NGX API. Apart from users having to own RTX cards — because the neural network runs off the tensor core hardware that’s only available on those cards — developers will have to work with Nvidia to patch in support on a driver level.

So — much like what happened with Nvidia’s Ansel, although hopefully with a much greater adoption rate — what you’ll end up seeing is new AA techniques pop up in games as they become patched in.

Most users will get this through GeForce Experience (GFE), although the middleware isn’t necessary for the NGX to function. How often the patches will be rolled out externally to GFE is another matter. Nvidia have spent the last couple of years incentivising sign-ups for GFE, and more frequent driver updates are a logical way of doing that.

Precisely how much work it’ll take to implement NGX in existing games wasn’t explained, although Nvidia is working with engine makers (Unity, Epic and so on) to incorporate the NGX API at an engine level.

Of course, there’s a catch.

Nvidia explained in a follow up Q&A that games that don’t take advantage of raytracing won’t use the RT cores on the RTX GPUs. The RTX 20 series cards are still an upgrade on the 10 series hardware, but it’s a chicken and egg scenario: without the hardware, studios have no incentive to develop for future-facing techniques like raytracing. And without the games that support the fancy new features, why fork out the premium to be an early adopter?

Someone has to bite the bullet to move the industry forward, though. And it’s worth remembering that paying $1200 for an RTX 2080 seems absurd — if you paid $1200 for a Founder’s Edition GTX 1080 at launch, or a GTX 1080 Ti after they launched, or if you’ve upgraded your GPU in the last few years. That price looks a lot different for someone who’s been keeping an R9 390x going, or a creaking SLI rig from the Maxwell era.

The benefit for someone upgrading from that far back is liable to be substantial enough with or without raytracing and all the other benefits. It’s those gamers, as well as the people with more money than sense, who are liable to be first adopters.

And that crowd is going to be introduced to something neat: driver updates that, eventually, will make their games look better. There’s still plenty of ifs and buts about this, of course: while Nvidia has seeded a range of developers with RTX hardware to get started, most of the games that take advantage of the new toys won’t land until next year.

And then you’ve got the wealth of games already out, ones that RTX 2080 Ti and RTX 2080 owners will inevitably want to replay at 4K with the highest settings — when will those games take advantage of raytracing, AI-powered anti-aliasing, neural network enhanced upscaling?

But whether it’s curiosity, an eye for business or a natural development decision, some developers will jump on board. And the gamers who go on that path will get an intriguing experience over the next year. I suspect the second generation of adopters will be the ones who truly get the generational leap that’s being envisioned with Nvidia’s new architecture, but as before, new techniques and tools don’t move forward by themselves.

Nvidia aren’t alone, though. And despite the incremental and monumental changes planned and proposed, the most interesting question for all will probably be this: what will AMD do?


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.