Why Ray Tracing On Nvidia’s New GPUs Is So Exciting

Why Ray Tracing On Nvidia’s New GPUs Is So Exciting
To sign up for our daily newsletter covering the latest news, features and reviews, head HERE. For a running feed of all our stories, follow us on Twitter HERE. Or you can bookmark the Gizmodo Australia homepage to visit whenever you need a news fix.

Nvidia recently announced new high-end graphics cards with an RTX designation attached to the model numbers—that RTX indicates the card’s enhanced ray tracing abilities, which could bring a whole new level of realism to games. Here’s what you need to know about ray tracing and how it’s going to feature on the graphics cards of the future.

The name is a giveaway as to what ray tracing actually is: Trying to determine the path of photons of light in virtual environments, so those virtual environments look as realistic as possible. Being able to work out how light should fall in a scene requires a lot of computing power, even more so as objects and light sources start moving (as they tend to do in games and movies).

Ray tracing is nothing new. Coders have been experimenting with it since the very first days of computer graphics. What has changed over time is how realistic and detailed ray tracing can be, and how fast it can be computed, and with Nvidia’s new cards the technology is taking another jump forward.

How ray tracing works

Image: Nvidia

Ray tracing attempts to work out the path of light by imagining the eye looking at a scene and working backwards to the light source. The colour of each pixel must be figured out based on the objects in a scene, their relationship to each other, and the number, type, and position of the scene’s light sources—not a straightforward set of calculations at all.

The way that light falls on a black cloth is different to how it falls on a chrome sphere, of course, and ray tracing algorithms need to take into account all these properties for every object in view, and every light source hitting them. Light bounces and reflects too, making the process of figuring out the colours of a few million pixels even harder.

As a result, it takes a huge amount of processing power. We’ve seen high quality ray traced scenes for years now, but it’s been largely restricted to static images or scenes that can be rendered a long time in advance—movie CGI has been able to deploy ray tracing a long time before video games because huge banks of computers can spend days calculating the physics of a scene.

Pixar even published a paper about how it used ray tracing in the move Cars. “Adding ray tracing to the renderer enables many additional effects such as accurate reflections, detailed shadows, and ambient occlusion,” explains Pixar. (Ambient occlusion is a technique used to better work out where light is blocked in a scene, and how that’s represented on screen).

Image: Nvidia

A lot of the impressive effects that ray tracing brings—reflections, shadows, refractions—already exist in computer games, but the details of these effects are mostly fudged or estimated by graphics artists. Moving to true, real-time ray tracing has been compared to moving from graphics painted by artists to graphics calculated by physics.

Up until now, games have solely relied on a rendering technique called rasterization, where instead of bombarding a scene with millions of rays of virtual light bouncing around in every direction, graphics processors calculate how the millions of triangles that make up 3D models should look when converted to pixels and flat 2D images—but just one at a time. Simulated light rays coming straight from a virtual light source influence the colour and brightness of those rendered pixels, but multiple triangles can’t affect each other, requiring shadows, indirect illumination, and reflections to all be cleverly simulated.

On the best games of today, on the best hardware, it looks fantastic—but it’s still not quite like looking at the real world. Real-time ray tracing promises another step in that direction.

Image: Nvidia

Lighting is particularly hard to get right with rasterization: It’s treated more or less as moving in a straight line, brightening up the sides of objects closest to the light and casting a shadow on the other side, but lighting in the real world doesn’t quite work like that.

Thanks to the engineers at Nvidia, Microsoft, and others, we’ve now gotten to the stage where ultra-realistic lighting can be mapped out without being pre-rendered. You should see the difference most clearly in complicated lighting setups, where there’s a row of frosted glass doors, or a stained glass window, or a waterfall.

Ray tracing on Nvidia cards

Image: Nvidia

With the switch to the RTX moniker from GTX, Nvidia is making a big noise about the ray tracing capabilities of its new series of cards. Those cards and the Turing GPUs contained within have been engineered especially to cope with the kinds of advanced, demanding calculations ray tracing requires.

If you think about how light travels through a scene—literally at the speed of light—and how shadows, reflections, and refractions are formed (such as when light passes through water), it’s perfectly understandable why consumer graphics cards have taken all this time to get to a stage where real-time ray tracing is possible.

Check out the Shadow of the Tomb Raider demo from Nvidia below, for example. We’ve seen lighting tricks like this before, but not in the same detail or with the same level of realism: Note how strong backlight is enough to turn objects and people into complete silhouettes, something that needs the power of ray tracing to properly work.

It’s something engineers and researchers have been working towards for half a century, both in computing and movie-making. The best CGI in films now looks like it’s part of the real world, reflections and refractions and all, and now Nvidia’s new cards promise to bring the same realism to games.

“Real-time ray tracing replaces a majority of the techniques used today in standard rendering with realistic optical calculations that replicate the way light behaves in the real world, delivering more lifelike images,” says Nvidia.

Another example shown below comes from an Unreal Engine demo of the technology published back in March. Look at the detail and the quality of the reflections in the scene—suddenly it’s barely distinguishable from live action (though this demo required a hugely powerful PC setup and some background rasterization to work).

The new cards actually split ray tracing up into two components—ray casting (tracking the paths of rays) and shading (determining the final appearance of objects). The RT cores on the new RTX cards are designed to speed up ray casting specifically, so performance increases and visual improvements will vary depending on the geometry of a scene and how much other work there is to do.

Being the first consumer graphics cards to include specific hardware for ray tracing, the RTX 2000 series represents another milestone in real-time rendering. Of course shadows, reflections and other lighting tricks have been in games for years, but now they don’t have to be fudged or approximated—they can be finely calculated.

It is still a nascent technology though, and games will have to be written specifically to take advantage of the RTX hardware. At the moment, the list of games supporting the new standards is relatively thin, but it should grow and grow over time. As we’ve said, the ray tracing revolution won’t start overnight, but it is beginning.