The new graphics API comes with new buzzwords. We’ll tell you what they mean and how they matter to your gaming experience.
Just when you think you’ve grasped all the jargon surrounding 3D graphics, new terms and technologies flood onto the market.
AMD has been aggressively shipping DirectX 11 GPUs in almost every price category, while cards based on Nvidia’s new GTX 470 and GTX 480 DX11 parts are finally becoming available. Meanwhile, Windows 7’s sales ramp has been extraordinary-the fastest-selling Microsoft OS in history. Given that Windows 7 is what Vista should have been, it’s also arguable that DirectX 11 is what DX10 should have been.
When DirectX 10 games hit the streets, the new API gave users marginal improvements in image quality alongside huge performance decreases. The tiny gain in visual fidelity didn’t really make up for the performance hit. On the other hand, DirectX 11 brings users some very cool potential eye-candy improvements, but also promises better performance-even if you don’t have a DirectX 11 GPU.
Along with new graphics, APIs come with new buzzwords: tessellation, SSAO, HDAO and postprocessing. That last buzzword being a catchphrase for many small but cool effects made possible with today’s programmable graphics chips.
We’ll take a closer look at these buzzwords to dissect what they actually deliver, plus discuss the performance impact of using high-end AMD and Nvidia GPUs.
Tessellation essentially creates something from nothing-or more properly, more from less. Hardware tessellation, which is required by DirectX 11, means that the GPU can generate more triangles from existing geometry using the hardware tessellation engine that’s part of the graphics chip. Now, generating more triangles for a flat surface is pointless-after all, a flat square looks like a flat square, whether it’s two triangles or 2000. What’s more interesting is generating more triangles for an actual 3D model. Let’s look at a simple example, the cobblestone surface from Microsoft’s DirectX developer’s kit.
In the top-right screen, we have a flat surface that looks somewhat more realistic by the application of a bump map. Bump maps fake you into thinking a flat polygon has depth by modelling the way light falls on a bumpy object (such as cobblestones.) However, if you were to bring the camera level with the pavement surface, you’d realise it was actually a flat surface. If geometry is tessellated, the cobblestones are actually 3D, as seen in the lower-right screen.
The tessellation in the cobblestone image is handled by a technique known as displacement mapping. A displacement map is just a special greyscale texture map in which different shades of grey define how much the geometry is displaced.
Cobblestones are nice, but will we ever see differences in real games? Let’s look at the recently released Metro 2033 (below). The left image is the game with tessellation disabled; tessellation is enabled in the right image. Note how the object is more rounded in the second shot. The effect is somewhat subtle here, but the point stands: this is the beginning of the end of polygonal heads. Tessellation means that character heads will someday all be rounder.
In this sceen from Metro 2033, you can see how tessellation makes it possible to create curved edges.
Yet another example of tessellation, from the DirectX SDK, shows a technique known as subdivision surfaces (below). The key idea in this technique is to start with a basic set of polygons, then divide them in ways that make sense for the object at hand. In this character model, we overlay the textures on top of the visible wireframe. You can see the additional geometry added in the right-side screen, as well as the more naturalistic, rounded features.
Other Uses for Tessellation
Tessellation is great for creating rounder heads and more realistic cobblestones. But it has other uses, too. Take water, for example. Instead of using pixel shaders to build better-looking water, just add more triangles… a lot more triangles, as in the case of the Nvidia Island demo.
Tessellation makes water appear more real in Nvidia’s Island demo.
In the new racing game Dirt 2, cars driving through water will throw up waves in the DirectX 11 version of the game, using hardware tessellation to generate hundreds of triangles to form the effect. In DX9 mode, you see some spray, but no waves, and the water puddle itself can be as few as two triangles.
Tessellation adds waves and ripples to a scene in Dirt 2.
Tessellation Going Forward
Tessellation offers the promise of better, more realistic-looking 3D objects, but it’s no panacea. As with any new technique, developers will have to be smart about its implementing. It’s easy to use tessellation to create objects that look wrong. On top of that, there’s the performance issue. While modern DirectX 11 GPUs have hardware tessellation engines, resources aren’t infinite. Turn up tessellation too much, and you’ll see a severe performance hit. Game developers will likely use the technology as part of sophisticated LoD (level of detail) schemes where close-up, important objects (characters) are tessellated, while distant or unimportant objects are tessellated less-or not at all.
Transparency Antialiasing Not So Special Anymore
Better-quality antialiasing with transparent textures was heavily touted by both Nvidia and AMD just a couple of years ago. Nvidia called this transparency antialiasing while AMD’s term was adaptive antialiasing. This is a classic case of a feature that improves image quality at the time, but isn’t really considered bleeding-edge these days.
The problem lies with the way transparency is handled in many games. Transparent objects are polygons with texture maps applied where some of the texture is transparent. Examples of this are chain-link fences, bare tree limbs, and overhead wires.
Adaptive antialiasing essentially smooths out the edges bordering on the transparent areas within those textures. Think of it as AA inside the polygon.
Without Transparency Antialiasing
With Transparency Antialiasing
Transparency, or adaptive, antialiasing works well when a game supports it (as seen in the screen above), but tessellation could provide a universal substitute.
For transparent AA to work, the game must test for alpha (the transparent part), but also disable alpha blend (where the transparent texture is combined with a background colour to create a new colour. This is sometimes used to create translucent (partially transparent) objects.
Valve’s Source game engine does this, so if you enable adaptive antialiasing (AMD) or transparent antialiasing (Nvidia) in the graphics control panel, you’ll see the effect, as in the Left 4 Dead screenshot here.
However, alpha blending and other techniques are used that prevent these techniques from working. For example, enabling the feature has no effect at all in most games that use the Unreal game engine. Also, technologies like tessellation may eventually make transparency AA obsolete. If those bare tree limbs can be built with polygons representing the limbs themselves, those polygons can be antialiased with standard multisampling AA, and you don’t need to mess around with adaptive AA.
Achieving great realism through light and shadows
It’s all about light. Without light, you can’t see. In 3D games, all lighting is created using mathematical cheats-approximations of how real-world lighting behaves. Some of the most interesting lighting effects lie in the absence of light: darkness and shadows.
Shadows have evolved from simplistic shadow maps-where the shadows all looked the same from any angle-to the more sophisticated techniques used in today’s games.
Variations on ambient occlusion are becoming increasingly more popular. Ambient occlusion takes into account how light falls on objects to create shadows, and that the properties of light and shadows change over distance. Crysis was one of the first games to attempt a form of ambient occlusion, known as screen space ambient occlusion (SSAO). SSAO techniques try to determine where a point in the scene exists relative to other points, and the effect that light falling onto that point has on other parts of the scene. Objects have reflective properties, and may in turn bounce light to other parts of the scene-even those blocked from the direct light source.
Real-world objects tend to have crevices, wrinkles and depressions, which may not be directly lit by a light source (the sun, for example). But they aren’t dark, either-they pick up light being bounced off other parts of the environment or even a nearby surface of the same object that is in direct light. Previous games often ignored this, so crevices and depressions were either completely dark or looked as brightly lit as the other parts of the object.
Other types of ambient occlusion found in newer games include high-definition ambient occlusion (HDAO) and horizon-based ambient occlusion (HBAO). These are still variations on the same idea-that where a pixel exists relative to other pixels determines how light falls on it, how it bounces that light, and what type of light it is (direct or reflected.)
Without Ambient Occlusion
With Ambient Occlusion
The circled areas in this DirectX 11 SDK example show how high-definition ambient occlusion (HDAO) produces more realistic details. Notice the increased depth, sharper lines, and greater shadowing.
Above is an HDAO sample from the DirectX SDK. Note the internal shadowing made possible in the lower screen by using this ambient occlusion technique. It is relatively subtle, but the overall scene seems more realistic when you’re running the application in full-screen mode. In the top shot, HDAO is disabled; the bottom one has HDAO enabled.
With DirectX 11, a new technique is emerging called contact hardening. If you think about how real-world shadows behave, you’ll realise that a shadow doesn’t look the same along its full length. Close to the object-say, at the base of a lamp post or tree-the line between shadow and light is sharply delineated (the “hard” in contact hardening.) The farther away from the object, the shadow is more diffuse. That’s because farther away, light seeps into the shadow area from the surrounding environment. Contact hardening shadows using Direct 11 graphics emulate this look. Right now, the only game using contact hardening shadows is STALKER: Call of Pripyat (below).
In this DirectX 11 SDK sample, shadows have harder edges near the object and softer edges farther away, as in real life.
Making graphical magic after the image is rendered
Postprocessing is where effects are applied to the 3D image after the frame is rendered. The term postprocessing comes from the film industry, where effects are added to movies after the movie is actually shot.
Postprocessing is really a catch-all for special effects that are generated, typically with shader programs, and aren’t necessarily part of an existing graphics API. Of course, the GPU itself needs to be programmable.
Adding effects to rendered frames first began to show up with DX9 games. We’ve seen increasing use of postprocessing effects in DX10 and now, DX11 titles. A wide variety of postprocessing effects are possible; examples include depth of field, heat distortion, wet distortion, bokeh, dynamic blur and film grain.
Some of these effects can be used to add realism to a scene. Heat distortion above a fire or hot desert sand is a good example of that. Other effects actually make the game less realistic, but more cinematic. Examples of these include depth of field, film grain, and bokeh effects. (Bokeh is the blurriness you see in out-of-focus areas of a photograph. The quality of the bokeh is one of the parameters used to rate the quality of a camera lens.)
In Just Cause 2, you’ll see bokeh effects when you use an Nvidia graphics card.
We’ve all seen these effects in movies and television, but rarely experience them in real life. Game developers add these effects to make their games seem more like big-screen movies. This makes sense in some games, like the over-the-top action of the recently released Just Cause 2, which emulate big-budget action movies in their overall feel. If you have an Nvidia-based graphics card running with the PC version of Just Cause 2, you’ll see bokeh effects in action.
The use of bokeh helps to focus the player’s attention on whatever is nearby. Clever shader-program writing can give developers granular control over the effect, as we see in the AMD Ladybug depth-of-field demo (downloadable from the AMD developer website.) This demo gives the user control over aperture settings, as if they were shooting with a camera. Stopping the camera down results in a soft blurry background behind the sharply focused image. Opening up the aperture brings the background into better focus.
The game Borderlands (below) uses depth of field to focus your attention on whatever you’re aiming at with your weapon.
Programmable shaders took a big step forward with DirectX 11. Previously, if a programmer wanted to add multiple effects to a scene, the shader program became large and unwieldy. Now they can call in shader routines as needed, allowing for more efficient writing of shader programs and more efficient use of effects.
The Performance Impact
How does all this graphics goodness affect frame rates?
If we set the wayback machine to a couple of years ago, we’d relive the disappointment we all experienced with the first DirectX 10 games. Visual effects added only marginally to image quality, but the performance hit was huge. What’s the impact of these spiffy new DirectX 11 features? Also, what’s the impact of postprocessing effects? Obviously, adding more shader programs can impact performance, but how much?
We tested performance with hardware tessellation using the Unigine Heaven 2.0 synthetic benchmark (which uses a real game engine) and STALKER: Call of Pripyat. Call of Pripyat was also used to test performance with SSAO, HBAO, and HDAO.
The performance of Nvidia’s GTX 480’s tessellation engine looks pretty awesome relative to AMD’s part in a benchmark like Heaven, but as we can see from Call of Pripyat, the impact of tessellation on real games is less clear. There just aren’t enough titles yet that make heavy use of hardware tessellation to determine which GPU is superior. Subjective experiences differ; Metro 2033‘s performance, for example, seems to give the edge to Nvidia, though we don’t have hard numbers to back this up. On the other hand, Aliens vs Predator is a smooth experience on both AMD and Nvidia’s latest cards.
Just Cause 2 supports two interesting GPU postprocessing features if you’re running an Nvidia-based card: bokeh and water simulation. Thus, we tested the GTX 480’s performance with and without those features using the game’s Concrete Jungle built-in benchmark.
As you can see, enabling these features incurs a performance cost. But that cost is a few percentage points, rather than the 75–80 per cent decrease we saw moving from DX9 to DX10.
In the past, comparisons regarding performance versus visual features revolved around antialiasing and anisotropic filtering. DirectX 10 added some new tricks to the game developer’s arsenal, but came with a severe performance penalty. DirectX 11’s new features can affect performance, but the new generation of graphics cards enables you to run with much better visual fidelity while maintaining reasonable performance.
It takes time for developers to take advantage of new features. The good news is that the uptake on DirectX 11–capable GPUs has been one of the most rapid in recent history. We are starting to see increasing use of capabilities that first began showing up with DirectX 9-finally. For example, it’s hard to find a current-generation game that doesn’t take advantage of the postprocessing effects made possible with programmable shaders. Developers continue to experiment with postprocessing effects, as we’ve seen with the bokeh setting in Just Cause 2. And features like film grain and depth of field are commonplace. Newer titles bring new effects, such as emulating colour filters seen in big-budget movies and TV shows.
Good tools will be the key to seeing new features take hold. One reason postprocessing has become so common is that graphics programmers have developed tools similar in concept to Photoshop filters, which allows artists to easily implement them in the art pipeline. It will be some time before similar tools are readily available for newer, DX11-capable hardware.
Then there’s the multiplatform question. Larger game publishers are leery of pushing high-end, PC-exclusive features if they’re shipping big-budget titles across multiple platforms, including game consoles that may not support tessellation or other features. While the PC has made something of a comeback in the gaming arena, putting additional developer resources into PC-exclusive abilities is still something of an afterthought.
Still, we are seeing new games emerge that take full advantage of new graphics possibilities. Eastern Europe seems to be an emerging haven for bleeding-edge development of PC games, if the STALKER series, Cryostasis, and Metro 2033 are any indication. And even console-oriented titles, like Dirt 2, can be architected to take advantage of new APIs on PCs.
So if you have one of the new generation of DirectX 11 cards, turn up the eye candy and experiment. Your games can look better than ever.