Guide

What Is Ray Tracing? The Latest Gaming Buzzword Explained

Ray tracing, a new technology for creating stunning, lifelike gaming graphics, is coming to a gaming PC or console near you. To make it happen, you’ll need the correct hardware: Should you wait for it to mature before jumping in, or should you do it now?

Have you noticed your eyes enlarging a lot more recently as you play? If that’s the case, it could be due to a new technique that’s quietly gaining traction in the gaming world. Ray tracing, which has recently become viable in real-time with specialist hardware, brings you the latest, most amazing visual advancements in PC gaming.

Though ray tracing is most generally associated with PC gaming, Sony’s PlayStation 5 and Microsoft’s Xbox Series X both have the requisite technology and a growing library of titles that support it. This article explains how ray tracing differs from standard rasterization, why it’s crucial for the future of gaming, and, of course, whether ray tracing should impact your next gaming PC (or gaming laptop) or console buy.

What is ray tracing?

Simply put, ray tracing is a technique for making Light behave how it does in real life in video games. It works by replicating genuine light rays and tracing the path that a light beam travels in the natural environment using an algorithm. Game designers can use this technology to

make virtual light beams appear to bounce off objects, throw realistic shadows, and produce practical reflections.

Ray tracing technique, which was first developed in 1969, has been utilized in the film industry for years to recreate realistic lighting and shadows. Even today, though, the technique necessitates a significant amount of computational power.

“A game needs to compute each frame in 16 milliseconds to run at 60 frames per second or 120 frames per second,” explains Tony Tamasi, vice president of technical marketing at graphics card maker Nvidia. “A normal film frame is pre-rendered, and rendering a single frame can take eight, twelve, or twenty-four hours.”

This renewed interest in ray tracing comes as home gaming technology approaches the point where it can process lighting effects in real-time. The graphics chips that will be used in the next generation of gaming PCs and consoles should display ray-traced scenes on the fly. When that happens, it might cause a tectonic change in gaming images.

The basics of ray tracing

Ray tracing is a technique for illuminating a computer-generated scene that works effectively. The concept isn’t new, though—the need to tap into the computer muscle to pull it off effectively.

Imagine throwing a light ray at an object and tracing how it bounces off the surface, similar to pointing a flashlight into a dark room. Then consider firing many beams and analyzing the ones that return (and those that don’t) to determine how the scene should appear. For example, rays that failed to return were most likely obstructed by an object, resulting in a shadow. (Thinking about it in terms of how radar works is an excellent place to start.)

This simple explanation demonstrates how ray tracing is similar to real-world lighting: Your brain receives information about what you’re seeing from the Light that reaches your eye. Ray tracing has been used in animated films for decades; Pixar’s Toy Story, for example, popularized the technique in 1995, and enormous advances in rendering have been made since then.

However, video games have relied on a different approach, rasterization, for generating 3D worlds for about as long as the film industry has used ray tracing. But, before we go into the reasons for this, let’s distinguish between ray tracing and rasterization.

The fundamentals: Ray tracing versus rasterization

Rasterization is a method of scene rendering that is based on objects. After painting each thing with color, logic is used to show only the pixels nearest to the eye. On the other hand, Ray tracing colors the pixels first and then associates them with objects later. Doesn’t that explain everything?

Well, not reasonably. So consider it this way. To achieve realistic graphics, rasterization necessitates specific techniques and tuning. A game’s rendering pipeline, for example, could be tuned and optimized to apply a particular effect, such as having the pixels on an object follow a specific pattern. This form of logic will, of course, differ from item to object and scene to scene. To take advantage of this, the developer must put in some work, but it can pay off in terms of efficiency, as the computer may be able to draw a complex scene with proportionately less processing power.

Because it is based on shooting light beams, ray tracing is used more broadly than rasterization. As a result, methods for achieving visual effects depend on how those rays are utilized. Softer shadows and reflections, for example, necessitate more rays, whereas motion and blurring effects may necessitate changing ray timing or ray origin.

Overall, rasterization and ray tracing can produce the same effect (or, at least, close to it). Let’s look at why one would be preferred over the other.

Mainstream gaming through ray tracing

Rasterization gained its position in video games decades ago because the gear required to produce it was available to the general public, unlike that needed for ray tracing. This is still primarily true; gaming graphics cards are and will continue to be optimized for rasterization for many years to come.

The GeForce RTX 2080 launched Nvidia’s GeForce RTX desktop card series in 2018, bringing ray tracing into mainstream gaming for the first time. In 2020, Nvidia released its second-generation GeForce RTX 3000 series cards (including the GeForce RTX 3080), and AMD swiftly followed the Radeon RX 6000 series.

In short, ray tracing took so long to join the game landscape because the computational power required to pull it off was unaffordable at costs that would allow widespread adoption. Granted, the entry price is still high—neither AMD nor Nvidia now sells a low-cost desktop graphics card with hardware ray tracing. Because of the strong demand for (and poor availability of) video cards, the GeForce RTX 2060 is now the “entry-level” video card capable of ray tracing in hardware. It was released in 2019 at a not-so-budget $349 and now sells for far more than that from most sites.

Visual improvements with ray tracing

It’s vital to remember that ray tracing is still a newcomer to game visuals. That’s because displaying a complete game in real-time ray tracing is still much beyond today’s hardware capabilities. Ray tracing is mainly used in games that support a few effects, primarily those relating to shadows and lighting, while everything else is still rasterized.

Let’s start with a glossary of terms. Nvidia’s RTX-branded cards, such as the GeForce RTX 2060 or RTX 3080, use a proprietary graphics-rendering solution dubbed “RTX” by Nvidia. This method may draw Light paths in the game engine using DirectX 12 and, more specifically, its DirectX Ray Tracing API (DXR).

Meanwhile, DXR is a ray-tracing API that can work independently or in tandem with Nvidia’s hardware. For example, Crytek gave us a demo of their own Crytek engine a few years back that performed ray-traced reflections on an AMD Radeon RX 5000 series card (a GPU without RT processors inside). However, the performance was disappointing. If you ran the identical example on an AMD Radeon RX 6000 Series card with inbuilt RT cores, the DXR scene would be processed substantially faster.’

What makes ray tracing so potentially—ahem—game-changing?

Virtual photons

Games without ray tracing rely on static “baked-in” lighting. Light sources are placed in an environment so that Light is distributed uniformly across any given view. Furthermore, because virtual models such as NPCs and objects have no information about other models, the GPU must calculate light behavior during the rendering process. Only Light emitted from a stationary source can be reflected by surface textures to simulate shininess. Consider the following comparison of reflections in Grand Theft Auto V.

Overall, the progress of the GPU has helped this process appear more realistic over time. However, games are still not photorealistic real-world reflections, refractions, and overall illumination. The GPU must be able to trace virtual light rays to achieve this.

Visible Light is a minor component of the electromagnetic radiation family that the human eye perceives in the real world. It contains photons that have the properties of both particles and waves. Photons can only be created or destroyed and have no actual size or shape.

On the other hand, light can be identified as a stream of photons. The brighter the perceived Light is, the more photons there are. Photons bounce off a surface, causing reflection. When photons traveling in a straight line pass through a transparent medium, the line is bent or redirected. Photons that have been destroyed can be viewed as “absorbed.”

Ray tracing is a technique used in video games to simulate how light behaves in the real world. It tracks millions of virtual photons to trace the passage of simulated Light. The more virtual photons the GPU must calculate, the more surfaces it will reflect, refract and scatter off and from, the brighter the Light.

This isn’t a brand-new method. Ray tracing has been used in CGI for decades, albeit in the early days, the process required farms of computers to make an entire movie, as a single frame may take hours or even days to render. Ray-traced graphics can now be emulated in real-time on home PCs, thanks to hardware acceleration and sophisticated lighting methods that keep the number of rays to a minimum.

The following is the actual eye-opener: Scenes in CGI animation are frequently “shot” from many angles, just like in any movie or TV show. You may move a camera to catch the action, zoom in, zoom out, or pan an entire region for each frame. To simulate movement, you must adjust everything on a frame-by-frame basis, much like animation. When you put all of the footage together, you’ll have a tale that flows.

Managing a single camera is continually moving and changing viewpoints, especially in fast-paced games. The GPU in both CGI and ray-traced games must calculate not only how light reflects and refracts in any given scene but also how it is recorded by the lens — your perspective. For a single PC or console, that’s a massive amount of processing work.

Unfortunately, we still don’t have consumer-level PCs that can indeed render ray-traced graphics at high frame rates. Instead, we now have hardware that can cheat effectively.

How is it different from what we’ve seen before?

When you look at how light works in today’s video games, it appears that all of the elements are present: reflections, shadows, bloom, and lens flare, to name a few. All of this, on the other hand, is a clever ruse. Light effects can be pre-rendered (even with ray tracing), but they’re baked into the image and are just packed animations that play out the same way every time. Although these effects appear compelling, they are, in fact, static.

“The problem with that is that it’s completely static,” Tamasi says. “Unless you render in real-time, the lighting is just going to be wrong.

If a player alters the environment by, for instance, busting a hole in a wall, the Light in the scene will not adjust to allow Light to pass through the hole unless the creators have explicitly prepared for it. With real-time ray tracing, the Light would automatically adapt.

How does ray tracing work?

In real life, Light comes to you. Waves of many tiny photons arise from a light source, bounce across and through various surfaces, and finally slam into you. Your brain then stitches together these different light rays into a single image.

Ray tracing is identical to this, except that everything moves in the other direction. Ray-traced Light starts at the viewer (or, more precisely, from the camera lens). It moves outward, plotting a path that bounces across multiple objects, sometimes even taking on their color and reflective properties, until the software determines the appropriate light source(s) for that particular ray. This way of simulating vision in reverse is far more efficient for a computer than following the glow from the Light source. After all, only those light paths visible to the user need to be drawn. It takes a lot less computing power to show what’s right in front of you than it does to render the rays emitted from all sources of light in a scene.

That isn’t to suggest it’s simple. “Every second, thousands of billions of photons penetrate your eye,” explains Christensen of the NCSA. “That’s a lot more calculations per second than a computer can accomplish… To even begin to make something look genuine, a lot of optimization, efficiency, and hacking is required.”

Nvidia’s solution is to trace only a handful of essential rays, then use machine learning algorithms to fill in the gaps and smooth everything out. “Denoising” is the term for this procedure.

“Rather than shooting hundreds or thousands of rays per pixel, we’ll actually shoot a few or maybe a few dozen,” Tamasi says. “So we use different classes of denoisers to assemble the final image.”

When is it coming?

Ray tracing in real-time is already possible—sort of. It’s accessible in select current games, including Battlefield V, Metro Exodus, and Shadow of the Tomb Raider, as well as upcoming titles like Cyberpunk 2077 and Wolfenstein: Youngblood, if you have a PC capable of handling it.

With its RTX graphics card line last year, Nvidia introduced ray-tracing capabilities. As a result, your PC would require one of these to benefit from the technology entirely. Current consoles, such as the Xbox One and Playstation 4, lack hardware.

Ray tracing will be enabled by the next generation of game consoles, especially the Playstation 5 and Microsoft’s strangely called Xbox One replacement, Project Scarlett, for those unwilling or unable to pay between $350 and $1,500 for a graphics card.

While the prospect is fantastic, it will still be a few years before the technology becomes commonplace. Real-time ray tracing is still in its infancy, and it has shown to be a bit inconsistent. Developers and designers will have to stay up as hardware improves.

“It’s a new tool in the toolbox,” Tamasi says. “We have to learn to use that new tool properly. There’s going to be a whole new class of techniques that people develop.”

Ray tracing is a lighting technique that brings an extra level of realism to games. It simulates how Light reflects and refracts in the actual world, creating a more realistic atmosphere than what is generally seen in more traditional games with static lighting. But what exactly is ray tracing? What’s more, how does it function?

Ray tracing can improve immersion with a strong graphics card, but not all GPUs can support it. Read on to see if ray tracing is necessary for your gaming experience and if spending hundreds of dollars on a new GPU is justified.

Let’s get real

Because of its underlying resemblance to real life, ray tracing is an extraordinarily realistic 3D rendering technology that can even make blocky games like Minecraft look photorealistic in the appropriate circumstances. There’s just one problem: simulating it is exceedingly difficult. It’s difficult and resource-intensive to recreate how Light works in the real world, and it takes a lot of computational power.

Existing ray-tracing alternatives in games, such as Nvidia’s RTX-driven ray tracing, aren’t true to life because of this. They aren’t genuine ray tracing, which simulates every single point of Light. Instead, the GPU “cheats” by employing numerous innovative approximations to produce a visual effect similar to the original but not as heavy on the hardware. This will most certainly alter in future GPU generations, but it’s a positive move for now.

Most ray tracing games now combine classic lighting techniques, such as rasterization, with ray tracing on specialized surfaces like reflective puddles and metalwork. Battlefield V is an excellent example. You may see troops reflected in water, topography reflected on airplanes, and explosions echoed over a car’s paint. Modern 3D engines can display reflections, but not to the level of detail shown in games like Battlefield V when ray tracing is enabled.

Ray tracing can also improve the dynamic and realistic appearance of shadows. Ray-traced lighting may create considerably more realistic shadows in dark and bright environments, with softer edges and higher definition, as seen in Shadow of the Tomb Raider. It isn’t easy to achieve that look without ray tracing. Only the cautious, controlled preset, steady light sources can fool developers. It takes a lot of time and works to set up all of these “stage lights,” The effect isn’t perfect even then.

Some games go all out with ray tracing for global illumination, basically ray-tracing the entire scene. However, this is the most computationally expensive solution and requires the most powerful graphics cards to run effectively. Metro Exodus uses this technique, though the implementation isn’t perfect.

Half-measures like only ray-tracing shadows or reflecting surfaces are popular as a result. Other games use Nvidia technologies like denoising and Deep Learning SuperSampling to increase performance and hide aesthetic flaws caused by generating fewer rays than would be required to create a properly ray-traced scene. These are still reserved for pre-rendered pictures and videos, where powerful servers can generate single frames for days.

What about AMD?

AMD has struggled to deliver hardware-accelerated ray tracing in recent years, but everything changed with the release of the RX 6800, RX 6800 XT, and RX 6900 XT. These new cards enable DirectX 12 ray tracing and give a fantastic performance, even if AMD isn’t entirely on par with Nvidia when it comes to ray tracing (read our RX 6800 XT versus RTX 3080 and RX 6900 XT versus RTX 3090 comparisons for more).

That’s not surprising, given that AMD’s RX 6000 cards use the Big Navi architecture, which is very much a first-generation of ray-tracing acceleration. It’s the same architecture that powers the PlayStation 5 and Xbox Series X images, but it’s targeted at a lower performance level than Nvidia’s top-tier cards. However, because ray tracing is such a unique feature on next-gen consoles, we anticipate improved support and optimizations in the future. AMD FidelityFX Super Resolution (FSR) for gaming PCs and the latest versions of Microsoft Xbox will be released shortly.

How can you see ray tracing at home?

You’ll need a current — and costly — graphics card to observe ray tracing at home. Only Nvidia’s RTX 20-series and 30-series GPUs and AMD’s RX 6000-series GPUs support hardware-accelerated ray tracing. Ray tracing is supported by the GTX 10- and 16-series cards, but they lack the RT cores to make it usable. RX 6000 and RTX 30-series cards are projected to be out of stock through 2021, while RTX 20-series cards have hit the end of life, implying that merchant stock is rapidly depleting. There aren’t many possibilities in late 2020, but things should change in the next few months. The rest is straightforward once you’ve obtained a graphics card.

If you plan to play at resolutions higher than 1080p and frame rates of 60 FPS or more, you should invest in a high-end graphics card. The RTX 3080 and RX 6800 XT are the best 4K cards, but if you’re willing to go to 1440p in some games, you can get by with an RTX 3070 or RX 6800.

A small number of games support ray tracing, but the number is expanding. Early RTX demos like Battlefield V, Shadow of the Tomb Raider, and Metro Exodus are excellent instances of ray tracing. Control and MechWarrior 5: Mercenaries are two recent games that appear promising. Stay in the Light is a ray-traced shadows and reflections indie horror game. Another fantastic example is the updated Quake II with RTX ray tracing.

Although there are fewer ray tracing games on the market, the business continues to flourish. Competitors will soon follow the PS5 and Xbox Series X in promoting ray tracing. The multiplatform game Watch Dogs 2 differs from Watch Dogs: Legion in that the latter has implemented ray tracing.

Conclusion

Ray tracing functionality in games is currently divisive because it must be implemented separately for AMD and Nvidia systems. More games support Nvidia GPUs because, until 2020, Nvidia was the only manufacturer of ray-tracing-capable graphics cards, although more games are beginning to support both kinds. Cyberpunk 2077, Dirt 5, Godfall, and World of Warcraft: Shadowlands are examples of the latter.