What Is Ray Tracing? The Latest Gaming Buzzword Explained

Ray tracing, a new technology for creating stunning, lifelike gaming graphics, is coming to a gaming PC or console near you. To make it happen, you’ll need the correct hardware: Should you wait for it to mature before jumping in, or should you do it now?

Have you noticed your eyes enlarging a lot more recently as you play? If that’s the case, it could be due to a new technique that’s quietly gaining traction in the gaming world. Ray tracing, which has recently become viable in real-time with specialist hardware, brings you the latest, most amazing visual advancements in PC gaming.

Though ray tracing is most generally associated with PC gaming, Sony’s PlayStation 5 and Microsoft’s Xbox Series X both have the requisite technology and a growing library of titles that support it. This article explains how ray tracing differs from standard rasterization, why it’s crucial for the future of gaming, and, of course, whether ray tracing should impact your next gaming PC (or gaming laptop) or console buy.

What is ray tracing?

Simply put, ray tracing is a technique for making Light behave how it does in real life in video games. It works by replicating genuine light rays and tracing the path that a light beam travels in the natural environment using an algorithm. Game designers can use this technology to

make virtual light beams appear to bounce off objects, throw realistic shadows, and produce practical reflections.

Ray tracing technique, which was first developed in 1969, has been utilized in the film industry for years to recreate realistic lighting and shadows. Even today, though, the technique necessitates a significant amount of computational power.

“A game needs to compute each frame in 16 milliseconds to run at 60 frames per second or 120 frames per second,” explains Tony Tamasi, vice president of technical marketing at graphics card maker Nvidia. “A normal film frame is pre-rendered, and rendering a single frame can take eight, twelve, or twenty-four hours.”

This renewed interest in ray tracing comes as home gaming technology approaches the point where it can process lighting effects in real-time. The graphics chips that will be used in the next generation of gaming PCs and consoles should display ray-traced scenes on the fly. When that happens, it might cause a tectonic change in gaming images.

The basics of ray tracing

Ray tracing is a technique for illuminating a computer-generated scene that works effectively. The concept isn’t new, though—the need to tap into the computer muscle to pull it off effectively.

Imagine throwing a light ray at an object and tracing how it bounces off the surface, similar to pointing a flashlight into a dark room. Then consider firing many beams and analyzing the ones that return (and those that don’t) to determine how the scene should appear. For example, rays that failed to return were most likely obstructed by an object, resulting in a shadow. (Thinking about it in terms of how radar works is an excellent place to start.)

This simple explanation demonstrates how ray tracing is similar to real-world lighting: Your brain receives information about what you’re seeing from the Light that reaches your eye. Ray tracing has been used in animated films for decades; Pixar’s Toy Story, for example, popularized the technique in 1995, and enormous advances in rendering have been made since then.

However, video games have relied on a different approach, rasterization, for generating 3D worlds for about as long as the film industry has used ray tracing. But, before we go into the reasons for this, let’s distinguish between ray tracing and rasterization.

The fundamentals: Ray tracing versus rasterization

Rasterization is a method of scene rendering that is based on objects. After painting each thing with color, logic is used to show only the pixels nearest to the eye. On the other hand, Ray tracing colors the pixels first and then associates them with objects later. Doesn’t that explain everything?

Well, not reasonably. So consider it this way. To achieve realistic graphics, rasterization necessitates specific techniques and tuning. A game’s rendering pipeline, for example, could be tuned and optimized to apply a particular effect, such as having the pixels on an object follow a specific pattern. This form of logic will, of course, differ from item to object and scene to scene. To take advantage of this, the developer must put in some work, but it can pay off in terms of efficiency, as the computer may be able to draw a complex scene with proportionately less processing power.

Because it is based on shooting light beams, ray tracing is used more broadly than rasterization. As a result, methods for achieving visual effects depend on how those rays are utilized. Softer shadows and reflections, for example, necessitate more rays, whereas motion and blurring effects may necessitate changing ray timing or ray origin.

Overall, rasterization and ray tracing can produce the same effect (or, at least, close to it). Let’s look at why one would be preferred over the other.

Mainstream gaming through ray tracing

Rasterization gained its position in video games decades ago because the gear required to produce it was available to the general public, unlike that needed for ray tracing. This is still primarily true; gaming graphics cards are and will continue to be optimized for rasterization for many years to come.

The GeForce RTX 2080 launched Nvidia’s GeForce RTX desktop card series in 2018, bringing ray tracing into mainstream gaming for the first time. In 2020, Nvidia released its second-generation GeForce RTX 3000 series cards (including the GeForce RTX 3080), and AMD swiftly followed the Radeon RX 6000 series.

In short, ray tracing took so long to join the game landscape because the computational power required to pull it off was unaffordable at costs that would allow widespread adoption. Granted, the entry price is still high—neither AMD nor Nvidia now sells a low-cost desktop graphics card with hardware ray tracing. Because of the strong demand for (and poor availability of) video cards, the GeForce RTX 2060 is now the “entry-level” video card capable of ray tracing in hardware. It was released in 2019 at a not-so-budget $349 and now sells for far more than that from most sites.

Visual improvements with ray tracing

It’s vital to remember that ray tracing is still a newcomer to game visuals. That’s because displaying a complete game in real-time ray tracing is still much beyond today’s hardware capabilities. Ray tracing is mainly used in games that support a few effects, primarily those relating to shadows and lighting, while everything else is still rasterized.

Let’s start with a glossary of terms. Nvidia’s RTX-branded cards, such as the GeForce RTX 2060 or RTX 3080, use a proprietary graphics-rendering solution dubbed “RTX” by Nvidia. This method may draw Light paths in the game engine using DirectX 12 and, more specifically, its DirectX Ray Tracing API (DXR).

Meanwhile, DXR is a ray-tracing API that can work independently or in tandem with Nvidia’s hardware. For example, Crytek gave us a demo of their own Crytek engine a few years back that performed ray-traced reflections on an AMD Radeon RX 5000 series card (a GPU without RT processors inside). However, the performance was disappointing. If you ran the identical example on an AMD Radeon RX 6000 Series card with inbuilt RT cores, the DXR scene would be processed substantially faster.’

What makes ray tracing so potentially—ahem—game-changing?

Virtual photons

In the realm of gaming, non-ray tracing titles rely on pre-set lighting, strategically positioning light sources to evenly illuminate scenes. Yet, without knowledge of surrounding models, GPUs must dynamically calculate light interactions during rendering, limiting realistic reflections and refractions.

However, the advent of ray tracing transforms this landscape. By simulating the behavior of light using millions of virtual photons, GPUs can meticulously trace light paths, enhancing reflections, refractions, and overall illumination. This technique mirrors the real-world behavior of light, offering unparalleled realism.

At its core, light consists of photons, possessing dual characteristics of particles and waves. They bounce off surfaces, causing reflections, and refract when passing through transparent mediums. Ray tracing meticulously tracks these photons, simulating their journey to recreate lifelike visuals.

While ray tracing isn’t novel, its integration into real-time graphics is revolutionary. Historically confined to CGI, where rendering a single frame could take hours or days, modern hardware accelerates this process, enabling real-time rendering on home PCs.

Yet, the challenge persists. Games often require dynamic perspectives, necessitating constant recalculations of light interactions to match changing viewpoints. This places immense processing demands on GPUs, hindering widespread adoption on consumer-level PCs.

Although consumer PCs struggle to render ray-traced graphics at high frame rates, innovative hardware solutions offer a compromise. By optimizing resource allocation and employing clever techniques, modern GPUs effectively simulate ray-traced effects, bridging the gap between performance and realism in gaming graphics.

How is it different from what we’ve seen before?

When you look at how light works in today’s video games, it appears that all of the elements are present: reflections, shadows, bloom, and lens flare, to name a few. All of this, on the other hand, is a clever ruse. Light effects can be pre-rendered (even with ray tracing), but they’re baked into the image and are just packed animations that play out the same way every time. Although these effects appear compelling, they are, in fact, static.

“The problem with that is that it’s completely static,” Tamasi says. “Unless you render in real-time, the lighting is just going to be wrong.

If a player alters the environment by, for instance, busting a hole in a wall, the Light in the scene will not adjust to allow Light to pass through the hole unless the creators have explicitly prepared for it. With real-time ray tracing, the Light would automatically adapt.

How does ray tracing work?

In real life, Light comes to you. Waves of many tiny photons arise from a light source, bounce across and through various surfaces, and finally slam into you. Your brain then stitches together these different light rays into a single image.

Ray tracing is identical to this, except that everything moves in the other direction. Ray-traced Light starts at the viewer (or, more precisely, from the camera lens). It moves outward, plotting a path that bounces across multiple objects, sometimes even taking on their color and reflective properties, until the software determines the appropriate light source(s) for that particular ray. This way of simulating vision in reverse is far more efficient for a computer than following the glow from the Light source. After all, only those light paths visible to the user need to be drawn. It takes a lot less computing power to show what’s right in front of you than it does to render the rays emitted from all sources of light in a scene.

That isn’t to suggest it’s simple. “Every second, thousands of billions of photons penetrate your eye,” explains Christensen of the NCSA. “That’s a lot more calculations per second than a computer can accomplish… To even begin to make something look genuine, a lot of optimization, efficiency, and hacking is required.”

Nvidia’s solution is to trace only a handful of essential rays, then use machine learning algorithms to fill in the gaps and smooth everything out. “Denoising” is the term for this procedure.

“Rather than shooting hundreds or thousands of rays per pixel, we’ll actually shoot a few or maybe a few dozen,” Tamasi says. “So we use different classes of denoisers to assemble the final image.”

When is it coming?

Ray tracing in real-time is already possible—sort of. It’s accessible in select current games, including Battlefield V, Metro Exodus, and Shadow of the Tomb Raider, as well as upcoming titles like Cyberpunk 2077 and Wolfenstein: Youngblood, if you have a PC capable of handling it.

With its RTX graphics card line last year, Nvidia introduced ray-tracing capabilities. As a result, your PC would require one of these to benefit from the technology entirely. Current consoles, such as the Xbox One and Playstation 4, lack hardware.

Ray tracing will be enabled by the next generation of game consoles, especially the Playstation 5 and Microsoft’s strangely called Xbox One replacement, Project Scarlett, for those unwilling or unable to pay between $350 and $1,500 for a graphics card.

While the prospect is fantastic, it will still be a few years before the technology becomes commonplace. Real-time ray tracing is still in its infancy, and it has shown to be a bit inconsistent. Developers and designers will have to stay up as hardware improves.

“It’s a new tool in the toolbox,” Tamasi says. “We have to learn to use that new tool properly. There’s going to be a whole new class of techniques that people develop.”

Ray tracing is a lighting technique that brings an extra level of realism to games. It simulates how Light reflects and refracts in the actual world, creating a more realistic atmosphere than what is generally seen in more traditional games with static lighting. But what exactly is ray tracing? What’s more, how does it function?

Ray tracing can improve immersion with a strong graphics card, but not all GPUs can support it. Read on to see if ray tracing is necessary for your gaming experience and if spending hundreds of dollars on a new GPU is justified.

Let’s get real

Ray tracing stands as a pinnacle in 3D rendering, offering unparalleled realism that can transform even the simplest games into visually stunning experiences. However, its intricacy poses a significant challenge in simulation due to its demand for computational power.

Current implementations, like Nvidia’s RTX-driven ray tracing, employ clever approximations rather than true-to-life simulation, trading off accuracy for performance. This compromise is necessary for now, though future GPU advancements promise more authentic experiences.

Games like Battlefield V blend traditional rasterization with ray tracing for specific surfaces, showcasing detailed reflections that elevate visual fidelity. Ray tracing also revolutionizes shadow rendering, offering softer, more realistic shadows in varying light conditions, exemplified in titles like Shadow of the Tomb Raider.

While some games opt for full-scene ray tracing, such as Metro Exodus, it remains the most resource-intensive approach, necessitating powerful hardware for optimal performance. To mitigate computational strain, many games adopt partial ray tracing for specific elements like shadows or reflections, often leveraging Nvidia technologies for performance enhancements.

Despite these advancements, achieving perfection in ray-traced visuals remains elusive, requiring meticulous setup and significant development effort. As such, practical compromises and technological innovations like denoising and Deep Learning SuperSampling help optimize performance without sacrificing visual quality.

In the realm of pre-rendered content, where computational resources are less constrained, these techniques shine, offering breathtaking visuals that push the boundaries of realism.

What about AMD?

AMD has struggled to deliver hardware-accelerated ray tracing in recent years, but everything changed with the release of the RX 6800, RX 6800 XT, and RX 6900 XT. These new cards enable DirectX 12 ray tracing and give a fantastic performance, even if AMD isn’t entirely on par with Nvidia when it comes to ray tracing (read our RX 6800 XT versus RTX 3080 and RX 6900 XT versus RTX 3090 comparisons for more).

That’s not surprising, given that AMD’s RX 6000 cards use the Big Navi architecture, which is very much a first-generation of ray-tracing acceleration. It’s the same architecture that powers the PlayStation 5 and Xbox Series X images, but it’s targeted at a lower performance level than Nvidia’s top-tier cards. However, because ray tracing is such a unique feature on next-gen consoles, we anticipate improved support and optimizations in the future. AMD FidelityFX Super Resolution (FSR) for gaming PCs and the latest versions of Microsoft Xbox will be released shortly.

How can you see ray tracing at home?

You’ll need a current — and costly — graphics card to observe ray tracing at home. Only Nvidia’s RTX 20-series and 30-series GPUs and AMD’s RX 6000-series GPUs support hardware-accelerated ray tracing. Ray tracing is supported by the GTX 10- and 16-series cards, but they lack the RT cores to make it usable. RX 6000 and RTX 30-series cards are projected to be out of stock through 2021, while RTX 20-series cards have hit the end of life, implying that merchant stock is rapidly depleting. There aren’t many possibilities in late 2020, but things should change in the next few months. The rest is straightforward once you’ve obtained a graphics card.

If you plan to play at resolutions higher than 1080p and frame rates of 60 FPS or more, you should invest in a high-end graphics card. The RTX 3080 and RX 6800 XT are the best 4K cards, but if you’re willing to go to 1440p in some games, you can get by with an RTX 3070 or RX 6800.

A small number of games support ray tracing, but the number is expanding. Early RTX demos like Battlefield V, Shadow of the Tomb Raider, and Metro Exodus are excellent instances of ray tracing. Control and MechWarrior 5: Mercenaries are two recent games that appear promising. Stay in the Light is a ray-traced shadows and reflections indie horror game. Another fantastic example is the updated Quake II with RTX ray tracing.

Although there are fewer games on the market, the business continues to flourish. Competitors will soon follow the PS5 and Xbox Series X in promoting ray tracing. The multiplatform game Watch Dogs 2 differs from Watch Dogs: Legion in that the latter has implemented ray tracing.


Ray tracing functionality in games sparks division as it requires separate implementation for AMD and Nvidia systems. Nvidia gained an edge in game support due to its sole provision of ray-tracing-capable graphics cards until 2020. However, an increasing number of games now cater to both AMD and Nvidia GPUs. Notable titles embracing this inclusivity include Cyberpunk 2077, Dirt 5, Godfall, and World of Warcraft: Shadowlands.