Ray Tracing is the latest graphical technique in acheiving a much improved and realistic graphics experience. Ray Tracing commonly referenced in PC gaming, xbox Series X and Playstation 5 also have the capable hardware to deliver such visuals. This article looks into how ray tracing differs from traditional methods and why its important for the future of graphics.
What is Ray Tracing?
Ray tracing is a technique that makes light in graphic renders behave like it does in real life. It works by simulating actual rays of light, using an algorithm to trace a path that a beam of light would take in the physical world. Using this technique, game designers and visual effects artists can make virtual rays of light appear to bounce off objets, cast realistic shadows and create lifelike reflections.
First conceptualised in 1969, ray tracing technology has been used for years to simulate realistic lighting and shadows in the film industry. An industry which could afford to wait for the render. Gaming on the other hand has had to wait for the computing power required to produce such a realtime render.
“A game needs to run 60 frames per second or 120 frames per second, so it needs to computer each frame in 16 milliseconds” says Tony Tamasi, vice president of technical marketing at graphics card developer Nvidia
To understand just how ray tracing works, we need to take a step back and understand how games previously rendered light and what was required in order to achieve the ray tracing experience.
Games without ray tracing rely on static lighting. Developers place light sources within an environment that emit light evenly across any given view. Moreover, virtual models like NPCs and objects don’t contain any information about any other model, requiring the GPU to calculate light behaviour during the rendering process (drastically increasing the computational requirements). Surface textures can reflect light to mimic shininess, but only light emitted from a static source. Take the comparison of reflections in GTA V below:
Ray tracing in games attempts to emulate the way light exists in the real world. It traces the path of simulated light by tracking millions of virtual photons. The brighter the light, the more virtual photons the GPU must calculate, and the more surfaces it will reflect and refract from.
Its also possible to utilise ray tracing in sound design as Mark Cerny suggests, particularly if you’re looking for a faster, cleaner salutation than more traditional methods provide. If you treat sound waves as much smaller rays, you can model them much the same way as light, drawing them from the source to the end user and judging where they interact with objects in the environment. The difficulty is that sound waves are generally much larger waves, reaching up to ten meters or larger, whilst the wavelength of light is much smaller, measured in nanometers, so modelling them as rays will inevitably cause inaccuracies that must be dealt with.
It is likely that you first heard of ray tracing (despite seeing it on films for years) when Nvidia started routing the ray tracing capabilities of its RTX 20 series graphic cards. Nvidia made a tremendous amount of noise about how its RT cores would enable the next generation of GPUs to bring incredible real time ray tracing to video games for the first time. It was unfortunately with an extremely high price tag. It did present however an amazingly incredible technical achievement, allowing modern gaming PC’s to do in real time what took those Hollywood studios several orders of magnitude longer.
Reception of those 20-series cards have been mixed, and sales have been tepid, but perhaps more importantly, Nvidia’s dominance of the ‘ray tracing in games’ narrative has begun to slip. There have been a number of stumbling blocks, the first and most important of which are how few games currently support ray tracing and how, even in those titles that do, it doesn’t make a glaring, immediately noticeable impact on graphics and presentation. This doesn’t come as a huge surprise, of course – most new graphics technologies, like the recent HDR renaissance, take some time to be properly rolled out and implemented, but it does look as though Nvidia was a bit too far ahead of the curve and it has begun losing its ray tracing preeminence in the interim
Lighting The Way Forward
All of this isn’t great news for Nvidia, at least in the short term, but great news for ray tracing enthusiasts. Broader hardware support means that doing the work to build ray tracing tech into games will look much more appealing to developers, because there will be an audience able to appreciate the results. And even for Nvidia, as ray tracing becomes more ubiquitous, so will sales of its RTX hardware, especially if the company is able to compress prices to accelerate mainstream sales.
It’s also good news for gamers at large. Ray tracing may not be making huge waves in a practical sense now, in large part because current support feels a bit rushed or tacked on, but as we see games constructed from the jump with ray tracing support in mind, the final products will start looking a lot more impressive. In countless demos, first from Nvidia and now from CryEngine and Unity (the games engine that recently incorporated ray tracing tools), we’ve seen the potential of ray tracing and, properly implemented, it’s as stunning as the marketing would have you believe.
The takeaway is that ray tracing is more HDR than 3D. It’s not a gimmicky, flash-in-the-pan tech that will fail to gain a foothold and exit the conversation in under a year. It really is an important part of the future of games, of ensuring that the next generation of games look closer to reality than ever before, and being able to deliver it in real time really is a stunning innovation. It’s an inevitability, and the main question around ray tracing is less ‘if’ than ‘when’.