Skip to main content

Real-time ray-tracing has tremendous potential to create more lifelike visuals for games, as well as to help film and TV studios speed up the creation of such visuals for their content. But realizing that potential will be a gradual process.

That was one of the key takeaways from an insightful panel discussion on ray-tracing, which attempts to simulate real-life lighting, held at Nvidia's (NVDA) - Get NVIDIA Corporation Report GTC developer conference on Tuesday. The panel consisted of Kevin Margo, a creative director at Nvidia; engineers at companies such as visual effects firm Weta Digital and Disney's (DIS) - Get The Walt Disney Company Report Pixar division; and executives at several rendering software firms. 

Nvidia has made the ability to play games supporting real-time ray tracing one of the key selling points for its recently-launched GeForce RTX gaming GPU line. Likewise, the ability to render ray-traced content in real-time -- as opposed to rendering it far more slowly via conventional "offline" approaches -- is a key selling point for its new Quadro RTX workstation GPU line. Both GPU families rely on specialized processing cores (they're known as RT cores) to render ray-traced content.

At the start of the panel, a series of impressive demos featuring ray-traced content were shown (Nvidia asked that no photos be taken). Luca Fascione, Weta's senior head of technology and research, showed a ray-traced scene from War for the Planet of the Apes that featured a very lifelike CGI ape, and contrasted it with a less impressive CGI ape created with traditional graphics rendering. Margo showed a movie-like clip rendered in real-time (it was also played during Nvidia CEO Jensen Huang's Monday keynote address) in which a character in an Iron Man-like suit fires projectiles, while Vlado Koylazov, CTO of image-rendering software firm Chaos Group, showed content rendered in real-time using Autodesk's (ADSK) - Get Autodesk Inc. Report popular Maya 3D animation software and a single RTX 2080 graphics card.

Nvidia and Disney are holdings in Jim Cramer's Action Alerts PLUS member club. Want to be alerted before Jim Cramer buys or sells NVDA  or DIs? Learn more now.

Though Hollywood is no stranger to creating ray-traced content, multiple panelists pointed out that using Nvidia's RTX GPUs yields major performance improvements in rendering times. Panos Zompolas, CTO of rendering software firm Redshift, suggested his firm has seen gains ranging from 20% to 30% to as much as 100% to 200%, depending on the content. Meanwhile, Jules Urbach, CEO of rendering software firm Otoy, showed a demo involving two RTX 2080 cards and talked of gains ranging from 2x to 7x.

What's more interesting than the performance gains, Zompolas added, is that "hardware-accelerated ray-tracing is blurring the line between offline rendering and real-time rendering." Real-time rendering has historically come at the cost of low image quality, while offline rendering has required anywhere from seconds to hours to create a single, high-quality frame. But hardware-accelerated ray-tracing has the potential to produce quality visuals in real-time or close to it.

One day after Nvidia unveiled its Omniverse collaboration software for 3D content creation, Margo noted that rendering in real-time can make it easier for different teams (including ones not involved with rendering) to work on content creation in real-time.

Scroll to Continue

TheStreet Recommends

At the same time, panelists admitted that there are still limitations to real-time rendering, and that offline rendering will still be used to create some content for a long time. It was noted that real-time ray-tracing requires the UV-mapping of content, and that this is something artists don't always want to be troubled by. And Koylazov pointed out that according to a calculation done by his firm, machines will need to become 100 times faster in order to render some content in real-time, and that (given the pace of GPU performance advances) it will take 30 years to get there.

Meanwhile, both Fascione and Adrien Herubel, a principal software engineer at Autodesk, observed that the amount of content being created is steadily growing, which in turn makes real-time rendering all the more challenging. "As the machines have improved...our ability to create more content...has enormously improved," Fascione said. "In 30 years, we'll have much larger scenes."

"There [are] definitely significant portions of the industry [that are] dedicated to keeping rendering times high," Herubel quipped, producing laughter from the audience.

On the flip side, Urbach pointed out that cloud infrastructures providing access to massive computing resources could help achieve a 100x performance gain sooner than expected, as could AI/deep learning algorithms (accelerated on Nvidia's RTX GPUs via Tensor Cores, and supported by Nvidia in games through a technology called deep learning super sampling, or DLSS), and architectural advances for GPUs.

He added that in the gaming world, where content naturally has to be rendered in real-time, game engines Unreal and Unity have been making "really enormous strides" with regards to ray-tracing support. He noted that (in line with Zompolas' remarks) performance advances are opening up a "tantalizing middle ground" where content rendered in real-time can come close to cinematic quality. On Monday, Unity announced that it will provide early access to a version of its game engine that supports ray-tracing on April 4.

Likewise, Max Liani, a senior lead software engineer at Pixar, suggested that in the content creation world, hardware advances can create "a middle spot" between real-time and traditional offline rendering, with artists able to more quickly make decisions related to their content even if it isn't rendered in real time. Simply rendering content at one frame per second is "plenty enough" to make a business decision, he added.

The big picture is that much like virtual reality, autonomous driving and other nascent technologies, there might not be a "Eureka moment" when real-time ray-tracing suddenly changes everything about how games and studio content are created. Instead, we might see steady advances that gradually yield better game visuals and improved content production times.

Will You Have Enough Money to Retire?

Want to learn about retirement planning from some of the nation's top experts? Join TheStreet's Robert "Mr. Retirement" Powell live in New York on April 6 for our Retirement Strategies Symposium. For a limited time, tickets are available for $99 for this full-day event. Check out the agenda, learn about the speakers and sign up here.