This year’s virtual edition of SIGGRAPH, the premiere computer graphics conference, formally ended yesterday after a few days of announcements and sessions.
At SIGGRAPH 2021, NVIDIA presented a few of its latest researches in how to advance real-time graphics. Arguably the most interesting demonstration was that of Neural Radiance Caching, a new technique specifically designed for path traced global illumination.
Neural Radiance Caching combines RTX’s neural network acceleration hardware (NVIDIA TensorCores) and ray tracing hardware (NVIDIA RTCores) to create a system capable of fully-dynamic global illumination that works with all kinds of materials, be they diffuse, glossy, or volumetric. It handles fine-scale textures such as albedo, roughness, or bump maps, and scales to large, outdoor environments neither requiring auxiliary data structures nor scene parameterizations.
Combined with NVIDIA’s state-of-the-art direct lighting algorithm, ReSTIR, Neural Radiance Caching can improve rendering efficiency of global illumination by up to a factor of 100—two orders of magnitude.
At the heart of the technology is a single tiny neural network that runs up to 9x faster than TensorFlow v2.5.0. Its speed makes it possible to train the network live during gameplay in order to keep up with arbitrary dynamic content. On an NVIDIA RTX 3090 graphics card, Neural Radiance Caching can provide over 1 billion global illumination queries per second.
NVIDIA shared the CUDA source code for Neural Radiance Caching so that developers and researchers may further experiment with it. There’s also a detailed technical paper available for purview if you want to learn the math behind this technology.
It’s not all about lighting, however. NVIDIA’s graphics researchers have also worked on the so-called neural reflectance field textures, or NeRF-Tex for short, which aim to model complex materials such as fur, fabric, or grass in a more accurate way.
Instead of using classical graphics primitives to model the structure, we propose to employ a versatile volumetric primitive represented by a neural reflectance field (NeRF-Tex), which jointly models the geometry of the material and its response to lighting. The NeRF-Tex primitive can be instantiated over a base mesh to “texture” it with the desired meso and microscale appearance. We condition the reflectance field on user-defined parameters that control the appearance. A single NeRF texture thus captures an entire space
of reflectance fields rather than one specific structure. This increases the gamut of appearances that can be modeled and provides a solution for combating repetitive texturing artifacts.
The related technical paper can be found here. For a visual demonstration of NVIDIA’s latest graphics research projects, including Neural Radiance Caching and NeRF-Tex, check out the footage below.