NVIDIA announced the GeForce 2080Ti and RTX series of cards, all featuring their new Turing Chipset. The Turing technology supports real time ray tracing, which fundamentally changes how artists approach rendering going forward. With both Autodesk and Pixar announcing the development of GPU software, the discussion of where it’s heading is now a moot point. The only factors that will affect the dominance and roadmap of GPU compute and render on a large scale are memory and price.
Game Engine Rendering
Both Unreal and Unity are pushing real time ray tracing into VFX with add-ons for alembic and USD pipelines. Using game engines for rendering gives rise to a fundamental shift in how we approach CG content for children’s TV and long form titles. Post production time in already fast-turnover environments can now be dramatically reduced, with more effort focussed on ideation and creativity than ever before. We can expect a significant leap forward in what we see on the small screen, closing the already narrowing gap between TV and film VFX.
Pixar demonstrated the latest incarnation of Renderman XPU at the RenderMan Art & Science Fair. Its mind blowing performance was brought to bear on industry level assets, which were digitally modelled to an exceedingly high standard. The demo featured a scene from Coco where the camera pulls away from the City of the Dead train station to show the sprawling metropolis. This was a performance in scale and showed what can be achieved with a GPU workflow. Soon we’ll be needing a DOP to direct lights in real time on virtual sets.
Autodesk showed off Arnold for GPU. Using the new NVIDIA RTX 6000 a film level Spider Man asset was rendered in the blink of an eye. This is Autodesk's first foray into GPU render, giving more options for studios who are reaping the benefits of lightning fast speed for high-end film and TV. Arnold is widely used by post production houses all over the world, and these impressive advancements will allow for new workflows and approaches in look development and real time feedback. This will also radically cut times for approvals and re-renders, giving the opportunity to add more detail to lighting, models, and textures.
Chaos Group announced project Lavina, a demo that ran a GPU render at 25fps. You can check out the clip in full below.
The demo poses a couple of questions; at what point do we reach the state of performance where OpenGL and DirectX become the bottleneck, not the solution? And in five to ten years time will we see V-Ray become the standard approach, much like Arnold overtook Pixar’s Reyes, forcing Pixar to re-think and develop their RIS architecture that we use today.
When Reyes was first developed by Pixar, it was designed to work around the limits of the technology available at the time. The ways of Brickmaps, SL, RIB files, shader writing, and other tricks seemed arcane. They got around the limits of memory and performance imposed by the hardware limits of the past. These tricks were removed by Arnold with its introduction of a production ready path tracer. This moved the world away from complexity to ease of use and let the hardware take the strain. The same analogy could be applied to DirectX and OpenGL with the methods and lookup workarounds for animation, lighting, and baking textures to squeeze the last drop of performance from the GPUs of the past. With the GPU now bringing real time ray tracing at 25fps, could history repeat itself and suddenly we move from the old methods and use real time spectral path tracing on GPUs that can push upwards of 60fps rendering? In that world render farms and “engines become a thing of the past.
Moving on from GPU, Pixar’s introduction of their open source Universal Scene Description (USD) file format could be seen everywhere during the show. USD is huge - unifying the great quantities of data produced when creating 3D scenes into one robust and universal file format. This will remove the need for multiple formats when producing 3D assets and streamline studio pipelines. The whole industry seems to be embracing USD for interchangeability and generally good practices.
Lightfields and VR
Intel ran the PresenZ video using the StarVR headset. The combination of the video and headset was awe inspiring and demonstrated the benefits of lightfields compared to VR created in a games engine. It amplified the whole immersive experience by using V-Ray’s rendering to the point where the lighting quality surpassed the cheats and lookups of real time engines. You can really see the difference in quality compared to traditional VR setups, which use workarounds to simulate lighting effects. The use of lightfields is exciting and offers so many possibilities for the world of post production as well as games.
SIGGRAPH 2018 brought us an exceptional amount of new technology from the new advancements in GPU rendering to VR and lightfields. With NVIDIA introducing support for real time GPU ray tracing, some may think that the days of work arounds and baking may be over. If I can just hit a button and get full unbiased, ray traced experience into VR including bounced light and interaction in real time, then we are truly entering a new era of rendering.