VR, RT, And AI: A SIGGRAPH 2019 Recap

Andrew Garrard

Senior Software Engineer, Samsung Electronics

SIGGRAPH, inaugurated in 1974, is the largest general-purpose computer graphics (“…and interactive techniques”) technical conference in the world. Since 1999 I’ve visited whenever I can. There's a lot of focus in off-line production rendering and desktop graphics, and I find that the latest research, techniques, and technology eventually percolate to real-time and mobile devices. It's also inspiring to see advances in state of the art. If you see a talk like "Fractional Gaussian Fields for Modeling and Rendering of Spatially-Correlated Media” and think “ooh!" SIGGRAPH is the conference for you.

[caption id="attachment_11480" align="aligncenter" width="699"] Pixel gets ignored by an attendee at the South Hall entrance[/caption]

This year the conference returned to Los Angeles. It was the largest since 2012, with 180 exhibitors and 18,700 attendees. No longer bigger than GDC, and less than half the size of my first SIGGRAPH, but still a reassuring upward trend. Several sessions were heavily over-subscribed – notably the always-popular “Advances in Real-Time Rendering” course (with a "…for games" suffix the last couple of years), and the NVIDIA “Deep Learning for Content Creation and Real-Time Rendering" session. Packed sessions were made more uncomfortable by poor air conditioning, so I was glad to have been acclimatized by the brief heatwave in the UK.

Mobile sessions were relatively sparse this year – although Google had several interesting papers, mostly in the area of computational photography. Unusually, Apple presented vendor sessions on RealityKit and Metal. Virtual reality remained popular, with the now-annual VR Theatre and several examples in the Emerging Technologies area. While neither phone vendors such as Samsung nor mobile GPU vendors like Arm and Qualcomm had exhibitor booths this year, engineers were present in force. In contrast, game engine titans Unity and Epic were highly visible on the show floor, as were the perennial motion capture rigs – which are becoming more effective, real-time and commoditized for game developer use.

The annual Real Time Live session goes from strength to strength. Notable this year was an encore appearance by Level Ex, demonstrating real-time ray tracing on mobile, although not necessarily rendering what you might expect. I was pleased to see both awards for the participants go to the very cool AI technology GauGAN, co-developed by Chris Hebert, a Samsung UK alumnus (see image below).

[caption id="attachment_11483" align="aligncenter" width="699"] GauGAN (draw the image on the left, get the image on the right, AI is scary)[/caption]

There is usually a hot research topic at each SIGGRAPH which dominates the technical sessions. Unsurprisingly, after last year's announcement of DXR and this year's reveal of the NVIDIA Turing architecture, there was a heavy focus on ray tracing. One could spend almost the entire conference in ray-tracing sessions and still miss some. Dedicated ray tracing hardware has expanded the number of game engines using true ray casting, increasing developer productivity. Although increasing GPU speed has allowed selective ray-traced effects for some years now, research is currently very active. The importance of appropriate sampling and ray types in practical engines was stressed to non-expert developers, and there was a public claim that ray tracing would be ubiquitous in five years. Still, even with a Titan RTX, current ray-traced scenes were often visibly noisy. Technology has advanced, but not infinitely.

While there was some evidence of this in previous years, there was a big step forward this time in noise reduction: blue noise, temporal AA, and especially machine-learning denoising networks. Some of the denoising networks can achieve remarkable results with a minimal number of samples; others are in active use by film studios to improve preview times. Currently, the most effective techniques are computationally costly, but EA discussed how best to combine real-time approaches in the PICA PICA demo, and NVIDIA discussed denoising in Quake II RTX. We are some way from mobile-ray-tracing being viable for general use, but the results achieved with such limited ray throughput are starting to convince me that it should be on everyone’s radar, especially with machine learning hardware taking off. I think we'll be busy with the rasterizer for a while yet, though!

[caption id="attachment_11481" align="aligncenter" width="699"] Exhibition floor and Geek Bar[/caption]

It wasn’t all ray tracing. Motion capture, learned character motion, and simulating real-world surfaces all got their time, along with advances in fluid and collision modeling. SIGGRAPH typically has several parallel tracks and usually manages to have at least two talks you'd like to see at any given moment, not counting the customary papers that are cool, but you'll never need to use. Fortunately, the technical papers fast-forward session was available to provide an overview of everything (with traditional damage to the Stanford bunny and armadillo models). It's too bad the reduced presentation times this year made it disappointingly more serious than usual.

Usually, the Computer Animation Festival lightens things up a bit. Alongside the showcase of major studio work, there were some examples of the traditional dark humor: Wild Love and Stuffed, as well as the inappropriate comedy of Kinky Kitchen. This year the organizers seem to have felt the need to have a message in a lot of the pieces, though. Igor Coric’s "Passage," The Stained Club, Hedgehog, and The Tree were all moving, but hardly uplifting, and I left slightly shell-shocked. Even if more political than usual, the festival did at least maintain the tradition of something weird.

[caption id="attachment_11505" align="aligncenter" width="700"] Cap’s Shield – Marvel Studios (Production Gallery)[/caption]

Victoria Alonso, executive VP of production at Marvel Studios, gave this year's keynote. She endeared herself to the audience by saying she was a long-time SIGGRAPH attendee, and further won over a subset of the crowd by saying she wanted to keep film production local to Los Angeles. Unfortunately, I was I bit less enamoured of her admitted dislike of comics and fixation on the record earnings of Avengers: Endgame — but I suppose film producers don't have to be geeks like me! Several Marvel costumes did make it to the Production Gallery for the second year running, which was cool to see.

In contrast, I had the privilege to catch Katie Bouman (above), of the Event Horizon team, giving her "Imaging a Black Hole with the Event Horizon Telescope" talk. The audience of image-processing and computer-vision professionals really appreciated this session.

Finally, the Khronos Group gave a well-attended round-up of the state of their APIs. Their after-party was unusually filled with many new faces – apparently due to the launch of the company's "3D Commerce" initiative.

[caption id="attachment_11491" align="aligncenter" width="700"] The sun sets on another SIGGRAPH[/caption]

And so another SIGGRAPH came to an end. Next year SIGGRAPH hits Washington DC, which at least means my flights will be cheaper and shorter. I hope to see you there!

Preferences Submitted

You have successfully updated your cookie preferences.