In their latest project, the creators at Los Angeles-based company Impossible Objects were tasked with depicting an epic battle between characters from the upcoming video game, Diablo Immortal. But the showdown had to take place on the surface of a Google Pixel phone, set in the living room of a live actor.
Working with advertising agency Omelet, the team at Impossible Objects brought this vision to life using accelerated virtual production workflows to blend visual effects with live action. Using Epic Games’ Unreal Engine and NVIDIA RTX A6000-powered Dell Precision 7920 workstations, the team created all the stunning cinematics and graphics, from high-fidelity textures and reflections to realistic camera movement and lighting.
These advanced technologies helped the artists make instant creative decisions as they could view high-quality virtual imagery rendered in real time.
“We can build larger, photorealistic worlds and not worry about relying on outdated creative workflows,” said Joe Sill, founder of Impossible Objects. “With Unreal Engine and NVIDIA RTX-powered Dell Precision workstations, we brought these Diablo Immortal characters to life.”
Real-Time Technologies Deliver Impossible Results
Previously, to tackle a project like this, the Impossible Objects team would look at concept art and storyboards to get an idea of what the visuals were supposed to look like. But with virtual production, the creators can work in a nonlinear way, bridging the gap between imagination and the final high-resolution images faster than before.
For the Diablo Immortals project, Impossible Objects used Unreal Engine for previsualization — where the artists were able to make creative, intentional decisions because they were experiencing high-fidelity images in real time. Moreover, the previsualization happened simultaneously with the virtual art department and layout phases.
The team used NVIDIA RTX A6000-powered Dell Precision 7920 workstations — an advanced combination that allowed the artists to enhance the virtual production and creative workflows. The RTX A6000 GPU delivers 48 gigabytes of VRAM, a crucial spec when offline rendering in Unreal Engine. With more GPU memory, the team had room for more geometry and higher resolution textures.
“Rendering would not have been possible without the RTX A6000 — we maxed out on its 48 gigs of memory, using all that room for textures, environments and geometry,” said Luc Delamare, head of Technology at Impossible Objects. “We could throw anything at the GPU, and we’d still have plenty of performance for real-time workflows.”
Typically, this project would have taken up to six months to complete. But the nonlinear approach enabled by the real-time pipeline allowed Impossible Objects to cut the production time in half.
The video game characters in the commercial were prebuilt and provided by Blizzard. Impossible Objects used Autodesk Maya to up-res the characters and scale them to perform better in a cinematic setting.
The team often toggled between compositing software, Autodesk Maya and Unreal Engine as they ported animation back and forth between the applications. And as the project started to get bigger, Impossible Objects turned to another solution: NVIDIA Deep Learning Super Sampling, an AI rendering technology that uses a neural network to boost frame rates and produce sharp images.
“NVIDIA DLSS was incredibly important, as we were able to use it in the real-time workflow, even with characters that had high polygon counts,” said Delamare. “This solution became really helpful, especially as the project started to get denser and denser.”
At the animation stage, Unreal Engine and NVIDIA RTX allowed the team to simultaneously update cinematography and lighting in real time. The end result was fewer department handoffs, which resulted in time saved and efficient creative communication gained.
With all of these advanced technologies combined, Impossible Objects had the power to create a more efficient, iterative process — one that allowed the team to ditch linear pipelines and instead take on a much more creative, collaborative workflow.
To learn more about the project, watch the video below:
See the full article on NVIDIA's Blog