How Real-Time Rendering From Game Engines Is Altering The VFX Landscape

ves awards 2019
17th Annual VES Awards: Nominees, Awardees & More
January 21, 2019
Hybrid Animation
Hybrid Animation and How It is Making an Impact in the Animation Industry
March 15, 2019

How Real-Time Rendering From Game Engines Is Altering The VFX Landscape

Source: https://www.starwars.com/news/we-set-the-bar-so-high-doug-chiang-on-designing-rogue-one

While films and video games have been exploring and exploiting the capabilities of visual effects or VFX tech for years now, both these media verticals have been achieving similar results through disparate paths. Though there’s been a sharing of technologies, but the usage differs for both the platforms. But of late, games are getting more and more visually stunning, and VFX-infused movies more and more immersive, which led these two worlds to finally overlap their functionality.

The overlap has thus happened in the form of real-time rendering. Gamers will be intimately familiar with how this works. You enter a new arena, scene, map, or location and start moving around and things are just there. From buildings to objects to characters, you name it. Depending on the game and the perspective of the player, a lot of these elements react to specific interactions too, and we don’t think twice about how all of this is being displayed in real-time. Now, this same technology is coming into film production, and this turn of events have one catalyst – Epic Games’ Unreal Engine.

Unreal Engine and how it all began

Typically, games aren’t anywhere near as complex as films, as the computing power required to render everything we see in real life, in real-time, would be ginormous. Besides this, man-hour required to fine-tune basic things like lighting, depth of field, and compositing would be off the charts and unfeasible for game-makers. Epic Games changed all that with the launch of the Unreal Engine. While game developers have been using Unreal Engine since it has launched, to foster greater quality render in real-time, filmmakers are catching on, too.

  • The first example of this was a short called The Human Race which mixed live-action footage with a pair of CG cars rendered in real-time to create a fantastic visual collaboration of the two, that was so seamless which took the viewers’ breath away. See for yourself.
  • Now coming to the cars, those are another victory for Unreal and real-time editing, and they harness the powers of a vehicle called The Blackbird. The Blackbird is a running and driving electric car whose length, dimensions and performance everything can be changed based on the car you want it to mimic, without the need for the actual car. It uses some breathtaking tech like LIDAR, sensors and cameras to work together and recreate reflections, lighting changes and more. We’ll let the makers of the car explain it through this interesting video from Top Gear:

Rogue One and K-2SO

Moving away from the automotive side of things going into a full-fledged film set, Rogue One: A Stars Wars Story used real-time rendering to create probably the film’s most iconic character, K-2SO. The droid had some great scenes and stole the hearts of millions of Star Wars fans, and was also rendered in real-time using Unreal.

An article on Polygon sheds more light onto how it all happened, paraphrasing John Knoll VFX Supervisor for the movie and what he said in his GDC 2017 keynote speech, “Knoll spoke at length about how Epic’s engine allowed the ILM team to render the beloved, sarcastic droid K-2SO in real time, bypassing the pre-rendering process. As a result, the team was able to see K-2SO on the screen during a specific scene instead of having to implement the rendering and editing after the fact. Knoll explained that achieving final pixels on screen helped with the production of Rogue One — and it marked the first time the studio was able to work with CGI in the moment.”

Here are some highlights from the android’s appearances in the film so you can see the final product –

Real-time Rendering in Ready Player One

Unreal Engine and real-time rendering were also used to bring the sets of dystopian sci-fi Ready Player One to life. The Spielberg-directed movie gained widespread recognition for its cutting-edge graphics, particularly the depiction of the virtual world – The Oasis.

As this Studio Daily piece reports, the legendary director himself said this was one of his most challenging projects:

The layers we had to achieve to put the OASIS on screen made it one of the most complicated things I’ve ever done,” said Spielberg. “There was motion-capture, live action, computer animation … It was really like making four movies at the same time.”

He used a combination of MOCAP, virtual reality, and real-time rendering to create essentially an entirely new world within the real-world and made it look complex, immersive and believable. The result, well, take a look and make up your own mind –

Expect more and more projects to harness the capabilities of Unreal Engine and shift to real-time rendering to make VFX more spectacular in the coming years. This technology will only grow with time, delighting the taste buds of both film makers and film fans.

Need help with specialized and highly focused VFX solutions for a film, TV or web series project? Get in touch with us at Toolbox Studio, a pioneer in the world of visual effects outsourcing.

CONTACT US NOW