Gallery Review Europe Blog Artists ‘A marriage of film and game artists’: Why filmmakers are turning to Unreal Engine and virtual production
Artists

‘A marriage of film and game artists’: Why filmmakers are turning to Unreal Engine and virtual production


When you imagine the production process of a Hollywood movie, you probably don’t think of the best drawing tablets and the best VR headsets

However, between pandemic-related pressures and opportunities to streamline production with new hardware, movie sets are increasingly incorporating the latest technology in exciting and innovative ways. 

At the forefront of this change is Felix Jorge, CEO and co-founder of Narwhal Studios. Jorge’s résumé includes some impressive career highlights, from Disney’s live-action Jungle Book to The Mandalorian, and Ant-Man and the Wasp: Quantumania

Starting his career as a visualization artist designing shots for movies and games, Jorge later co-created Narwhal Studios, using game engines to assist in the filmmaking process. This method is known as virtual production, and it’s already being used in Narwhal Studios’ work with some of the biggest Hollywood movies and shows we know and love, including Ahsoka and The Book of Boba Fett

In an interview with TechRadar, Jorge explained how this combination of next-generation hardware and software comes together to revolutionize the production process. 

Among other projects, Felix Jorge has worked on the virtual production for The Mandalorian (Image credit: Disney)

Unreal potential 

Much like Jorge himself, virtual production has roots in visualization. He explains: “It came from the need to give directors a process to design their shots early on in production. Let’s say you are going to make a VFX movie; before investing in all those extremely expensive shots, you would hire a smaller team and design with them.”

So, instead of going to physical locations and finding shots, looking for ways to optimize one physical space and telling a story rooted in the scenery available, Jorge explains that virtual production allows production designers and art directors to “build the world and find shots in the world they’re creating”.

To do this, the Virtual Art Department will build a quick world in Unreal Engine, which creatives can then explore to find their shots, just as they would in the real world. Jorge explains that there are even specific tools for filmmakers that allow them to navigate a set just as they would in real life: “So, if you’re a camera person, all your cameras will be represented in the engine and you can place them in a VR headset or with your phone.”

Beyond visualization, virtual production is spreading to other production processes, including in-camera VFX effects, which Jorge says was pioneered in the first three seasons of The Mandalorian. “It allows us to take those game engine assets and use Unreal to put them up on screens and shoot directly into it,” he says. “So now, we have real actors pretty much in the game engine.

“That’s an interaction that never existed before, and I think that’s one of the most impactful processes. It’s revolutionizing the process of the film industry.”

If this is all sounding a bit Metaverse-adjacent, well, that’s because it is, says Jorge. “You know, it’s funny because it is [like the metaverse]. What’s beautiful about the process is that it allows someone who already has a pre-established process from the past to still do what they’re excellent at and not have to change it.”

Of course, having things exist in the digital realm does grant certain new benefits, especially when it comes to designing the world in real time. Using VR headsets, Jorge explains that the director can instruct the crew, even down to changing the time of day on set so that everyone can see the changes all at once.

While virtual production was starting to get a foothold on-premise at movie sets before the pandemic, Jorge says Covid-19 allowed the process to shift to a fully remote and virtual one. “One of the first movies we did during Covid was Ant-Man and the Wasp: Quantumania. That production was in London, then they had someone in Hawaii, someone in LA, and multiple local people, and they all wanted to stand in the world together and find their shots.”

“We would walk them through how to engage with the process, and how to create your film virtually, and then all they had to do was show up to the computer. It allowed them to have a process that’s very familiar and had a smaller footprint, but also it allows them to retain a lot of creative control.”

Breaking it down 

So, how does it all come together? Jorge explains that the process starts with the conception stage, where his team will work with filmmakers to break down the script and identify good candidates for virtual production, VFX, or good candidates for a physical location shoot.

Once he knows the scope of the virtual production, his team begins building the set virtually. Depending on the location, he says, there might be assets online to help pad out the foundation. “If you have a city set, that’s common, and if we can get all the assets online, we’re going to do that. We’ll change them and we’ll make them unique, but [it means] we don’t have to start from scratch. It’s cool because it’s a marriage of film and game artists.”

However, for movies like Ant-Man and the Wasp: Quantumania, some environments need a little more inventiveness. “It’s such a trippy world. In that movie, we had close to 15 artists helping the key creatives design the world. 

“We were working with the director on finding the action shots and the hero areas, working with the production designer designing the world itself. He had an effects artist to help with the crazy clouds, and different artists were working a lot with the VFX supervisor as well, because there was a marriage there between the art and the effects.”

Then, every week, Jorge says the team would critique the world that was being built together, and select hero areas so creative budgets weren’t being wasted. “This is the major difference between real-time and older processes for film,” he explains. “With a game engine, you can see things in real time, you can rotate them, you can apply your craft.

While concerns remain about the impact of some newer technologies like AI on original TV shows and movies, Unreal Engine as a platform for collaboration shows a more positive relationship between hardware, software, and art – and potentially one that could benefit filmmakers of all experience levels. It might just also be the first sensible use of the metaverse we’ve seen in practice so far.  

You might also like…



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version