1

The Future of Cinema is Video Games, Just Not the Way You Think

 2 years ago
source link: https://newanddigital.medium.com/the-future-of-cinema-is-video-games-just-not-the-way-you-think-b6a664099b76
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

The Future of Cinema is Video Games, Just Not the Way You Think

Up close with a new technology at the National Association of Broadcasters Convention

1*chyEqC_-3HaMiMqGv3jpjQ.jpeg

The Batman — filmed with responsive LED monitor walls instead of green screen to produce a realistic Gotham City (source: WarnerBros)

“In the next decade, the future of cinema will incorporate video game technology,” said Paul Graff, visual effects supervisor of Boardwalk Empire, Stranger Things and The Wolf of Wall Street. Graff said this to my 2011 cohort at the Academy of Television Arts and Sciences Faculty Seminar. Graff had just shown us the VFX he used in season 1 of Boardwalk Empireand had explained the advances in computer graphic design. His hybrid computer effects incorporated much of what we’d see in the Marvel Cinematic Universe: huge green screens, motion tracking, camera mapping, and a crew of talented CG artists. Graff said that it was only a matter of time before video game engines worked with CG teams, directors, and cinematographers.

A decade later, at the National Association of Broadcasters (NAB) Convention, the convention’s product of focus was the technology used in The Mandalorian Disney+ series, Rogue One, and The Batman: virtual sets combined with game engines to create responsive, immersive worlds for the viewer.

For almost a century, computers had very little do with with filmmaking. From the Emerald City in The Wizard of Oz to the the massive landscapes and space scenes in Star Wars, backgrounds were created using matte painting. Matte painting in an art that creates huge perspective paintings on large pieces of glass. The painted glass is backlit and on screen talent would be filmed in front of the background using perspective filming techniques.

1*hl8rJIoyE7eZXenv0gPXuw.jpeg

Matte painting with compositing cut out for filming in perspective from Star Wars: The Empire Strikes Back

In the 1980s and 90s, the technique of compositing (meaning merging multiple layers) introduced optical effects. Optical effects would later update to computer graphic imagery (CGI) enabling directors to build any world they could imagine. Unlike matte painting, CGI backgrounds could move. The downside is that actors had to work with huge green screens and their imagination, placing a lot of work on the director to envision perspective, angles and action. Unfortunately for the filming process, the CGI background and scenery existed in a computer in their future.

1*Cmu1NrsK7qYfX-wXODpycA.jpeg

Avengers on their way to assemble in front of a massive green screen

Then, in 2020, the cinematographers and directors behind the LucasFilm Disney+ series The Mandalorian used video game technologies to create immersive spaces that incorporated high resolution, durable monitors and game engines that produced background scenes in real time. At this year’s NAB Convention, the first in person event since 2019, this monitor and game engine technology one of the highlights of the show. The technology incorporates two parts: the screens and the game engines.

The screens

Multiple vendors displayed variations of modular snap-together screens. The snap-together screens are flexible, flat, high resolution LED monitors connected to panels that clip together without any gap whatsoever. Screens can be assembled to any size and any curve. The screens are ranked by their pixel resolution (smaller pixels/higher resolution) and ability to be calibrated to the camera. This means that you can lower the luminance of the monitors to the point they are displaying intense, high resolution color with very low light. This feature pleases directors of photography as they can literally tune the brightness of the background.

1*DA7jKynYN6eSqf32W7sSxQ.jpeg
1*XKAFEWuC-jXixEFOVVJ-PQ.jpeg
1*xF-g72oCLJwqlt_a1fVKBg.jpeg
Taylorleds is one of the many module screen companies at NAB — The center image displays the panel up close

In terms of cost, most companies can outfit a tv wall the size of a billboard for under $500,000 (which is relatively cheap in movie making). The screens can be taken apart after each shoot and rebuilt and used over and over. Actors can see what is on the screen and react in real space. The screens can be scaled to entire walls and wrapped around the ceiling and floor.

1*EhHs-FNXOyRj9jISecDyxg.jpeg

Vū is one of the largest immersive screen companies — this assembled screen is 150 feet wide

The Game Engine

When you wear virtual reality goggles, you are putting two, tiny monitors really close to your face with convex lenses to see panoramically. Each VR screen uses “differential” to move the video in synch with your head movement as you are tracked by motion capture cameras. Differential is similar to how your car spins one wheel faster and one slower on a turn. With VR differential is coded in for immersive realism and to keep you from getting nauseous. Game engines using immersive design are coded for both differential and camera angle.

Game engines are especially used to do real time rendering of objects and environments. Just like on the sets of The Mandalorian, the camera (rather than the actor) is outfit with a motion capture locator and the background moves behind the subject on screen. Unlike immovable matte paintings which required the camera move very little, the new technology moves the background with the camera to create and bend with proper perspective. See the following example:

1*QidMp4iJrcHYi3MPy4_8dw.gif

The inset displays the Unreal engine moving in sync with the camera

The subjects in front of the background can now see what’s going on behind them. On-screen talent can now be in the CGI virtual space in real time, able to personally view what the audience sees on screen.

1*2wsB-eh20mTQyNSTMfgWLg.jpeg

A full monitor set, including monitor floor, in action from the company OptiTrack

The Future

“Walls, Lydia, remember; crystal walls, that’s all they are. Oh, they look real, I must admit — Africa in your parlor — but it’s all dimensional, superreactionary, supersensitive color film and mental tape film behind glass screens. It’s all odorophonics and sonics, Lydia.” (The Veldt, Ray Bradbury, 1950)

As resolution, durability, and game engines increase in quality and speed, we can recreate the scale and perspective of the matte painting and visual accuracy of the scene. These monitor sets enable infinite amount of customization and limited only by the creator’s imagination. Entire virtual environments can be created and on-screen talent can be immersed more than ever before.

Back in 1950, Ray Bradbury imagined this technology in his sci-fi short story “The Veldt” and George Lucas calls the new technology “the stage of the future.” While we aren’t yet combining monitors with smell-o-vision and holograms, we probably aren’t that far away.

As resolution increases and game engine processing further develops, we may be able to build immersive virtual rooms in our own house. From what I saw at this year’s NAB, it’s only the beginning of virtual world building in physical space. Where we’re going, we won’t need goggles.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK