Augmented Reality for Movie Productions (II)

Part II (For part I see here)

This week, I promised you to post the interview I had some weeks earlier with the CEO of Lightcraft Technology: Eliot Mack. Eliot and me talked about their approach of using AR technology to support movie productions on set with a live preview. It brings the before mentioned previs and the post-production process together – live to the set. Their system Previzion has the claim of “Visual Effects to Go”. It’s a combination of high precision camera tracking, sophisticated rendering, and VFX quality keying that makes it ideal for on-set compositing of virtual backgrounds and CGI characters. They’ve been working on the science fiction TV-series “V” doing live previs and post-production!

(C) Lightcraft Technology

Hi Eliot. Thanks for taking your time. I’ve introduced previs to my readers and read about your system. But to kick off from the other direction: is Augmented Reality a known term in previs and do you consider your work somehow AR?

Hi Tobias. Thanks for having me. In the previsualization community, the real time combination of virtual and live action worlds has quite a few names. ‘On set live preview’, ‘stage vis’, ‘live compositing’, and ‘Simulcam’ are a few of the names. It’s the same basic concept.

In a similar vein, ‘Augmented Reality’ is a fairly wide term, but in practice it usually means through the lens recognition of fiducial targets and features, and overlaying imagery on those targets.

Previzion definitely shares some technical origins with AR work in fiducial recognition and CG/live action graphics, but has a different development focus. For example, the target recognition camera is separate from the scene camera, as the targets can’t be visible in the scene.

We ended up driving the technology into the specific realm of production level VFX work by incorporating high accuracy lens calibration, custom built inertial sensors, photorealistic real time rendering, VFX quality keying, and complete data logging to create a system to handle the very large scale problems that are encountered in the production world.

(C) Lightcraft Technology

I was introducing the reader to the previs concept before. When do customers use previs live on location?

We have two main areas: feature film/episodic previsualization, where we show a real time preview and record the motion data for use later in post production, and in camera finishing, where the keyer and photorealistic background rendering are of high enough quality to be used as the final product.

We get used anywhere there is CG mixed with live action (greenscreen, scene extensions, animated characters, vehicles, driving plates, etc.) where the speed and quality has to be very high.

Eliot gave me the permission to blog this video showing their work in action:

Your company worked on the production of the TV-series “V”, where you did on-set previs with your system. Could you describe shortly the setup we’ve seen in the video and how it works?

The video above gives a pretty good overview. The tracking sensors are mounted on the camera, the lens motors connected to the lens, and the optical fiducial targets are mounted on the ceiling. The Previzion system brings all of the raw tracking, video, and lens data together, renders the matching 3D background, composites it, and sends it out to the preview monitors with only a 5 frame delay.

Does it also work outdoors or only on stages?

For outdoor shooting, you can either mount the optical targets sideways, and point the tracking camera at them, or use encoded Technocranes, jibs, and dollys. We’ve done a complete integration with the General Lift encoding system, and adding other encoder systems is straightforward.

How long are the setup times?

Setting up the optical targets on a stage takes about a day. After the stage is calibrated, daily on set operations aren’t affected much by the system’s presence, except for the live preview’s presence.

Likewise, measuring an encoded crane takes about an hour on a pre-production tech day, and the daily setup time consists of powering on the crane encoders when the crane is in its default position, then making 2 measurements to locate the crane in the virtual space. It’s quite rapid.

Having set up all hardware – you mentioned before that you also do live compositing with e.g. 3D set extensions, that can be broadcast as is. With Augmented Reality tracking we often experience inaccuracies or tracking errors. Is the accuracy solid enough for live compositing?

The accuracy is good enough for live compositing in many cases. The floor contact is not subpixel accurate (it slides by a couple of pixels), but if you have a practical floor and can blend the CGI seam into the practical floor it’s fine. We’ve already been used to final some on the air children’s TV shows and infomercials.

Sounds great. So, let’s talk about the concept. What is the biggest advantage for whom on set? How does the system helps to visualize ideas and decisions?

When we started building Previzion, I thought the main benefit would be a big cost and time savings in post production, by automating much of the tracking process. […] The main economic benefit is to the producer, as automating the tracking enables them to use hundreds of shots/week in episodic TV (like “V”). The automation definitely happened, but what surprised me was how much the live preview opened up the on set creative process:

The main creative benefit is to the director and DP, as they can see the whole shot come together and maintain creative control over what the show looks like. Without a preview, people tend to be conservative on greenscreen shoots, as they are worried about having someone run into a (virtual) wall.

When the camera operators had a live composited feed coming over the monitors, they could suddenly operate a virtual shot exactly like a normal live action shot. They could plan camera moves with the entire composition in frame, sweep around virtual geometry to maximize the dramatic impact of the shot, and make more powerful images.

The live preview simply transforms the VFX shooting process by bringing instant visual feedback.

…continuing with instant visual feedback – what is missing today in the AR approach and what do you want to integrate in the future to the system having collected a lot of experience with “V” and other projects?

We are continuously evolving the systems in the direction of better rendering, keying, tracking, stage usability, asset management, and data recording.

We are already seeing several customers using the system to finish shots in real time, as the keying and photorealistic background rendering are quite good.

We’ll be pushing the envelope of what is possible in real time very hard, as the cost and time savings made possible by finishing on set are simply enormous.

We’re also doing a lot of work in stereoscopic rig tracking and metadata handling, and working toward live stereoscopic preview. The optical lens calibration system that we invented is a natural fit for the precise optical matching required in stereo VFX.

Since we build the entire real time pipeline, we can work toward incorporating advanced VFX tools like 4:4:4 log space camera data input, HDRI lighting and rendering, scene referred linear compositing, color grading, and device corrected output color space transforms for realistic color previews. These are all standard tools in high end feature film VFX, but with our unique software and hardware architecture we can make them work in real time, which makes those techniques viable in a huge range of productions.

Talking about HDR lighting and high end visual effects… this seems to be too much for real-time today?

At a certain level of scene complexity, of course, it becomes something you have to render in post. Since the Previzion real time VFX pipeline is set up to mirror a post production VFX pipeline, the settings that were used on set to generate a director-approved look can be used as a perfect jumping-off point for the post process, without losing data or time in the process.

So you exchange lower quality place holders with higher quality later on. Do you more and more aim at live photorealism or what is the most important feature today?

Photorealism has obviously been critical in our in-camera finishing work. What has been more surprising is the desire for sophisticated rigging, models, and lighting in the on set feature film preview world. Everyone wants to see what the movie will actually look like as soon as possible.

Much of the virtual background 3D design work that traditionally happened in post production is being pushed into preproduction in the previsualization stage. This is a good thing, as it makes all the departments communicate before and during production instead of finding problems when it is too late in post to do anything about it.

In a nutshell to conclude. The great thing about your system and AR in movie production?

The benefit to post production is that Previzion takes care of a lot of the boring work – tracking, data recording, lens matching, etc. – and helps make sure that elements shot on the stage match the background lighting, have good greenscreen lighting, etc. It basically removes most of the unpleasant surprises that can cause post budgets to skyrocket.

Great! So, thanks a lot for all this insight, Eliot!

Hope you’ve enjoyed the interview. Please let me know in the comments or on facebook, if you’d like to see more of this format (or even have an interesting interview topic for me). :-) Cheers!

Leave a Reply