< Previous | Contents | Manuals Home | Boris FX | Next >

3D Tracking, Stabilization, and Effects Workflow

The handling of any 360 VR shot is of course determined by what you want to do with it. Scripts in the "360 VR" submenu will assist you. Here are the general steps involved:

Open the shot, marking it with a 360 VR mode of "Present."

Do automatic and/or supervised tracking of the shot, producing a 3D camera path and orientation and 3D locations for the tracked features.

Solve the shot to produce a camera path and 3D locations for all the tracked features.

Set up a coordinate system using the 3D locations to orient the overall scene: "Which way is up?!"

Run the Stabilize from Camera Path script to reorient and stabilize the original footage, with additional options for path-aligned cameras.

If desired, create secondary linear shots to facilitate adding 3D objects to the shot using non-360VR-capable rendering applications.

Export to other applications, either so the stabilization can be applied in After Effects, for example, or for adding 3D objects into the footage.

It's important to note that 360 VR solves, whether done natively or via linearization, typically will have much higher final errors than a typical conventional shot, due to 1) uncorrected residual distortion in the stitching; 2) synchronization errors because the cameras typically aren't genlocked; and 3) the rolling shutter effect in the small CMOS cameras typically used in VR rigs.

While SynthEyes will let you do an amazing job stabilizing the footage by reorienting it on each frame, it cannot repair the image damage done by problems with stitching, unsynchronized cameras, or the rolling shutter effect.

©2024 Boris FX, Inc. — UNOFFICIAL — Converted from original PDF.