Stabilization is essential for safely viewing 360VR footage from moving cameras. This introductory tutorial shows the stabilization of a 360VR sequence from an ultra-light aircraft (courtesy 360Rize), and the export of that stabilization to After Effects. This demo was presented at the 2017 NAB Show.
2017-04-28. 10:48 long.
Shows how to solve long shots---regular or 360VR---by processing them as individual pieces, then combining the results. The tutorial shows an almost 19 minute (33,659 frame) 360VR shot being tracked, solved, and stabilized. Input footage is at https://youtu.be/dtaTAJcGCXI and the result at https://youtu.be/MH6ZsWR8Zzc
2017-05-18. 21:00 long.
This is a 360VR tour of the interpretive trail at Black Rock Sanctuary, Chester County, PA. SynthEyes was used to stabilize the original Ricoh Theta S footage (https://youtu.be/dtaTAJcGCXI) using the method described in https://youtu.be/lViQjXzqcW8 The virtual-world/real-world lock is fixed throughout the shot, which is the hallmark of advanced SynthEyes stabilization. As a result, the shot is very watchable. See 2:50, 5:55, 8:40, 10:45 also 13:55, 15:45, 17:25 for trickier spots. Of course, to be a good tour we really need a higher-resolution camera! Remaining artifacts in the footage: up and down motion due to walking and remaining unfixed tracking issues, but most significantly due to rolling shutter (jello) in the two CMOS camera sensors. That manifests itself in spots where a portion of the shot moves, while the rest remains stationary. So make that a higher-resolution global shutter camera!
2017-05-18. 18:44 long.
This is a 360VR tour of the interpretive trail at Black Rock Sanctuary, Chester County, PA, as shot on a Ricoh Theta S camera (~2K with only accelerometer-based in-camera stabilization). See: 2:50, 5:55, 8:40, 10:45 also 13:55, 15:45, 17:25. This is the raw footage for our Solving Long Shots tutorial (https://youtu.be/lViQjXzqcW8). The resulting stabilized, and much more watchable, version of this video can be found at https://youtu.be/MH6ZsWR8Zzc You can open both in separate tabs, and compare the two.
2017-05-18. 18:44 long.
Shows how to set the viewing direction at the beginning of a 360VR movie, by panning the entire shot in the 3D environment.
2017-05-02. 3:32 long.
These tutorials show different methods to align the horizon, ie up direction, of a 360VR shot. It's a good idea to have a passing familiarity with these, as even if you don't use a particular method, the tutorials tend to show techniques for working with 360VR shots as a whole, and provide more insight into what you're doing.
Shows techniques for native 360VR solving, including how to set up autotracking and rough rotoscoping. In many cases, this method will replace the linearization-based 360VR handling discussed in earlier tutorials. Footage courtesy 360Rize.
2017-02-15. 9:39 long.
Shows a simpler form of filtering-based stabilization for long 360VR shots by breaking them into pieces, stabilizing the pieces, then recombining them. (See https://youtu.be/lViQjXzqcW8 for the better, more rigid, method.) Shows the resulting stabilization data being exported to AfterEffects, in this case for the use of SkyBox's plugin. A SynthEyes-provided plugin can also be used; data can also be exported to HitFilm.
2017-04-13. 25:23 long.
Shows how to change the aim direction of the linearized camera view, so it can continue to follow the best trackable area, as well as how to solve just a portion of a shot, then work through the rest, for early detection of problems.
2016-09-13. 11:31 long.
Shows how to quickly delete spurious trackers after auto-tracking using a slightly subtle feature of the tracker panel's Delete button. This is especially useful for 360VR shots, which may have spurious trackers on the camera mount or out in the sky.
2017-02-15. 1:09 long.
Shows how you can export the 360VR stabilization that you do in SynthEyes to AfterEffects, so that you can do all your 2D compositing work in AfterEffects, without having to generate stabilized images in SynthEyes. Requires plugin from the Customer-Only portion of the website(1608) or within SynthEyes(after 1608). Disabled for SynthEyes Demo. Footage courtesy of 360Rize.com
2016-08-26. 4:56 long.
Here we show the simpler version of 360VR stabilization, which doesn't use a full 3D solve of the scene, for shots where an absolute world orientation is not required. Suitable for short shots using Peg mode, or long shots using low-pass-filtering-based stabilization. Footage courtesy 360Rize.com
2016-08-24. 12:05 long.
Shows how the Create Spherical Screen script can be used to export 360VR shots to downstream applications, including the effect of 360VR stabilization in SynthEyes. Exports to Blackmagic Design's Fusion. Full use of the 360VR result in Fusion requires a 360VR Fusion camera, perhaps Domemaster from Andrew Hazelden. Footage courtesy 360Rize.com
2016-08-24. 9:49 long.
Demonstrates a way to export normal or 360VR stabilization from SynthEyes to other apps for compositing, so that the actual image manipulation is performed there. This can be done using animated image distortion maps to describe the effect of the stabilization. Here we show it with Blackmagic Design's Fusion. Footage courtesy 360Rize.com
2016-08-24. 11:22 long.
Gives a quick run-through of the spherical screen now built into the perspective view when it is handling 360VR shots. This eliminates the need to run the Create Spherical Screen script for the purpose of viewing (it's still useful for exporting). Footage courtesy 360Rize.com
2016-08-24. 3:34 long.
This is an introduction to what we're going to be doing in this 12-part tutorial: tracking and stabilizing an example shot from 360 Heros. We then insert a variety of objects into it, including adding shadows, and finally at the end we'll composite the whole thing together in After Effects. Footage used with permission of 360 Heros. 3D Models licensed from Turbosquid.
2016-05-09. 5:49 long.
This key section shows how to 3D-track the 360VR shot by creating and tracking a normal linear perspective shot; SynthEyes has the tools builtin to do that quickly and easily, and to convert back and forth as needed. The result is a nice world-stabilized shot.
2016-05-09. 10:07 long.
The initial 3D track creates trackers only in the area that the generated linear camera is looking at. We'd like to have more trackers throughout the 3D environment to use for reference when inserting objects; here we show how to get them using the Add Many dialog.
2016-05-09. 4:47 long.
Here we show how to generate more zero-weighted trackers throughout the entire shot by auto-tracking it. This may necessitate some garbage-matte roto work to mask out the camera platform, but it can generate a nice distribution throughout the shot without much thought.
2016-05-09. 9:40 long.
Normally 360 VR shots are world-stabilized in SynthEyes, so that the virtual and real world are aligned the same way throughout the shot, so that the user can easily understand the relationship. Here we show how to align the "forward" direction versus the camera's path, if that is desired as an artistic choice especially for dynamic shots.
2016-05-09. 6:43 long.
Shows how to set up the perspective view for 360 VR shots, then inserts a building into the shot. Shows how to determine the lighting direction or placement. You can generate a quick 360 VR test movie right out of SynthEyes at this point!
2016-05-09. 13:07 long.
We show that you can export to Blender and use its built-in 360 degree camera to render inserts (using the Cycles renderer). This approach can be used with other applications that have a 360 VR camera, even if the exporter doesn't support it, by changing a regular camera to 360 VR.
2016-05-09. 10:46 long.
Shows how to render inserted objects using any rendering application, even if it does not have a 360 VR camera built-in. A script creates a normal perspective camera that precisely follows the inserted object; that scene is exported and rendered. SynthEyes then converts the images back to 360 VR. Despite the somewhat more complex process, the tight render view results in a quick overall render time.
2016-05-09. 10:55 long.
Shows the insertion of a floating hot-air balloon. Because it comes close to the camera, it is particularly subject to jitter in the camera path. Some filtering is used to smooth that out.
2016-05-09. 22:30 long.
The inserted building needs to cast a shadow onto a sloping hillside below it. This part shows the creation of a shadow-catching object textured with that shadow, so it can be rendered as its own layer.
2016-05-09. 7:52 long.
Because 360 VR cameras are composites of multiple (CMOS) cameras, the images aren't as solid as for normal 3D tracking. While we need the entire image to generate 3D information, individual sections can shift relative the overall solve. Here we use a script to anchor an insert as much as possible to its local image.
2016-05-09. 6:36 long.
Here we show the final assembly of the 360 VR video, using After Effects. Contains a small number of compositing tweaks, such as adjusting the levels, blur, and opacity.
2016-05-09. 11:12 long.
This is the the 360 VR tutorial's output without the tagging that tells YouTube that it is a 360 VR video. You can use this to look at the final results as a 2D result.
2016-05-09. 0:14 long.
This is the 360 VR tutorial's final output, tagged as a 360 VR shot to YouTube, so that you can view it with Google Cardboard or other compatible 360 VR viewers.
2016-05-09. 0:14 long.
SynthEyes easily is the best camera match mover and object tracker out there.Matthew Merkovich