Exciting news! SynthEyes has joined Boris FX and its award-winning VFX product family. Find out what this means for you.
Exciting news! SynthEyes has joined Boris FX and its award-winning VFX product family. Find out what this means for you.
Discusses shots where the trackers are all on a single plane.
2007-09-23. 5:15 long.★★
Discusses shots that may seem to be amenable to a 3-D solution but are really tripod-type shots.
2008-08-07. 3:54 long.★★
Includes using the auto-place button, tracker cleanup, and export to Cinema 4D. The export process is basically the same for other applications.
2011-10-21. 4:58 long.
2010-11-17. 2:19 long.
Part Two shows a 3D card being added to the scene from Part One, and a texture being extracted for the card. A lamppost gets in the way, and a cylinder is used as blocking geometry to avoid any artifact.
2010-11-17. 4:57 long.
Part Three adds a second card containing an existing texture, then shows the scene being exported from SynthEyes and opened in AfterEffects.
2010-11-17. 3:53 long.
Includes a little automatic and supervised tracking, cleanup, coordinate system setup, and object insertion.
2008-08-05. 5:03 long.
Non-computer look at what's needed to determine depth information in a shot. Helps you understand what is needed for a successful match-move
2008-07-11. 6:52 long.★★★
Shows some of the viewport and control panel features in SynthEyes (2008)
2008-07-10. 3:00 long.
Shows the snapping behavior of the symmetric and spot trackers in the mini-track-view, camera view, and SimulTrack view, and the use of ALT/command to suppress snapping.
2011-12-08. 3:02 long.
Shows using the search-from-solved capability of the supervised tracker to easily address situations where the tracker is moving off-screen, then back on. Uses zero-weighted-trackers during search, though that isn't strictly necessary.
2011-10-25. 7:31 long.
Shows basic supervised tracking techniques and takes advantage of the "Search from Solved" tracker mode option.
2011-10-25. 6:12 long.
Offset tracking is a new feature in SynthEyes 2011 that allows difficult tracks to be performed using nearby easier tracks as a base. This is most useful when a track must be created for a corner moving over a rapidly-varying background. A tracker in the interior can serve as the base. This tutorial also shows using the Hi-pass filter option in the image preprocessor to make tracking easier on something changing rapidly in brightness.
2010-11-23. 6:47 long.★
This tutorial shows two different new methods in SynthEyes 2011 for evaluating trackers: 1) tracker radar, and 2) tracker error coloring.
2010-11-17. 5:02 long.
SynthEyes features the new SimulTrack window, which is a visualization tool for evaluating tracking. We introduce each of the modes of the window: single-tracker, rows, and grid, and offer a quick look at some of the features of this window type. Separate tutorials will examine a new supervised tracking workflow made possible by SimulTrack, and its use in stereo tracking.
2010-11-17. 6:53 long.★
This video shows some of the more advanced workflows possible using the SimulTrack window in SynthEyes, as well as additional special features. Familiarity with supervised tracking in SynthEyes is required, see /content/closup.htm for background from earlier versions
2010-11-29. 6:39 long.
Naturally, we'd all like the computed tracker and camera paths to be as smooth as possible, but different kinds of system noise make that impossible. A separate tutorial explains many of those causes, but this tutorial focuses on a new tracking approach that minimizes the amount of jitter, fine-tuning the automatic tracker paths using the supervised tracker with little effort.
2007-11-16. 6:28 long.
This is the output from the "Mixed Tracking exported to AfterEffects tutorial"
2007-02-08. 0:11 long.
This Quicktime movie introduces the tracker cleanup dialog, used to quickly identify and remove marginal trackers.
2008-07-10. 5:54 long.★★
SynthEyes users must be alert for actors and other situations where some portions of the imagery are moving independently of the main portion of the shot being tracked: all the trackers that are used must be rigidly positioned with respect to the rest.
2009-07-28. 6:32 long.
Shots from sports stadiums are often subjects of motion graphics projects and frequently result in problems for new users because they are usually tripod shots. This three-part tutorial shows what happens to the unwary, and then shows two different ways to address the shot successfully.
2009-07-27. 18:06 long.
This tutorial is a completely real-world show and tell of what coordinate system alignment is about. No computers at all! We kid you not. It is silly, stupid, and hopefully clever enough to help you visualize what is going on during coordinate system alignment.
2008-02-07. 10:22 long.★★★
Shows tips and techniques for using the auto-place button found in SynthEyes 2011 build 1008 and later. The Place button picks coordinate systems using a fairly advanced scheme, and while providing opportunities for efficient artist control. Be sure to jump up to 720p full-screen to watch this video.
2011-10-21. 10:00 long.★★
Distance constraints give you control over the distance between the camera and origin, or camera and moving object in an object-tracking setup. That distance is exactly where jitter often appears in low-perspective shots, so the distance constraint gives you a direct way to control it.
2010-11-18. 6:31 long.
This tutorial will advance your understanding of the coordinate system setup process, and show you how to interpret the Constrained Points view, which is always the first place to look when trying to understand or diagnose the coordinate system setup of a scene. The initial portion of the tutorial shows a shot being manually aligned. We show this process to help you understand what SynthEyes is doing with your coordinate system setup information---not telling you that you should be manually aligning all your shots.
2007-12-07. 22:51 long.★★
Discusses the situation where you have used the Coords (Summary Panel) or *3 (Coordinate System Panel) buttons to set up a ground plane, but you want to use two other trackers to set the overall size of the scene, because you have an existing ground-truth distance between them.
2008-07-10. 2:55 long.
SynthEyes lets you change the coordinate axis setting whenever you want, from Z-Up (max), to Y-Up (maya), to Y-Up-Left (Lightwave). This tutorial takes a further look.
2007-10-27. 3:30 long.
This tutorial shows the single-frame alignment system being used to camera-match a single digital still.
2007-04-26. 5:20 long.★★
If you are given the distance between the camera and a trackable object in your scene, you can scale your SynthEyes scene to match. In addition to the method shown here, you can use SynthEyes phase to do it automatically.
2007-11-02. 5:27 long.
This example shows how to manually align a scene within SynthEyes, without applying any alignment constraints to the points. This can be useful when there isn't an obvious geometric choice, and to handle tripod-mode shots.
2012-10-23. 6:51 long.
Discusses tripod-type (nodal) shots. Notice how all the solved trackers are the same distance from the camera: in tripod mode, the distances can not be determined, but a 3-D insert can still be performed.
2012-10-22. 9:27 long.★★
Shows how to use the camera path locks to set up the overall scaling of the coordinate system, if you know the height of the camera above the ground plane. You can also use the Camera Height phase for this purpose.
2008-08-04. 4:40 long.
Suppose the camera is on a straight dolly track, and you have measured how far the camera travelled along the track (lens position to lens position). You can use this measurement to set up the scaling of the overall coordinate system. You can also use the Camera Travel Phase to do this automatically.
2008-08-04. 5:05 long.
Introduction to the lens grid auto-calibration system in SynthEyes, which analyzes footage of a lens grid pattern to produce accurate lens preset information.
2011-12-07. 8:20 long.
Shows how to use SynthEyes's lens calibration system to calibrate a lens, and apply that calibration to a shot. Shows the before and after of an autotrack of the scene.
2011-12-07. 5:23 long.
Tutorial shows the manual setup of a SynthEyes Undistort node in AfterEffects, to match the distortion and scale computed by a solve followed by using the 1-pass option on the Lens Workflow button (SynthEyes Summary panel). With this setup, you can use AfterEffects to undistort your footage, rather than the SynthEyes image preprocessor.
2012-01-05. 4:30 long.
Shows the two-pass workflow, where a lens distortion calculated in SynthEyes is used to configure both SynthEyes Undistort and SynthEyes Redistort effects in AfterEffects. With these nodes and this workflow, you can use AfterEffects (instead of SynthEyes) to convert your distorted footage to clean footage for tracking and 3D work, generate CGI effects that match the clean footage, then use the Redistort node to composite the CGI effects back to the *original* footage, so that you can deliver footage with higher quality,
2012-01-05. 9:12 long.
Shows the use of SynthEyes's "Advanced Distortion" effect (pixel bender) in AfterEffects. This effect is able to reproduce the complex distortion types, including off-centering and high-order (fisheye) distortion computed by SynthEyes's lens-grid-based calibration tool.
2012-01-05. 12:05 long.
Quick review to complement the other AE/SE pixel-bender tutorials. SynthEyes Undistort nodes are automatically generated as part of an export of the 3D scene from SynthEyes to AfterEffects. This tutorial shows that in action and what the result is.
2012-01-05. 3:04 long.
Shows lens distortion workflow to produce undistorted images.
2009-07-16. 5:16 long.
Shows lens distortion workflow based on re-distorting and compositing with the original image.
2009-07-16. 5:49 long.
Advanced tutorial looking behind the scenes at lens centering and distortion.
2009-07-17. 14:55 long.◆◆◆
This tutorial demonstrates SynthEyes's stabilization system running in filter mode, which is used for long "traveling" shots. In this case, the source footage is from a Stickypod stuck on the front windshield of a car and then driven down the road.
2007-07-10. 3:18 long.★★
Peg-mode stabilization - camera orbiting target
2007-04-06. 6:32 long.★★
This tutorial demonstrates a complete workflow, tracking and stabilizing a shot in SynthEyes, saving the images, exporting for Bentley MicroStation, and then in MicroStation, importing the MSA file, configuring MicroStation, and dropping in a small external model. After the model is completed and textures set up in MicroStation, then next step would be to render (record) the animation.
2007-05-07. 7:17 long.
This is an introduction to texture extraction in SynthEyes, using Add Card on the Perspective Window. Uses the automatic coordinate placement tool, then tweaks it a little. Similar to "Pass 2: Quick Pass through SynthEyes"
2012-01-18. 7:26 long.★★
Follow-on to "Texture Extraction Introduction." Shows a texture being extracted for a more complex piece of mesh geometry, a box primitive. Since the result wastes space in the texture map, the box is then cut down to size, using the perspective window's mesh operations, and the texture map re-run, producing a more space-efficient texture map (don't understand? watch the video!).
2012-01-18. 7:20 long.
Follow-on to "Texture Extraction Introduction". Shows using a roto mask to block out a tree, including setting up the roto mask by importing tracker coordinates. Shows what happens when you block out too much!
2012-01-18. 4:51 long.
Shows how to create clean plates from a shot, including removing a moving car and the effect of black borders around the edges of the image.
2010-12-05. 10:58 long.
This second section of the tutorial on creating clean plates shows how to create a texture for the moving car as well.
2010-12-05. 9:18 long.
This extra bonus section shows how to adjust the image texture resolution so that the pixels come out square, using a small script. (script is not in current build but coming soon)
2010-12-05. 5:44 long.
This is part one of a substantial tutorial showing a panoramic backdrop being created, ranging from auto-tracking, to creating a portion of a cylinder to extract the texture for, to extracting and examining the texture. This first portion covers up until the cylinder has been created and edited.
2010-12-03. 9:59 long.
This is part two of a substantial tutorial showing a panoramic backdrop being created, ranging from auto-tracking, to creating a portion of a cylinder to extract the texture for, to extracting and examining the texture. This second portion covers the actual texture extraction.
2010-12-03. 6:23 long.
This extra bonus section shows how to adjust the image texture resolution so that the pixels come out square, using a small script. (script is not in current build but coming soon)
2010-12-04. 4:03 long.
The tutorial shows a way to set the relative scale when a scene has both a tracked camera and a tracked object --- and you don't have on-set scale information to use to set each scale exactly.
2011-12-07. 4:48 long.
See the tutorial that shows the rendered output from this.
2011-10-27. 26:56 long.★★
Rendered output from Tracking a Flat Tablet
2011-10-28. 0:08 long.
This tutorial shows an auto track of a shot with a moving object and camera. The camera is a tripod-type nodal motion, while a full 3-D solve is obtained for the moving object, and a coordinate system is set up and an object inserted into the coordinate frame of the moving object.
2010-12-08. 15:13 long.
This tutorial demonstrates how to do object tracking, including simultaneously doing a tripod-mode track on the camera. This is a common scenario for shots of vehicles driving, sailing, or flying past the camera.
2007-07-30. 12:32 long.
This tutorial demonstrates how to match-move a mesh, so that the mesh exactly matches its motion in the shot. Commonly used for head tracking from a single camera, it is applicable for any kind of moving-object shot where you have an existing mesh for the object being tracked. Best of all, this technique works even when the object moves little.
2007-07-30. 9:16 long.
Shows a number of features to support stereo tracking in SynthEyes
2010-11-29. 7:35 long.
This tutorial shows the basic automatic tracking and solving of a stereo shot, and introduces the Stereo Geometry panel, which controls the relationship beteween the two cameras. Familiarity with general monocular SynthEyes tracking is required.
2009-07-18. 8:19 long.
This tutorial shows more behind-the-scenes details of a stereo tracking setup, and how to set up for and perform stereo supervised tracking. Familiarity with usual monocular supervised tracking in SynthEyes is required.
2009-07-20. 7:35 long.
Stereoscopic shots do not normally require that the camera physically translate, since there are already two camera views. That is only the case, however, if there are many trackable features relatively nearby to the camera rig. If all the trackable features are far away, the same situation arises as with monocular cameras on a tripod: no depth information is available. This tutorial introduces such shots, showing how to track them. The tutorial also briefly reviews several methods of locating problematic trackers in stereoscopic shots.
2009-07-23. 7:18 long.
This tutorial shows rigid-body stereo object tracking and solving using SynthEyes's automatic tracking systems, in particular, setting up some animated roto-splines to control what trackers are associated with what objects. More frequently, object tracking shots are handled using supervised tracking, but this example shows how suitable shots can be tracked automatically.
2009-09-11. 12:47 long.
Since a stereoscopic shot has two cameras, any stereo shot is already a motion capture setup. While a standard moving-object track requires a half-dozen or more trackers on a rigid body, a stereoscopic motion capture setup works with any number of tracker pairs, as few as one, and they can all move independently on a flexible object or separate objects. This tutorial shows a motion-capture setup of a stereo shot.
2009-07-22. 5:51 long.
SynthEyes can display anamorphic red/blue (etc) stereoscopic views from the perspective window of both stereoscopic and regular monocular shots. The tutorial shows setup and gives hints.
2009-07-29. 4:00 long.
Even if the cameras are mechanically aligned on set, additional alignment will be required in post-production unless you use a very high quality rig. This tutorial shows a simple way to set up SynthEyes to do this manually; not only does it get the job done, but it is rather informative as well.
2009-07-31. 8:08 long.◆◆
This tutorial shows how to use a SynthEyes script to quickly and accurately align the two halves of a stereo shot. The script automates the process described in the prior manual alignment tutorial, so you should watch that tutorial first.
2009-09-11. 5:00 long.◆◆
This tutorial shows the use of two scripts that convert 2-D tracking data into 3-D motion-capture-style paths that can be exported to 3D applications (which generally don't know anything about 2D trackers).
2011-10-26. 7:47 long.
This tutorial shows a small scene being tracked, exported to Modo, then loaded in Modo.
2009-07-31. 3:19 long.
SynthEyes can export to Motion 3 as a 3-D scene. Not only cameras and trackers can be exported, but planes as well, which appear as Drop Zones in Motion. A special script in SynthEyes adjusts the aspect-ratio of planes in SynthEyes to simplify handling in Motion.
2008-08-06. 5:06 long.
This tutorial shows a shot being auto-tracked, a coordinate system being set up using some supervised trackers, the scene exported to After Effects, and a logo being burned in using AE's 3-D workspace. You can see the final movie /content/alienzSmaller.mov (2 MB) also.
2007-02-08. 7:45 long.
This Quicktime movie shows a shot being tracked then exported to, and opened in, Lightwave.
2006-10-07. 2:17 long.
The projection screen generator creates physical geometry in the scene to hold a background image, which can make some setup and scene-building tasks easier --- especially for green-screen shots. For green screen shots, SynthEyes can leave the keyed portion of the image transparent. This works for internally-keyed shots, or for shots with externally-generated alpha channels.
2010-11-19. 3:27 long.
Shows a supervised camera and object shot, primarily the image preprocessor setup to simplify supervised tracking on the box, and a way to adjust the coordinate system setup of the box to match the rest of the scene.
2008-08-05. 4:06 long.
The hold-mode feature of SynthEyes 2008 enables you to match-move shots where the camera translates during part of the shot, but also only rotates during part of the shot. In essence, these shots contain both a regular section and a tripod-mode section. With the hold-mode feature, you can easily solve complex combinations of these motions.
2008-07-10. 5:45 long.◆
Sometimes you need to combine two separate tracks together exactly. Most commonly, this involves a shot consisting of normal sections and tripod sections, but other situations might include a shot with a major occlusion in the middle, or as a starting point for a shot that transitions from a wide shot to a full-frame moving object (still tricky). In the past, you needed to assemble the tracks in your favorite animation package, but it is quicker and easier to do it in SynthEyes. Largely but not always superceded by Hold Mode!
2007-12-09. 9:25 long.
You can use the image preprocessor to create motion from stills, the "Ken Burns" effect. This tutorial shows a simple example of that. Doing it in SynthEyes is different than doing a 2D image zoom and pan, because, with the proper lens field of view, it is recalculating the correct perspective shift, as if the camera was on a tripod, instead of the 2-D image-being-slid-around look.
2009-01-08. 5:45 long.
This tutorial demonstrates the SynthEyes Add-Many and Coalesce Trackers capabilities. Here, we use them to build a denser-than-usual tracker mesh to use as a ground mesh upon which to add objects.
2007-04-05. 4:45 long.★
This Quicktime movie shows SynthEyes's green-screen system at work. It allows SynthEyes to track only the green screen, and ignore features on the actors moving in front of the screen, so that they don't have to be masked out from tracking. Shows typical issues and solutions handling these shots.
2006-08-28. 3:33 long.★
This tutorial shows the SynthEyes 2007 zoom-filtering script in action on a shot that contains a zoom in the middle of the shot. You'd like to have the stability of a non-zoom shot during the non-zooming portion, and accomodate the zoom during that time. The script lets you get that. See also the Flatten FOV phase.
2006-09-28. 3:50 long.
This tutorial shows the SynthEyes's Image Preparation subsystem being used to dramatically reduce the RAM usage of a shot, including setting up presets. Note: the layout of the image preprocessor window has been changed, but the overall effect remains the same.
2006-09-29. 5:44 long.
This Quicktime tutorial shows how to use SynthEyes's Batcher to process a collection of shots without further intervention. Use the batcher to do the time-consuming tracking, then come back for final checks and to set up a coordinate system. Or use it for stabilizing a group of shots.
2006-10-09. 4:53 long.
Common "Mr Fix-It"!!! Changing a tracked camera from a camera to an object track.
2012-10-17. 3:11 long.
Find the direction to a distant (sun) light, so you can cast matching shadows in your 3-D app.
2012-10-21. 6:14 long.★
Shows how to convert from field of view to focal length and back using plate width, and how to use field of view and focal length to compute the back plate width of your camera.
2007-11-09. 7:05 long.◆◆◆
A quick introduction to SynthEyes's file auto-save system.
2011-12-07. 3:13 long.
This tutorial teaches you about the SynthEyes keyboard accelerator system. Along the way, you'll configure your SynthEyes so that the W, E, and R keys select the Move, Rotate, and Scale mouse modes, following Maya's convention. This can be a handy time-saver for positioning objects in the 3D environment.
2011-10-27. 5:46 long.
Using Script Bars to quickly access commonly-used functions
2009-07-17. 2:11 long.★
The tutorial shows using the "Random Tracker Colors" script after an autotrack to help find problems in the generated trackers --- it is a lot easier to see what the trackers are doing if they are all different colors.
2012-01-02. 3:49 long.
How to locate bad trackers by working in the dark!
2007-04-07. 2:00 long.★
Tracker cleanup of an example shot of a warehouse: jumping and far trackers.
2007-04-07. 5:04 long.
What to look for if things don't line up after export from SynthEyes to your other applications.
2007-07-12. 6:25 long.★
Part One: how to register SynthEyes and install your temporary authorization
2006-09-29. 5:28 long.★★
Part Two: how to install your temporary authorization and set up Customer Care, once your permanent authorization arrives.
2006-09-29. 3:08 long.★★
Introduction to the "Matrix" object in SynthEyes (1209+). This is a space-filling grid of small markers that can be used to visually assess the tracking of a shot.
2012-10-05. 4:08 long.
Shows how to configure the SynthEyes (1209+) multiple-export control panel, so that a single menu operation can rapidly run any number of exporters. For examples, you might deliver 3dsmax, maya, and cinema 4D versions of your shots in one operation.
2012-10-05. 5:30 long.
This tutorial shows the "Set Horizon", "Slide into Position", and "Tracker/Tracker Distance" phases in SynthEyes (1209+) to set up a coordinate system (rotation, translation, and scale). Phases offer a completely different way to set up a coordinate system than the usual tracker constraints.
2012-10-05. 7:28 long.★
If you are certain that a dolly track was perfectly straight during shooting, and want to be cause the camera path to be exactly straight, this tutorial shows how to do it, using the SynthEyes (1209+) "Linearize Path" phase. The tutorial also shows how to use the graph editor to look at the output of each phase in a phase pipeline.
2012-10-05. 13:22 long.◆◆
Shows another example of coordinate system setup in SynthEyes (1209+) using phases. Here we use (autoplace,) Set Heading, Slide into Position, and Camera Height to adjust the coordinate system. We match the horizon line as part of this.
2012-10-05. 10:02 long.
Shows a phase pipeline for a zoom shot with two crash zoom sections, ie three sections where the zoom is stationary. The phase pipeline exactly flattens the zoom FOV track during the flat section, to eliminate any and all lens jitter.
2012-10-05. 8:33 long.◆◆
The tutorial runs through different things you can do in the SynthEyes (1209+) phase editor view: how to create phases, move them around, align them, set the root phase and interpret the resulting display. Also run them, copy/paste, and the phase library to move configurations from scene to scene.
2012-10-05. 9:09 long.★★
Discusses rolling shutter using an example shot that is tracked, initially without rolling shutter processing, to yield a rather high error. Then, SynthEyes (1209+)'s rolling shutter processing is turned on, dramatically reducing the pixel errors, and in the process showing just how much of an impact rolling shutter has.
2012-10-05. 9:28 long.★
The tutorial shows how to handle survey shots --- collections of still images from one or more still cameras --- that have been taken to facilitate a better 3-D reconstruction of the set that what can be obtained from principal photography. The SynthEyes (1209+) survey shot processing helps you assemble an IFL, quickly track it, and also modifies the solver algorithms to accommodate this kind of shot.
2012-10-05. 11:40 long.★
Shows how the SynthEyes (1209+) "rooms" features can help customize your workspace to your job, giving you a quick way to select a control panel, view configuration, and even start a dialog.
2012-10-05. 4:32 long.★★
The tutorial discusses the use of the solver controls, such as Slow but Sure, the motion hint, and the begin/end frames.
2012-10-23. 5:29 long.★
Shows additional methods to set up a coordinate system: snapping the perspective-view grid to trackers, then making it the ground plane; and making a mesh object the ground plane. For both cases, we show using a distance constraint to set up the overall scene scaling.
2012-10-24. 10:14 long.◆
The Community Translation Project aims to allow SynthEyes users to share translated versions of the user interface. This tutorial shows a new translation being created, edited, and installed.
2012-11-02. 9:22 long.
This tutorial uses a long shot to run through a whole sequence of activities, starting with auto-tracking, adding more trackers, mesh-building, and shadow catching. You'll probably see some capabilities you weren't aware of. Source video from https://www.artbeats.com shot named B017-C081
2012-11-07. 46:37 long.◆◆★★
Inverted perspective arises when a shot has too little perspective, resulting in two possible solves that are both mathematically correct. This tutorial demonstrates this visually, then shows a shot exhibiting the problem and shows how to select the desired solution. The example shot is a mixed tripod plus object shot.
2012-11-07. 15:35 long.◆◆
The tutorial shows how to operate to set up a shot and track it in two camera views at once, including using a dual SimulTrack configuration.
2012-11-29. 11:03 long.★★
This talky tutorial describes the disk caching system in SynthEyes, which can improve overall workflow on large shots, especially stereo and especially when you have a fast SSD disk or RAID drive. Disk caching is easy to turn on and invisible to use, but be sure to watch this tutorial and check out the manual for best results!
2012-12-02. 22:50 long.◆
Talks about the dynamic projection screen generator in SynthEyes's perspective view. The projection screen shows an undistorted version of the original footage and can show a keyed version of the original shot. It adapts immediately to changes in field of view and distortion. The screen can be locked to the location of a tracker (or extra) in the 3D environment. Footage from Hollywood Cameraworks, https://www.hollywoodcamerawork.us/ greenscreenplates.html
2013-08-07. 9:57 long.★
Shows how you can use real SynthEyes planes to act as clipping planes when working on vertex point clouds, ie from tracker positions or lidar data. Lidar scan from https://kos.informatik.uni-osnabrueck.de/3Dscans/ by Dorit Borrmann, Jan Elseberg, HamidReza Houshiar, and Andreas Nuchter from Jacobs University Bremen GmbH, Germany.
2013-08-09. 6:06 long.
An introduction to 3-D planar tracking, with a 3-D export to After Effects
2013-11-17. 9:51 long.
Shows a 3-D planar tracker being exported to After Effects as an animated corner pin. The 2-D export can be used for 2-D planar trackers as well. (Two-D planar trackers can not be exported into 3-D applications.)
2013-11-17. 2:50 long.
Shows the use of additional elements of the 3-D planar tracking feature set, including layering two planar trackers, channel selection and maintaining the same camera field of view.
2013-11-17. 12:03 long.
To help align 3-D planar trackers with circular features in the 3-D environment, the 3-D planar tracking feature set includes the automatic generation of a circular mask. This tutorial demonstrates that. Footage courtesy of Dan Lanz, Impulse FX https://www.impulse-fx.com
2013-11-17. 4:26 long.
When a scene contains multiple 3-D planar trackers, their fields of view must all agree, ie with the actual camera field of view, in order for everything to match up. This tutorial shows how to do this, and shows using the Export Preparation script twice to configure a 1 moving-camera + 2 moving-object planar setup, then shows it exported to Cinema 4D.
2013-11-17. 11:34 long.
Shows how to add a 3-D planar tracker to a scene tracked and solved with regular trackers, and how to convert a 2- or 3-D planar tracker into multiple regular trackers to supply more information to a regular 3-D solve.
2013-11-18. 11:03 long.
Shows some of the features of the planar tracker's in-plane masking system, used to control which portion of the rectangular tracker is actually used for tracking.... useful for trackers of different shapes, or to exclude some areas, as shown here. Includes 2-D AE export. Source footage courtesy Hollywood Camera Work, available at https://www.hollywoodcamerawork.us/ trackingplates.html Statue of Liberty footage courtesy of Aerial Exposures, https://www.aerialexposures.com/
2013-11-19. 10:48 long.
Shows 3-D planar tracking using an animated garbage matte set up in SynthEyes's roto panel. Shows some manipulations of the search rectangle and region size. Footage courtesy of Hollywood Camera Work, available at https://www.hollywoodcamerawork.us/ trackingplates.html
2013-11-20. 13:42 long.
Uses two planar trackers to track a foreground actress, to serve as moving masks for the primary 3-D planar tracker. Here, the foreground trackers are 2-D planar trackers, giving an introduction to using SynthEyes's 2-D planar tracking modes as well. Footage courtesy of Hollywood Camera Work, available at https://www.hollywoodcamerawork.us/ trackingplates.html
2013-11-21. 10:50 long.◆
SynthEyes 1407 (https://www.ssontech.com) introduces quite a few new preferences to support auto-save and -increment operations, so that you can configure file versioning behavior. File/Save a Copy and File/Save Next Version are also new. This is a run through of the new options.
2014-07-24. 7:28 long.
As of SynthEyes 1407, you can add "notes" to your camera views, as a way to communicate between tracking artists and supervisors, or just to remind you what you were doing before a long weekend.
2014-07-24. 5:10 long.
The SynthEyes Instructible Assistant, ie Synthia, enables chat-style natural language control. Unlike simpler assistants such as Siri or Google Now, Synthia responds to specific instructions, simple or complex, and can be instructed by the user in English to add additional functionality. Cloud communications make possible rapid improvement based on users' experiences. This tutorial shows the tip of the iceberg; for more information see the Synthia Manual in the SynthEyes product or demo version---it's fun and you'll learn a lot.
2014-07-24. 12:21 long.★★
This shows the effect of rolling shutter when the camera is subject to vibration --- here due to shooting hand-held in a helicopter. This is shot with a CMOS HV-20 camera, though it would be the same with any CMOS camera. Footage shot the same way with a CCD camera requires stabilization to be truly watchable, but each individual image is fine and the stabilized result is excellent.
2012-09-05. 0:05 long.
This classroom-length tutorial runs through many of the features used in supervised tracking to produce maximum accuracy. Beginners and experts alike will find helpful new features and better understand the purpose behind them.
2014-11-12. 40:07 long.★★★
Shows how to use SynthEyes's animated reference crosshairs, which can help supervised tracking when there are nearby linear features. Also has a little offset tracking. (SynthEyes 1411 or later)
2014-11-13. 4:49 long.
You can put English-language Synthia commands onto a toolbar, so that you can use them for quick and helpful workflow animation activities. To do that, you put commands (such as "make the selected trackers orange") into small text files. This tutorial shows you how, with examples that quickly manipulate trackers that are grouped by color.
2014-11-13. 5:56 long.
Object misplacement is often the hidden source of sliding in inserted objects. While the tracks must be correct, and users intrinsically understand that, users frequently fail to understand the importance of proper object placement, relative to the trackers. This tutorial tries to correct that. BTW, the sliding demonstration in the initial example was because the pyramid was intentionally located a bit below the field level. Objects *above* the right location are more easily detected, so objects that are too low are a common mistake.
2014-11-13. 14:32 long.
The error curve mini-view shows the error of one or more trackers over the duration of a shot. This tutorial runs through the various modes of the display. SynthEyes 1502+. Imagery: See GoingThruPhase.zip in the Example files.
2015-01-31. 11:15 long.
This extended tutorial shows a 3-D scene being exported from SynthEyes to fusion. There's a quick run-through of the shot being auto-tracked in SynthEyes, trackers cleaned up generically, and two meshes and a planar tracker being created. Once in Fusion, the tutorial runs through the different nodes and points out the lens distortion setup, then shows how to reconfigure for a true compositing-based insert. For more information, see "Fusion" in the "Exporting to Your Animation Package" section of the SynthEyes user manual. Source images: soccer.zip in the Example Files area. See also the "Softening Edges of Inserts" tutorial.
2015-01-31. 17:27 long.
The tutorial shows 3 planar trackers being exported from SynthEyes to Fusion for a sign replacement-type usage, and runs through some details of the flow in Fusion. The tutorial continues the "Multiple 3-D Planar Trackers" tutorial. SynthEyes 1502+.
2015-01-31. 5:35 long.
REVISED! This tutorial shows offset tracking being used for temporary tracking when the primary feature is occluded. It shows a complete sequence of a basic offset-tracked section, a second section with a different temporary feature, shows what happens when the tracking direction is changed, and shows different reference features being used during the same occlusion. New version for SynthEyes 1502+. See Walkway.zip in the Download Examples.
2015-01-31. 14:32 long.★★
When 3-D objects are overlaid onto background images during compositing, the edge is often unnaturally sharp. This tutorial shows one method to soften that edge, working exclusively in Fusion on the result of the Fusion 3-D Comp Export tutorial.
2015-01-31. 6:34 long.
When you work with a longer main shot and a shorter reference shot, it can be convenient to scrub them independently to different frame numbers, to facilitate linking the trackers from shot to shot. This tutorial shows how to do that, using a camera + perspective view.
2015-02-02. 2:41 long.
Shows a 3-level hierarchy being set up for the base, side, and top of a pizza box using SynthEyes's Geometric Hierarchy (GeoH) tracking features, then tracked.
2015-08-24. 25:01 long.
This is a basic introduction to geometric hierarchy tracking, so basic that it skips the hierarchy part altogether. Shows the tracking of a truck.
2015-08-24. 10:28 long.
Here we show some more features of the GeoH tracker, tracking a can of oats in a 4K QHD shot. Includes some simple re-keying and glitch fixing.
2015-08-24. 15:05 long.
This tutorial runs through a variety of features associated with deforming meshes in SynthEyes, as part of the Geometric Hierarchy tracking system, which can deform meshes to match the incoming shot. Covers different lasso and paint creation modes, and using an external image editor for setup. Describes constraints on overlap of children, parents, and siblings.
2015-08-24. 24:49 long.
Run-through of the display and mouse actions of the hierarchy view introduced in SynthEyes 1508.
2015-08-24. 10:34 long.★★★
The tutorial shows how to use a preliminary tripod track to essentially stabilize a shot for the benefit of the GeoH tracker, so the GeoH tracker knows better where to look. This is good for bouncy shots and also for "nearly-a-tripod" shots that are between nodal and regular 3-D shots.
2015-08-24. 10:59 long.
Shows a geometric hierarchy track within a regular 3D camera track, to track a moving object or maybe add some secondary tracking/animation to something. The key point is to get the object at the right distance/location in the scene, before starting tracking.
2015-08-24. 11:55 long.
In hybrid geometric hierarchy tracking, normal supervised trackers provide the raw tracking needed to power geometric hierarchy tracking, allowing you to produce object tracks, including hierarchical tracks, even for thin objects and very difficult images that aren't suitable for the more regular area-based geometric hierarchy tracking. A note on the track in this tutorial: you do see some jumping as the middle object switches back and forth between alternate solutions---the modeling needs to be more accurate.
2015-08-24. 12:41 long.
The tutorial shows the full 3D trajectory of a ball bouncing across a tennis court being created via hybrid Geometric Hierarchy (GeoH) Tracking, with the help of some supervised tracking and single-frame alignment. The point isn't to learn to do *this* particular task, but to see that you can use hybrid GeoH tracking to pull 3D information out of situations where you wouldn't otherwise be able to do so.
2015-08-24. 16:06 long.
GeoH tracking can be combined with regular moving object tracking, especially when there's no preexisting model, but some secondary tracking is required, or to be able to work with an unknown or zooming lens field of view. Starting with already-tracked supervised trackers, a moving object track is accomplished, then GeoH tracking added to accommodate the opening of the box. Then two additional levels of GeoH tracking are added to deform the corners of the lid to match up better, using a two-channel gradient.
2015-08-24. 18:38 long.
This tutorial shows a geometric-hierarchy-tracked scene being exported to Blender, in particular pizza-box scene from earlier tutorials. It uses the auto-run capability of the export, and shows the resulting point-cache folder.
2015-08-24. 4:52 long.
Geometric Hierarchy (GeoH) Tracking can combine geometry tracking, mesh deformation, hierarchy, supervised tracking and more as a creative tracking and problem-solving tool. The tutorial works through an example overlaying a GeoH secondary track on a regular object solve. This tutorial is about what can be done, not button-by-button breakdowns: see the other tutorials for that.
2015-08-24. 17:27 long.★★★
You can use a 3D mesh as a reference for a camera-only match-move, for example to add a new building to an exact location in an existing environment. Here we use a flat plane textured with images from Google Maps as a reference (in lieu of a site plan).
2015-08-27. 23:21 long.★★★
Runs through some details of GPU- and software-based reading of RED Movie files in SynthEyes. If your GPU is suitable, it can provide 10x faster reading than software-based reading (which is limited to single-threading by the RED SDK). The best way to find out if your GPU is suitable is to try it! NOTE: GPU-based file reading was introduced in SynthEyes 1511.
2015-11-16. 8:55 long.
The mesh-deduplication features let you cut down the size of SNI files that repeatedly store the same large meshes. This tutorial runs through a sequence of scenarios, showing the intended uses of the de-duplication modes. For more details, see the section "Mesh De-Duplication" in the manual.
2015-11-16. 11:47 long.◆◆
The smudge tool allows you to edit a mesh to better match shot imagery, typically after the mesh has been match-moved to the shot. You can use it to refine the boundary or interior of objects. This tutorial runs through various smudge modes and controls.
2015-11-17. 8:56 long.◆◆
Shows using a photographic survey to create a 3D model of a small springhouse (building) out of geometric primitives. Any number of small features are used to do this; it's a way for users to gain insight on how the features can be used, include survey shots and "lock z-drop", the # segment tool, smudge, edit pivots, etc. Note that with more source images, we might build the model directly from the trackers, instead of from primitives.
2015-11-20. 37:38 long.
Shows a tripod (nodal) shot being aligned in 3D by matching the imagery to a known mesh in the shot---in this case, the springhouse from the tutorial "Springhouse Modeling"---- using the perspective view's Pinning Tool in Pin Scene mode.
2015-11-20. 9:57 long.
Obsolete: Use SynthEyes to solve the calibration shot, calculating rolling shutter. This is a run-through of how to set up Python to talk to SynthEyes, and vice versa, based on the material in the Configuring Python and SynthEyes section of the SyPy: Python Reference Manual available in versions AFTER SynthEyes 1511, or from the Customer Only area before that. Here's a recap of the folder names on all OSs: Windows: C:\Program Files\Andersson Technologies LLC\SynthEyes\SyPy into C:\Python27\Lib\site-packages; Mac OS X: /Applications/SynthEyes/SyPy into /Library/Python/2.6/site-packages; Linux: /opt/SynthEyes/SyPy into /usr/lib/python2.6/site-packages. Note that folder names may need to be modified if you have other versions on your machine, or to install into the python of other applications.
2016-01-08. 8:25 long.
Obsolete: Use SynthEyes to solve the calibration shot, calculating rolling shutter. Shows the preparations to run the Rolling Shutter Analysis script: shooting, initial setup, auto-tracking, and tracker cleanup. This tutorial complements the separate overview tutorial, which talks about rolling shutter issues and shows results, and the python setup tutorial, which is necessary to be able to run the script. Requires SynthEyes versions AFTER 1511, or that you download the script and related materials from the Customer-Only area.
2016-01-08. 15:25 long.
Shows how to precisely measure the amount of rolling shutter distortion created by a given camera (in a given shooting mode). This is a valuable performance measure of a camera, which is otherwise hard to obtain. This particular tutorial is an overview of rolling shutter and shows the method in operation and some results. This tutorial isn't intended to be particularly software-specific, per se. Other related tutorials show more detailed button-pushing steps, and the (python) setup required to use this method. Note that SynthEyes versions AFTER 1511 are required, or you can download the script and related material from the customer-only area of the website.
2016-01-08. 10:13 long.★★★
Shows a shot being tracked and solved in SynthEyes, exported to After Effects, and a simple 3-D floating-text layer added. Intentionally, this tutorial does not address lens distortion, see the additional tutorials for that. Shot credit: Artbeats
2016-01-14. 8:17 long.
Shows a shot with lens distortion being tracked and solved in SynthEyes, then the one-pass lens workflow being used and the scene exported to After Effects. The one-pass workflow produces shots with no lens distortion (ie it has been removed) for delivery to the client. Includes adding a small flag in a 3-D layer to the shot. Shot credit: Arnie Itzkowitz, Aerial Exposures.
2016-01-15. 9:04 long.
Shows the two-pass lens distortion workflow, which allows you to composite 3-D effects over the original footage with distortion. Here, a shot is tracked and 3-D solved, then a planar tracker added before proceeding with the lens workflow and export. Shows the undistorted and redistorted comps within After Effects. Shot credit: Arnie Itzkowitz, Aerial Exposures.
2016-01-16. 14:28 long.
This is an introduction to what we're going to be doing in this 12-part tutorial: tracking and stabilizing an example shot from 360 Heros. We then insert a variety of objects into it, including adding shadows, and finally at the end we'll composite the whole thing together in After Effects. Footage used with permission of 360 Heros. 3D Models licensed from Turbosquid.
2016-05-09. 5:49 long.
This key section shows how to 3D-track the 360VR shot by creating and tracking a normal linear perspective shot; SynthEyes has the tools builtin to do that quickly and easily, and to convert back and forth as needed. The result is a nice world-stabilized shot.
2016-05-09. 10:07 long.
The initial 3D track creates trackers only in the area that the generated linear camera is looking at. We'd like to have more trackers throughout the 3D environment to use for reference when inserting objects; here we show how to get them using the Add Many dialog.
2016-05-09. 4:47 long.
Here we show how to generate more zero-weighted trackers throughout the entire shot by auto-tracking it. This may necessitate some garbage-matte roto work to mask out the camera platform, but it can generate a nice distribution throughout the shot without much thought.
2016-05-09. 9:40 long.
Normally 360 VR shots are world-stabilized in SynthEyes, so that the virtual and real world are aligned the same way throughout the shot, so that the user can easily understand the relationship. Here we show how to align the "forward" direction versus the camera's path, if that is desired as an artistic choice especially for dynamic shots.
2016-05-09. 6:43 long.
Shows how to set up the perspective view for 360 VR shots, then inserts a building into the shot. Shows how to determine the lighting direction or placement. You can generate a quick 360 VR test movie right out of SynthEyes at this point!
2016-05-09. 13:07 long.
We show that you can export to Blender and use its built-in 360 degree camera to render inserts (using the Cycles renderer). This approach can be used with other applications that have a 360 VR camera, even if the exporter doesn't support it, by changing a regular camera to 360 VR.
2016-05-09. 10:46 long.
Shows how to render inserted objects using any rendering application, even if it does not have a 360 VR camera built-in. A script creates a normal perspective camera that precisely follows the inserted object; that scene is exported and rendered. SynthEyes then converts the images back to 360 VR. Despite the somewhat more complex process, the tight render view results in a quick overall render time.
2016-05-09. 10:55 long.
Shows the insertion of a floating hot-air balloon. Because it comes close to the camera, it is particularly subject to jitter in the camera path. Some filtering is used to smooth that out.
2016-05-09. 22:30 long.
The inserted building needs to cast a shadow onto a sloping hillside below it. This part shows the creation of a shadow-catching object textured with that shadow, so it can be rendered as its own layer.
2016-05-09. 7:52 long.
Because 360 VR cameras are composites of multiple (CMOS) cameras, the images aren't as solid as for normal 3D tracking. While we need the entire image to generate 3D information, individual sections can shift relative the overall solve. Here we use a script to anchor an insert as much as possible to its local image.
2016-05-09. 6:36 long.
Here we show the final assembly of the 360 VR video, using After Effects. Contains a small number of compositing tweaks, such as adjusting the levels, blur, and opacity.
2016-05-09. 11:12 long.
This is the 360 VR tutorial's output without the tagging that tells YouTube that it is a 360 VR video. You can use this to look at the final results as a 2D result.
2016-05-09. 0:14 long.
This is the 360 VR tutorial's final output, tagged as a 360 VR shot to YouTube, so that you can view it with Google Cardboard or other compatible 360 VR viewers.
2016-05-09. 0:14 long.
Shows the updated perspective view locking controls and preferences, which allow direct control over which camera a given perspective view is locked to. SynthEyes 1608+
2016-08-24. 7:29 long.
Here we show the simpler version of 360VR stabilization, which doesn't use a full 3D solve of the scene, for shots where an absolute world orientation is not required. Suitable for short shots using Peg mode, or long shots using low-pass-filtering-based stabilization. Footage courtesy 360Rize.com
2016-08-24. 12:05 long.
Shows how the Create Spherical Screen script can be used to export 360VR shots to downstream applications, including the effect of 360VR stabilization in SynthEyes. Exports to Blackmagic Design's Fusion. Full use of the 360VR result in Fusion requires a 360VR Fusion camera, perhaps Domemaster from Andrew Hazelden. Footage courtesy 360Rize.com
2016-08-24. 9:49 long.
Demonstrates a way to export normal or 360VR stabilization from SynthEyes to other apps for compositing, so that the actual image manipulation is performed there. This can be done using animated image distortion maps to describe the effect of the stabilization. Here we show it with Blackmagic Design's Fusion. Footage courtesy 360Rize.com
2016-08-24. 11:22 long.
Gives a quick run-through of the spherical screen now built into the perspective view when it is handling 360VR shots. This eliminates the need to run the Create Spherical Screen script for the purpose of viewing (it's still useful for exporting). Footage courtesy 360Rize.com
2016-08-24. 3:34 long.
Shows the operation of the newly improved Deglitch tool in the graph editor: it is a bit more subtle in determining what it should do. It has a nice additional piece of functionality too: Split a tracker via shift-click, when a tracker has jumped from one feature to another, and both are usable individually.
2016-08-24. 12:11 long.
SynthEyes can measure the average illumination level of a tracker over time, and store that on the tracker, a light, or a mesh. This is useful for handling shots where the lighting varies rapidly during the shot, for example flickering lights, explosions, etc.
2016-08-24. 21:56 long.
The Number Zone shows the value of each displayed channel on the current frame. Shows how to turn it on and off, change the sizing, and how to change the displayed numbers!
2016-08-24. 2:30 long.
Provides an overview of the system for saving and restoring window placements and settings as windows, and SynthEyes itself, opens and closes. Points out some of the hazards with that, and some preferences you can use to avoid them.
2016-08-24. 15:52 long.
Shows how you can export the 360VR stabilization that you do in SynthEyes to AfterEffects, so that you can do all your 2D compositing work in AfterEffects, without having to generate stabilized images in SynthEyes. Requires plugin from the Customer-Only portion of the website(1608) or within SynthEyes(after 1608). Disabled for SynthEyes Demo. Footage courtesy of 360Rize.com
2016-08-26. 4:56 long.
When a car or plane crashes today, it is often observed by surveillance or witness video cameras. Analyzing that video to determine the path of the vehicles can help identify the cause. This two-part tutorial discusses techniques for doing that using SynthEyes, with example footage of an aircraft taxiing. This first section shows the tracking of a nearly stationary camera to remove the effect of camera vibration.
2016-08-31. 25:40 long.
The second section shows the plane being tracked, the tracks being examined, and the path filtered. Each task shown is an introduction; other tutorials show additional techniques and more detail. Footage is available in the Downloads section (AirSynth.zip), though unfortunately the 3D model cannot be distributed.
2016-08-31. 25:50 long.
Shows how to change the aim direction of the linearized camera view, so it can continue to follow the best trackable area, as well as how to solve just a portion of a shot, then work through the rest, for early detection of problems.
2016-09-13. 11:31 long.
Shows how to quickly delete spurious trackers after auto-tracking using a slightly subtle feature of the tracker panel's Delete button. This is especially useful for 360VR shots, which may have spurious trackers on the camera mount or out in the sky.
2017-02-15. 1:09 long.
Shows how to calibrate a fisheye lens from live shots that contain a number of straight-line features. While the technique is generally useful, even for non-fisheye lenses, this particular tutorial shows the additional steps required for "full-circle" fisheye lenses, where the image is vignetted and does not extend to the edge of the rectangular image plane.
2017-02-15. 20:31 long.
Shows how to use the By Hand button to animate a smooth splined tracker path for an occluded tracker, which is especially useful for object tracking where there are often relatively few trackers.
2017-02-15. 11:23 long.
Walks through various options of the Lens Master Calibration script while calibrating a somewhat distorted lens (Canon 24-70mm L at 24mm), especially the results of the different workflow selections.
2017-02-15. 13:57 long.
Shows techniques for native 360VR solving, including how to set up autotracking and rough rotoscoping. In many cases, this method will replace the linearization-based 360VR handling discussed in earlier tutorials. Footage courtesy 360Rize.
2017-02-15. 9:39 long.
Shows how to determine the optical center by intentionally causing vignetting. This can be done easily and accurately in the field using a nesting collection of filter step-down rings. This technique is especially useful when the lens has little distortion.
2017-02-15. 6:38 long.
Shows how to do lens calibration using a novel random-dot method that generates a very dense data set for accurate calibration, including a full 360 degree scan that guarantees an accurate field of view. This technique is intended for reference calibrations to examine equipment and techniques carefully, not for calibrating shoots out in the field.
2017-02-15. 30:39 long.
Shows the extensive capabilities of the export from SynthEyes to HitFilm Pro, including 3D meshes, planes and planar trackers, and lens distortion workflows. Uses the GoingThruPhase.zip download.
2017-04-11. 22:49 long.
Shows a simpler form of filtering-based stabilization for long 360VR shots by breaking them into pieces, stabilizing the pieces, then recombining them. (See https://youtu.be/lViQjXzqcW8 for the better, more rigid, method.) Shows the resulting stabilization data being exported to AfterEffects, in this case for the use of SkyBox's plugin. A SynthEyes-provided plugin can also be used; data can also be exported to HitFilm.
2017-04-13. 25:23 long.
Stabilization is essential for safely viewing 360VR footage from moving cameras. This introductory tutorial shows the stabilization of a 360VR sequence from an ultra-light aircraft (courtesy 360Rize), and the export of that stabilization to After Effects. This demo was presented at the 2017 NAB Show.
2017-04-28. 10:48 long.
Shows how to align the horizon/up direction of a 360VR shot by literally rotating the entire image sphere around. Not the simplest way, but gives good physical insight for whatever method you use.
2017-05-02. 6:02 long.
Shows how to align the horizon/up direction of a 360VR shot using the horizon line in the perspective view as a reference. Shows using the perspective view to look around in a 360VR shot as well.
2017-05-02. 5:19 long.
Shows how to align the up direction/horizon line of a shot where the camera is at a fixed height over the ground plane for the duration of the shot, typically a camera on a vehicle rolling over level ground.
2017-05-02. 5:32 long.
Shows how to align the up direction/horizon line of a 360VR movie by using 3 trackers located on level ground. This is a simple example of Coordinate System Setup in SynthEyes.
2017-05-02. 10:14 long.
Shows how to set the viewing direction at the beginning of a 360VR movie, by panning the entire shot in the 3D environment.
2017-05-02. 3:32 long.
Quick recap of how to do horizon line alignment using a single level "golden" frame in the original shot. Use the "Level Scene from Frame" script to do this quickly.
2017-05-02. 1:47 long.
Shows how to solve long shots---regular or 360VR---by processing them as individual pieces, then combining the results. The tutorial shows an almost 19 minute (33,659 frame) 360VR shot being tracked, solved, and stabilized. Input footage is at https://youtu.be/dtaTAJcGCXI and the result at https://youtu.be/MH6ZsWR8Zzc
2017-05-18. 21:00 long.
This is a 360VR tour of the interpretive trail at Black Rock Sanctuary, Chester County, PA, as shot on a Ricoh Theta S camera (~2K with only accelerometer-based in-camera stabilization). See: 2:50, 5:55, 8:40, 10:45 also 13:55, 15:45, 17:25. This is the raw footage for our Solving Long Shots tutorial (https://youtu.be/lViQjXzqcW8). The resulting stabilized, and much more watchable, version of this video can be found at https://youtu.be/MH6ZsWR8Zzc You can open both in separate tabs, and compare the two.
2017-05-18. 18:44 long.
This is a 360VR tour of the interpretive trail at Black Rock Sanctuary, Chester County, PA. SynthEyes was used to stabilize the original Ricoh Theta S footage (https://youtu.be/dtaTAJcGCXI) using the method described in https://youtu.be/lViQjXzqcW8 The virtual-world/real-world lock is fixed throughout the shot, which is the hallmark of advanced SynthEyes stabilization. As a result, the shot is very watchable. See 2:50, 5:55, 8:40, 10:45 also 13:55, 15:45, 17:25 for trickier spots. Of course, to be a good tour we really need a higher-resolution camera! Remaining artifacts in the footage: up and down motion due to walking and remaining unfixed tracking issues, but most significantly due to rolling shutter (jello) in the two CMOS camera sensors. That manifests itself in spots where a portion of the shot moves, while the rest remains stationary. So make that a higher-resolution global shutter camera!
2017-05-18. 18:44 long.
Shows coordinate system setup including scaling, shadow-object creation, export to Fusion, then a walkthrough of the Fusion nodes created, including (composite) 360VR camera, stabilizer path, and viewer.
2017-06-16. 17:30 long.
Input to the "Exporting 360VR to Fusion" tutorial. Shot on a Ricoh Theta S, which has some built-in stabilization of its own, in Valley Forge, PA.
2017-06-16. 0:32 long.
Output of the "Exporting 360VR to Fusion" tutorial. This version includes the point cloud, as shown in the Fusion viewports, which allows the tracking to be seen. See the Cloudless version of this file to see what an actual viewer would be watching.
2017-06-16. 0:23 long.
This is the output of the "Exporting 360VR to Fusion" tutorial, but without the point cloud. This is a bit less interesting but would be the actual output.
2017-06-16. 0:23 long.
SynthEyes can export 3D meshes to After Effects, automatically setting up a CINEWARE connection to Cinema 4D. Cinema 4D handles the 3D animation and rendering that will all match up for compositing in After Effects, using the tracking data produced by SynthEyes.
2017-08-31. 9:17 long.
Shows a 360VR shot in SynthEyes being exported to the After Effects 3D environment, including the integrated export to Cinema 4D to handle the 3D meshes in the SynthEyes scene. Note that for AE to do 360VR rendering, you must have installed the cube map converter included with SynthEyes. Insta360 Pro footage. 360VR source and output posted separately.
2017-09-15. 14:52 long.
You can export from SynthEyes to After Effects's 3D environment to work on 360 VR shots for logo and screen insertions and other 2.5D compositing effects just within AE. You do NOT have to use Cinema 4D. This shows two ways to set up flat inserts from SynthEyes, or you can do it directly in After Effects's 3D environment. Insta360 Pro footage. 360VR source and output posted separately.
2017-09-16. 11:15 long.
Output of the "Exporting 360VR to After Effects's 3D Environment, with Cinema 4D" tutorial. Original: heavy-duty Benro monopod-mounted Insta360 Pro footage at 4K.
2017-09-16. 0:12 long.
Output of the "Exporting 360VR to After Effects's 3D Environment, for 2.5 D" tutorial. This result shows the mesh planes, as positioned in SynthEyes, with the tip of the right foot at the tracked point. The images do not continuously reorient to the camera. Original: heavy-duty Benro monopod-mounted Insta360 Pro footage at 4K.
2017-09-16. 0:12 long.
Output of the "Exporting 360VR to After Effects's 3D Environment, for 2.5 D" tutorial. This result shows the tracker planes, which continuously orient to the camera, with an anchor point BELOW and between the feet, ie Bottom Center in the roto'd image. Original: heavy-duty Benro monopod-mounted Insta360 Pro footage at 4K.
2017-09-16. 0:12 long.
This imagery is used as input for several tutorials. Shot on an Insta360 Pro mounted on a heavy-duty Benro monopod.
2017-09-16. 0:12 long.
Shows how to use the Find Erratic Trackers tool to help locate bad trackers *before* solving a scene. Such trackers commonly arise in aerial helicopter and drone shots, due to vehicles and people moving. Discusses controls and limitations.
2018-06-27. 18:38 long.
SynthEyes 1806 has new algorithms and controls to help solve large shots, making much longer shots now feasible to solve in one piece. Shows solving a 17,700 frame 360VR shot with 7780 trackers, setting it up and working through the solve.
2018-06-27. 43:04 long.
Shows how to solve shots with more distortion distortion, such as from action cams and drones, using the cubic and quartic solving in SynthEyes 1806. Discusses how to monitor overfitting using canary zero-weighted trackers. Finally, shows exporting the scene to After Effects using the AE distortion plugin, which is updated for 1806.
2018-06-27. 23:06 long.
When a tripod shot doesn't contain enough significant information, it is impossible to determine a field of view from it. When that happens, the solve will produce indeterminate results---both 1806 and 1709, although you may think that 1709's results are good. These video addresses customer concerns that 1806 isn't handling these shots properly, showing what needs to be done in ALL versions---switch to Known lens mode. It shows how the Fuzz First Path control (Tripod Fuzz in 1809+) can be used, and gives some more insight into the solver in general.
2018-08-31. 6:12 long.
SynthEyes 1809+ contains special processing to easily and correctly match-move tripod shots where there is only a single valid tracker during part or all of the shot.
2018-09-09. 7:51 long.
The rig lets you export a shot's stabilization to 3D animation and compositing packages, so stabilized images can be generated there instead of being written by SynthEyes. The tutorial shows how rigs are created for various cases, including exports to After Effects, Fusion, and Blender in the process. Requires updated scripts!
2018-11-02. 39:03 long.
This tutorial introduces the ViewShift system of SynthEyes, which leverages camera tracking data and 3D models to change the viewpoint of imagery to accomplish object removals, combining split takes, and animated texture map creation. Here we walk through the control panel on an example, removing a car from a shot under control of an animated spline. We also show illumination level compensation for higher-quality inserts. Imagery download: CarVS.zip
2019-03-25. 22:21 long.
Here we show the use of 3D meshes to determine the region to shift, instead of animated splines. We show a number of problems that come up, and techniques to address them. Imagery download: GoingThruPhase.zip
2019-03-25. 21:13 long.
This is a quick run-through of using ViewShift to generate an animated texture map. It also shows how to apply it for visualization inside SynthEyes. Imagery download: Tower.zip
2019-03-25. 5:35 long.
ViewShift can work from a manually-cleaned plate as the source shot, not only from an entire shot. This tutorial shows some small scripts that accelerate setup when the clean plate is one of the frame in the main shot.
2019-05-10. 8:00 long.
SynthEyes 2106 now offers tooltips and optionally, menus, script names, and user interface elements, in 25 languages. It also supports international Unicode UTF-8 characters in file, tracker, camera, mesh, etc names, and notes and file descriptions.
2021-06-03. 7:30 long.
Shows how you can use the AprilTags tracking capability of SynthEyes for object tracking, illustrated by tracking two tags taped to a hand-held phone. Source and result footage is available on our website in the Downloads | Example Files area.
2021-05-21. 5:37 long.
Shows a box with 3 AprilTags being tracked using Corner Tracker mode, combined as a single moving-object track. Bonus: a coordinate system setup along the box's diagonal, resulting in the box becoming the hilt of a sword.
2021-05-26. 10:34 long.
Shows how to align a shot to a reference shot, using 3 or more AprilTags in the scene for easy alignment. This example uses an outdoor scene from a Wiral cable rig, though AprilTags may be even more useful indoors and in green-screen shots.
2021-05-30. 8:59 long.
SynthEyes 2204 has many improvements to increase the usability of flexes, which are curves in 3-D space, for modeling and other tasks. This tutorial shows how you can use them.
2022-03-23. 4:02 long.
Gives an overview of new features in SynthEyes 2204 to look out for, PLUS talks about the new SyFlo2 license manager and the two new managed license types. Important: you'll need to upgrade everything to 2204 and SyFlo2 at once!
2022-03-19. 10:38 long.
SynthEyes 2204 now offers Apple ProRes codec support in Windows and Linux; previously it was available only in macOS. The tutorial runs through features of the ProRes support, including how to write ProRes files with colorimetry matching the original. A new feature uses presets to ensure Save Sequence always writes with the correct colors.
2022-03-21. 6:01 long.
Rectify Grid is a new capability to "just fix" arbitrarily distorted lens grids, producing STmap image distortion maps. In addition to describing that, the tutorial discusses creating and shooting lens grids, no matter how or what will process them, in hopes of addressing the plague of disastrous lens grids that customers receive to be analyzed. Crucially, lens grids must reach the image's edges!
2022-03-22. 7:28 long.
SynthEyes 2204 now features frame number, timestamp, and timecode burn-in to Camera and Perspective Views, and Save Sequence and Preview Movie output. This tutorial shows the controls and various details.
2022-03-22. 3:46 long.
Under some situations, you need to UNsolve parts of a scene: when a solve goes wrong; or when the solver detects an error and says you must, after you've deleted too many trackers. A new graph editor channel, #Solved, shows the number of solved trackers on each frame, providing guidance and insight into the solver's operation.
2022-03-28. 6:03 long.
SynthEyes 2204 introduces a powerful new exporter for the Universal Scene Description (USD) file format, which may eventually supercede the Filmbox (FBX) and Alembic (ABC) file formats. We show its operation, plus a number of helpful tricks, including using the USD Toolset for viewing and format conversion.
2022-03-30. 7:55 long.
Walks through the theory and operation of roto-masking features in SynthEyes 2210: creating splines, including spline setup via trackers; assigning blips via splines; visualization and outputing mattes; and working with imported mattes.
2022-10-11. 19:17 long.
Discusses anamorphic distance, an optical feature of anamorphic lenses with significant, but little understood, impact on matchmoving and graphics, which can explain a number of their unusual properties. Shows how SynthEyes can calculate it, how it can be compensated for, what that looks like, and how it can be handled in downstream applications.
2023-04-14. 14:14 long.
Shows how SynthEyes's lens parameter solving modes can be used to determine animated lens distortion parameters on a shot with a large zoom. "Animate on keys" mode allows a tailored distortion curve, preventing the animated distortion parameters from introducing jitter into the solved shot.
2023-04-14. 7:01 long.
Shows a rack focus shot of a lens grid being analyzed using "animaate by frames" parameter solving mode to examine the horizontal and vertical scaling changes produced by lens focus breathing. This process can be used on simple but frequently encountered shots such as a hand-held but static shot of an actor, with a change in focus.
2023-04-14. 11:13 long.
This follow-on focus breathing shot looks at a somewhat more realistic shot, using the "Animate on Keys" lens parameter solving mode to control parameter jitter when there are fewer trackers available.
2023-04-14. 12:15 long.
This extensive, detailed, tutorial shows an anamorphic shot being locked to a lidar scan. The resulting solve is used as a starting point for several additional tutorials. This process can also be used for non-anamorphic shots; for example matching architectural models to drone shots (using a standard radial lens model).
2023-04-14. 24:53 long.
Runs through the many new features of the Nuke exporter in SynthEyes 2304, including distortion export to Nuke Lens Distortion nodes (not STmaps) using overscan rendering if the Lens Workflow script has not been run; supporting both 1- and 2-pass workflows if it has. Other features include projection screens, auto-run, paste-to-clipboard, and New 3-D support.
2023-04-14. 11:20 long.
This tutorial shows what the field of view produced by SynthEyes corresponds to, and introduces a new script that produces numbers that are comparable to lens barrel values and spec sheet numbers --- and also the sensor size value required for any of those numbers to have plausible accuracy.
2023-04-14. 6:24 long.
SynthEyes easily is the best camera match mover and object tracker out there.Matthew Merkovich