Determining True Field of View and Focal Length


 

This tutorial shows what the field of view produced by SynthEyes corresponds to, and introduces a new script that produces numbers that are comparable to lens barrel values and spec sheet numbers ‐‐‐ and also the sensor size value required for any of those numbers to have plausible accuracy.

Script for Search Engines:

Hey, this is Russ Andersson. Often, people want to compare their tracking results to the focal length shown on the lens barrel. This has always been tricky, for several reasons. First, the lens barrel value is only an approximate +/- value, several percent. Second, it depends on the lens focus distance and to a lesser extent, the aperture settings. Third, the relationship between field of view, as determined by the tracking software, and the focal length depends on the camera's sensor size. Focal length is only "half a number" in that regard. You have to dig into the camera specifications to find it, and the raw sensor size isn't quite the right number. You need to know the size of the part of the sensor that produced your image --- which is often a cropped portion of the real sensor. Sometimes a vendor will provide that, but accurate numbers are still hard to come by. Despite all this, people still persist in wanting to do this comparison. If you look carefully, there's yet a fourth factor to consider when using match-moving software, concerning the definition of the field of view produced by SynthEyes. We're going to take a look at an underwater GoPro shot, which has quite a bit of distortion, as an example. We'll start with a quick and dirty solve of the shot, beginning by increasing the number of trackers to 300. We'll run the auto tracker, and on the solver panel, we're going to switch to the radio plus squash model, and just compute the quadratic distortion to start with. Once we've got our initial solve, then we can go and turn on a few more of these values, and refine the solve. So we get a reasonable solve from this, without even looking at the trackers to clean them up at all. Now the solver produces this 84 degree field of view. The question is: "Well, what does that correspond to?" If we go to the perspective view, and I'm just going to right click and pan out a little bit, you'll see that the image is here and it's been dynamically undistorted by the perspective view. The field of view of the image itself varies depending where you are in the image, across the top here, the horizontal field of view is all the way up there, and it's similarly down here, but across the middle it's not so big. Same thing in the vertical direction, smaller field of view in the middle, larger vertical field of view towards the edges. So the solver is reporting the field of view of this pink rectangle here, and that is the nominal undistorted image that the solver considers. Now if I go and run the lens workflow, we'll run it in Lens Workflow #2, this is now the image being produced by the image preprocessor, and the Lens Workflow script has adjusted the field of view to a larger value now, that reflects this fact that the image has been squished down to fit, if you like, and so that all parts of the image are visible. So the solver is always producing just this one number that corresponds to the output of the image preprocessor or the solver itself; it always stays self-consistent between the image and that value and that's what you need for match moving purposes now. But the sensor size value corresponds to this original image, not this modified bounding box. So to take all that into account, we need to run the True Field of View script, and that goes and takes a look at the situation, to produce an actual horizontal image field of view, and that's across the middle center here, and a vertical image field of view, that's here, and a diagonal image field of view. And to go with that, there's a true focal length value that is independent of those numbers. But that is all based on the actual sensor width. So I get these same set of numbers regardless of what happens. If I go and undo the lens workflow and run the script again, I still get the same set of numbers, and the same set I'd get if I ran the Lens Workflow #1 as well. I hope this gives you some more insight into what the various numbers mean, and what it takes to get values for comparison to spec-sheet values. SynthEyes produces precise, self-consistent, numbers that allow you to insert new 3-D elements at a pixel-by-pixel level. But that's different than absolute accuracy, where whether it's output from SynthEyes, or a value from a spec sheet, the numbers are only estimates. Take care and thanks for watching.

SynthEyes easily is the best camera match mover and object tracker out there.

Matthew Merkovich

More Quotes