Stereoscopic (3D) Filmmaking

by Rogers Jayzee

Anaglyph 3D is a kind of stereo image that combines a left and right picture into a single image that is then separated back into the left and right images by the viewer with the use of colored filters placed over their eyes. The most common filtering colors used for Anaglyph 3D glasses are red and cyan.

Briefly, Triple flash polarized 3D, RealD 3D, is the popular new delivery technique pioneered by RealD inc. Similar to Anaglyph 3D,it also uses special filters over the viewers eyes to separate the left and right images except that these filters filter different polarities of light instead of color.

The other main difference is that RealD images are not combined into a single image. Each image completely maintains its integrity as a whole image. The left and right images are shown sequentially to the viewer at such a high rate that when not wearing the special polarized glasses, the images look like a single blurry image however, when you put the RealD polarized glasses on, the images are filtered in such a way that the left eye only sees the left image being projected and the right eye only sees the right image.

The demonstration I will show is for Anaglyph 3D images, however, it should be noted that regardless of the delivery of the images (anaglyph, RealD, Viewmaster, etc) the same principals apply to the photography of the images. What you do with the left and right images once photographed is up to you.

It is a commonly held belief by a lot of people that you must place your cameras 2.5 inches apart if you want to create good stereo images. This idea was born from the fact that human eyes are approximately 2.5 inches apart so, to recreate a 3D view that matches our own way of seeing, we must place the cameras 2.5 inches apart.

The truth is that there is no such rule. It is a myth believed only by people who have a limited understanding of the process.

It’s kind of similar to the way many people believe that bodybuilders’ muscles will turn to fat if they stop working out. It just isn’t so.

The only time you might want to say “I must have the cameras 2.5 inches apart” is if you were, for example, photographing a man who is 6 feet tall. His feet are at the bottom of the screen and his head is near the top and you plan on projecting the images of the man on the screen so that his image is 6 feet tall. In other words, you are trying to replicate the image accurately in all 3 dimensions.

I know, that sounds like exactly what you want to do and it seems like I just proved the point for needing to have the cameras 2.5 inches apart but, now here’s the thing, we as film makers are not going to only shoot master shots of 6 feet tall people from head to toe and we are not going to show them on a screen that accurately keeps their height at 6 feet. We shoot everything! From Close ups of a bug to wide panoramic of mountains in the distance. We do not maintain scale within our frame. A bug on the back of someone’s hand could fill the screen or an elephant could be a spec k in the corner of the screen.

This is why there is no value to limiting our camera set up to 2.5 inches.