Queen Elizabeth 3D

Me Mark Bone speaking with the Queen after our 3D Shoot
Back during the summer I had the honour to work on a live 3D presentation to Her Majesty, Queen Elizabeth. The Queen made a royal visit to Canada this summer where she spent half a day at Pinewood Studios in Toronto. In honour of her visit, a live dance inspired by the Jai Ho dance sequence from Slumdog Millionaire. Genie Award-winning and Academy Award-nominated Indian-born Canadian film director and screenwriter Deepa Mehta directed the dance with a War of 1812 twist. The event was hosted and coordinated by the SIRT Centre (Screen Industries Research and Training) in association with Sheridan College. I had the pleasure of working with Cinematographer Doug Koch to determine the 3D parameters of the project. It was one of his first 3D projects, but Koch quickly picked up on the 3D nuances of stereoscopic cinematography. Although there was no official stereographer on set, John Harper, a Convergence Puller and Rig Tech out of the Vince Pace camp, and me took it upon ourselves to control the 3D parameters of the rig. I used Lenny Lipton Math to determine my positive and negative parallax values on the days leading up to the event.

Lenny Limptons Stereoscopic Convergence Math

The 1812 3D presentation for the Queen was captured on The Stereo Tango 3D rig with two Sony HDC-P1 cameras on a 30 foot technocrane. The Tango mirror rig that was used is designed by Sebastien Laffoux and was equipped with an Arri remote follow focus system, that is modified to allow for remote, on-the-fly convergence and interaxial distance adjustment as well as control for the left cameras tilt. This is very convenient for us as the Convergence and I/O pullers as it gives us the ability to change the cameras interaxial distance and as well as it’s convergence point while the camera was hoisted up into the air on the technocrane.

The left and right camera feeds were sent through to the Sony MPE-200 processor from the Tango rig over a pair of SDI cables. The 3D depth budget was decided ahead of time based on the size of the screen that the images were being projected onto. Based on the known information on the screen size, the depth bracket was decided. The positive parallax limit (wall-eye limit) was set to 0.8% of screen width and the negative parallax limit was set to 2%. This allowed for comfortable 3D that wasn’t overly conservative. The Sony MPE-200 allows for intuitive monitoring of these values, displaying a waveform that depicts how much of the image is in negative parallax and how much is in positive parallax. The operator can simply watch this waveform to determine if the image is within the depth budget. There are also other visual cues within the MPE-200’s interface that notify the operator whether or not these limits are being adhered to.

Anaglyph High Above with the Technocrane (taken during rehearsals before temporal disparity was corrected)

The MPE-200 will also correct minor alignment issues, which is helpful to fix sub pixel misalignments. Within the alignment interface is the option to flip and flop the individual cameras, which turned out to be an invaluable asset to the production. Since the cameras were mounted in a mirror rig, the vertical camera (right camera) has to have a flip applied to the image. The crew initially applied this flop in camera, which causes a single frame delay of the image. Any sort of temporal delay is detrimental to 3D and damages the effectiveness of the 3D images. It also will severely diverge the entire image whenever the camera is panned to the right.

Luckily the MPE-200 processor allows the operator to flip and flop the cameras images without any delay occurring. Once this issue was corrected the 3D became much more enjoyable.

Because the cabling coming from the cameras had to run all of the way from the top of the technocrane and into multiple 3D monitors, processors and decks, it then became very crucial that the cabling was distinctly marked to differentiate between the left and right camera. There was a few occurrences originally where the cables had been swapped and this caused the image to become diverged and unwatchable.

The Queens visit was an ideal 3D capture situation for many reasons. It was shot with a great deal of light, to provide a large amount of depth of field, which is often most enjoyable for audiences as it allows them to look deep into the background as well the foreground, in 3D.

The single, extensive camera move was thoroughly rehearsed and gave the Stereoscopic team ample time to prepare for any possible Stereo Window violations and uncomfortable 3D moments. It allowed us full rehearsal of the I/A and Convergence pulling, so we were able to carefully rehearse our marks, so that interaxial distance and convergence point provided the best possible 3D at all of the crucial moments.

Seen in the photo below is how we coped with camera moves that came as close to some objects as 2ft at times.

Convergence on pole
Because it would be uncomfortable to have had the pole coming far out in negative parallax (jumping out, off of the screen) since it sits on the edge of frame for a while, we decided to lower the Interaxial between the cameras and converge on the pole so that for the viewers it would appear as though the pole is actually inside the theatre screen, as if they are looking through a window. This typically is more comfortable then having half of an object appearing in the middle of the room. Our brain understands occluded objects when they appear as though they are behind the screen much easier then if they appear to be floating in the air.

Although the lenses (Fujinon) were capable of zooming, the aesthetic choice was to leave them locked at a fairly wide angle, which is conducive for more immersive 3D. As well this avoids any possible misalignment issues that often occur when zooming with two separate cameras in 3D. The lenses tended to breathe a lot when their focus ring was adjusted. To avoid any discontinuity between the cameras, the lenses were left at the best hyper-focal length for the scene.

Leave a Reply

Your email address will not be published. Required fields are marked *