Just when you thought HD was the next big thing, then comes 3D HD

3D HD is THE next big thing. And it comes to sports and music first. NBA (the National Basketball Association in USA), plans to shoot some of their games in 3D HD. In a session called “Winning Ways to Wow the Sports Broadcast Viewer” (must be an advertising guy who cme up with that title…), NBA and Pace will show the All-star game in 3D HD.

Special invited guests saw the NBA All-star game in 3D HD on February 18th, and now attendees to the NAB exhibition in Las Vegas, get to see the game and hear about the advanced technology behind it.

“Thomp”, at NowPublic saw the All-star game. And is convinced this is the new thing:

…made me feel like I paid 6,000 dollars to sit among the stars on the floor of the hardwood court. The up close FUSION 3D HD action made me forget I was miles away in a theater wearing 3D glasses and the sound was to as if I sitting at courtside. I felt I was so close that when Jay-Z bent over to whisper to Beyonce, I wanted to tap his shoulder and say she with me! Action seemed so close that when a loose ball flew in to the crowd, several people in the theatre (including me) put our hands out to catch it!

He also has some thoughts on the future of the system:

In summary I can see this technology being a big hit in the future especially to the common folk like me who can not pay the large sums of money to partake in these events live. PACE should not limit itself to major sports events like the Super Bowl or the NBA. This technology can be used to see high priced concerts or highly publicized Las Vegas shows. This can be another viable revenue stream for artist and promoters. Instead of booking a plane and hotel ticket, you can jump in your car, drive to your local IMAX Theater and save a couple of thousand dollars.

And – saving the planet while having a great time, I might add.

How is it done?

The technology behind this is exciting. James Cameron is shooting both his forthcoming movies with the same 3D cameras. The $200 million “Avatar” (2008) and the movie adaption of the manga series “Battle Angel Alita”, called “Battle Angel” (2009), are both shot in 3D HD. Cameron and Vince Pace are the founders of Pace 3D technologies that makes the special versions of the camera. Vince Pace is also director of photography, second unit on both movies.

The cameras are customized versions of the US$115,000 Sony HDC-950, in a specially designed rig. The camera sensors are placed 70mm apart to capture left-eye and right-eye imagery seperately. Pace is also developing the camera system to include the newer Sony HDC-1500 HD cameras.

NBA used six Sony cameras to capture the All-star game:

These camera feeds were distributed via fiber to a portable HD fly-pack system (provided by Bexel) in the arena that included a Sony MVS 8000A switcher and an EVS XT[2] server.

The Sony switcher includes a feature that allowed the director to aggregate multiple camera feeds and lock them together. The director then switched the multicamera production as he would a typical game broadcast; but, to get the full 3-D effect, he often held on shots longer than usual. Instant replays and graphics were also presented in 3-D, using the EVS server.

The output of the switcher (two uncompressed HD signals at about 3Gb/s) was sent to the Mandalay Bay ballrooms via fiber cabling. The larger ballroom was set up in a stadium-seating configuration, on risers to give the full effect. The smaller space was standing room only. Images were displayed with two stacked Sony SXRD 4K projectors in each room on 47ft and 30ft screens. The projectors were fitted with a special polarizing filter supplied by 3-D specialists Real D. Audience members wore special polarizing glasses to get the full 3-D effect.

Sports and concerts

Both Marilyn Manson and Gwen Stefani has used this new technology for music videos, and Universal Music Group’s Interscope Record has signed a deal with Pace to use the technology for concerts. Expect to see Dr. Dre, Eminem, U2, Gwen Stefani, 50 Cent, Sheryl Crow and Pussycat Dolls doing their things in 3D at your local cinema soon.

That is, if your local movie theatre can show 3D. Only 700 of 37 000 US movie theatres have 3D projection screens (I have not been able to numbers for the rest of the world). But Sony do of course want to change that:

“We are ready to roll into any theater with the two-projector system.”

…said John Kaloukian, director of Sony Electronics’ professional display group to Reuters (note: Some Reuters articles only stay up for three weeks so this link may go dead after some time).

Most of 700 movie screens with 3D projection, are delivered by Real D, which even has a blog with interesting articles like this one about “Composing for Stereo: The Filmmaker’s Point of View”:

The other concern I call the strength of the stereoscopic image. That is determined by two interrelated factors. One of them is completely new to the stereoscopic cinema, and has no direct counterpart in the planar cinema – and that is the distance between the spacing of the cameras or the camera lenses. Whether we’re shooting live action, or we’re in a CGI virtual space, the distance between the camera’s lenses is a critical factor. We’re going to call whatever we’re shooting with a stereo camera. A stereo camera, unlike a conventional camera, captures two perspective viewpoints. So we’re not going to refer to stereoscopic cameras. We’re going to call it a stereoscopic camera, and we’re going to say it has two lenses – a left lens and a right lens.

The distance or the spacing between the two lenses is called the interaxial spacing – that is, the distance between the lens axes. If you think about it, if the lenses are superimposed – in other words, if the axes had zero spacing – you’re shooting a planar movie. The farther apart the lenses go, the deeper the image looks. The use of this control is closely related to the focal length you use. Wide angle lenses tend to stress perspective, because objects that are closer to the lens appear to be proportionately larger, and the background appears to be smaller. The stereoscopic depth sense, which is technically known as stereopsis, is weighted or scaled by extra-stereoscopic cues – that is, by non-stereoscopic or monocular cues. One of the strongest of these is perspective – and perspective is often determined by the choice of focal length. So it turns out that with wide angle lenses you can use a reduced interaxial, and for telephoto lenses you can use a larger interaxial.

New tools = New ways of thinking! This also may be technology that take people back to the cinemas. If you have a 50″ plasma/LCD screen at home, a blu-ray or HD-DVD player, a great chair and your kitchen nearby – why go to a movie theatre? With 3D HD you have a new reason, as I don’t think there will be a home version of Sony SXRD 4K projectors anytime soon. Not to mention the players delivering the actual movies.


I’m curious, how do you edit this? HD with two seperate cameras, one for each eye? What are your options for post processing? Do you need special software or do you edit in “normal” tools like FCP or Avid systems? Feel free to comment below if you know!


Camcorderinfo.com made a video from the Sony/NBA event at NAB, as seen below (you may have to click through to brilliantdays.com/3d-hd too see the video if you read this in a newsreader).