Skitch is a brand new app for uploading of pictures. “Heh, I need an app for that??”, you might say. Yes, you do. Because Skitch do it in ways you wouldn’t believe. It looks amazing, behaves like it has a Nobel Prize-brain, and the icon is sweet.

Check out the 3 minutes entry video which says it all:

Then go grab a semi-public beta at Colin Devroe, or just wait a few hours until it’s public beta.

Final Cut Pro wipes

I have used Final Cut Pro since version 1.0 in 1999. During these 8 years I have used wipes two or three times. Wipes are ugly.

Heh. But today a friend of mine called and asked:

What do I buy if I want the most crazy wipes out there for Final Cut Pro?

Fair enough. He makes a children’s program, so he DOES need them. It’s part of the deal to make wacky transitions every 6 seconds.

So I made a list of some interesting things:

The Final Cut Pro wipes list

GeniusDV has a tutorial called “Utilizing Gradient Wipes in Final Cut Pro”. It shows how to use a matte image as source for crazy wipes in FCP.

CHV Electronics has a collection of plugins for FCP. One of them is called “The AlphaWipe-collection V1.0”. I have never tried it, but it looks like you get a lot for your $.

CGM also has lots of FCP plugins, and many of them have crazy transitions. Have a look at the demo movies (links are on the top of the page).

Mattias in Sweden makes a great bundle of free FCP plugins. You can download them at They are not wipes, but they are free and should go into any FCP setup. Donate him some Paypal $ if you use them!

Digital Heaven in UK has some great quality FCP plugins. I use the DH_Subtitle – a subtitle plugin myself. And when it comes to transitions they have the nice DH_WhipPan. The plugins are cheap, and they are great quality.

And finally: Toolfarm has a list of FCP plugins that does transitions.

Do you have any suggestions? Things that actually work?

Final Cut Studio 2

In October 2006 I wrote:

I think the new version of Motion will be Motion on steroids. Apple will put lots of the stuff Shake can do inside Motion, and now also include the great color tools from Final Touch. I’m not even sure they will continue FinalTouch as separate product. It depends whether they manage to get a work flow for colorists inside Motion.

Yesterday Apple showed their biggest upgrade of Final Cut Pro ever; the Final Cut Studio 2. With new versions of abot everything (except Livetype).

Let me start off with the single most important thing to me as a video editor: Open Format Timeline. With this new feature in FCP, I can drop all my most used video formats in the same timeline… (drum roll) with no rendering!. I’m going to save hours and hours just with this feature. All these formats are supported by this feature:

Open Format Timeline - formats supported

FinalTouch = Color

Apple bought FinalTouch last autumn, and now they are throwing it into Final Cut Studio 2 – for free! Yes, it’s part of the package. Either you pay $1299 for the a new package, or $499 to upgrade from Final Cut Studio, or $699 from any previous version of FCP – including version 1 from 1999. That’s a very good deal. Applause for Apple, who have sold more then 800,000 copies of Final Cut Pro so far, and will sell tons more with this upgrade.

Have a look at the new features of FinalTouch, now called Color.

I’ll have a full rundown of all important new features of the new Final Cut Studio 2 package soon.

Just when you thought HD was the next big thing, then comes 3D HD

3D HD is THE next big thing. And it comes to sports and music first. NBA (the National Basketball Association in USA), plans to shoot some of their games in 3D HD. In a session called “Winning Ways to Wow the Sports Broadcast Viewer” (must be an advertising guy who cme up with that title…), NBA and Pace will show the All-star game in 3D HD.

Special invited guests saw the NBA All-star game in 3D HD on February 18th, and now attendees to the NAB exhibition in Las Vegas, get to see the game and hear about the advanced technology behind it.

“Thomp”, at NowPublic saw the All-star game. And is convinced this is the new thing:

…made me feel like I paid 6,000 dollars to sit among the stars on the floor of the hardwood court. The up close FUSION 3D HD action made me forget I was miles away in a theater wearing 3D glasses and the sound was to as if I sitting at courtside. I felt I was so close that when Jay-Z bent over to whisper to Beyonce, I wanted to tap his shoulder and say she with me! Action seemed so close that when a loose ball flew in to the crowd, several people in the theatre (including me) put our hands out to catch it!

He also has some thoughts on the future of the system:

In summary I can see this technology being a big hit in the future especially to the common folk like me who can not pay the large sums of money to partake in these events live. PACE should not limit itself to major sports events like the Super Bowl or the NBA. This technology can be used to see high priced concerts or highly publicized Las Vegas shows. This can be another viable revenue stream for artist and promoters. Instead of booking a plane and hotel ticket, you can jump in your car, drive to your local IMAX Theater and save a couple of thousand dollars.

And – saving the planet while having a great time, I might add.

How is it done?

The technology behind this is exciting. James Cameron is shooting both his forthcoming movies with the same 3D cameras. The $200 million “Avatar” (2008) and the movie adaption of the manga series “Battle Angel Alita”, called “Battle Angel” (2009), are both shot in 3D HD. Cameron and Vince Pace are the founders of Pace 3D technologies that makes the special versions of the camera. Vince Pace is also director of photography, second unit on both movies.

The cameras are customized versions of the US$115,000 Sony HDC-950, in a specially designed rig. The camera sensors are placed 70mm apart to capture left-eye and right-eye imagery seperately. Pace is also developing the camera system to include the newer Sony HDC-1500 HD cameras.

NBA used six Sony cameras to capture the All-star game:

These camera feeds were distributed via fiber to a portable HD fly-pack system (provided by Bexel) in the arena that included a Sony MVS 8000A switcher and an EVS XT[2] server.

The Sony switcher includes a feature that allowed the director to aggregate multiple camera feeds and lock them together. The director then switched the multicamera production as he would a typical game broadcast; but, to get the full 3-D effect, he often held on shots longer than usual. Instant replays and graphics were also presented in 3-D, using the EVS server.

The output of the switcher (two uncompressed HD signals at about 3Gb/s) was sent to the Mandalay Bay ballrooms via fiber cabling. The larger ballroom was set up in a stadium-seating configuration, on risers to give the full effect. The smaller space was standing room only. Images were displayed with two stacked Sony SXRD 4K projectors in each room on 47ft and 30ft screens. The projectors were fitted with a special polarizing filter supplied by 3-D specialists Real D. Audience members wore special polarizing glasses to get the full 3-D effect.

Sports and concerts

Both Marilyn Manson and Gwen Stefani has used this new technology for music videos, and Universal Music Group’s Interscope Record has signed a deal with Pace to use the technology for concerts. Expect to see Dr. Dre, Eminem, U2, Gwen Stefani, 50 Cent, Sheryl Crow and Pussycat Dolls doing their things in 3D at your local cinema soon.

That is, if your local movie theatre can show 3D. Only 700 of 37 000 US movie theatres have 3D projection screens (I have not been able to numbers for the rest of the world). But Sony do of course want to change that:

“We are ready to roll into any theater with the two-projector system.”

…said John Kaloukian, director of Sony Electronics’ professional display group to Reuters (note: Some Reuters articles only stay up for three weeks so this link may go dead after some time).

Most of 700 movie screens with 3D projection, are delivered by Real D, which even has a blog with interesting articles like this one about “Composing for Stereo: The Filmmaker’s Point of View”:

The other concern I call the strength of the stereoscopic image. That is determined by two interrelated factors. One of them is completely new to the stereoscopic cinema, and has no direct counterpart in the planar cinema – and that is the distance between the spacing of the cameras or the camera lenses. Whether we’re shooting live action, or we’re in a CGI virtual space, the distance between the camera’s lenses is a critical factor. We’re going to call whatever we’re shooting with a stereo camera. A stereo camera, unlike a conventional camera, captures two perspective viewpoints. So we’re not going to refer to stereoscopic cameras. We’re going to call it a stereoscopic camera, and we’re going to say it has two lenses – a left lens and a right lens.

The distance or the spacing between the two lenses is called the interaxial spacing – that is, the distance between the lens axes. If you think about it, if the lenses are superimposed – in other words, if the axes had zero spacing – you’re shooting a planar movie. The farther apart the lenses go, the deeper the image looks. The use of this control is closely related to the focal length you use. Wide angle lenses tend to stress perspective, because objects that are closer to the lens appear to be proportionately larger, and the background appears to be smaller. The stereoscopic depth sense, which is technically known as stereopsis, is weighted or scaled by extra-stereoscopic cues – that is, by non-stereoscopic or monocular cues. One of the strongest of these is perspective – and perspective is often determined by the choice of focal length. So it turns out that with wide angle lenses you can use a reduced interaxial, and for telephoto lenses you can use a larger interaxial.

New tools = New ways of thinking! This also may be technology that take people back to the cinemas. If you have a 50″ plasma/LCD screen at home, a blu-ray or HD-DVD player, a great chair and your kitchen nearby – why go to a movie theatre? With 3D HD you have a new reason, as I don’t think there will be a home version of Sony SXRD 4K projectors anytime soon. Not to mention the players delivering the actual movies.


I’m curious, how do you edit this? HD with two seperate cameras, one for each eye? What are your options for post processing? Do you need special software or do you edit in “normal” tools like FCP or Avid systems? Feel free to comment below if you know!

Update made a video from the Sony/NBA event at NAB, as seen below (you may have to click through to too see the video if you read this in a newsreader).

Blackmagic Disk Speed Test showing slower results

We have been installing a new Xsan system the latest days (more about that in a later post). When testing the performance we use two different tools, and they have very different results, 186 MB/second vs 136 MB/second.

The first one is Xsan tuner, that comes with Xsan:

Xsan tuner

Speed is calculated to 186 MB/second.

We use DeckLink Extreme HD cards in the Macs (10 bits uncompressed SDI). Blackmagic also has their own test tool for disk speeds, the Blackmagic Disk Speed Test:

BlackMagic Disk Speed Test

Now, the same disks show 136 MB/second.

Which one to trust? BlackMagic make dead solid products, but on this one I hope their tools show the wrong results…

If you need to test your drives for speed, whether you use BlackMagic cards or not, download the DeckLink drivers for Mac OS X or Windows XP, and you’ll get the Blackmagic Disk Speed Test application when you install it. If you don’t need the actual drivers, just “hide away” the app, and uninstall the drivers again.

Xsan tuner can be downloaded from

Why a difference?

Do you have any idea why the two apps show different speeds? 136 MB/second up to 186 MB/second is almost two more streams of uncompressed video in realtime. Less rendering, more done.

Multitouch will revolutionize your computer

Giles Turnbull at O’Reilly has a short update om Jeff Han, who makes the amazing multitouch interface. Jeff has founded the Perceptivepixel company. The website is just a front page (with lamp graphics in multitouch) and not much else.

O’Reilly also has this video that shows how much cooler multitouch has become in just a year. Go back to my original multitouch post and have a look at the video there. Now, Multitouch is a whole wall.

(Click through too se the video)

Continue reading Multitouch will revolutionize your computer

Aperture to Final Cut Pro

I love it when someone solves a problem, just before I need it solved! This happened yesterday. Eirik and I have been looking for the ultimate photo archive solution. I have been using iPhoto since getting my first digital camera some years ago. iPhoto is a great product but has been missing some more advanced features, like a better tagging system, and more advanced editing.

Aperture solves that, and so far I’m very happy about the 1.5 version.

And just the other day I was putting “figure out how to export lots of pictures from Aperture to FCP quick and easy” in my todo-list. Then comes “Aperture to Final Cut Pro” from Connected Flow. Check. Great work, Fraser! There’s a dead-easy manual online and the price is nice (read: Free).

Aperture to Final Cut Pro menu

Also check out the excellent Flickrexport for iPhoto and Aperture.

Final Print

Final Print is a standalone application which prints a list of markers from a Final Cut Pro sequence. This provides a very useful workflow enhancement when handing off a project to someone else for further work.

Excellent little application from Digital Heaven. I also recommend their Final cut Pro plugins, of which I own DH_SubTitle. It does what it says: Make subtitles. (I’m so lucky that I live in an European country where we DO NOT DUB foreign movies, but subtitle them. Going to movies in other European countries (or watching other European channels) is a total nightmare. Dubbing ruins movies…)

Now, back to subject. Digital Heaven is run by talented Martin Baker, who also has made the VideoSpace widget, a widget all video editors should have on their Dashboard.


Update: There’s a newer video at “Multitouch will revolutionize your computer”. This is certainly something we’ll see in one form or another in Mac OS X pretty soon.

Apple let the iPhone out of the box today… and wow! It’s a revolution, especially on the UI side.

So now we know where the multi-touch technology that Jeff Han demoed at TEDtalk in february 2006 went. In the iPhone.

Just to remind you, here’s the video at Google again:

And for rss readers: Here’s the direct link.

Now, watch how it is used in the iPhone.

Where do you think all the other cool things Jeff demoes will go? Obviously 10.5. All MacBooks and MacBook Pros could use the same technology with the touch pad. Or maybe the rumoured new Apple display will have a similar multi-touch feature as the iPhone? Just think about it. Apple has already done this now, on a small scale in the iPhone. Mind-boggling.

Used in Pro apps

Imagine having this way of working in the pro apps; Final Cut Pro, Motion, Logic. Zoom in and out of timelines. Zooming on video effects. Controlling faders and graphs directly on the screen. And you could use your fingers to draw vectors for movements, and easier graphics work. It would be so much more organic. Like playing on the computer, using it more like a musical instrument. I would love it.

4 minutes into the video, Jeff moves around lots of pictures with his hands. Imagine using that to organize your content before putting it on the timeline.

The new OS?

What if the new OS X worked like this in the Finder? Zooming into folders, organizing content. Sorting and analyzing. That would certainly be something different than Vista!

So Eirik, it’s Apple-shares you should buy.

The guys behind it are setting up a company and hope to put it into production. Where can I buy shares?

Or rather – bought. Apple shares went up 8.3% today.

Eirik also has a nice video of the screen in action at Siggraph 2006

Update: TED blog

Several blogs write about this, including the TED Blog. Chris Anderson asked Jeff Han about multitouch, and he answered:

The iPhone is absolutely gorgeous, and I’ve always said, if there ever were a company to bring this kind of technology to the consumer market, it’s Apple. I just wish it were a bit bigger so I could really use both of my hands.

Hm. Does it make you any wiser? A bit secretive?

Just to make this a little more fun, seven different computeres at Apple read this post some minutes after it was digged… Read into it what you want. I still think 2007 will be an amazing year for Apple.

Update 2: Exciting updates coming?

Now Jeff Han comments on the project page for multitouch:

Update: Yes, we saw the keynote too! We have some very, very exciting updates coming soon- stay tuned!

(via daringfireball). Still in doubt something big is in the works?

Update 3: Fast company

Fast Company has an article about Jeff Han, which was done before the iPhone launch:

Not everyone is sold on Han’s idea. Ben Shneiderman, a computer science professor at the University of Maryland and a founding director of the Human-Computer Interaction Lab, calls Han a “great showman” who has “opened the door to exciting possibilities.” But he doesn’t think Han’s technology would be suitable for a large-scale consumer product, nor as useful as a mouse on a large display. If you are standing in front of the screen, Shneiderman wonders, how would people behind you be able to see what you’re doing?

One way, Han counters, is for the demonstrator to simply move his ass out of the way. Another: Use a drafting-table display, as Han did at TED, and project the image on a wall-size screen.

But criticisms like these are a million light years from Han’s mind. We’re in his cluttered and cramped office at NYU. Books line a shelf, and a skein of wires unfurls across the floor. A computer circuit board is half taken apart (he stopped losing screws long ago), and a nearby whiteboard contains blueprints and sketches of the touch screen, plus a clever trick for hacking programming code.

Han is explaining why he formed Perceptive Pixel. “I want to create an environment where I can create technology, get it into the hands of someone to market it, and move on to other technologies so I can keep innovating,” he says. “I want to be a serial entrepreneur: Incubate an idea, get it to a good state, and make that an enabler to get to the next state. It’s every researcher’s fantasy.”

Update 4: Fingerworks

According to both Engadget and Charles Arthur, it is FingerWorks technology that is inside the iPhone. Apple bought their technology some time ago.

If you head over to, you’ll see…

Important note!

FingerWorks has ceased operations as a business.

FingerWorks products are no longer available for resale, and no further updates to software drivers will be developed.

The fun thing is that both Jeff Han and FingerWorks call their technology Multitouch.

Update 5: Revolution!

You may want to know how multitouch will revolutionize your computer/

Joe’s filters

If you’re using Final Cut Pro and NOT using Joe’s filter… Spend a few minutes at Joe’s Filters. Joe is about to upgrade the filters, he writes in an e-mail today:

I’m finishing a major upgrade (way, way too long in coming) and starting to look for new ways to make the filters bigger, faster and better.

He also want’s you to help with the new filters, so if you have some answers on these questions…

  • Which filters do you use most?
  • What formats are you working in? (SD, HD, HDV, DVCPRO, Red, etc)
  • How did you first learn about Joe’s Filters?
  • What other plugins do you use?
  • What kinds of new tools would you like to see? (existing things done better, or things that haven’t been done yet)

Continue reading Joe’s filters

1 000 Amazing Circles (actually 9 743)

In November 2005 I promised not to mention Amazing Circles on my site until member number 1 000…. I broke my promise. Twice. I admit.

Amazing circles 1000

Screenshot: The latest creations in the Flickr Amazing Circles group

But now there are 1 002 members in the Amazing Circles Flickr group, and member number 1 000 is JJjunki from Tsukuba, Japan. Welcome! Here’s a circle JJjunki made.

The group now approcahes 10 000 Amazing Circles, with the most active members close to 200 circles each.

Also, a big applause for Nadano Kamome, who started this with the picture “How to edit Reversing world” in October 2005. I almost met Kamome in April 2006, when I visited Japan with my family. Almost, because we couldn’t find a spot and time where we both could be at the same time. I think we even passed each other on different trains between Kyoto and Kobe at some point! Next time, Kamome, I promise! I’m definitely going back to Japan, which was just brilliant. Remind me finishing my Japan series that I started ( – tag “Japan”), but never finished.

If you do a Google search for “Amazing Circles”, you’ll get over 37 000 hits. That’s a quite a bit for a name I “invented” only 15 months ago. According to Trendmapper (Amazing Circles chart), it has been as high as about 95 000 hits back in May 2006. And when Digg caused a stampede in February 2006, the site went down. Heh.

Some people have gone bananas with pictures of themselves. So if you want to make some really scary new year’s cards…