Camera Project‎ > ‎

Tracking a Camera's Movement

CMy musings on what I would like to see in a hybrid Stills/Motion camera.

I’ve been an active participant in Red Cameras Reduser and Scarletuser forums.
These are a series of articles compiled from submissions that I’ve posted on these forums.

First up a bit of background…
http://en.wikipedia.org/wiki/Accelerometer

 
I do a lot of hand held shooting, which I feed through the AE motion stabilising routines.
Kind of a pain cause AE needs to examine each frame to track the movements.

I know that Red is able to record lens movement in the metadata stream.

What if Red installed an accelerometer into Scarlet and recorded the operator’s movements in the metadata?
This data feed could be picked up by a Post app, and used to smooth out the motion in the video. Not just up/down right/left and XY angle but probably Z angle shifts too.
Great for 1080p, and probably good enough to smooth out 2k, depending on how extreme the movements were.
Most of the time I just want to remove vibration, or I get a bit wary after five minutes and my aim starts to wonder, or I just want to straighten the horizon.

Maybe the accelerometer data doesn’t have enough granularity to smooth out the footage completely, but it could give the software some pretty good signposts to follow.


Stabilisation

I’m thinking that this movement data if captured by the Scarlet could be applied by Red Alert making compatibility with Post apps a non-issue?

What about image stabilization within the lens?
This was a question from the Red User group. Unfortunately I didn’t note the name of the originator.
Good point. I have this function in my TX1 and it is amazing to watch it kick-in, in the view finder. I wouldn't complain if Red had lens or chip based stabilising of footage.

Potential problems are that a few companies have patented a lot of lens stabilising tech, and it would be hard for Red to come up with a system that didn’t infringe on others patents.

Lens stabilising is limited to how far the lens (or imaging chip) can be physically pivoted.

Despite running the lens stabiliser in my camera all the time, I still have many clips which need to be stabilised in post.
'Doing it in post' allows a lot more latitude in how much the image is moved around. Especially if one is down-sampling from a larger image.

Would use less battery charge recording data to the metadata stream, then it would driving an extra piece of hardware (possibly??? I’m not a hardware guy.)

And finally, I like that Red cameras capture and store all the data that they can, and allow the user to massage the files to suit their incredibly diverse applications.
Saving the camera tracked movement rather then applying in it ‘in cam’ would allow us to decide how we want to use it.

Match Motion (thankyou Beckspace)

Great idea!
One could move the camera away from the subject and back, and the shot would still be tracked cause the data from the camera movement is captured at the time of shooting, not extracted from the image in post.
Interesting thing about the Scarlet is that the data stream could include data on what the lens was doing at the time of the shot such as zooms, focusing, etc.

View finder rotation

Rotate the camera and the image displayed in the viewfinder orientates to display correctly to the user.

Camera shooting control

The user could set the camera to stop recording whenever it’s pointed at the ground, just in case the user has forgotten to turn off the cam.
It would also be possible to shut off shooting when the cam is wildly moving, and restart it when the cam has returned to a state where the recording isn’t going to make the audience puke (the Cloverfeild function.)
This could also be useful for quickly starting up a camera from being held in a relaxed position, to instant shooting when pointed at something interesting (usually stuff that is a bit more horizontal.)

‘Frame is level’ indication

Say you’re shooting a wedding with your Scarlet cause the writers have gone on strike again, and you’re desperately trying to pay off a sub-prime mortgage.
Set the tripod down and start shooting cause the bride has just snogged the best man and you don’t want to miss the shot.
You have neglected to set the tripod level.
The Accelerometer notes that the cam is off horizontal and inserts the data into the metadata.
Back at ‘Weddings are Art Too ltd.’ you suck the clip up into AE (or what ever Post app) select the metadata plugin and the clip is instantly rotated to level.

The potential

Wouldn’t it be bizarre to have AE, or Nuke, or even Max extract the 3D movement of the camera when it was taking a shot and plot that in 3d space? Would it be possible to position the recorded frames in virtual space and because of the resolution of the frames, alter the camera movements a bit?
Add a nice flat pan maybe?