Accelerometer recorded to metadata for stabilizing / tracking in post.
Open, NormalPublic

Description

I would like to submit a suggestion to integrate an accelerometer (dirt cheap) into the camera hardware. The acc data should be recorded as metadata, in sync with the images and preferably at at least twice the frame rate. This data could be used in post production to stabilize handheld footage or for special tracking needs in high end post. Please consider this, as the hardware should be cheap to add and the metadata would offer lots of opportunities for experimenting in post.

Cheers, George Tyszkiewicz
Apertus Axiom rocks!

derkiki created this task.Dec 8 2014, 10:32 PM
derkiki updated the task description. (Show Details)
derkiki raised the priority of this task from to Needs Triage.
derkiki claimed this task.
derkiki updated the task description. (Show Details)
derkiki added a subscriber: derkiki.
sebastian triaged this task as Normal priority.Dec 8 2014, 11:21 PM
sebastian added subscribers: Bertl, sebastian.

We are planning to put 3D magnetometer and 3D gyroscope sensors (9DOF-IMU e.g. for image stabilization) directly behind/next to the sensor. Great to see this is something people care about!

@Bertl, did we narrow down the component choice already?

That's great to hear! Another huge wish for the future: an entry level model with a cheaper sensor and a lower price tag.

Another huge wish for the future: an entry level model with a cheaper sensor and a lower price tag.

Your suggestion is a new idea/wish and would deserve its own lab task! I am very curious to learn how many folks would be interested in cheaper entry level option.

Hyuri.Pimentel added a subscriber: Hyuri.Pimentel.EditedDec 22 2014, 9:11 PM

Indeed, in addition to having image stabilization, having data for camera tracking sounds awesome; huge!

summary from talking to a VFX guy specialized in motion capture and studio real time camera tracking:

-) onboard camera motion sensors will not be accurate enough for real time 3d tracking - expect results similar to virtual reality apps on your smartphone. The metadata is still highly valuable for post production as it gives great starting points for the actual 3d tracking done in post for example.
-) rotation tracking is very accurate already so stabilization could work out pretty well.
-) position tracking is not accurate enough yet as errors sum up over time - the only thing that really works with precision is optical feature tracking in the image or external optical tracking of camera movement (aka motion tracking).

3dsman added a subscriber: 3dsman.Mar 31 2015, 12:39 PM

I'm working in vfx industry and i'm waiting this feature with great excitation, i hope this will be used by the blender tracker as basis for 3d tracking (optical could be just used for tracking enhancement) because optical tracking alone doesn't work any time (during fast motion for example). This could be used to automatically detect element that move in the shot and separate them from the camera motion.
May be realtime optical camera tracking could be done by moving optical search zone of tracking points at his estimated position based on IMU measurement???
This could be a killer apps...

Probably you know this project :

https://www.kickstarter.com/projects/1091165875/steadxp-the-future-of-video-stabilization/description

It can be a great option in the future to develop the same directly in the camera. But i think the algorithm is pretty complicated (no much crop, it's really impressive in their videos)!

What you think about the feasibility ?

Yes

It can be a great option in the future to develop the same directly in the camera. But i think the algorithm is pretty complicated (no much crop, it's really impressive in their videos)!

What you think about the feasibility ?

In our case we will track the motion directly behind the image sensor center so the distance between motionsensors and image plane is well defined. Since the steadxp can be attached on top of any camera these distances vary with every camera and I guess they have to do quite some guesswork in their interpretation of the motion data to recover the actual pixel motion.

Great !
I found this one, problably you know as well also : http://docs.opencv.org/3.1.0/#gsc.tab=0

They have open algorithms for video stabilization and tracking motion etc...

The result is pretty good https://www.youtube.com/watch?v=CP0KBGKClyI

Possible to take this way for tracking motion directly on the beta ? and maybe in open cine ?

Bertl added a comment.May 13 2016, 9:22 PM

While it certainly works well in post processing, this probably isn't a good idea for real-time stabilization in the AXIOM Beta.

Why?

  • OpenCV is (typically heavy) image processing. It usually requires a lot of resources like parallel processing in a graphics card.
  • Proper motion estimation is a challenge at high resolutions.
  • There are easier ways to get the information (e.g. via IMUs)

Best,
Herbert

The only applications for doing this live in the Beta that comes to mind would be sports/broadcast and here dedicated mechanical stabilization systems exist already... So I agree this is a post production thing mainly, but I think that's what chooksprod meant with mentioning opencine anyway.

there is another one application: realtime track could be very useful for green screen previsualisation of live human in 3d environment or the other side with real environment and vfx character like the troll in lord of the ring (this was made with motioncapture system but today it's sould be possible to do this with camera motion tracking)

For camera stabilisation it could be very cool somebody make an extension card to drive directly the brushless motor.

Maybe information to easily find optical center and find the offset between IMU and optical center could be usefull too

In T212#9722, @3dsman wrote:

there is another one application: realtime track could be very useful for green screen previsualisation of live human in 3d environment or the other side with real environment and vfx character like the troll in lord of the ring (this was made with motioncapture system but today it's sould be possible to do this with camera motion tracking)

For camera stabilisation it could be very cool somebody make an extension card to drive directly the brushless motor.

Maybe information to easily find optical center and find the offset between IMU and optical center could be usefull too

For this kind of application the IMU data from inside the camera is not accurate enough and typically requires external visual tracking of 3d positions of camera, actors, etc.

chooksprod added a comment.EditedMay 20 2016, 7:30 PM

So i don't understand exactly how it work for the IMU data recording in the beta.
You can take data from IMU and use it directly in postprod for a better stabilization ?

I use IMU for gimbals but it's a "live" application, i don't know how to record the data from the IMU and recovered it on a pixel but maybe a solution already exist !

i was thinking to put stabilization algorithm from open CV in open cine, and maybe software developers can optimize the data from the beta with this algorithm and improve stabilization in post with a "customized algorithm" ?

i know that stabilizers from adobe etc.. are very powerful but i hope that we can do better with this IMU and maybe an open algorythm on open cine !

I just try on my gimbal to put a second IMU (one vertical and one horizontal) , and it's work really great and improve stabilization, do you think it will be possible to do the same and have 2 IMU informations in the future ?

The IMU just records the motion of the camera (3D rotation, acceleration, etc.) what you do with that kind of information then in post production is up to the user/software.

Stabilization systems with motors also use IMUs but actually do things quite differently as they are closed loop systems and constantly "use the motors to work against the motion the IMU measures" basically. So the result can be measured by the IMU immediately...

Bertl added a comment.May 21 2016, 7:27 PM

Actually (nitpicking here :) the IMU doesn't record the motion, it tracks it with several sensors.

Those sensors can be read from the FPGA or ARM cores and put into metadata streams, either within the HDMI output or in a separate file with precise timestamps.

This data can then be used in post production as you like ...

Hope that clarifies,
Herbert

KG12-12 added a subscriber: KG12-12.EditedOct 9 2016, 10:53 AM

Well, an acceleromter info for a camera is a great idea, actually. But the problem is how can it be included in the metadata? Should it be stored in an another format like .aclmif (Accelerometer Info) which contains a very precise tracking of the accelerometer over time. You should include timecode info too so you can parse the information of the accelerometer over time. And the program which can render .aclmif is only Open Cine, with Premiere Pro adding support in 2021? Well but it is not a wrong idea to add a new format and it should not be in this section - it should be in the software section.

Bertl added a comment.Oct 9 2016, 2:43 PM

We need to record metadata anyway, for example to keep information about exposure time, sensor register settings, etc.
If we decide to do it in a separate file/stream then we need very precise timestamps (or frame numbers) to go with.

Best,
Herbert

I'm not sure of what the desirable specifications are for an IMU, but the invensense mpu9250 is pretty easy to use and rather inexpensive. It has a sample rate of up to 32kHz.

https://www.invensense.com/products/motion-tracking/9-axis/mpu-9250/

Bertl added a comment.Oct 27 2016, 3:53 PM

Other candidates are:

  • LSM330DLC
  • FXOS8700CQT

Partial solutions:

  • MMA8451QT
  • MMA8652FC

Note: we have two sides for solder on modules, but they share power supply and data connections.

Best,
Herbert

What about this guy?

"""
The single-chip FireFly ICM-30670 is the world’s first dual interface Optical Image Stabilization (OIS) and 6-axis motion tracking solution for User Interface, UI, with an integrated sensor-hub and framework software.
"""
https://www.invensense.com/products/motion-tracking/6-axis/icm-30670/

Pricing:
https://store.invensense.com/ProductDetail/ICM30670-InvenSense-Inc/593894/

Bertl added a comment.Oct 27 2016, 9:45 PM

Why not. After all, the IMU is currently 'solder-on' so we can try different ones without changing the entire design.
With a little trick (connecting the solder-on boards with small pins) they can also be easily swapped.
All that is required is a tiny PCB with the proper connections.

Best,
Herbert

Hi everybody,
I asked to steadxp if they can say to us which component they use. No answer yet but i will tell you if i have one !

Maybe it will be helpfull (maybe not) but i found some info for IMU stabilization.

First on the paper : https://graphics.stanford.edu/papers/stabilization/karpenko_gyro.pdf
and the code: https://github.com/alexgo1/Video-Stabilization

and a second one : https://www.youtube.com/watch?v=yvMZa6y-6_M
and the code: https://github.com/analogicalnexus/UMD-course-projects/tree/master/DVS-IMU-Stabilization
Kalman filter : https://github.com/analogicalnexus/UMD-course-projects/tree/master/Kalman_IMU

Hope it will be helpfull