Algorithms for Motion Tracking? 32
Keith Handy asks: "I seem to be unable to find algorithms and/or open source programs that will do accurate motion tracking, i.e. you mark a point on an object in frame 36, and the program can follow that point on that object through all the frames following it. This is useful not just for analyzing motion, but also for interpolating/extrapolating frames of video -- so if you had something at only 15 fps, you could generate inbetween frames (which are not just crossfades between the frames) and actually smooth the effect of the motion. Not something so complicated as to get into actual physics -- just something that will indicate where (in 2D only) that part of the object has moved from one frame to the next, for any given point in the whole picture. And for that matter it doesn't have to be 100% accurate, just any means of generating a reasonable motion-flow map." This doesn't strike me as an easy algorithm to develop, but are there any papers online or offline, that might describe an algorithm that can at least track objects in an image?
"In other words, I want something that does this,
in order to write code that will do things like this and this. I already know how to write code to blur and warp images, so to be able to track motion would give me (and you) the same capabilities as these expensive plug-ins.
Anyone know any other resources, directions, or existing code I could look into to find out more about how this works, so I can incorporate it into my own programming instead of paying hundreds or thousands of dollars for limited, proprietary use of the technology?"
what about intercorrelation ? (Score:3, Interesting)
the intercorrelation function of two neighbouring
frames. The maxima are more or less where
the objects have moved.
I only used this method on artificially generated
frames, ie 1 frame with translation and noise
added. Still, the intercorrelation sinks quite
fast. On natural images, there must be a lot
of fiddling to do.
Identify Features and Label (Score:1, Interesting)
Basically, identify all "peaks" (whatever feature you're interested in) and sort them. Start with the most outstanding feature and associate its nearest neighbours with it. Repeat many times. You will have data structure of references which will produce a map of islands and isthmuses depending on how far down you look.
Attach a "label" (unique ID) to each significant feature in the frame.
Repeat for the next frame.
Compare significant features. Using some sort of threshold, you can attach a confidence level that you're looking at the safe feature in the previous frame.
That's a simplistic overview, but I did it many years ago for looking at the output of stellar formation simulations.
tracking motion (Score:3, Interesting)
http://motion.technolust.cx [technolust.cx]
there are some examples and a sample video which demonstrate tracking "motion."
some brainstormed ideas... (Score:2, Interesting)
break up the iimage into N x N submatrices, and do a fourier transform on each subsection of the image. then do this for the next frame, and calculate the phase differences between each frame, and use linear/cubic/etc interpolation to generate the frames in between. not too difficult, and I think there is even a 2-D FFT library located somwhere on download.com. this, however might introduce a couple of artifacts, but if you're doing high framerate video, it shouldn't be too noticeable.
or even more far-fetched:
assuming that the translation of the objects in the image plane between frames are small and uniform enough, you might also be able to pull this off with a properly trained neural network on subsections of the image (so each individual feature fits approximately in each subsection). neural networks can do non-linear regression, but thier outputs are continuous, so I figure if you train it right, it'll give you what you want.
good luck