I am using a kinect to do computer visual software. I have my program setup so that it filters out everything beyond a certain distance, and once something enters close enough that is large enough to be a hand, my program assumes it is.
However, I want to extend this functionality. Currently, if the hand leaves the depth filtered region, the software no longer tracks it's position. How can I follow the hand after I've recognized it, regardless of the depth?
You can have a look at Mean Shift Tracking: http://www.comp.nus.edu.sg/~cs4243/lecture/meanshift.pdf
Then it's possible to track the blob even when it gets smaller or bigger (further or closer).