|Title||Model-based tracking: Temporal conditional random fields|
|Publication Type||Conference Paper|
|Year of Publication||2010|
|Authors||Shafiee, M. J., Z. Azimifar, and P. Fieguth|
|Conference Name||17th IEEE International Conference on Image Processing (ICIP)|
|Keywords||constant motion, CRF-based predictor, image sequence, image sequences, model-based tracking, motion compensation, object motion modeling, probabilistic framework, probability, temporal conditional random fields, tracking|
We present Temporal Conditional Random Fields, a probabilistic framework for modeling object motion. The state-of-the-art discriminative approach for tracking is known as dynamic conditional random fields. This method models an event based on spatial and temporal relation between pixels in an image sequence without any prediction. To facilitate such a powerful graphical model with prediction and come up with a CRF-based predictor, we propose a set of new temporal relations for object tracking, with feature functions such as optical flow (calculated among consequent frames) and line filed features. We validate our proposed method with real data sequences and will show that the TCRF prediction is nearly equivalent with result of template matching. Experimental results indicate that our TCRF can predict future state of any maneuvering target with nearly zero error during its constant motion. Not only the proposed TCRF has a simple and easy to implement structure, but also it outperforms the state-of-the-art predictors such as Kalman filter.