site stats
Reasoning about Containment Relations and Human Actions
Abstract

This paper studies a challenging problem of tracking severely occluded objects in long video sequences. The proposed method reasons about the containment relations and human actions, thus infers and recovers occluded objects identities while contained or blocked by others. There are two conditions that lead to incomplete trajectories: i) Contained. The occlusion is caused by a containment relation formed between two objects, e.g., an unobserved laptop inside a backpack forms containment relation between the laptop and the backpack. ii) Blocked. The occlusion is caused by other objects blocking the view from certain locations, during which the containment relation does not change. By explicitly distinguishing these two causes of occlusions, the proposed algorithm formulates tracking problem as a network flow representation encoding containment relations and their changes. By assuming all the occlusions are not spontaneously happened but only triggered by human actions, an MAP inference is applied to jointly interpret the trajectory of an object by detection in space and human actions in time. To quantitatively evaluate our algorithm, we collect a new occluded object dataset captured by Kinect sensor, including a set of RGB-D videos and human skeletons with multiple actors, various objects, and different changes of containment relations. In the experiments, we show that the proposed method demonstrates better performance on tracking occluded objects compared with baseline methods.

BibTeX
					
@inproceedings{liang2018tracking,
    title={Tracking Occluded Objects and Recovering Incomplete Trajectories by Reasoning about Containment Relations and Human Actions},
    author={Liang, Wei and Zhu, Yixin and Zhu, Song-Chun},
    booktitle={Proceedings of AAAI Conference on Artificial Intelligence (AAAI)},
    year={2018}
}
					
				
Acknowledgements

The work reported herein was supported by a Natural Science Foundation of China (NSFC) grant No.61472038 and No.61375044 (to Liang), DARPA XAI grant N66001-17-2-4029 and ONR MURI grant N00014-16-1-2007 (to Zhu).