Projects per year
Abstract / Description of output
We propose an unsupervised approach for discovering
characteristic motion patterns in videos of highly articulated objects performing natural, unscripted behaviors,
such as tigers in the wild. We discover consistent patterns
in a bottom-up manner by analyzing the relative displacements of large numbers of ordered trajectory pairs through
time, such that each trajectory is attached to a different
moving part on the object. The pairs of trajectories descriptor relies entirely on motion and is more discriminative
than state-of-the-art features that employ single trajectories. Our method generates temporal video intervals, each
automatically trimmed to one instance of the discovered
behavior, and clusters them by type (e.g., running, turning head, drinking water). We present experiments on two
datasets: dogs from YouTube-Objects and a new dataset of
National Geographic tiger videos. Results confirm that our
proposed descriptor outperforms existing appearance- and
trajectory-based descriptors (e.g., HOG and DTFs) on both
datasets and enables us to segment unconstrained animal
video into intervals containing single behaviors.
Original language | English |
---|---|
Title of host publication | Computer Vision and Pattern Recognition (CVPR), 2015 IEEE Conference on |
Publisher | Institute of Electrical and Electronics Engineers |
Pages | 2151-2160 |
Number of pages | 10 |
DOIs | |
Publication status | Published - 2015 |
Fingerprint
Dive into the research topics of 'Articulated motion discovery using pairs of trajectories'. Together they form a unique fingerprint.Projects
- 1 Finished