Projects per year
Abstract / Description of output
Image-based tracking of animals in their natural habitats can provide rich behavioural data, but is very challenging due to complex and dynamic background and target appearances. We present an effective method to recover the positions of terrestrial animals in cluttered environments from video sequences filmed using a freely moving monocular camera. The method uses residual motion cues to detect the targets and is thus robust to different lighting conditions and requires no a-priori appearance model of the animal or environment. The detection is globally optimised based on an inference problem formulation using factor graphs. This handles ambiguities such as occlusions and intersections and provides automatic initialisation. Furthermore, this formulation allows a seamless integration of occasional user input for the most difficult situations, so that the effect of a few manual position estimates are smoothly distributed over long sequences. Testing our system against a benchmark dataset featuring small targets in natural scenes, we obtain 96% accuracy for fully automated tracking. We also demonstrate reliable tracking in a new data set that includes different targets (insects, vertebrates or artificial objects) in a variety of environments (desert, jungle, meadows, urban) using different imaging devices (day / night vision cameras, smart phones) and modalities (stationary, hand-held, drone operated).
|Title of host publication
|2017 IEEE International Conference on Computer Vision Workshop (ICCVW)
|Place of Publication
|Institute of Electrical and Electronics Engineers (IEEE)
|Number of pages
|Published - 23 Jan 2018
|International Conference on Computer Vision Workshop 2017 - Venice, Italy
Duration: 22 Oct 2017 → 29 Oct 2017
|International Conference on Computer Vision Workshop 2017
|22/10/17 → 29/10/17
FingerprintDive into the research topics of 'Visual Tracking of Small Animals in Cluttered Natural Environments Using a Freely Moving Camera'. Together they form a unique fingerprint.
- 1 Finished
28/02/15 → 31/08/18