Classification of Animal Behaviour Using Dynamic Models of Movement

Michael A. Dewar, T.C. Lukins, J. A. Heward, Douglas Armstrong

Research output: Contribution to conferencePoster


Traditional behavioural assays analyse videos of animals in controlled environments. In these assays, most of the information about the animal's behavioural phenotype captured in the video is attenuated down to crude behavioural statistics that summarise ranges of different behaviours.

We aim to avoid this attenuation by using the dynamic information in the animal's behaviour to automatically generate a chain of meaningful behavioural symbols. This is achieved by inferring the behavioural symbols and their transition probabilities, as well as the dynamic models of movement and their hidden states, within the switching state-space model framework.

Our example data set consists of seven videos of fruit-fly courtship assays. The position and orientation of the flies have been extracted using machine vision techniques[1,2]. We have initially approached the segmentation of the behaviour displayed in these videos from a semi-supervised standpoint. Two of the seven videos are manually segmented into a sequence of behaviours. Dynamic models of these behaviours are then found, and are used as the starting point for a full switching Kalman filter across the whole data set, retaining the manual segmentations as a constraint on the learning.

Initial results have shown that basic movement patterns can be automatically extracted from video by learning linear, second order dynamic models from the data. Future work involves learning nonlinear dynamic models to describe more complex behaviours.
Original languageEnglish
Publication statusPublished - 2008
EventStochastic Models of Behaviour - Whistler, B.C., Canada
Duration: 13 Dec 2008 → …


ConferenceStochastic Models of Behaviour
CityWhistler, B.C.
Period13/12/08 → …


Dive into the research topics of 'Classification of Animal Behaviour Using Dynamic Models of Movement'. Together they form a unique fingerprint.

Cite this