Information measures in distributed multitarget tracking

Murat Uney, Daniel Clark, Simon Julier

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we consider the role that different information measures play in the problem of decentralised multi-target tracking. In many sensor networks, it is not possible to maintain the full joint probability distribution and so suboptimal algorithms must be used. We use a distributed form of the Probability Hypothesis Density (PHD) filter based on a generalisation of covariance intersection known as exponential mixture densities (EMDs). However, EMD-based fusion must be actively controlled to optimise the relative weights placed on different information sources. We explore the performance consequences of using different information measures to optimise the update. By considering approaches that minimise absolute information (entropy and Rényi entropy) or equalise divergence (Kullback-Leibler Divergence and Rényi Divergence), we show that the divergence measures are both simpler and easier to work with. Furthermore, in our simulation scenario, the performance is very similar with all the information measures considered, suggesting that the simpler measures can be used.
Original languageEnglish
Title of host publication14th International Conference on Information Fusion
PublisherInstitute of Electrical and Electronics Engineers
Pages1-8
Number of pages8
ISBN (Electronic)978-0-9824438-2-8
ISBN (Print)978-1-4577-0267-9
Publication statusPublished - 8 Aug 2011

Fingerprint

Dive into the research topics of 'Information measures in distributed multitarget tracking'. Together they form a unique fingerprint.

Cite this