Incremental One-Class Learning with Bounded Computational Complexity

Rowland R. Sillito, Bob. Fisher

Research output: Chapter in Book/Report/Conference proceedingConference contribution


An incremental one-class learning algorithm is proposed for the purpose of outlier detection. Outliers are identified by estimating - and thresholding - the probability distribution of the training data. In the early stages of training a non-parametric estimate of the training data distribution is obtained using kernel density estimation. Once the number of training examples reaches the maximum computationally feasible limit for kernel density estimation, we treat the kernel density estimate as a maximally-complex Gaussian mixture model, and keep the model complexity constant by merging a pair of components for each new kernel added. This method is shown to outperform a current state-of-the-art incremental one-class learning algorithm (Incremental SVDD [5]) on a variety of datasets, while requiring only an upper limit on model complexity to be specified.
Original languageEnglish
Title of host publicationArtificial Neural Networks - ICANN 2007
EditorsJoaquim Marques de Sa, Luis A Alexandre, Wlodzislaw Duch, Danilo Mandic
PublisherSpringer-Verlag GmbH
Number of pages10
ISBN (Electronic)978-3-540-74690-4
ISBN (Print)978-3-540-74689-8
Publication statusPublished - 2007

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin Heidelberg


Dive into the research topics of 'Incremental One-Class Learning with Bounded Computational Complexity'. Together they form a unique fingerprint.

Cite this