Correntropy for random variables: Properties and applications in statistical inference

Weifeng Liu, Puskal Pokharel, Jianwu Xu, Sohan Seth

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Similarity is a key concept to quantify temporal signals or static measurements. Similarity is difficult to define mathematically, however, one never really thinks too much about this difficulty and naturally translates similarity by correlation. This is one more example of how engrained second-order moment descriptors of the probability density function really are in scientific thinking. Successful engineering or pattern recognition solutions from these methodologies rely heavily on the Gaussianity and linearity assumptions, exactly for the same reasons discussed in Chapter 3.
Original languageEnglish
Title of host publicationInformation Theoretic Learning
Subtitle of host publicationRenyi's Entropy and Kernel Perspectives
PublisherSpringer
Pages385-413
Number of pages29
ISBN (Electronic)978-1-4419-1570-2
ISBN (Print)978-1-4419-1569-6
DOIs
Publication statusPublished - 2010

Publication series

NameInformation Science and Statistics
PublisherSpringer New York
ISSN (Print)1613-9011

Fingerprint

Dive into the research topics of 'Correntropy for random variables: Properties and applications in statistical inference'. Together they form a unique fingerprint.

Cite this