On speeding up computation in information theoretic learning

Sohan Seth, José C Principe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

With the recent progress in kernel based learning methods, computation with Gram matrices has received immense attention. However, the complexity of computing the entire Gram matrix is quadratic in terms of number of samples. Therefore, a considerable amount of work has been focused on extracting relevant information from the Gram matrix without accessing all the elements. Most of these methods exploits the positive definiteness and rapidly decaying eigenstructure of the Gram matrix. Although information theoretic learning (ITL) is conceptually different from kernel based learning, several ITL estimators can be written in terms of Gram matrices. However, the difference between ITL and kernel based methods is that a few ITL estimators include a special type of matrix which is neither positive definite nor symmetric. In this paper we discuss how the techniques applied in kernel based learning can be applied to reduce computational complexity of the ITL estimators involving both Gram matrices and these other matrices.
Original languageEnglish
Title of host publicationNeural Networks, 2009. IJCNN 2009. International Joint Conference on
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages2883-2887
Number of pages5
ISBN (Print)978-1-4244-3548-7
DOIs
Publication statusPublished - 14 Jun 2009

Fingerprint Dive into the research topics of 'On speeding up computation in information theoretic learning'. Together they form a unique fingerprint.

Cite this