Quantized mixture kernel least mean square

R. Pokharel, S. Seth, J. C. Principe

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Use of multiple kernels in the conventional kernel algorithms is gaining much popularity as it addresses the kernel selection problem as well as improves the performance. Kernel least mean square (KLMS) has been extended to multiple kernels recently using different approaches, one of which is mixture kernel least mean square (MxKLMS). Although this method addresses the kernel selection problem, and improves the performance, it suffers from a problem of linearly growing dictionary like in KLMS. In this paper, we present the quantized MxKLMS (QMxKLMS) algorithm to achieve sub-linear growth in dictionary. This method quantizes the input space based on the conventional criteria using Euclidean distance in input space as well as a new criteria using Euclidean distance in RKHS induced by the sum kernel. The empirical results suggest that QMxKLMS using the latter metric is suitable in a non-stationary environment with abruptly changing modes as they are able to utilize the information regarding the relative importance of kernels. Moreover, the QMxKLMS using both metrics are compared with the QKLMS and the existing multi-kernel methods MKLMS and MKNLMS-CS, showing an improved performance over these methods.
Original languageEnglish
Title of host publicationNeural Networks (IJCNN), 2014 International Joint Conference on
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages4168-4174
Number of pages7
ISBN (Print)978-1-4799-6627-1
DOIs
Publication statusPublished - 1 Jul 2014

Fingerprint Dive into the research topics of 'Quantized mixture kernel least mean square'. Together they form a unique fingerprint.

Cite this