Use of multiple kernels in the conventional kernel algorithms is gaining much popularity as it addresses the kernel selection problem as well as improves the performance. Kernel least mean square (KLMS) has been extended to multiple kernels recently using different approaches, one of which is mixture kernel least mean square (MxKLMS). Although this method addresses the kernel selection problem, and improves the performance, it suffers from a problem of linearly growing dictionary like in KLMS. In this paper, we present the quantized MxKLMS (QMxKLMS) algorithm to achieve sub-linear growth in dictionary. This method quantizes the input space based on the conventional criteria using Euclidean distance in input space as well as a new criteria using Euclidean distance in RKHS induced by the sum kernel. The empirical results suggest that QMxKLMS using the latter metric is suitable in a non-stationary environment with abruptly changing modes as they are able to utilize the information regarding the relative importance of kernels. Moreover, the QMxKLMS using both metrics are compared with the QKLMS and the existing multi-kernel methods MKLMS and MKNLMS-CS, showing an improved performance over these methods.
|Title of host publication||Neural Networks (IJCNN), 2014 International Joint Conference on|
|Publisher||Institute of Electrical and Electronics Engineers (IEEE)|
|Number of pages||7|
|Publication status||Published - 1 Jul 2014|