Orthogonal Least Squares Algorithm for Training Multi-output Radial Basis Function Networks

S. Chen, P. M. Grant, C. F. N. Cowan

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The radial basis function (RBF) network offers a viable alternative to the two-layer neural network in many signal processing applications. A novel learning algorithm for RBF networks (S. Chen et al., 1990, 1991) has been derived based on the orthogonal least squares (OLS) method operating in a forward regression manner (Chen et al., 1989). This is a rational way to choose RBF centres from data points because each selected centre maximizes the increment to the explained variance of the desired output and the algorithm does not suffer numerical ill-conditioning problems. This learning algorithm was originally derived for RBF networks with a scalar output. The present study extends this previous result to multi-output RBF networks. The basic idea is to use the trace of the desired output covariance as the selection criterion instead of the original variance in the single-output case. Reconstruction of PAM signals and nonlinear system modelling are used as two examples to demonstrate the effectiveness of this learning algorithm
Original languageEnglish
Title of host publicationSecond International Conference on Artificial Neural Networks, 1991
Pages336-339
Number of pages4
DOIs
Publication statusPublished - 1991
EventArtificial Neural Networks, 1991., Second International Conference on - Bournemouth, United Kingdom
Duration: 18 Nov 199120 Nov 1991

Conference

ConferenceArtificial Neural Networks, 1991., Second International Conference on
CountryUnited Kingdom
CityBournemouth
Period18/11/9120/11/91

Fingerprint Dive into the research topics of 'Orthogonal Least Squares Algorithm for Training Multi-output Radial Basis Function Networks'. Together they form a unique fingerprint.

Cite this