Abstract / Description of output
For a given set of training data, a method of learning for optimally generalizing neural networks using functional analytic approach already exists. Here, we consider the case when additional training data is made available at a later stage. We devise a method of carrying out optimal learning with respect to the entire set of training data (including the newly added one) using the results of the previously learned stage. This ensures that the learning operator and the learned function can both be computed incrementally, leading to a reduced computational cost. Finally, we also provide a simplified relationship between the newly learned function and the previous function, opening avenues for work into selection of optimal training set.
Original language | English |
---|---|
Title of host publication | Neural Networks, 1995. Proceedings., IEEE International Conference on |
Pages | 777-782 |
Number of pages | 6 |
Volume | 2 |
DOIs | |
Publication status | Published - 1995 |