Title :
VC dimension theory for a learning system with forgetting
Author_Institution :
Math. Inf. Sci., Electrotech. Lab., Ibaraki, Japan
Abstract :
In a changing environment, forgetting old samples is an effective method to improve the adaptability of learning systems. However, too fast forgetting causes a decrease of generalization performance. In this paper, we analyze the generalization performance of a learning system with a forgetting parameter. For a class of binary discriminant functions, it is proved that the generalization error is given by O(√hα) (O(hα) in a certain case), where h is the Vapnik-Chervonenkis (VC) dimension of the class of functions and 1-α represents a forgetting rate. The result provides a criterion to determine the optimal forgetting rate.
Keywords :
generalisation (artificial intelligence); learning (artificial intelligence); learning systems; neural nets; VC dimension theory; Vapnik-Chervonenkis dimension; binary discriminant functions; forgetting rate; generalization performance; learning system; neural networks; Frequency; Informatics; Learning systems; Neural networks; Performance analysis; Probability distribution; Risk analysis; Risk management; Testing; Virtual colonoscopy;
Conference_Titel :
Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on
Print_ISBN :
0-7803-1421-2
DOI :
10.1109/IJCNN.1993.713961