DocumentCode :
2620446
Title :
Consistency and generalization in incrementally trained connectionist networks
Author :
Martinez, Tony
Author_Institution :
Dept. of Comput. Sci., Brigham Young Univ., Provo, UT, USA
fYear :
1990
fDate :
1-3 May 1990
Firstpage :
706
Abstract :
Aspects of consistency and generalization in connectionist networks which learn through incremental training by examples or rules are discussed. Differences between training set learning and incremental rule or example learning are presented. Generalization, the ability to produce reasonable mappings when presented with novel input patterns, is discussed in light of the above learning methods. The contrast between Hamming distance generalization and generalizing by high-order combinations of critical variables is discussed. Examples of detailed rules for an incremental learning model are presented for both consistency and generalization constraints
Keywords :
learning systems; neural nets; Hamming distance generalization; consistency; example learning; generalization; high-order combinations of critical variables; incremental training; incrementally trained connectionist networks; mappings learning; training set learning; Biological neural networks; Computer science; Data mining; Hamming distance; Impedance matching; Intelligent networks; Learning systems; Nervous system; Neural networks; Prototypes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1990., IEEE International Symposium on
Conference_Location :
New Orleans, LA
Type :
conf
DOI :
10.1109/ISCAS.1990.112177
Filename :
112177
Link To Document :
بازگشت