DocumentCode :
288383
Title :
Feature induction by backpropagation
Author :
Ronald, Edmund ; Schoenauer, Marc ; Sebag, Martine
Author_Institution :
CMAP, Ecole Polytech., Palaiseau, France
Volume :
1
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
531
Abstract :
A method for investigating the internal knowledge representation constructed by neural net learning is described: it is shown how from a given weight matrix defining a feedforward artificial neural net, we can induce characteristic patterns of each of the classes of inputs classified by that net. These characteristic patterns, called prototypes, are found by a gradient descent search of the space of inputs. After an exposition of the theory, results are given for the well known LED recognition problem where a network simulates recognition of decimal digits displayed on a seven-segment LED display. Contrary to theoretical intuition, the experimental results indicate that the computed prototypes retain only some of the features of the original input patterns. Thus it appears that the indicated method extracts those features deemed significant by the net
Keywords :
backpropagation; feature extraction; feedforward neural nets; knowledge representation; search problems; LED recognition problem; backpropagation; characteristic patterns; feature induction; feedforward neural net; gradient descent search; internal knowledge representation; learning; prototypes; weight matrix; Artificial neural networks; Backpropagation; Computational modeling; Displays; Feature extraction; Feedforward neural networks; Knowledge representation; Light emitting diodes; Neural networks; Prototypes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374220
Filename :
374220
Link To Document :
بازگشت