DocumentCode :
423348
Title :
Combining PCA and entropy criterion to build ANN´s architecture
Author :
Li, Ai-jun ; Luo, Si-Wei ; Liu, Yun-Hui ; Nan, Zhi-Hong
Author_Institution :
Dept. of Comput. Sci., Beijing Jiaotong Univ., China
Volume :
5
fYear :
2004
fDate :
26-29 Aug. 2004
Firstpage :
3052
Abstract :
Designing of artificial neural network (ANN or NN)´s architecture is a fundamental problem, which draws researchers´ concern. This paper proposes PCA and entropy as a criterion to select neuron and provides a method, PCA-ENN, to build NN. First, according to the similarity or equivalence between decision tree (DT) and NN, PCA-ENN adopts PCA to extract new feature attributes. Second, PCA-ENN selects the best cut point for each new attribute by entropy criterion and selects the best attribute for classification as a neural unit. Then specifies the connection weights between input units and outer inputs by coefficients obtained from PCA and specifies the biases of input units as the best cut points. At the same time, PCA-ENN constructs the hidden and output layer units, and initializes the connection weights of units. PCA-ENN cannot only build architecture of NN effectively, but also make NN´s incremental learning possible.
Keywords :
decision trees; entropy; feature extraction; learning (artificial intelligence); neural net architecture; principal component analysis; ANN architecture; NN incremental learning; PCA; artificial neural network; connection weights; decision tree; entropy criterion; feature extraction; hidden layer unit; neuron models; Artificial neural networks; Computer architecture; Decision trees; Entropy; Feature extraction; Learning systems; Network topology; Neural networks; Partial response channels; Principal component analysis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Cybernetics, 2004. Proceedings of 2004 International Conference on
Print_ISBN :
0-7803-8403-2
Type :
conf
DOI :
10.1109/ICMLC.2004.1378556
Filename :
1378556
Link To Document :
بازگشت