DocumentCode :
445814
Title :
Information maximization and cost minimization in information-theoretic competitive learning
Author :
Kamimura, Ryotaro
Author_Institution :
Inf. Sci. Lab., Tokai Univ., Kanagawa, Japan
Volume :
1
fYear :
2005
fDate :
31 July-4 Aug. 2005
Firstpage :
202
Abstract :
In this paper, we introduce costs in the framework of information maximization and try to maximize the ratio of information to its associated cost. We have shown that competitive learning is realized by maximizing mutual information between input patterns and competitive units. One shortcoming of the method is that maximizing information does not necessarily produce representations faithful to input patterns. Information maximizing primarily focuses on some parts of input patterns that are used to distinguish between patterns. Thus, we introduce the cost that represents average distance between input patterns and connection weights. By minimizing the cost, final connection weights by information maximization reflect well input patterns. We applied the method to a political data analysis and a Wisconsin cancer problem. Experimental results confirmed that by introducing the cost, representations faithful to input patterns were obtained. In addition, generalization performance was significantly improved.
Keywords :
information theory; minimisation; pattern classification; unsupervised learning; Wisconsin cancer problem; cost minimization; final connection weights; information maximization; information-theoretic competitive learning; mutual information; political data analysis; Cancer; Cost function; Data analysis; Entropy; Information science; Information theory; Laboratories; Mutual information; Neurons; Self organizing feature maps;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
Type :
conf
DOI :
10.1109/IJCNN.2005.1555830
Filename :
1555830
Link To Document :
بازگشت