DocumentCode :
288393
Title :
Improving generalization performance by controlling α-information
Author :
Kamimura, Ryotaro ; Nakanishi, Shohachiro
Author_Institution :
Inf. Sci. Lab., Tokai Univ., Kanagawa, Japan
Volume :
1
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
594
Abstract :
The authors attempt to show that α-information, defined by an entropy of Renyi, is effective in improving the generalization performance. The α-information is represented in the difference between maximum Renyi entropy and an entropy, observed after finishing the learning. Thus, the information means the information content, stored in the network architecture. Improving the generalization performance corresponds to the adjustment of the information, stored in network architectures. For evaluating the performance of α-information, two problems of language acquisition: inference of regular verbs and irregular verb with grammatical determination were examined. In either case, the authors could clearly see that the generalization performance tended to be improved by changing α-information appropriately, especially as the network size was larger
Keywords :
entropy; generalisation (artificial intelligence); inference mechanisms; learning (artificial intelligence); neural nets; α-information; generalization; grammatical determination; information content; irregular verb; language acquisition; learning; maximum Renyi entropy; regular verbs; Entropy; Information science; Laboratories; Logistics; Mutual information;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374232
Filename :
374232
Link To Document :
بازگشت