Title :
On learning a function of perceptrons
Author_Institution :
Dept. of Math., London Sch. of Econ., UK
Abstract :
This paper concerns the generalization accuracy when training a classifier that is a fixed Boolean function of the outputs of a number of perceptrons. The analysis involves the ´margins´ achieved by the constituent perceptrons on the training data. A special case is that in which the fixed Boolean function is the majority function (where we have a ´committee of perceptrons´). Recent work of Auer et al. studied the computational properties of such networks (where they were called ´parallel perceptrons´), proposed an incremental learning algorithm for them. The results given here provide further motivation for the use of this learning rule.
Keywords :
Boolean functions; error analysis; generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; pattern classification; Boolean function; classifier training; generalization error bounds; incremental learning algorithm; parallel perceptrons; Artificial neural networks; Boolean functions; Circuits; Computer networks; Concurrent computing; Mathematics; Training data;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1380064