DocumentCode :
2770159
Title :
The Generalisation of the Recursive Deterministic Perceptron
Author :
Elizondo, David ; Birkenhead, Ralph ; Taillard, Eric
Author_Institution :
De Montfort Univ., Leicester
fYear :
0
fDate :
0-0 0
Firstpage :
1776
Lastpage :
1783
Abstract :
The Recursive Deterministic Perceptron (RDP) feed-forward multilayer neural network is a generalisation of the single layer perceptron topology. This model is capable of solving any two-class classification problem as opposed to the single layer perceptron which can only solve classification problems dealing with linearly separable sets (two classes X and Y of Md are said to be linearly separable if there exists a hyperplane such that the elements of X and Y lie on the two opposite sides of Md delimited by this hyperplane). For all classification problems, the construction of an RDP is done automatically and convergence is always guaranteed. Three methods for constructing RDP neural networks exist: Batch, Incremental, and Modular. The Batch method has been extensively tested. However, no testing has been done before on the Incremental and Modular methods. Contrary to the Batch method, the complexity of these two methods is not NP-Complete. A study on the three methods is presented. This study will allow the highlighting of the main advantages and disadvantages of each of these methods by comparing the results obtained while building RDP neural networks with the three methods in terms of the level of generalisation. The networks were trained and tested using the following standard benchmark classification datasets: IRIS and SOYBEAN.
Keywords :
multilayer perceptrons; pattern classification; batch method; feed-forward multilayer neural network; incremental methods; modular methods; recursive deterministic perceptron; single layer perceptron topology; two-class classification problem; Benchmark testing; Convergence; Feedforward neural networks; Feedforward systems; Iris; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
Type :
conf
DOI :
10.1109/IJCNN.2006.246894
Filename :
1716324
Link To Document :
بازگشت