DocumentCode
1850993
Title
Deterministic neuron: a model for faster learning
Author
Ahmed, Farid ; Awwal, Abdul Ahad S
Author_Institution
Dept. of Comput. Sci. & Eng., Wright State Univ., Dayton, OH, USA
fYear
1993
fDate
24-28 May 1993
Firstpage
845
Abstract
Training in most neural network architectures are currently being done by updating the weights of the network in a way to reduce some error measures. The well-known backpropagation algorithm and some other training algorithms use this approach. Obviously, this has been very successful in mimicking the way the biological neurons do their function. But the problem of slow learning and getting trapped in local minimas of error function domain deserve serious investigation. Various models are proposed with various levels of success to get rid of these two problems. In this work, we propose a deterministic model of the neuron, that guarantees faster learning by modifying the nonlinearity associated with each neuron. Only one such neuron is required to solve the generalized N-bit parity problem
Keywords
deterministic automata; feedforward neural nets; learning (artificial intelligence); character recognition; deterministic model; error measures; feedforward network; generalized N-bit parity problem; learning speed increase; neural network architectures; nonlinearity; smart neurons; training algorithms; weights; Associative memory; Backpropagation algorithms; Biological system modeling; Character recognition; Computer architecture; Computer errors; Computer science; Neural networks; Neurons; Upper bound;
fLanguage
English
Publisher
ieee
Conference_Titel
Aerospace and Electronics Conference, 1993. NAECON 1993., Proceedings of the IEEE 1993 National
Conference_Location
Dayton, OH
Print_ISBN
0-7803-1295-3
Type
conf
DOI
10.1109/NAECON.1993.290833
Filename
290833
Link To Document