Title :
Capacity of several neural networks with respect to digital adder and multiplier
Author :
Biederman, Daniel C. ; Ososanya, Esther
Author_Institution :
Dept. of Electr. Eng., Tennessee Technol. Univ., Cookeville, TN, USA
Abstract :
Many neural network designers are often curious about the capacity of a neural network. If they are able to know move about the capacity of neural networks, they would have an easier time deciding what neural network architecture to use as well as how many hidden neurons are needed for a neural network to perform a given function. Knowing the best architecture saves time with training, and allow for less expensive circuitry. The paper is a review of what neural network architectures are necessary to perform certain functions. There are two neural network architectures considered: (1) multilayer, multiple output feedforward, (2) multilayer, single output feedforward. The objectives are to examine the advantages of each network and to prove or disprove the following fact: that a single bit per output neural network uses less overall neurons to perform the same function as a multilayered network. The problem studied include a multiple bit digital adder and a multiple bit digital multiplier. The size of the problem range from 1 bit to 4 bits
Keywords :
adders; feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; multiplying circuits; neural net architecture; 1 to 4 bit; circuitry; functions; hidden neurons; multilayer multiple output feedforward network; multilayer single output feedforward network; multiple bit digital adder; multiple bit digital multiplier; neural network architecture; neural network capacity; training; Adders; Backpropagation algorithms; Circuits; Feedforward neural networks; Microprocessors; Multi-layer neural network; Neural networks; Neurons; Space technology; Very large scale integration;
Conference_Titel :
System Theory, 1995., Proceedings of the Twenty-Seventh Southeastern Symposium on
Conference_Location :
Starkville, MS
Print_ISBN :
0-8186-6985-3
DOI :
10.1109/SSST.1995.390564