Title :
A backpropagation neural network design using adder-only arithmetic
Author :
Mahoney, Vicente ; Elhanany, Itamar
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Tennessee Univ., Knoxville, TN
Abstract :
Feedforward and recurrent neural networks have enjoyed great popularity in the field of machine learning. Hardware implementation of these networks, often using a backpropagation algorithm for learning, has many advantages over software implementation, the main advantage being speedup due to taking advantage of the inherent parallelism of neural networks. However, tradeoffs are often introduced with respect to speed, area, precision, and other factors. Moreover, many of the calculations involve multiplications, which can be costly and complex to implement in custom hardware. This paper proposes a piecewise linear approximation architecture that can be used to replace multiplications with a series of shifts and additions. This, combined with the same approximation for the nonlinear activation function and its derivative, results in a neural network realization in which all arithmetic and function computations are carried out using adders only.
Keywords :
approximation theory; backpropagation; digital arithmetic; feedforward neural nets; neural chips; nonlinear functions; recurrent neural nets; adder-only arithmetic; backpropagation algorithm; backpropagation neural network design; feedforward neural networks; machine learning; nonlinear activation function approximation; piecewise linear approximation architecture; recurrent neural networks; Arithmetic; Backpropagation algorithms; Computer architecture; Computer networks; Machine learning; Neural network hardware; Neural networks; Parallel processing; Piecewise linear approximation; Recurrent neural networks;
Conference_Titel :
Circuits and Systems, 2008. MWSCAS 2008. 51st Midwest Symposium on
Conference_Location :
Knoxville, TN
Print_ISBN :
978-1-4244-2166-4
Electronic_ISBN :
1548-3746
DOI :
10.1109/MWSCAS.2008.4616944