Title :
Hardware implementation of BFNN and RBFNN in FPGA technology: Quantization issues
Author :
Krid, Mohamed ; Masmoudi, Dorra Sellami ; Chtourou, Mohamed
Author_Institution :
Dept. of Electr. Eng., Nat. Sch. of Eng. of Sfax, Sfax
Abstract :
Neural networks hardware implementation is often required to concretize their parallelism and minimize computing time for real time application requirements. This work describes hardware implementation issues of neural networks on FPGA environment. Not to loose generality, two examples of NNs are considered: a backpropagation feedforward neural network (BFNN) and an RBF neural network (RBFNN). Although local quantization adds an extra complexity to the design task, it gives minimum quantization errors comparing to the global one. It was carried out that RBFNN is subject to an extra sensitivity to quantization effects. Preserving acceptable design accuracy suggests an increasing of the hidden layer number in the RBFNN. Hardware implementation makes use of a sequential approach with pipeline in order to achieve the best compromise between rapidity and silicon area. The proposed design methodology was applied to an illustration example with a sine input-output function. BFNN networks carried out more compact implementations on FPGA circuit, compared to RBFNN.
Keywords :
backpropagation; field programmable gate arrays; logic design; neural nets; FPGA technology; backpropagation feedforward neural network; hardware implementation; minimum quantization error; neural networks hardware implementation; sequential approach; Backpropagation; Computer networks; Concurrent computing; Feedforward neural networks; Field programmable gate arrays; Neural network hardware; Neural networks; Parallel processing; Pipelines; Quantization;
Conference_Titel :
Electronics, Circuits and Systems, 2005. ICECS 2005. 12th IEEE International Conference on
Conference_Location :
Gammarth
Print_ISBN :
978-9972-61-100-1
Electronic_ISBN :
978-9972-61-100-1
DOI :
10.1109/ICECS.2005.4633500