Title : 
An analog Gaussian synapse for artificial neural networks
         
        
            Author : 
Lee, S.T. ; Lau, K.T.
         
        
            Author_Institution : 
Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore
         
        
        
        
        
            Abstract : 
Using a normalized Gaussian function for feedforward neural networks with a single hidden layer has been proven to have the capability of universal approximation in a satisfactory sense. Back-propagation neural networks with Gaussian function synapses have better convergence over those with linear multiplying synapses. A compact analog Gaussian synapse is presented in this paper. The standard deviation and the magnitude of the proposed Gaussian synapse can be programmed externally
         
        
            Keywords : 
CMOS analogue integrated circuits; analogue processing circuits; backpropagation; convergence; feedforward neural nets; neural chips; transfer functions; CMOS analogue ANN; analog Gaussian synapse; artificial neural networks; backpropagation neural networks; convergence; feedforward neural networks; normalized Gaussian function; single hidden layer; universal approximation; Artificial neural networks; Circuits; Convergence; Differential amplifiers; Feedforward neural networks; Microelectronics; Mirrors; Neural networks; Neurons; Transconductance; Variable structure systems; Voltage;
         
        
        
        
            Conference_Titel : 
Circuits and Systems, 1995., Proceedings., Proceedings of the 38th Midwest Symposium on
         
        
            Print_ISBN : 
0-7803-2972-4
         
        
        
            DOI : 
10.1109/MWSCAS.1995.504382