Title :
Approximation of activation functions for vector equalization based on recurrent neural networks
Author :
Mostafa, Mahjabeen ; Teich, Werner G. ; Lindner, Jurgen
Author_Institution :
Inst. of Commun. & Navig., German Aerosp. Center DLR, Wessling, Germany
Abstract :
Activation functions represent an essential element in all neural networks structures. They influence the overall behavior of neural networks decisively because of their nonlinear characteristic. Discrete- and continuous-time recurrent neural networks are a special class of neural networks. They have been shown to be able to perform vector equalization without the need for a training phase because they are Lyapunov stable under specific conditions. The activation function in this case depends on the symbol alphabet and is computationally complex to be evaluated. In addition, numerical instability can occur during the evaluation. Thus, there is a need for a computationally less complex and numerically stable evaluation. Especially for the continuous-time recurrent neural network, the evaluation must be suitable for an analog implementation. In this paper, we introduce an approximation of the activation function for vector equalization with recurrent neural networks. The activation function is approximated as a sum of shifted hyperbolic tangent functions, which can easily be realized in analog by a differential amplifier. Based on our ongoing research in this field, the analog implementation of vector equalization with recurrent neural networks is expected to improve the power/speed ratio by several order of magnitude compared with the digital one.
Keywords :
Lyapunov matrix equations; approximation theory; computational complexity; differential amplifiers; equalisers; learning (artificial intelligence); recurrent neural nets; telecommunication computing; transfer functions; Lyapunov stable; activation functions; analog implementation; computation complexity; continuous-time recurrent neural networks; differential amplifier; power/speed ratio; shifted hyperbolic tangent functions; training phase; vector equalization; Approximation methods; Equalizers; Neurons; Optimized production technology; Recurrent neural networks; Vectors;
Conference_Titel :
Turbo Codes and Iterative Information Processing (ISTC), 2014 8th International Symposium on
Conference_Location :
Bremen
DOI :
10.1109/ISTC.2014.6955084