Title :
Approximation by fully complex MLP using elementary transcendental activation functions
Author :
Kim, Taehwan ; Adali, Tülay
Abstract :
Recently, we presented ´fully´ complex multi-layer perceptrons (MLPs) using a subset of complex elementary transcendental functions as the nonlinear activation functions. These functions jointly process the inphase (I) and quadrature (Q) components of data, while taking full advantage of well-defined gradients in the error back-propagation. The characteristics of these elementary transcendental functions are categorized and their common almost everywhere (a.e.) bounded and analytic properties are investigated. More importantly, it is proved that fully complex MLPs are a.e. convergent and therefore are capable of universally approximating any nonlinear complex mapping to an arbitrary accuracy. Numerical examples demonstrate the benefit of isolated essential singularity included in a subgroup of elementary transcendental functions in achieving arbitrarily close approximation to the desired mapping
Keywords :
approximation theory; backpropagation; integration; multilayer perceptrons; transfer functions; analytic properties; arbitrarily close approximation; arbitrary accuracy; common almost everywhere properties; complex elementary transcendental functions; complex multi-layer perceptrons; elementary transcendental activation functions; elementary transcendental functions; error back-propagation; fully complex MLP; in-phase; isolated essential singularity; nonlinear activation functions; nonlinear complex mapping; quadrature components; universally approximation; Algorithm design and analysis; Independent component analysis; Neural networks; Neurons;
Conference_Titel :
Neural Networks for Signal Processing XI, 2001. Proceedings of the 2001 IEEE Signal Processing Society Workshop
Conference_Location :
North Falmouth, MA
Print_ISBN :
0-7803-7196-8
DOI :
10.1109/NNSP.2001.943125