Title :
Using function approximation to analyze the sensitivity of MLP with antisymmetric squashing activation function
Author :
Yeung, Daniel S. ; Sun, Xuequan
Author_Institution :
Dept. of Comput., Hong Kong Univ., China
fDate :
1/1/2002 12:00:00 AM
Abstract :
Sensitivity analysis on a neural network is mainly investigated after the network has been designed and trained. Very few have considered this as a critical issue prior to network design. Piche´s statistical method (1992, 1995) is useful for multilayer perceptron (MLP) design, but too severe limitations are imposed on both input and weight perturbations. This paper attempts to generalize Piche´s method by deriving an universal expression of MLP sensitivity for antisymmetric squashing activation functions, without any restriction on input and output perturbations. Experimental results which are based on, a three-layer MLP with 30 nodes per layer agree closely with our theoretical investigations. The effects of the network design parameters such as the number of layers, the number of neurons per layer, and the chosen activation function are analyzed, and they provide useful information for network design decision-making. Based on the sensitivity analysis of MLP, we present a network design method for a given application to determine the network structure and estimate the permitted weight range for network training
Keywords :
function approximation; multilayer perceptrons; sensitivity analysis; MLP design; MLP sensitivity analysis; antisymmetric squashing activation function; function approximation; multilayer perceptron design; statistical method; Decision making; Design methodology; Function approximation; Information analysis; Multilayer perceptrons; Neural networks; Neurons; Sensitivity analysis; Statistical analysis; Sun;
Journal_Title :
Neural Networks, IEEE Transactions on