DocumentCode
911334
Title
Sensitivity analysis of multilayer perceptron with differentiable activation functions
Author
Choi, Jin Young ; Choi, Chong-Ho
Author_Institution
Dept. of Control & Instrum. Eng., Seoul Nat. Univ., South Korea
Volume
3
Issue
1
fYear
1992
fDate
1/1/1992 12:00:00 AM
Firstpage
101
Lastpage
107
Abstract
In a neural network, many different sets of connection weights can approximately realize an input-output mapping. The sensitivity of the neural network varies depending on the set of weights. For the selection of weights with lower sensitivity or for estimating output perturbations in the implementation, it is important to measure the sensitivity for the weights. A sensitivity depending on the weight set in a single-output multilayer perceptron (MLP) with differentiable activation functions is proposed. Formulas are derived to compute the sensitivity arising from additive/multiplicative weight perturbations or input perturbations for a specific input pattern. The concept of sensitivity is extended so that it can be applied to any input patterns. A few sensitivity measures for the multiple output MLP are suggested. For the verification of the validity of the proposed sensitivities, computer simulations have been performed, resulting in good agreement between theoretical and simulation outcomes for small weight perturbations
Keywords
neural nets; perturbation theory; sensitivity analysis; additive/multiplicative weight perturbations; computer simulations; connection weights; differentiable activation functions; input pattern; input-output mapping; multilayer perceptron; neural network; output perturbations; sensitivity; Computational modeling; Computer simulation; Instruments; Multilayer perceptrons; Neural network hardware; Neural networks; Neurons; Sensitivity analysis;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.105422
Filename
105422
Link To Document