Title :
On exploiting sparsity in CANFIS neuro-fuzzy modular network learning by second-order stagewise backpropagation
Author_Institution :
Dept. of Ind. Manage., Nat. Taiwan Univ. of Sci. & Technol., Taipei, Taiwan
Abstract :
We describe efficient evaluation of the (global) Hessian matrix of the sum-squared-error measure for CANFIS neuro-fuzzy modular network learning. Our network consists of multiple (local-expert) multilayer perceptrons (MLPs) mediated by fuzzy membership functions, leading to an iteratively reweighted nonlinear least squares problem. In the posed situation, we show how our second-order stagewise backpropagation procedure, recently developed for learning with a single MLP, efficiently exploits the sparsity (of the Hessian matrix) that arises in a multiple-response problem. In spite of its complex modular architecture, our procedure works excellently. Its computational convenience is immense since such an efficient evaluation is crucial in implementing Newton-type second-order algorithms that may exploit negative curvature when the Hessian matrix is indefinite as well as in the Hessian analysis for any type of modular neural-network learning.
Keywords :
Hessian matrices; backpropagation; fuzzy neural nets; multilayer perceptrons; CANFIS neuro-fuzzy modular network learning; Hessian matrix; Newton-type second-order algorithms; multiple multilayer perceptrons; multiple-response problem; neural-network learning; second-order stagewise backpropagation; sum-squared-error measure; Backpropagation; Education; Fuzzy systems; Jacobian matrices; Joints; Neural networks; Sparse matrices;
Conference_Titel :
Fuzzy Systems and Knowledge Discovery (FSKD), 2011 Eighth International Conference on
Conference_Location :
Shanghai
Print_ISBN :
978-1-61284-180-9
DOI :
10.1109/FSKD.2011.6019560