DocumentCode :
13973
Title :
A New Gradient Descent Approach for Local Learning of Fuzzy Neural Models
Author :
Wanqing Zhao ; Kang Li ; Irwin, George W.
Author_Institution :
Sch. of Electron., Electr. Eng. & Comput. Sci., Queen´s Univ. Belfast, Belfast, UK
Volume :
21
Issue :
1
fYear :
2013
fDate :
Feb. 2013
Firstpage :
30
Lastpage :
44
Abstract :
The majority of reported learning methods for Takagi-Sugeno-Kang (TSK) fuzzy neural models to date mainly focus on improvement of their accuracy. However, one of the key design requirements in building an interpretable fuzzy model is that each obtained rule consequent must match well with the system local behavior when all the rules are aggregated to produce the overall system output. This is one of the distinctive characteristics from black-box models such as neural networks. Therefore, how to find a desirable set of fuzzy partitions and, hence, identify the corresponding consequent models which can be directly explained in terms of system behavior, presents a critical step in fuzzy neural modeling. In this paper, a new learning approach considering both nonlinear parameters in the rule premises and linear parameters in the rule consequents is proposed. Unlike the conventional two-stage optimization procedure widely practiced in the field where the two sets of parameters are optimized separately, the consequent parameters are transformed into a dependent set on the premise parameters, thereby enabling the introduction of a new integrated gradient descent learning approach. Thus, a new Jacobian matrix is proposed and efficiently computed to achieve a more accurate approximation of the cost function by using the second-order Levenberg-Marquardt optimization method. Several other interpretability issues regarding the fuzzy neural model are also discussed and integrated into this new learning approach. Numerical examples are presented to illustrate the resultant structure of the fuzzy neural models and the effectiveness of the proposed new algorithm, and compared with the results from some well-known methods.
Keywords :
Jacobian matrices; fuzzy neural nets; gradient methods; learning (artificial intelligence); Jacobian matrix; TSK; Takagi-Sugeno-Kang; black-box models; distinctive characteristics; fuzzy neural models; integrated gradient descent learning approach; local learning; new gradient descent approach; second-order Levenberg-Marquardt optimization method; Computational modeling; Cost function; Fuzzy neural networks; Jacobian matrices; Vectors; Fuzzy systems; Takagi–Sugeno–Kang (TSK) model; gradient descent; interpretability; local learning;
fLanguage :
English
Journal_Title :
Fuzzy Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
1063-6706
Type :
jour
DOI :
10.1109/TFUZZ.2012.2200900
Filename :
6203571
Link To Document :
بازگشت