Title :
A novel global training algorithm and its convergence theorem for fuzzy neural networks
Author :
Liang-Jie, Zhang ; Li Yan-Da ; Min, Chen Hui
Author_Institution :
Dept. of Autom., Tsinghua Univ., Beijing, China
Abstract :
In this paper, a new global optimizing algorithm that combines the modified quasi-Newton method and the improved genetic algorithm is proposed to find the global minimum of the total error function of a fuzzy neural network. A global linear search algorithm based on fuzzy logic and combinatorial interpolation techniques is developed in the modified quasi-Newton model. It is shown that this algorithm ensures convergence to a global minimum with probability 1 in a compact region of a weight vector space. The results of computer simulations also reveal that this algorithm has a better convergence property and the times of global search are obviously decreased
Keywords :
Newton method; backpropagation; convergence of numerical methods; error analysis; fuzzy logic; fuzzy neural nets; genetic algorithms; interpolation; combinatorial interpolation; convergence theorem; error function; fuzzy logic; fuzzy neural networks; genetic algorithm; global linear search algorithm; global training algorithm; probability; quasi-Newton method; weight vector space; Computer simulation; Convergence; Fuzzy control; Fuzzy logic; Fuzzy neural networks; Fuzzy reasoning; Genetic algorithms; Input variables; Interpolation; Optimization methods;
Conference_Titel :
Neural Networks, 1995. Proceedings., IEEE International Conference on
Conference_Location :
Perth, WA
Print_ISBN :
0-7803-2768-3
DOI :
10.1109/ICNN.1995.487557