Title :
Optimization research on Artificial Neural network Model
Author :
Huanping, Zhao ; Congying, Lv ; Xinfeng, Yang
Author_Institution :
Dept. of Comput. Sci. & Technol., Nanyang Inst. of Technol., Nanyang, China
Abstract :
Optimization Research on Artificial Neural Tree Network Model is divided into two parts: optimizing topology structure and optimizing parameters. For optimizing topology structure, building-block-library based genetic programming algorithm, anarchical variable probability vector based probabilistic incremental program evolution algorithm and tree-encoded based particle swarm optimization algorithm are proposed. The above algorithms can effectively reduce the number of invalid individuals generated in evolution process, improve the convergence speed and error precision of the NTNM. For optimizing parameters, differential evolution algorithm is introduced. It has characteristics of less parameters to control, easier to implement and uneasy to fall into local minimum, etc. which make it very suitable for the optimization of parameters.
Keywords :
convergence; genetic algorithms; neural nets; particle swarm optimisation; probability; topology; trees (mathematics); vectors; NTNM; anarchical variable probability vector; artificial neural network model; artificial neural tree network model; building-block-library based genetic programming algorithm; convergence speed; differential evolution algorithm; error precision; evolution process; invalid individuals; optimization research; optimizing parameters; optimizing topology structure; parameter optimization; probabilistic incremental program evolution algorithm; tree-encoded based particle swarm optimization algorithm; Classification algorithms; Computational modeling; Encoding; Genetics; Optimization; neural tree network model; optimization; parameters; topology;
Conference_Titel :
Computer Science and Network Technology (ICCSNT), 2011 International Conference on
Conference_Location :
Harbin
Print_ISBN :
978-1-4577-1586-0
DOI :
10.1109/ICCSNT.2011.6182301