Title :
Neural networks with dynamic structure using a GA-based learning method
Author :
Fall, Everett ; Hsin-Han Chiang
Author_Institution :
Dept. of Electr. Eng., Fu Jen Catholic Univ., Taipei, Taiwan
Abstract :
Artificial neural networks (NNs) are traditionally designed with distinctly defined layers (input layer, hidden layers, output layer) and accordingly network design techniques and training algorithms are based on this concept of strictly defined layers. In this paper, a new approach to designing neural networks is presented. The structure of the proposed NN is not strictly defined (each neuron may receive input from any other neuron). Instead, the initial network structure can be randomly generated, and traditional methods of training, such as back-propagation, are replaced or augmented by a genetic algorithm (GA). The weighting of each neuron input is encoded genetically to serve as the genes for the GA. By means of the training data provided to the supervised network, the contribution of each neuron in creating a desired output serves as a selection function. Each of the neurons is then modified to store and recall past weightings for possible future use. A simple network is trained to recognize vertical and horizontal lines as a proof of concept.
Keywords :
backpropagation; genetic algorithms; neural nets; ANN; GA-based learning method; artificial neural networks; backpropagation; dynamic structure; genetic algorithm; selection function; supervised network; Algorithm design and analysis; Artificial neural networks; Biological neural networks; Genetic algorithms; Neurons; Training; Neural network; biologically inspired; complex neuron; deep learning; genetic algorithm; weight retention;
Conference_Titel :
Networking, Sensing and Control (ICNSC), 2015 IEEE 12th International Conference on
Conference_Location :
Taipei
DOI :
10.1109/ICNSC.2015.7116001