Title : 
Towards minimal network architectures with evolutionary growth networks
         
        
            Author : 
Romaniuk, Steve G.
         
        
            Author_Institution : 
Dept. of Inf. Syst. & Comput. Sci., Nat. Univ. of Singapore, Singapore
         
        
        
        
            fDate : 
27 Jun-2 Jul 1994
         
        
        
            Abstract : 
This paper points out how simple learning rules such as perceptron and delta can be re-introduced as local learning techniques to yield an effective automatic network construction algorithm. This feat is accomplished by choosing the right training set during network construction. The choice of partitions can have profound affects on the quality of the created networks, in terms of number of hidden units and connections. Selection of partitions during various network construction phases is achieved by means of evolutionary processes. Empirical evidence underlining the effectiveness of this approach is provided for several well known benchmark problems such as parity, encoder and adder functions
         
        
            Keywords : 
learning (artificial intelligence); neural nets; perceptrons; adder functions; benchmark problems; delta; encoder; evolutionary growth networks; evolutionary processes; learning rules; local learning techniques; minimal network architectures; network construction; parity; partitions selection; perceptron; training set; Genetic algorithms; Information systems; Interference; NP-hard problem; Neural networks; Partitioning algorithms; Test pattern generators; Testing; Transfer functions; Traveling salesman problems;
         
        
        
        
            Conference_Titel : 
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
         
        
            Conference_Location : 
Orlando, FL
         
        
            Print_ISBN : 
0-7803-1901-X
         
        
        
            DOI : 
10.1109/ICNN.1994.374413