Title : 
Hidden-layer size reducing for multilayer neural networks using the orthogonal least-squares method
         
        
        
            Author_Institution : 
Fac. of Comput. Eng. & Syst. Sci., Kyushu Inst. of Technol., Fukuoka, Japan
         
        
        
        
        
        
            Abstract : 
This paper proposes a new approach to hidden-layer size reducing for multilayer neural networks, using the orthogonal least-squares (OLS) method based on the Gram-Schmidt orthogonal transformation. A neural network with a large hidden-layer size is first trained via a standard training rule. Then the OLS method is introduced to identify and eliminate redundant neurons such that a simpler neural network is obtained. The OLS method is employed as a forward regression procedure to select a suitable set of neurons from a large set of preliminarily trained hidden neurons, such that the input to the output-layer neuron is reconstructed with less hidden neurons. Simulation results are included to show the efficiency of the proposed method
         
        
            Keywords : 
feedforward neural nets; learning (artificial intelligence); least squares approximations; redundancy; transforms; Gram-Schmidt orthogonal transform; forward regression; hidden neurons; hidden-layer size reduction; learning rule; multilayer neural networks; orthogonal least-squares; redundancy elimination; Backpropagation; Computer networks; Multi-layer neural network; Neural networks; Neurons; Paper technology; Pattern classification; Redundancy; Systems engineering and theory; Training data;
         
        
        
        
            Conference_Titel : 
SICE '97. Proceedings of the 36th SICE Annual Conference. International Session Papers
         
        
            Conference_Location : 
Tokushima
         
        
        
            DOI : 
10.1109/SICE.1997.624936