Title :
On sequential construction of binary neural networks
Author_Institution :
Istituto per i Circuiti Elettronici, CNR, Genova, Italy
fDate :
5/1/1995 12:00:00 AM
Abstract :
A new technique called sequential window learning (SWL), for the construction of two-layer perceptrons with binary inputs is presented. It generates the number of hidden neurons together with the correct values for the weights, starting from any binary training set. The introduction of a new type of neuron, having a window-shaped activation function, considerably increases the convergence speed and the compactness of resulting networks. Furthermore, a preprocessing technique, called hamming clustering (HC), is proposed for improving the generalization ability of constructive algorithms for binary feedforward neural networks. Its insertion in the sequential window learning is straightforward. Tests on classical benchmarks show the good performances of the proposed techniques, both in terms of network complexity and recognition accuracy
Keywords :
convergence; feedforward neural nets; generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; pattern recognition; binary neural networks; binary training set; convergence speed; feedforward neural networks; generalization ability; hamming clustering; network complexity; recognition accuracy; sequential window learning; two-layer perceptrons; window-shaped activation function; Backpropagation algorithms; Clustering algorithms; Convergence; Feedforward neural networks; Helium; Network synthesis; Neural networks; Neurons; Performance evaluation; Testing;
Journal_Title :
Neural Networks, IEEE Transactions on