DocumentCode :
982826
Title :
Symmetries and discriminability in feedforward network architectures
Author :
Shawe-Taylor, John
Author_Institution :
Dept. of Comput. Sci., R. Holloway & Bedford New Coll., Egham, UK
Volume :
4
Issue :
5
fYear :
1993
fDate :
9/1/1993 12:00:00 AM
Firstpage :
816
Lastpage :
826
Abstract :
This paper investigates the effects of introducing symmetries into feedforward neural networks in what are termed symmetry networks. This technique allows more efficient training for problems in which we require the output of a network to be invariant under a set of transformations of the input. The particular problem of graph recognition is considered. In this case the network is designed to deliver the same output for isomorphic graphs. This leads to the question of which inputs can be distinguished by such architectures. A theorem characterizing when two inputs can be distinguished by a symmetry network is given. As a consequence, a particular network design is shown to be able to distinguish nonisomorphic graphs if and only if the graph reconstruction conjecture holds
Keywords :
feedforward neural nets; graph theory; learning (artificial intelligence); pattern recognition; discriminability; feedforward neural networks; graph recognition; isomorphic graphs; nonisomorphic graphs; symmetry networks; Books; Computer science; Delay effects; Feedforward neural networks; Handwriting recognition; Intelligent networks; Multilayer perceptrons; Neural networks; Neurons;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.248459
Filename :
248459
Link To Document :
بازگشت