DocumentCode
747954
Title
Comments on "Noise injection into inputs in back propagation learning"
Author
Grandvalet, Yves ; Canu, Stéphane
Author_Institution
Centre de Recherches de Royallieu, Univ. de Technol. de Compiegne, France
Volume
25
Issue
4
fYear
1995
fDate
4/1/1995 12:00:00 AM
Firstpage
678
Lastpage
681
Abstract
The generalization capacity of neural networks learning from examples is important. Several authors showed experimentally that training a neural network with noise injected inputs could improve its generalization abilities. In the original paper (ibid., vol. 22, no. 3. p. 436-40, 1992), Matsuoka explained this fact in a formal way, claiming that using noise injected inputs is equivalent to reduce the sensitivity of the network. However, the author states that an error in Matsuoka´s calculations lead him to inadequate conclusions. This paper corrects these calculations and conclusions.<>
Keywords
backpropagation; generalisation (artificial intelligence); learning by example; neural nets; backpropagation learning; example-based learning; generalization capacity; neural networks; noise injection; Bayesian methods; Computational efficiency; Computer networks; Intelligent networks; Jacobian matrices; Neural networks; Noise reduction; Risk management; Surface reconstruction; Transfer functions;
fLanguage
English
Journal_Title
Systems, Man and Cybernetics, IEEE Transactions on
Publisher
ieee
ISSN
0018-9472
Type
jour
DOI
10.1109/21.370200
Filename
370200
Link To Document