Abstract :
Experiments are reported which demonstrate that, whereas digital inaccuracy in neural arithmetic, in the form of bit-length limitation, degrades neural learning, analogue noise enhances it dramatically. The classification task chosen is that of vowel recognition within a multilayer perceptron network, but the findings seem to be perfectly general in the neural context, and have ramifications for all learning processes where weights evolve incrementally, and slowly.