DocumentCode :
1021433
Title :
Probability distribution normalisation of data applied to neural net classifiers
Author :
Tattersall, G.D.
Author_Institution :
Sch. of Inf. Syst., East Anglia Univ., Norwich, UK
Volume :
30
Issue :
1
fYear :
1994
fDate :
1/6/1994 12:00:00 AM
Firstpage :
56
Lastpage :
57
Abstract :
The individual elements of pattern vectors generated by real systems often have widely different value ranges. Direct application of these patterns to a distance-based classifier such as a multilayer perceptron can cause the large value range elements to dominate in the classification decision. A commonly used remedy is to normalise the variance of each pattern element before use. However, the author shows that this approach is often inappropriate and that better results can be obtained by nonlinearly scaling the pattern elements to render their probability distributions approximately uniform as well as having the same variance
Keywords :
feedforward neural nets; pattern recognition; probability; vectors; distance-based classifier; multilayer perceptron; neural net classifiers; nonlinearly scaling; pattern elements; pattern vectors; probability distribution normalisation;
fLanguage :
English
Journal_Title :
Electronics Letters
Publisher :
iet
ISSN :
0013-5194
Type :
jour
DOI :
10.1049/el:19940042
Filename :
260601
Link To Document :
بازگشت