DocumentCode :
353361
Title :
Metrics that learn relevance
Author :
Kaski, Samuel ; Sinkkonen, Janne
Author_Institution :
Neural Networks Res. Centre, Helsinki Univ. of Technol., Espoo, Finland
Volume :
5
fYear :
2000
fDate :
2000
Firstpage :
547
Abstract :
We introduce an algorithm for learning a local metric to a continuous input space that measures distances in terms of relevance to the processing task. The relevance is defined as local changes in discrete auxiliary information, which may be for example the class of the data items, an index of performance, or a contextual input. A set of neurons first learns representations that maximize the mutual information between their outputs and the random variable representing the auxiliary information. The implicit knowledge gained about relevance is then transformed into a new metric of the input space that measures the change in the auxiliary information in the sense of local approximations to the Kullback-Leibler divergence. The new metric can be used in further processing by other algorithms. It is especially useful in data analysis applications since the distances can be interpreted in terms of the local relevance of the original variables
Keywords :
data analysis; feature extraction; neural nets; performance evaluation; Kullback-Leibler divergence; continuous input space; data analysis; discrete auxiliary information; implicit knowledge; learning; local metric; mutual information; neurons; performance index; processing task; random variable; Clustering algorithms; Extraterrestrial measurements; Feature extraction; Gain measurement; Input variables; Mutual information; Neural networks; Neurons; Random variables; Space technology;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.861526
Filename :
861526
Link To Document :
بازگشت