DocumentCode :
3494316
Title :
Recurrent neural network learning for text routing
Author :
Wermter, Stefan ; Arevian, Garen ; Panchev, Christo
Author_Institution :
Centre for Inf., Sunderland Univ., UK
Volume :
2
fYear :
1999
fDate :
1999
Firstpage :
898
Abstract :
Describes recurrent plausibility networks with internal recurrent hysteresis connections. These recurrent connections in multiple layers encode the sequential context of word sequences. We show how these networks can support text routing of noisy newswire titles according to different given categories. We demonstrate the potential of these networks using an 82 339 word corpus from the Reuters newswire, reaching recall and precision rates above 92%. In addition, we carefully analyze the internal representation using cluster analysis and output representations using a new surface error technique. In general, based on the current recall and precision performance, as well as the detailed analysis, we show that recurrent plausibility networks hold a lot of potential for developing learning and robust newswire agents for the internet
Keywords :
recurrent neural nets; Reuters newswire; cluster analysis; internal recurrent hysteresis connections; noisy newswire titles; recurrent plausibility networks; sequential context; surface error technique; text routing; word sequences;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
ISSN :
0537-9989
Print_ISBN :
0-85296-721-7
Type :
conf
DOI :
10.1049/cp:19991226
Filename :
818051
Link To Document :
بازگشت