DocumentCode :
2651434
Title :
Using the H-Divergence to Prune Probabilistic Automata
Author :
Bernard, Marc ; Jeudy, Baptiste ; Peyrache, Jean-Philippe ; Sebban, Marc ; Thollard, Franck
Author_Institution :
Lab. Hubert Curien, Univ. de Lyon, Lyon, France
fYear :
2011
fDate :
7-9 Nov. 2011
Firstpage :
725
Lastpage :
731
Abstract :
A problem usually encountered in probabilistic automata learning is the difficulty to deal with large training samples and/or wide alphabets. This is partially due to the size of the resulting Probabilistic Prefix Tree (PPT) from which state merging-based learning algorithms are generally applied. In this paper, we propose a novel method to prune PPTs by making use of the H-divergence dH, recently introduced in the field of domain adaptation. dH is based on the classification error made by an hypothesis learned from unlabeled examples drawn according to two distributions to compare. Through a thorough comparison with state-of-the-art divergence measures, we provide experimental evidences that demonstrate the efficiency of our method based on this simple and intuitive criterion.
Keywords :
learning (artificial intelligence); probabilistic automata; trees (mathematics); H-divergence; domain adaptation; merging based learning algorithms; probabilistic automata learning; probabilistic automata pruning; probabilistic prefix tree; Adaptation models; Learning automata; Merging; Noise measurement; Probabilistic logic; Size measurement; Training; H-divergence; Probabilistic Prefix Tree; Pruning methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Tools with Artificial Intelligence (ICTAI), 2011 23rd IEEE International Conference on
Conference_Location :
Boca Raton, FL
ISSN :
1082-3409
Print_ISBN :
978-1-4577-2068-0
Electronic_ISBN :
1082-3409
Type :
conf
DOI :
10.1109/ICTAI.2011.114
Filename :
6103405
Link To Document :
بازگشت