DocumentCode
454620
Title
Multitask Learning for Spoken Language Understanding
Author
Tur, Gokhan
Author_Institution
Speech Technol. & Res. Lab., SRI Int.
Volume
1
fYear
2006
fDate
14-19 May 2006
Abstract
In this paper, we present a multitask learning (MTL) method for intent classification in goal oriented human-machine spoken dialog systems. MTL aims at training tasks in parallel while using a shared representation. What is learned for each task can help other tasks be learned better. Our goal is to automatically re-use the existing labeled data from various applications, which are similar but may have different intents or intent distributions, in order to improve the performance. For this purpose, we propose an automated intent mapping algorithm across applications. We also propose employing active learning to selectively sample the data to be re-used. Our results indicate that we can achieve significant improvements in intent classification performance especially when the labeled data size is limited
Keywords
interactive systems; learning (artificial intelligence); man-machine systems; pattern classification; speech processing; active learning; automated intent mapping algorithm; human-machine spoken dialog systems; intent classification; multitask learning method; spoken language understanding; Adaptation model; Backpropagation algorithms; Humans; Laboratories; Libraries; Man machine systems; Natural languages; Neural networks; Routing; Speech processing;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech and Signal Processing, 2006. ICASSP 2006 Proceedings. 2006 IEEE International Conference on
Conference_Location
Toulouse
ISSN
1520-6149
Print_ISBN
1-4244-0469-X
Type
conf
DOI
10.1109/ICASSP.2006.1660088
Filename
1660088
Link To Document