Title :
Tracking focus of attention in meetings
Author :
Stiefelhagen, Rainer
Author_Institution :
Interactive Syst. Labs., Karlsruhe Univ., Germany
Abstract :
The author presents an overview of his work on tracking focus of attention in meeting situations. He has developed a system capable of estimating participants´ focus of attention from multiple cues. In the system he employs an omni-directional camera to simultaneously track the faces of participants sitting around a meeting table and uses neural networks to estimate their head poses. In addition, he uses microphones to detect who is speaking. The system predicts participants´ focus of attention from acoustic and visual information separately, and then combines the output of the audio- and video-based focus of attention predictors. In addition he reports recent experimental results: In order to determine how well we can predict a subject´s focus of attention solely on the basis of his or her head orientation, he has conducted an experiment in which he recorded head and eye orientations of participants in a meeting using special tracking equipment. The results demonstrate that head orientation was a sufficient indicator of the subjects´ focus target in 89% of the time. Furthermore he discusses how the neural networks used to estimate head orientation can be adapted to work in new locations and under new illumination conditions.
Keywords :
image motion analysis; lighting; neural nets; speech recognition; tracking; user interfaces; acoustic information; attention predictors; audio; experimental results; focus of attention tracking; head orientation; head pose estimation; illumination; meetings; microphones; multiple cues; neural networks; omni-directional camera; video; visual information; Cameras; Competitive intelligence; Face detection; Focusing; Head; Humans; Interactive systems; Laboratories; Layout; Lighting;
Conference_Titel :
Multimodal Interfaces, 2002. Proceedings. Fourth IEEE International Conference on
Print_ISBN :
0-7695-1834-6
DOI :
10.1109/ICMI.2002.1167006