DocumentCode :
2641529
Title :
Perceptual system of networked partner robots
Author :
Masuta, Hiroyuki ; Kubota, Naoyuki
Author_Institution :
Tokyo Metropolitan Univ., Tokyo
fYear :
2007
fDate :
17-20 Sept. 2007
Firstpage :
1845
Lastpage :
1850
Abstract :
This paper discusses how to extract human motion patterns of networked partner robots. Typically, human motions can be classified into body motions and gesture motions, and are extracted by human tracking and gesture recognition, respectively. In order to improve the performance of these motion recogntion methods, we use both of CCD camera image and distance information of a laser range finder. The proposed method for the human tracking is composed of differential extraction, particle velocity estimation, extraction of facial direction, and adaptive color matching, and 3D human modeling of laser range finder. Finally, we discuss the effectiveness of the the proposed method through several experimental results.
Keywords :
gesture recognition; image colour analysis; image matching; image motion analysis; robot vision; CCD camera image; adaptive color matching; body motions; distance information; facial direction extraction; gesture motions; gesture recognition; human motion patterns; human tracking; laser range finder; motion recogntion methods; networked partner robots; particle velocity estimation; perceptual system; Competitive intelligence; Data mining; Data security; Human robot interaction; Information security; Intelligent robots; Intelligent sensors; Monitoring; Robot sensing systems; Speech recognition; 3D human modeling; computational intelligence; human tracking; partner robots;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
SICE, 2007 Annual Conference
Conference_Location :
Takamatsu
Print_ISBN :
978-4-907764-27-2
Electronic_ISBN :
978-4-907764-27-2
Type :
conf
DOI :
10.1109/SICE.2007.4421286
Filename :
4421286
Link To Document :
بازگشت