Title :
Hand action perception for robot programming
Author :
Jiar, Yunde ; Wheeler, Mark ; Ikeuchi, Katsushi
Author_Institution :
Sch. of Comput. Sci., Carnegie Mellon Univ., Pittsburgh, PA, USA
Abstract :
This paper presents a general and robust approach to hand action perception for automatic robot programming using depth image sequences. The human instructor must simply demonstrate an assembly task in front of a vision system in the human world; no dataglove or special markings are necessary. The recorded image sequences are used to recover a depth image sequence for model-based human hand and object tracking to form the perceptual data stream. The data stream is then segmented and interpreted for generating a task sequence which describes the human hand action and the relationship between the manipulated object and the hand. The task sequence might be composed of a series of subtasks and each subtask involves four phases: approaching, pre-manipulating, manipulating and departing. In this paper we also discuss a robot system that replicates the observed task and automatically validates the replication results in the robot world
Keywords :
assembling; automatic programming; image recognition; image segmentation; image sequences; manipulators; robot programming; approach; assembly task; automatic robot programming; data stream segmentation; departure; depth image sequences; hand action perception; model-based human hand tracking; object tracking; perceptual data stream; pre-manipulation; recorded image sequences; subtasks; task sequence; vision system; Assembly systems; Data gloves; Humans; Image sequences; Machine vision; Robot programming; Robotic assembly; Robotics and automation; Robustness; Streaming media;
Conference_Titel :
Intelligent Robots and Systems '96, IROS 96, Proceedings of the 1996 IEEE/RSJ International Conference on
Conference_Location :
Osaka
Print_ISBN :
0-7803-3213-X
DOI :
10.1109/IROS.1996.569024