Title :
Modelling human assembly actions from observation
Author :
Paul, George V. ; Jiar, Yunde ; Wheeler, Mark D. ; Ikeuchi, Katsushi
Author_Institution :
Robotics Inst., Carnegie Mellon Univ., Pittsburgh, PA, USA
Abstract :
This paper describes a system which can model an assembly task performed by a human. The actions are recorded in real-time using a stereo system. The assembled objects and the fingers of the hand are tracked through the image sequence. We use the spatial relations between the fingers and the objects to temporally segment the task into approach, pre-manipulate, manipulate and depart phases. We interpret the actions in each segment broadly into grasp, push, fine motion, etc. We then analyze the contact relations between objects during the manipulate phase to reconstruct the fine motion path of the manipulated object. The fine motion in configuration space is a series of connected path segments lying on the features (c-surfaces) of the configuration space obstacle. We project the observed configurations onto these c-surfaces and reconstruct the path segments. The connected path segments form the fine motion path. We demonstrate the system using the peg in hole task
Keywords :
assembling; computer vision; image sequences; learning systems; modelling; path planning; real-time systems; robot programming; stereo image processing; computer vision; configuration space; fine motion path; human assembly action modelling; image sequence; manipulate phase; object tracking; path segment reconstruction; real-time system; stereo imaging system; Assembly systems; Fingers; Humans; Image reconstruction; Image segmentation; Layout; Motion analysis; Real time systems; Robotic assembly; Robotics and automation;
Conference_Titel :
Multisensor Fusion and Integration for Intelligent Systems, 1996. IEEE/SICE/RSJ International Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-3700-X
DOI :
10.1109/MFI.1996.572201