Title :
Knowledge-based robotic grasping
Author :
Stansfield, S.A.
Author_Institution :
Sandia Nat. Lab., Albuquerque, NM, USA
Abstract :
A general-purpose robotic grasping system for use in unstructured environments is described. Using computer vision and a compact set of heuristics, the system automatically generates the robot arm and hand motions required for grasping an unmodeled object. The utility of such a system is most evident in environments where the robot will have to grasp and manipulate a variety of unknown objects, but where many of the manipulation tasks may be relatively simple. Examples of such domains are planetary exploration and astronaut assistance, undersea salvage and rescue, and nuclear waste site clean-up. A two-stage model of grasping is described. Stage one is an orientation of the hand and wrist and a ballistic reach toward the object; stage two is hand preshaping and adjustment. Visual features are first extracted from the unmodeled object. These features and their relations are used by an expert system to generate a set of valid reaches/grasps for the object. These grasps are then used in driving the robot hand and arm to bring the fingers into contact with the object in the desired configuration
Keywords :
computer vision; computerised control; knowledge based systems; robots; computer vision; expert system; heuristics; knowledge based systems; manipulation tasks; robotic grasping; visual feature extraction; Fingers; Friction; Grasping; Knowledge based systems; Laboratories; Radioactive pollution; Robot sensing systems; Robotic assembly; Robotics and automation; Wrist;
Conference_Titel :
Robotics and Automation, 1990. Proceedings., 1990 IEEE International Conference on
Conference_Location :
Cincinnati, OH
Print_ISBN :
0-8186-9061-5
DOI :
10.1109/ROBOT.1990.126173