DocumentCode
3514224
Title
The next best touch for model-based localization
Author
Hebert, Paul ; Howard, T. ; Hudson, Nicolas ; Ma, Jiaxin ; Burdick, Joel W.
Author_Institution
Jet Propulsion Lab., California Inst. of Technol., Pasadena, CA, USA
fYear
2013
fDate
6-10 May 2013
Firstpage
99
Lastpage
106
Abstract
This paper introduces a tactile or contact method whereby an autonomous robot equipped with suitable sensors can choose the next sensing action involving touch in order to accurately localize an object in its environment. The method uses an information gain metric based on the uncertainty of the object´s pose to determine the next best touching action. Intuitively, the optimal action is the one that is the most informative. The action is then carried out and the state of the object´s pose is updated using an estimator. The method is further extended to choose the most informative action to simultaneously localize and estimate the object´s model parameter or model class. Results are presented both in simulation and in experiment on the DARPA Autonomous Robotic Manipulation Software (ARM-S) robot.
Keywords
dexterous manipulators; image sensors; mobile robots; parameter estimation; pose estimation; robot vision; touch (physiological); ARM-S; DARPA autonomous robotic manipulation software robot; autonomous robot; contact method; information gain metric; model class estimation; model parameter estimation; model-based localization; next best touch; next sensing action; object localization; object pose uncertainty; tactile method; Computational modeling; Entropy; Equations; Gain measurement; Mathematical model; Robots; Trajectory;
fLanguage
English
Publisher
ieee
Conference_Titel
Robotics and Automation (ICRA), 2013 IEEE International Conference on
Conference_Location
Karlsruhe
ISSN
1050-4729
Print_ISBN
978-1-4673-5641-1
Type
conf
DOI
10.1109/ICRA.2013.6630562
Filename
6630562
Link To Document