Title :
Calibration-free robots for cost-effective, dependable and robust automation
Author_Institution :
Intell. Robots Lab., Bundeswehr Univ., Munich
Abstract :
Robots that never need any calibration of their kinematics, their actuators or their sensors promise great advantages in terms of maintenance cost, robustness and dependability. Here we propose an approach for realizing such robots for calibration-free vision-based object manipulation. The underlying concepts are based on the utilization of laws of projective geometry that always apply, regardless of camera characteristics, and in some cases on learning by doing. No quantitative models of the robotpsilas kinematics, control characteristics or sensors are used. Consequently, even gross changes in those characteristics as they may be caused, e.g., by the aging of parts or by maintenance work, are tolerated, often without any degradation of the performance of the robot. The proposed approach is based on a vision system with an uncalibrated camera pair observing the robotpsilas work space as the main sensor and a strategy that lets the robot learn automatically the relationships between motor control commands and resulting sensor data. Real robots controlled on the basis of the approach have proved their effectiveness, adaptability and robustness in extensive real-world tests.
Keywords :
calibration; image sensors; learning (artificial intelligence); manipulator kinematics; object detection; robot vision; robust control; actuator; calibration-free robot; motor control command; projective geometry; robot kinematics; robot learning; robust cost-effective robot automation; sensor data; uncalibrated camera pair; vision-based object manipulation; Cameras; Kinematics; Orbital robotics; Robot control; Robot sensing systems; Robot vision systems; Robotics and automation; Robustness; Sensor phenomena and characterization; Sensor systems;
Conference_Titel :
Automation and Logistics, 2008. ICAL 2008. IEEE International Conference on
Conference_Location :
Qingdao
Print_ISBN :
978-1-4244-2502-0
Electronic_ISBN :
978-1-4244-2503-7
DOI :
10.1109/ICAL.2008.4636157