DocumentCode :
3449274
Title :
Sensor fusion in telerobotic task controls
Author :
Bejczy, Antal K. ; Kim, Won S.
Author_Institution :
Jet Propulsion Lab., California Inst. of Technol., Pasadena, CA, USA
Volume :
3
fYear :
1995
fDate :
5-9 Aug 1995
Firstpage :
241
Abstract :
This paper describes a display and control methodology for combining (or fusing) different multidimensional sensor data to guide the performance of telerobotic contact or near-contact tasks successfully in both manual and supervised automatic modes of control. Success is defined as a mapping of control goals or subgoals into a multidimensional data space. Several implemented examples are presented and illustrated. The methodology can also be extended to virtual reality simulation of telerobotic task scenarios through proper physical modeling of sensors
Keywords :
discrete event simulation; sensor fusion; telerobotics; virtual reality; event driven displays; multidimensional data space; multidimensional sensor data; physical modeling; sensor fusion; simulation; telerobotic task control; virtual reality; Automatic control; Displays; Force sensors; Motion control; Multidimensional systems; Robotics and automation; Sensor fusion; Tactile sensors; Telerobotics; Torque;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Robots and Systems 95. 'Human Robot Interaction and Cooperative Robots', Proceedings. 1995 IEEE/RSJ International Conference on
Conference_Location :
Pittsburgh, PA
Print_ISBN :
0-8186-7108-4
Type :
conf
DOI :
10.1109/IROS.1995.525890
Filename :
525890
Link To Document :
بازگشت