DocumentCode
3095485
Title
Learning task specific plans through sound and visually interpretable demonstrations
Author
Veeraraghavan, Harini ; Veloso, Manuela
Author_Institution
Comput. Sci. Dept., Carnegie Mellon Univ., Pittsburgh, PA
fYear
2008
fDate
22-26 Sept. 2008
Firstpage
2599
Lastpage
2604
Abstract
Autonomous robots operating in human environments will need to automatically learn to perform new tasks without requiring the implementation of task-specific actions or time-consuming deliberative planning at run-time. In this work, we contribute a demonstration-based approach for teaching a robot task-specific planners involving complex sequential tasks with repetitions. Complexity of tasks results from step repetitions, execution failures and conditionally executing plans. Our demonstration approach uses sound and visually interpretable cues to guide and indicate the various actions and objects to a robot. The robot in turn performs the actions and generalizes its execution into a task-specific planner. We demonstrate the successful plan learning for two different tasks implemented in real-world settings.
Keywords
intelligent robots; learning systems; autonomous robots; complex sequential tasks; demonstration-based approach; execution failures; human environments; learning task specific plans; plan learning; sound interpretable demonstrations; task-specific actions; time-consuming deliberative planning; visually interpretable demonstrations; Humans; Object recognition; Planning; Robot sensing systems; Robots; Uncertainty; Visualization;
fLanguage
English
Publisher
ieee
Conference_Titel
Intelligent Robots and Systems, 2008. IROS 2008. IEEE/RSJ International Conference on
Conference_Location
Nice
Print_ISBN
978-1-4244-2057-5
Type
conf
DOI
10.1109/IROS.2008.4651002
Filename
4651002
Link To Document