DocumentCode :
2424982
Title :
Interactive biped locomotion based on visual/auditory information
Author :
Ogura, Yu. ; Sugahara, Yusuke ; Kaneshima, Yoshiharu ; Hieda, Naoki ; Lim, Hun-ok ; Takanishi, Atsuo
Author_Institution :
Graduate Sch. of Sci. & Eng., Waseda Univ., Tokyo, Japan
fYear :
2002
fDate :
2002
Firstpage :
253
Lastpage :
258
Abstract :
This paper describes an interactive locomotion method for a biped humanoid robot. The method consists of two main parts: a pattern generator and a human-robot interface. The human robot interface is used to achieve real-time interactive locomotion. In particular, visual information and voice instructions are employed to determine locomotion parameters such as step length, step direction, and the number of steps. The motion of the lower-limbs is generated by the online pattern generator based on the locomotion parameters. Continuous locomotion experiments are carried out in real time using WABIAN-RV. The experimental results show the feasibility of the proposed interactive locomotion method.
Keywords :
interactive systems; legged locomotion; real-time systems; robot programming; robot vision; speech recognition; speech-based user interfaces; WABIAN-RV; biped humanoid robot; continuous locomotion experiments; human-robot interface; interactive biped locomotion; lower-limb motion; pattern generator; real-time interactive locomotion; step direction; step length; visual information; visual/auditory information; voice instructions; Collaborative work; Design engineering; Educational robots; Human robot interaction; Humanoid robots; Learning systems; Legged locomotion; Manufacturing automation; Robot sensing systems; Robotics and automation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Robot and Human Interactive Communication, 2002. Proceedings. 11th IEEE International Workshop on
Print_ISBN :
0-7803-7545-9
Type :
conf
DOI :
10.1109/ROMAN.2002.1045631
Filename :
1045631
Link To Document :
بازگشت