DocumentCode :
3051496
Title :
Multichannel audio aided dynamical perception for prosthetic hand biofeedback
Author :
González, Jose ; Yu, Wenwei
Author_Institution :
Chiba Univ., Chiba, Japan
fYear :
2009
fDate :
23-26 June 2009
Firstpage :
240
Lastpage :
245
Abstract :
Visual input is one of the prerequisites for most biofeedback studies for prosthetic hand control, since amputees lost part of their proprioception. This study explores whether it is possible to use audio feedback alone to convey more than one independent variable, without relying on visual input, to improve the learning of new perceptions; in this case artificial proprioception of a prosthetic hand. This way different features of the audio feedback can be observed to design applications capable of coupling different sensory inputs that will improve the acceptance and control of prosthetic devices. Experiments were conducted to determine whether the audio signals can be used as a multi-variable dynamical sensory substitution in reaching movements. The results showed that it is possible to use auditive feedback to create a body image without using the visual contact as a guide, thus to assist prosthetic hand users by internalizing new perceptions.
Keywords :
audio signal processing; mechanoception; prosthetics; artificial proprioception; auditive feedback; grasping movement; multichannel audio aided dynamical perception; prosthetic devices; prosthetic hand biofeedback; Artificial limbs; Biological control systems; Brain modeling; Medical treatment; Monitoring; Neurofeedback; Prosthetic hand; Real time systems; Rehabilitation robotics; Visual system;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Rehabilitation Robotics, 2009. ICORR 2009. IEEE International Conference on
Conference_Location :
Kyoto International Conference Center
ISSN :
1945-7898
Print_ISBN :
978-1-4244-3788-7
Electronic_ISBN :
1945-7898
Type :
conf
DOI :
10.1109/ICORR.2009.5209521
Filename :
5209521
Link To Document :
بازگشت