DocumentCode
663181
Title
HARMONIE: A multimodal control framework for human assistive robotics
Author
Katyal, Kapil D. ; Johannes, Matthew S. ; McGee, Timothy G. ; Harris, Andrew J. ; Armiger, Robert S. ; Firpi, Alexer H. ; McMullen, David ; Hotson, Guy ; Fifer, Matthew S. ; Crone, Nathan E. ; Vogelstein, R. Jacob ; Wester, Brock A.
Author_Institution
Johns Hopkins Univ. Appl. Phys. Lab., Laurel, MD, USA
fYear
2013
fDate
6-8 Nov. 2013
Firstpage
1274
Lastpage
1278
Abstract
Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the user´s cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.
Keywords
artificial limbs; assisted living; augmented reality; brain-computer interfaces; cognition; dexterous manipulators; electroencephalography; electromyography; end effectors; graphical user interfaces; handicapped aids; medical robotics; medical signal processing; mobile robots; robot vision; wheelchairs; ADL; BCI; ECoG; EEG; HARMONIE; activities of daily living; command and control; computer vision; dexterous assistive devices; effector centric device control; eye tracking; graphical user interface; hand conformations; human assistive robotics; hybrid augmented reality multimodal operation neural integration environment; intuitive control system; intuitive modalities; multimodal control framework; multistage tasks; natural modalities; neural interfaces; noninvasive engineering solutions; objects grasping; objects interaction; patient populations; patient use; prosthesis feedback; prosthetic controls; prosthetic joysticks; prosthetic limb; robotic assistive devices; sEMG signals; self-feeding; semiautonomous control system; smooth movement trajectories; surface electromyography signal; surgical implantation; surgically implanted brain-computer interfaces; upper-limb amputees; user cognitive load; user control; wheelchair-bound quadriplegics; Computers; Electroencephalography; Prosthetics; Robots; Sensors; Trajectory; Assistive robotics; brain-computer interface; brain-machine interface; computer vision; hybrid BCI/BMI; intelligent robotics; modular prosthetic limb; neural prosthetic system; prosthetics; robotic limb; semi-autonomous;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Engineering (NER), 2013 6th International IEEE/EMBS Conference on
Conference_Location
San Diego, CA
ISSN
1948-3546
Type
conf
DOI
10.1109/NER.2013.6696173
Filename
6696173
Link To Document