DocumentCode
252248
Title
A Kinect based vibrotactile feedback system to assist the visually impaired
Author
Yelamarthi, Kumar ; DeJong, Brian P. ; Laubhan, Kevin
Author_Institution
Sch. of Eng. & Technol., Central Michigan Univ., Mount Pleasant, MI, USA
fYear
2014
fDate
3-6 Aug. 2014
Firstpage
635
Lastpage
638
Abstract
This paper presents a Microsoft Kinect based vibrotactile feedback system to aid in navigation for the visually impaired. The lightweight wearable system interprets the visual scene and presents obstacle distance and characteristic information to the user. The scene is converted into a distance map using the Kinect, then processed and interpreted using an Intel Next Unit of Computing (NUC). That information is then converted via a microcontroller into vibrotactile feedback, presented to the user through two four-by-four vibration motor arrays woven into gloves. The system is shown to successfully identify, track, and present closest objects, closest humans, multiple humans, and perform distance measurements.
Keywords
feedback; handicapped aids; haptic interfaces; navigation; wearable computers; Intel Next Unit of Computing; Microsoft Kinect; NUC; lightweight wearable system; navigation; obstacle distance; vibrotactile feedback system; visually impaired assistance; Assistive devices; Microcontrollers; Object recognition; Sonar navigation; Vibrations; Visualization; blind; kinect sensor; navigation assistance; tactile feedback; visually impaired;
fLanguage
English
Publisher
ieee
Conference_Titel
Circuits and Systems (MWSCAS), 2014 IEEE 57th International Midwest Symposium on
Conference_Location
College Station, TX
ISSN
1548-3746
Print_ISBN
978-1-4799-4134-6
Type
conf
DOI
10.1109/MWSCAS.2014.6908495
Filename
6908495
Link To Document