DocumentCode :
3764030
Title :
On combining wearable sensors and visual SLAM for remote controlling of low-cost micro aerial vehicles
Author :
Jos? Mart?nez-Carranza;Francisco Marquez;Esteban O. Garcia;Ang?lica Mu?oz-Melendez;Walterio Mayol-Cuevas
Author_Institution :
Instituto Nacional de Astrof?sica, ?ptica y Electr?nica, Puebla, 72840, M?xico
fYear :
2015
Firstpage :
232
Lastpage :
240
Abstract :
In this work we present initial results of a system that combines wearable technology and monocular simultaneous localisation and mapping (SLAM) for remote controlling of a low-cost micro aerial vehicle (MAV) that flies beyond the visual line-of-sight. To this purpose, as a first step, we use a state-of-the-art visual SLAM system, called ORB-SLAM, to create a 3D map of the scene. The visual data feeding ORB-SLAM is obtained from imagery transmitted from the on-board camera of our low-cost vehicle. This vehicle can not process data on board, however, it can transmit images at a rate of 15-20 Hz, which we found sufficient to carry out the visual localisation and mapping. The second step in our system is to replace the conventional controller with a pair of wearable-sensor-based gloves worn by the user so he/she can command the MAV by only performing hand gestures. Our goal is to show that the user can fly the vehicle beyond the line-of-sight by only using the vehicle´s pose and map estimates in real time and that commanding the MAV with hand gestures will enable him/her to focus more on the flight task. Our preliminary results indicate the feasibility of our approach.
Keywords :
"Vehicles","Visualization","Simultaneous localization and mapping","Cameras","Three-dimensional displays","Wearable sensors"
Publisher :
ieee
Conference_Titel :
Research, Education and Development of Unmanned Aerial Systems (RED-UAS), 2015 Workshop on
Type :
conf
DOI :
10.1109/RED-UAS.2015.7441012
Filename :
7441012
Link To Document :
بازگشت