Title :
Aggressive navigation using high-speed natural feature point tracking
Author :
Raabe, Christian ; Henell, Daniel ; Saad, Emad ; Vian, John
Author_Institution :
Univ. of Tokyo, Tokyo, Japan
Abstract :
Presently, most autonomous aerial vehicles rely on satellite navigation such as GPS to sense their position in the earth reference frame. However, reliance on GPS restricts the vehicle to missions where GPS signals are readily received. Motion capture systems are capable of indoor localization but require large infrastructure and are prone to occlusion. To overcome these restrictions, a self-contained high-speed vision system was developed at the University of Tokyo in collaboration with Boeing Research & Technology. The system has been flight tested and shown to be capable of drift-free position and attitude estimates without any reliance on GPS signals. Furthermore, the positional accuracy and update rate is at least one order of magnitude superior to that of uncorrected GPS. The vision system combines a high-speed camera with a lightweight computer and power supply into a self-contained computer-vision package. The computer processes the incoming image stream with a modified version of the University of Oxford Parallel Tracking and Mapping (PTAM) SLAM algorithm. Using this algorithm, the location and pose of the camera (and the MA V it is attached to) is estimated as it moves through space by mapping natural features as they first appear and by tracking those features as they move through or reappear in the camera´s view. Our vision system was demonstrated using a hexacopter test bed. In a pair of experiments, the hexacopter was able to autonomously repeat a circuit of takeoffs and landings at predetermined separated sites using only MEMs gyro sensors and our vision system. One experiment was performed inside Boeing´s motioncapture equipped Collaborative Systems Laboratory (CSL) to prove independence from GPS and to measure the accuracy of the vision system. The hexacopter performed 5 circuits of the navigation task over an area approximately 8 x 8 m at an altitude of approximately 2 m. The vision system´s camera was set to provide an image stream of 640- x 480 pixel resolution at 50 Hz. Upon comparison with motion capture data, position estimates from the vision system were shown to be free of drift, with an average error of 2.2 cm and a maximum error of 9.7 cm when the vision system coordinate frame was optimally aligned to the motion capture coordinate frame. A second experiment was performed in an open outdoor area, allowing for safe execution of more aggressive maneuvering. In this experiment, the vision system´s camera was set to provide an image stream of 320 x 240 pixel resolution at 120 Hz. This experiment demonstrated the ability to perform takeoffs, landings and transit at higher speeds than was demonstrated in the indoor experiment.
Keywords :
Global Positioning System; aerospace control; aerospace instrumentation; autonomous aerial vehicles; gyroscopes; image resolution; microsensors; object tracking; test equipment; GPS signals; MEMs gyro sensors; autonomous aerial vehicles; hexacopter test bed; high-speed camera; high-speed natural feature point tracking; image resolution; lightweight computer; motion capture coordinate frame; power supply; self-contained high-speed vision system; vision system coordinate frame; Navigation;
Conference_Titel :
Aerospace Conference, 2014 IEEE
Conference_Location :
Big Sky, MT
Print_ISBN :
978-1-4799-5582-4
DOI :
10.1109/AERO.2014.6836340