Title :
Sensor fusion methods for synthetic vision systems
Author :
Allerton, David J. ; Clare, Anthony J.
Author_Institution :
Dept. of Autom. Control & Syst. Eng., Sheffield Univ., UK
Abstract :
A millimetric radar imaging sensor can project a forward-looking view in a head-up display (HUD) to provide enhanced vision in the final stages of an approach, particularly in conditions of very low visibility. Although this increases situational awareness for the flight crew, the image quality is poor and there is no direct measure of system integrity. This paper describes a synthetic vision system using real-time image feature extraction to detect the runway in the image. This information is combined with knowledge of the aircraft position and attitude to provide flight guidance cues and to monitor the aircraft flight path. In the initial phase of the approach, GPS measurements are used to align the inertial reference system. During the final stages of an approach, inertial reference measurements are combined with imaging data to locate the vertices of the runway. Sensor fusion methods are used to provide flight guidance cues in the HUD and to determine system integrity measurements of the imaging system. A synthetic vision system overlays the computed runway position on the cluttered radar image and displays essential flight data. The paper outlines a radar model of the sensor, which runs on a PC-based visual system. This model has been used to provide a realistic real-time radar image during development of the tracking algorithms. The inertial reference system and the tracking system are also modeled and combined in an extended Kalman filter to provide flight guidance and to give timely warning of system failures to the flight crew. The paper describes the sensor fusion method developed for failure detection and provides examples of low visibility approaches flown in a flight simulator, to demonstrate the effectiveness of these techniques.
Keywords :
Global Positioning System; Kalman filters; aerospace control; aircraft displays; aircraft landing guidance; feature extraction; head-up displays; inertial navigation; inertial systems; radar tracking; sensor fusion; GPS measurements; aircraft flight path; aircraft position; cluttered radar image; extended Kalman filter; failure detection; flight crew; flight guidance cues; flight simulator; forward-looking view; head-up display; image quality; inertial reference measurements; inertial reference system; low visibility approaches; millimetric radar imaging sensor; radar model; real-time image feature extraction; runway position; runway vertices; sensor fusion; situational awareness; synthetic vision systems; system failures; system integrity; tracking system; Aircraft; Displays; Feature extraction; Image quality; Image sensors; Machine vision; Radar imaging; Radar tracking; Real time systems; Sensor fusion;
Conference_Titel :
Digital Avionics Systems Conference, 2004. DASC 04. The 23rd
Print_ISBN :
0-7803-8539-X
DOI :
10.1109/DASC.2004.1391310