Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation

Iván F. Mondragón, Pascual Campoy, Carol Martinez, Miguel Olivares

Producción: Contribución a una revistaArtículorevisión exhaustiva

46 Citas (Scopus)

Resumen

This paper presents an aircraft attitude and heading estimator using catadioptric images as a principal sensor for UAV or as a redundant system for IMU (Inertial Measure Unit) and gyro sensors. First, we explain how the unified theory for central catadioptric cameras is used for attitude and heading estimation, explaining how the skyline is projected on the catadioptric image and how it is segmented and used to calculate the UAV's attitude. Then, we use appearance images to obtain a visual compass, and we calculate the relative rotation and heading of the aerial vehicle. Finally the tests and results using the UAV COLIBRI platform and the validation of them in real flights are presented, comparing the estimated data with the inertial values measured on board.

Idioma originalInglés
Páginas (desde-hasta)809-819
Número de páginas11
PublicaciónRobotics and Autonomous Systems
Volumen58
N.º6
DOI
EstadoPublicada - 30 jun. 2010
Publicado de forma externa

Huella

Profundice en los temas de investigación de 'Omnidirectional vision applied to Unmanned Aerial Vehicles (UAVs) attitude and heading estimation'. En conjunto forman una huella única.

Citar esto