The UZH-FPV Drone Racing Dataset

High-speed, Aggressive 6DoF Trajectories for State Estimation and Drone Racing

About

We introduce the UZH-FPV Drone Racing dataset, which is the most aggressive visual-inertial odometry dataset to date. Large accelerations, rotations, and apparent motion in vision sensors make aggressive trajectories difficult for state estimation. However, many compelling applications, such as autonomous drone racing, require high speed state estimation, but existing datasets do not address this. These sequences were recorded with a first-person-view (FPV) drone racing quadrotor fitted with sensors and flown aggressively by an expert pilot. The trajectories include fast laps around a racetrack with drone racing gates, as well as free-form trajectories around obstacles, both indoor and out. We present the camera images and IMU data from a Qualcomm Snapdragon Flight board, ground truth from a Leica Nova MS60 laser tracker, as well as event data from an mDAVIS 346 event camera, and high-resolution RGB images from the pilot’s FPV camera. With this dataset, our goal is to help advance the state of the art in high speed state estimation.

Publication

When using this work in an academic context, please cite the following publication:

Are We Ready for Autonomous Drone Racing? The UZH-FPV Drone Racing Dataset

J. Delmerico, T. Cieslewski, H. Rebecq, M. Faessler, D. Scaramuzza

Are We Ready for Autonomous Drone Racing? The UZH-FPV Drone Racing Dataset

IEEE International Conference on Robotics and Automation (ICRA), 2019.

PDF YouTube Code


@InProceedings{Delmerico19icra,
 author = {Jeffrey Delmerico and Titus Cieslewski and Henri Rebecq and Matthias Faessler and Davide Scaramuzza},
 title = {Are We Ready for Autonomous Drone Racing? The {UZH-FPV} Drone Racing Dataset},
 booktitle = {{IEEE} Int. Conf. Robot. Autom. ({ICRA})},
 year = 2019
}

License

This datasets are released under the Creative Commons license (CC BY-NC-SA 3.0), which is free for non-commercial use (including research).

Acknowledgements

This work was supported by the National Centre of Competence in Research Robotics (NCCR) through the Swiss National Science Foundation, the SNSF-ERC Starting Grant, and the DARPA FLA Program.

This work would not have been possible without the assistance of Stefan Gächter, Zoltan Török, and Thomas Mörwald of Leica Geosystems and their support in gathering our data. Additional thanks go to Innovation Park Zürich, and the Fässler family for providing experimental space, iniVation AG and Prof. Tobi Delbruck for their support and guidance with the mDAVIS sensors, and Hanspeter Kunz from the Department of Infromatics at the University of Zurich for his support with setting up this website.