Previous Competitions

ICRA 2020 FPV Drone Racing VIO Competition

A second competition using this dataset was held for ICRA 2020. The goal was to estimate the quadrotor motion as accurately as possible, utilizing any desired sensor combinations, and to improve upon the performance of previous year’s submissions. Unfortunately, none of the submissions outperformed last year’s best submission. The competition was hosted on this page.

Results

  • Evaluation: The relative pose errors at the sub-trajectory of lengths {40, 60, 80, 100, 120} meters are computed. The average translation and rotation error over all sequences are used for ranking.
  • Naming rule: The name in the following table is the combinition of the initials of the participant’s last name and the affliation.
  • Sensor coding: S – sensors from the Snapdragon board; D – sensors from the DAVIS.
  • References: References (e.g., report) are available upon the participants’ consent.
Ranking Name Sensors Translation (%) Rotation (deg/m) References
1 OKVIS 2.0 stereo (S) + inertial 7.148 0.262 report
Dr. Stefan Leutenegger is leading the Smart Robotics Lab (SRL) at Imperial College, London.
2 OpenVINS mono (S) + inertial 7.198 0.267 report ; code
Patrick Geneva is part of the Robot Perception and Navigation Group (RPNG) at the University of Delaware.
3 OSU-ETHZ stereo (S) + inertial 7.277 0.266 report
Dr. Jianzhu Huai is with the SPIN lab at The Ohio State University.

IROS 2019 FPV Drone Racing VIO Competition

A first competition using this dataset was held jointly with IROS 2019 Workshop “Challenges in Vision-based Drone Navigation” on November 8, 2019 in Macau. The goal was to estimate the quadrotor motion as accurately as possible, utilizing any desired sensor combinations. The winner was awarded 1,000 USD and invited to present their approach at the workshop. The competition was hosted on this page.

Results

  • Evaluation: The relative pose errors at the sub-trajectory of lengths {40, 60, 80, 100, 120} meters are computed. The average translation and rotation error over all sequences are used for ranking.
  • Naming rule: The name in the following table is the combinition of the initials of the participant’s last name and the affliation.
  • Sensor coding: S – sensors from the Snapdragon board; D – sensors from the DAVIS.
  • References: References (e.g., report) are available upon the participants’ consent.
Ranking Name Sensors Translation (%) Rotation (deg/m) References
1 g-d binocular (S) + inertial 7.023 0.264 report ; code (OpenVINS)
Patrick Geneva is part of the Robot Perception and Navigation Group (RPNG) at the University of Delaware.
2 m-l mono (S) + inertial 7.034 0.266 report
Thomas Mörwald is with Leica Geosystems.
3 u-t stereo (S) + inertial 7.778 0.285 report ; code (Basalt)
Vladyslav Usenko is with the Computer Vision Group at the Technical University of Munich.
4 a-u * stereo (S) + inertial 11.869 0.619 code
5 r-u stereo (S) + inertial 36.048 1.894
* The affiliation of this submission is unknown (not provided by the participant).