Previous Competitions

IROS 2020 FPV Drone Racing VIO Competition

A third competition using this dataset was held for IROS 2020. The goal was to estimate the quadrotor motion as accurately as possible, utilizing any desired sensor combinations, and to improve upon the performance of previous competition submissions. The top three submissions outperformed last competition’s best submission.

Results

  • Evaluation: The relative pose errors at the sub-trajectory of lengths {40, 60, 80, 100, 120} meters are computed. The average translation and rotation error over all sequences are used for ranking.
  • Naming rule: The name in the following table is the combination of the initials of the participant’s last name and the affiliation.
  • Sensor coding: S – sensors from the Snapdragon board; D – sensors from the DAVIS.
  • References: References (e.g., reports) are available upon the participants’ consent.
Ranking Name Sensors Translation (%) Rotation (deg/m) References
1 MEGVII-3D stereo (S) + inertial 6.819 0.263 report
Can Huang, Ran Yan, and Xiao Liu are with the 3D Team at MEGVII Research.
2 VCU Robotics stereo (S) + inertial 6.891 0.265 report
He Zhang and Cang Ye are with the Dept. of Computer Science at Virginia Commonwealth University.
3 LARVIO mono (S) + inertial 6.919 0.266 report
Xiaochen Qiu is part of the Navigation and Embedded System Labortory at Beihang University.
4 Lenovo_LR_ShangHai stereo (S) + inertial 7.005 0.282 report
LiXin Gao is at Lenovo Research, Shanghai.
5 Basalt stereo (S) + inertial 7.494 0.268 report
Nikolaus Demmel is with the Computer Vision Group, TUM.
6 Xin Zhang stereo (S) + inertial 9.140 0.219 report
Xin Zhang is with Shanghai Jiao Tong University.
7 QuetzalC++ mono (D) + inertial 34.273 1.635 report
J. Arturo Cocoma-Ortega, L. Oyuki Rojas-Perez and Prof. Jose Martinez-Carranza are with the Group of Intelligent Unmanned Aerial Systems (iUAS) at the National Institute of Astrophysics, Optics and Electronics, San Andrés Cholula, Puebla, Mexico.

ICRA 2020 FPV Drone Racing VIO Competition

A second competition using this dataset was held for ICRA 2020. The goal was to estimate the quadrotor motion as accurately as possible, utilizing any desired sensor combinations, and to improve upon the performance of previous year’s submissions. Unfortunately, none of the submissions outperformed last year’s best submission. The competition was hosted on this page.

Results

  • Evaluation: The relative pose errors at the sub-trajectory of lengths {40, 60, 80, 100, 120} meters are computed. The average translation and rotation error over all sequences are used for ranking.
  • Naming rule: The name in the following table is the combinition of the initials of the participant’s last name and the affliation.
  • Sensor coding: S – sensors from the Snapdragon board; D – sensors from the DAVIS.
  • References: References (e.g., report) are available upon the participants’ consent.
Ranking Name Sensors Translation (%) Rotation (deg/m) References
1 OKVIS 2.0 stereo (S) + inertial 7.148 0.262 report
Dr. Stefan Leutenegger is leading the Smart Robotics Lab (SRL) at Imperial College, London.
2 OpenVINS mono (S) + inertial 7.198 0.267 report ; code
Patrick Geneva is part of the Robot Perception and Navigation Group (RPNG) at the University of Delaware.
3 OSU-ETHZ stereo (S) + inertial 7.277 0.266 report
Dr. Jianzhu Huai is with the SPIN lab at The Ohio State University.

IROS 2019 FPV Drone Racing VIO Competition

A first competition using this dataset was held jointly with IROS 2019 Workshop “Challenges in Vision-based Drone Navigation” on November 8, 2019 in Macau. The goal was to estimate the quadrotor motion as accurately as possible, utilizing any desired sensor combinations. The winner was awarded 1,000 USD and invited to present their approach at the workshop. The competition was hosted on this page.

Results

  • Evaluation: The relative pose errors at the sub-trajectory of lengths {40, 60, 80, 100, 120} meters are computed. The average translation and rotation error over all sequences are used for ranking.
  • Naming rule: The name in the following table is the combinition of the initials of the participant’s last name and the affliation.
  • Sensor coding: S – sensors from the Snapdragon board; D – sensors from the DAVIS.
  • References: References (e.g., report) are available upon the participants’ consent.
Ranking Name Sensors Translation (%) Rotation (deg/m) References
1 g-d binocular (S) + inertial 7.023 0.264 report ; code (OpenVINS)
Patrick Geneva is part of the Robot Perception and Navigation Group (RPNG) at the University of Delaware.
2 m-l mono (S) + inertial 7.034 0.266 report
Thomas Mörwald is with Leica Geosystems.
3 u-t stereo (S) + inertial 7.778 0.285 report ; code (Basalt)
Vladyslav Usenko is with the Computer Vision Group at the Technical University of Munich.
4 a-u * stereo (S) + inertial 11.869 0.619 code
5 r-u stereo (S) + inertial 36.048 1.894
* The affiliation of this submission is unknown (not provided by the participant).