Drone-based vision has drawn increasing attention in recent years with many applications, including aerial photography, surveillance, etc. Despite significant progress in general computer vision algorithms, these algorithms are usually not optimal on drone platforms. Thus, developing and evaluating new vision algorithms for drone-generated visual data becomes a fundamental problem in drone-based applications. However, this goal is heavily limited by the lack of large-scale dedicated benchmarks and assessment of vision algorithms on drone platforms.
Thus motivated, we propose to organize the ICCV 2023 workshop challenge on “Vision Meets Drones: A Challenge” (or VisDrone2023) on Oct. 3rd, 2023, in conjunction with IEEE International Conference on Computer Vision (ICCV 2023) in Paris, France, for various core vision tasks on drone platform. To this end, we collect the large-scale drone-captured VisDrone2023 dataset with rich annotation and evaluate and discuss state-of-the-art algorithms. Besides, we will invite researchers to participate in challenges and to discuss their research at the workshop, as well as to submit papers describing research, experiments, or applications.
In this workshop“Vision Meets Drones: A Challenge”(or VisDrone2023), we aim to advance the research in drone-captured visual data analysis related to many topics such as computer vision, large-scale learning, and visual surveillance. Encouraged by our previous successes in VisDrone 2018 (with ECCV), VisDrone 2019 (with ICCV), VisDrone 2020 (with ECCV), VisDrone 2021 (with ICCV), and VisDrone 2022 (with PRCV), we will further improve the workshop and particularly focus on two core tasks including object detection and zero-shot object detection. VisDrone Group holds the task of object detection, and BRAIN Lab holds the task of zero-shot object detection.
The Brain and Artificial Intelligence Laboratory (BRAIN Lab) of Northwestern Polytechnical University is affiliated to the Key Laboratory of Information Fusion Technology under the Ministry of Education. The team targets at the cutting-edge research in the filed of archifical intelligence and closely focuses on the country’s strategic needs. We carry out the theoretical research, key technology breakthroughs, and system integration verification in the fields of intelligent remote sensing information processing, brain cognition and intelligent computing, visual intelligent perception and intelligent processing, etc.
Note: The top three contestants of each task will receive certificates. And the winner will win 10,000 RMB and will be invited to give a presentation at ICCV 2023 Workshop.
- VisDrone 2023 will be organized in conjunction with ICCV 2023.
- VisDrone 2023 is co-organized by VisDrone Group and BRAIN Lab.
- ICCV2021 Workshop: Vision Meets Drones 2021: A Challenge Zoom link.
- Paper submission system is available now .The deadline for workshop paper is August 7 2021, AOE time.
- The deadline for the competition is 24:00 on July 15th 2021, AOE time
- VisDrone 2021 will be organized in conjunction with ICCV 2021.
- Aug. 28, 2020: Computer Vision for UAVs Workshop and Challenge will be held at 8:00 (UTC+1) on August 28.
- July. 14, 2020: Evaluation server will be closed at 23:59 on July 15 (UTC+0 time).
- July. 9, 2020: Paper submission system is available now. Paper submission deadline is delayed until July 15th.
- June. 26, 2020: Due to the impact of COVID-19, the submission deadline is delayed until July 15th. Each team will have additional 5 submission opportunities.
Zhu P, Wen L, Du D, et al. Detection and Tracking Meet Drones Challenge[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence, 2021 (01): 1-1. Bibtex source | Abstract | PDF
Huang P, Han J, Cheng D, et al. Robust region feature synthesizer for zero-shot object detection[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2022: 7622-7631.
Li K, Wan G, Cheng G, et al. Object detection in optical remote sensing images: A survey and a new benchmark[J]. ISPRS journal of photogrammetry and remote sensing, 2020, 159: 296-307.