Overview
- The AGIUSEP 2022 Crowd Counting Challenge requires participating algorithms to count persons in each frame.
- The goal of counting algorithms is to estimate the number of people in an image. We will provide a dataset with pairs of RGBT images. Each algorithm is evaluated through computing the number of people, mean absolute error (MAE) and mean squared error (MSE) between the predicted number of people and ground-truth in evaluation.
Dates
- [08.01]: Training, validation and testing data released
- [08.01]: Evaluation software released
- [09.15]: Result submission deadline
- [10.16]: Challenge results released
- [10.16]: Winner presents
- The deadline for the competition is 24:00 on September 15th, BEIJING time
Notice
A team can only register one account. Quota can be obtained by joining the WeChat group.
In order to prevent the problem of a team registering multiple accounts, this competition requires all members of the participating team to join the WeChat group. If the QR code is invalid, we will update it in time. And the old account cannot be used, you need to re-register a new account.
If you do not have WeChat, please send your application to tju.drone.vision@gmail.com. The application information should include account name, real name, institution, country, email address and the name and institution of team members.

Challenge Guidelines
- The crowd counting evaluation page lists detailed information regarding how submissions will be scored.
- We encourage the participants to use the provided training data, while also allow them to use additional training data. The use of external data must be indicated during submission.
- The train images and corresponding annotations as well as the images in the test-challenge set are available on the download page. Before participating, every user is required to create an account using an institutional email address. If you have any problem in registration, please contact us. After registration, the users should submit the results in their accounts. The submitted results will be evaluated according to the rules described on the evaluation page. Please refer to the evaluation page for a detailed explanation.
Tools and Instructions
We provide extensive API support for the VisDrone images, annotation and evaluation code. Please visit our GitHub repository to download the VisDrone API. For additional questions, please find the answers here or contact us.