





Bridging Sensing, Planning and Interaction
Participants are provided with a sparse map of an environment and a simulated robot equipped with a camera that can rotate independently of the mobile base. Given a set of robot waypoints, the objective is to orient the camera toward the most informative parts of the environment to improve localization accuracy.
Full challenge details are available at: github.com/rvp-group/actloc_benchmark. If you have any questions or encounter issues, please open an issue on the repository—this helps others as well.
There are two challenge tracks:
To ensure a fair comparison across all participants, we will evaluate your submissions on a commercial workstation with the following specifications: RTX 4090 (24 GB VRAM), 64 GB RAM. Please ensure that your method fits within these hardware constraints. Submissions that exceed these memory limits may be disqualified.
We offer two ways to submit your method:
.zip
file of your full repository to
activepws@gmail.com with the subject line [CHALLENGE]
.
Include a README describing your changes, especially if they go beyond the method
folder.
.zip
folder.
You can freely decide to submit privately or publicly, this does not affect the score. We believe that most of your changes should be contained in the method
folder. We will run your models on the test sets and
publish the results on the website. For public submissions, scores will also be posted as a comment to your pull request.
The leaderboard will be published here as submissions are evaluated.