Active Perception Workshop Logo

Bridging Sensing, Planning and Interaction

in conjunction with IROS

Monday 20th October - 8:30 (room 301)

Hangzhou, CHINA


Overview

Active perception plays a central role in robotics, enabling systems to intelligently acquire informative data by optimizing their sensing and interaction strategies. This workshop focuses on the intersection of perception, planning, and interaction, addressing key challenges in vision-based navigation and manipulation.

Topics of interest include data-driven perception, reinforcement learning for active vision, multi-robot exploration, and perception through interaction—encompassing understanding, reasoning, and decision-making. The event aims to bridge the gap between planning-centric robotics and perception-driven methodologies.

Manipulation and Grasping

Robot arms and hands adjust their pose, force, or tactile sensing to better understand objects before grasping them.

Exploration & Mapping (SLAM)

Robots actively plan their viewpoints to improve Simultaneous Localization and Mapping (SLAM) efficiency and accuracy.

Exploration & Mapping (SLAM)

Robots actively plan their viewpoints to improve Simultaneous Localization and Mapping (SLAM) efficiency and accuracy.

Sensor Fusion

Actively combining information from RGB, LiDAR, radar, thermal, event cameras (...) to improve perception.

Next-Best-View Planning

Deciding where to move sensors next for better perception in tasks like localization, mapping, or object search.

Uncertainty-Aware Planning

Robots actively gather missing or uncertain information to improve their models and make better decisions.

Poster Presentation and Highlight Talks

Poster Presentation and Highlight Talks

Speakers from Diverse Application Areas

Speakers from Diverse Application Areas

Keynote Speakers 🎤

Workshop Schedule ⏰

08:25 – 08:35 Welcome Remarks
Organizing Committee
08:35 – 09:00 Sebastian Scherrer (CMU)remote
Plenary Talk: Multi-Robot Information Gathering in Challenging Environments
09:00 – 09:25 Tai Wang (Shanghai AI Lab)
Plenary Talk: Towards a Vision-Language Navigation Foundation Model via Sim2Real
09:25 – 09:50 Marija Popovic (TU Delft)
Plenary Talk: Reinforcement Learning for Active Perception using UAVs
09:50 – 10:10 Spotlight Talks
Presentations from selected award finalists
10:10 – 10:50 Coffee Break ☕
Poster Session
10:50 – 11:15 Boyu Zhou (SUSTech)
Plenary Talk: Autonomous Exploration: From Traditional to AI-driven Approaches
11:15 – 11:40 Jianxiang Feng (Agile Robots)
Plenary Talk: Learning Robust Perception and Manipulation via Uncertainty-Aware Intelligence
11:40 – 12:05 Stefan Leutenegger (ETH Zurich)
Plenary Talk: Exploration with Drones for Geometric and Semantic Reconstruction in the Wild
12:05 – 12:30 Interactive Discussion
Guided group discussions including invited speakers and organizers on “What’s Next in Active Perception”
12:30 – 12:45 Award & Closing Remarks
Final remark by organization committe and award presentation

Active Localization Challenge 🚀

You are provided with a sparse map of the environment and assume to have a robot equipped with a camera that can freely rotate with respect to the mobile base. Given some robot waypoints, the goal is to rotate the camera towards more meaningful parts of the map, improving localization accuracy.

Active Localization Challenge

Organizers

Sponsors

Chingmu
Z-up