





Bridging Sensing, Planning and Interaction
Active Perception is a key component in robotics, enabling systems to dynamically acquire useful information by optimizing their sensing strategies. This workshop will explore the intersection of sensor placement, planning, and interaction, addressing challenges in uncertainty-aware decision-making, localization, and multi-robot coordination.
Topics will include deep learning-driven perception, reinforcement learning for sensor selection, multi-robot exploration, and perception through interaction. The event aims to bridge the gap between planning-driven robotics approaches and perception-focused strategies.
Robot arms and hands adjust their pose, force, or tactile sensing to better understand objects before grasping them.
Robots actively plan their viewpoints to improve Simultaneous Localization and Mapping (SLAM) efficiency and accuracy.
Robots actively plan their viewpoints to improve Simultaneous Localization and Mapping (SLAM) efficiency and accuracy.
Actively combining information from RGB, LiDAR, radar, thermal, event cameras (...) to improve perception.
Deciding where to move sensors next for better perception in tasks like localization, mapping, or object search.
Robots actively gather missing or uncertain information to improve their models and make better decisions.
Poster Presentation and Highlight Talks
Speakers from Diverse Application Areas
Big Reveal Coming Soon!
Stay Tuned
Talk: π€ Exploring the Frontiers of Active Perception
Big Reveal Coming Soon!
Stay Tuned
Talk: π€ Exploring the Frontiers of Active Perception
Big Reveal Coming Soon!
Stay Tuned
Talk: π€ Exploring the Frontiers of Active Perception
You are provided with a sparse map of the environment and assume to have a robot equipped with a camera that can freely rotate with respect to the mobile base. Given some robot waypoints, the goal is to rotate the camera towards more meaningful parts of the map, improving localization accuracy.