Sensing and Perception

  • The role of robotics in asset management has conventionally been passive, involving chiefly the repeated observation of assets and environments following fixed sampling strategies. We aim to advance programs of active and interactive sensing and perception, progressing through stages of sophistication in which robots can adaptively observe their environments, employ active perception techniques in which sensor reconfiguration forms part of the planning process, and finally use interactive perception strategies in which mechanical intervention forms part of the perception process. This theme will tackle fundamental research questions in three areas:

    • Sensor design and adaptive control. Emerging adaptive sensors like steerable LiDAR and mask-based plenoptic sensing promise to enable unprecedented robotic capabilities, such as seeing through murky water and around corners. However, there are no established methods for designing and controlling these devices in robotic tasks like asset inspection and maintenance. Therefore we will develop novel methods that consider the dynamics of robotic systems and their working environments, and cope with the speed and flexibility of emerging sensing technologies.
    • Interpreting new kinds of sensors. Integrating a new sensing device into a robotic system is a challenging and skilled task. Technologies impacting robotics today have taken a decade or longer to master; we need to accelerate this process. Unsupervised machine learning promises to allow robotic systems to learn to use new devices autonomously, but there are open questions in handling multiple, dissimilar sensors, especially in asynchronous and distributed systems. We will address these questions and develop ways to guarantee stability and performance in lifelong unsupervised learning.
    • Linking sensing with planning and intervention. When robots use manipulation as part of the perception process, new opportunities arise to jointly design the manipulators, sensors and algorithms that drive planning, perception and intervention. Modern approaches consider these in isolation, following the principles of modular design, but by breaking this modularity we propose to endow robotic systems with new abilities. We will establish novel methods to make this whole-system design approach tractable and maintainable, and to plan (and potentially dynamically allocate) computing tasks across the system’s heterogeneous distribution of computing and communication resources.