Coordinator (PI): Nicolás Navarro-Guerrero, L3S Research Center, Leibniz Universität Hannover
Contributors: Nicolás Navarro-Guerrero, and Wadhah Zai El Amri
Start Date: 1st April 2023
End Date: 31st March 2026
A glaring problem in robotics is the re-utilization of data and solutions. While data and code can be shared, they cannot be directly deployed on other robotic systems as they might differ in configuration, performance, sensor array, etc. Similarly, the data cannot be simply aggregated into a more extensive dataset. Hence, training from scratch for a particular robotic system is still the norm.
Tactile perception is a good example of this challenge because tactile data is strongly dependent on the robot hand or gripper used and the type of sensor. Moreover, unlike cameras (RGB), tactile sensors have no standard data representation. Additionally, tactile and proprioceptive data are intrinsically active, i.e., the explorative movements are key for recognition.
This project aims to develop a unified representation of tactile data. Such representation could allow efficient and effective robotic systems to operate across diverse modalities and robots. Another aim is to reuse knowledge from other robotic systems. The outcomes of this project aim to improve object recognition, object manipulation, and robot dexterity. However, the applications of this type of transfer and representation learning time go beyond these use cases and robotics applications.