During my work as a student research assistant at Knowledge Technology Group, Universität Hamburg, I worked on a cross-modal sim-to-real learning scenario using the humanoid robot NICO (Neuro Inspired COmpanion). In a simulation environment, we trained a nerual model to recognize the location of objects on a table. The model was then deployed to the real-life NICO to test his ability to recognize the real objects in our lab. The demo was presented as part of the Cross-Modal-Learning project’s autumn school. See our demo in the video below!
My role in this project was implementing a synthetic data generation pipeline (using domain-randomization), training different models on the dataset and deploying the final model to the NICO robot for the demo.
This scenario was designed and implemented with the goal of creating a testbed for multi-modal explanations in Human-Robot Interaction.