Journal Article


Goal-directed tactile exploration for body model learning through self-touch on a humanoid robot

Abstract

An early integration of tactile sensing into motor coordination is the norm in animals, but still a challenge for robots. Tactile exploration through touches on the body gives rise to first body models and bootstraps further development such as reaching competence. Reaching to one’s own body requires connections of the tactile and motor space only. Still, the problems of high dimensionality and motor redundancy persist. Through an embodied computational model for the learning of self-touch on a simulated humanoid robot with artificial sensitive skin, we demonstrate that this task can be achieved (i) effectively and (ii) efficiently at scale by employing the computational frameworks for the learning of internal models for reaching: intrinsic motivation and goal babbling. We relate our results to infant studies on spontaneous body exploration as well as reaching to vibrotactile targets on the body. We analyze the reaching configurations of one infant followed weekly between 4 and 18 months of age and derive further requirements for the computational model: accounting for (iii) continuous rather than sporadic touch and (iv) consistent redundancy resolution. Results show the general success of the learning models in the touch domain, but also point out limitations in achieving fully continuous touch.

Attached files

Authors

Gama, Filipe
Shcherban, Maksym
Rolf, Matthias
Hoffmann, Matej

Oxford Brookes departments

School of Engineering, Computing and Mathematics

Dates

Year of publication: 2021
Date of RADAR deposit: 2022-03-07


Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 International License


Related resources

This RADAR resource is Identical to Goal-directed tactile exploration for body model learning through self-touch on a humanoid robot

Details

  • Owner: Joseph Ripp
  • Collection: Outputs
  • Version: 1 (show all)
  • Status: Live