Detecting features of tools, objects, and actions from effects in a robot using deep learning

Namiko Saito, Kitae Kim, Shingo Murata, Tetsuya Ogata, Shigeki Sugano

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a tool-use model that can detect the features of tools, target objects, and actions from the provided effects of object manipulation. We construct a model that enables robots to manipulate objects with tools, using infant learning as a concept. To realize this, we train sensory-motor data recorded during a tool-use task performed by a robot with deep learning. Experiments include four factors: (1) tools, (2) objects, (3) actions, and (4) effects, which the model considers simultaneously. For evaluation, the robot generates predicted images and motions given information of the effects of using unknown tools and objects. We confirm that the robot is capable of detecting features of tools, objects, and actions by learning the effects and executing the task.

Original languageEnglish
Title of host publication2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
PublisherInstitute of Electrical and Electronics Engineers
Pages91-96
Number of pages6
ISBN (Electronic)9781538661109
DOIs
Publication statusPublished - 15 Jul 2018
EventJoint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018 - Tokyo, Japan
Duration: 16 Sept 201820 Sept 2018

Publication series

Name2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018

Conference

ConferenceJoint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
Country/TerritoryJapan
CityTokyo
Period16/09/1820/09/18

Keywords / Materials (for Non-textual outputs)

  • cognitive robotics
  • development of infants
  • neural network
  • tool-use

Fingerprint

Dive into the research topics of 'Detecting features of tools, objects, and actions from effects in a robot using deep learning'. Together they form a unique fingerprint.

Cite this