Sparse-Dense Motion Modelling and Tracking for Manipulation without Prior Object Models

Christian Rauch, Ran Long, Vladimir Ivan, Sethu Vijayakumar

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

This work presents an approach for modelling and tracking previously unseen objects for robotic grasping tasks. Using the motion of objects in a scene, our approach segments rigid entities from the scene and continuously tracks them to create a dense and sparse model of the object and the environment. While the dense tracking enables interaction with these models, the sparse tracking makes this robust against fast movements and allows to redetect already modelled objects. The evaluation on a dual-arm grasping task demonstrates that our approach 1) enables a robot to detect new objects online without a prior model and to grasp these objects using only a simple parameterisable geometric representation, and 2) is much more robust compared to the state of the art methods.
Original languageEnglish
Pages (from-to)11394-11401
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume7
Issue number4
Early online date22 Aug 2022
DOIs
Publication statusPublished - 1 Oct 2022

Keywords / Materials (for Non-textual outputs)

  • Perception for Grasping and Manipulation
  • Visual Tracking
  • SLAM

Fingerprint

Dive into the research topics of 'Sparse-Dense Motion Modelling and Tracking for Manipulation without Prior Object Models'. Together they form a unique fingerprint.

Cite this