Edinburgh Research Explorer

Hybrid Multi-camera Visual Servoing to Moving Target

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Related Edinburgh Organisations

Open Access permissions

Open

Documents

https://ieeexplore.ieee.org/document/8593652
Original languageEnglish
Title of host publication2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2018)
Place of PublicationMadrid, Spain
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages1132-1137
Number of pages6
ISBN (Electronic)978-1-5386-8094-0
ISBN (Print)978-1-5386-8095-7
DOIs
Publication statusPublished - 7 Jan 2019
Event2018 IEEE/RSJ International Conference on Intelligent Robots and Systems - Madrid, Spain
Duration: 1 Oct 20185 Oct 2018
https://www.iros2018.org/

Conference

Conference2018 IEEE/RSJ International Conference on Intelligent Robots and Systems
Abbreviated titleIROS 2018
CountrySpain
CityMadrid
Period1/10/185/10/18
Internet address

Abstract

Visual servoing is a well-known task in robotics. However, there are still challenges when multiple visual sources are combined to accurately guide the robot or occlusions appear. In this paper we present a novel visual servoing approach using hybrid multi-camera input data to lead a robot arm accurately to dynamically moving target points in the presence of partial occlusions. The approach uses four RGBD sensors as Eye-to-Hand (EtoH) visual input, and an arm-mounted stereo camera as Eye-in-Hand (EinH). A Master supervisor task selects between using the EtoH or the EinH, depending on the distance between the robot and target. The Master also selects the subset of EtoH cameras that best perceive the target. When the EinH sensor is used, if the target becomes occluded or goes out of the sensor’s viewfrustum, the Master switches back to the EtoH sensors to re-track the object. Using this adaptive visual input data, the robot is then controlled using an iterative planner that uses position, orientation and joint configuration to estimate the trajectory. Since the target is dynamic, this trajectory is updated every time-step. Experiments show good performance in four different situations: tracking a ball, targeting a bulls-eye, guiding a straw to a mouth and delivering an item to a moving hand. The experiments cover both simple situations such as a ball that is mostly visible from all cameras, and more complex situations such as the mouth which is partially occluded from some of the sensors.

Event

2018 IEEE/RSJ International Conference on Intelligent Robots and Systems

1/10/185/10/18

Madrid, Spain

Event: Conference

Download statistics

No data available

ID: 69748688