Biorobot log data for visual navigation with transverse oscillating route following (TORF)

  • Jan Stankiewicz (Creator)

Dataset

Abstract

The purpose of this work was to develop a flying robot that can navigate using a bee-inspired visual route following approach. To this end we performed several missions in which an aerial robot with an onboard camera and processing unit flies along a predefined route and samples images of the ground. Thereafter, it tries to autonomously navigate back along the learned route by comparing the current view with the bank of views acquired on the learning flight. By centring an oscillatory flight path on the most familiar views, the biorobot can traverse the route without any additional information. See attached video (torf_operation_5x_speed.mp4) for a visual impression.

This dataset contains all the data used to generate the figures in the associated publication. This includes information about the state of the robot and in some cases, the images acquired along the way. The data structures are defined in the attached “readme” file.

The software used to generate this data is hosted at: https://github.com/jannsta1/torf

Data Citation

Stankiewicz, JT. (2021). Biorobot log data for visual navigation with transverse oscillating route following (TORF), [dataset]. University of Edinburgh. School of informatics. Institute of Perception, Action and Behaviour. https://doi.org/10.7488/ds/2970.
Date made available11 Jan 2021
PublisherEdinburgh DataShare

Cite this