Rotation invariant visual processing for spatial memory in insects

Thomas Stone, Michael Mangan, Antoine Wystrach, Barbara Webb

Research output: Contribution to journalArticlepeer-review


Visual memory is crucial to navigation in many animals, including insects. Here, we focus on the problem of visual homing, that is, using comparison of the view at a current location with a view stored at the home location to control movement towards home by a novel shortcut. Insects show several visual specializations that appear advantageous for this task, including almost panoramic field of view and ultraviolet light sensitivity, which enhances the salience of the skyline. We discuss several proposals for subsequent processing of the image to obtain the required motion information, focusing on how each might deal with the problem of yaw rotation of the current view relative to the home view. Possible solutions include tagging of views with information from the celestial compass system, using multiple views pointing towards home, or rotation invariant encoding of the view. We illustrate briefly how a well-known shape description method from computer vision, Zernike moments, could provide a compact and rotation invariant representation of sky shapes to enhance visual homing. We discuss the biological plausibility of this solution, and also a fourth strategy, based on observed behaviour of insects, that involves transfer of information from visual memory matching to the compass system.
Original languageEnglish
Number of pages16
JournalInterface Focus
Issue number4
Publication statusPublished - 15 Jun 2018


Dive into the research topics of 'Rotation invariant visual processing for spatial memory in insects'. Together they form a unique fingerprint.

Cite this