Multisensory architectures for action-oriented perception

L. Alba, P. Arena, S. De Fiore, J. Listán, L. Patané, A. Salem, G. Scordino, B. Webb, Paolo Arena (Editor), Ángel Rodríguez-vázquez (Editor), Gustavo Liñán-cembrano (Editor)

Research output: Chapter in Book/Report/Conference proceedingConference contribution


In order to solve the navigation problem of a mobile robot in an unstructured environment a versatile sensory system and efficient locomotion control algorithms are necessary. In this paper an innovative sensory system for action-oriented perception applied to a legged robot is presented. An important problem we address is how to utilize a large variety and number of sensors, while having systems that can operate in real time. Our solution is to use sensory systems that incorporate analog and parallel processing, inspired by biological systems, to reduce the required data exchange with the motor control layer. In particular, as concerns the visual system, we use the Eye-RIS v1.1 board made by Anafocus, which is based on a fully parallel mixed-signal array sensor-processor chip. The hearing sensor is inspired by the cricket hearing system and allows efficient localization of a specific sound source with a very simple analog circuit. Our robot utilizes additional sensors for touch, posture, load, distance, and heading, and thus requires customized and parallel processing for concurrent acquisition. Therefore a Field Programmable Gate Array (FPGA) based hardware was used to manage the multi-sensory acquisition and processing. This choice was made because FPGAs permit the implementation of customized digital logic blocks that can operate in parallel allowing the sensors to be driven simultaneously. With this approach the multi-sensory architecture proposed can achieve real time capabilities.
Original languageEnglish
Title of host publicationProc. SPIE 6592, Bioengineered and Bioinspired Systems III, 65920A (May 22, 2007)
Number of pages12
Publication statusPublished - 18 May 2007


Dive into the research topics of 'Multisensory architectures for action-oriented perception'. Together they form a unique fingerprint.

Cite this