Edinburgh Research Explorer

Study of Multimodal Interfaces and the Improvements on Teleoperation

Research output: Contribution to journalArticle

  • Eleftherios Triantafyllidis
  • Christopher McGreavy
  • Jiacheng Gu
  • Zhibin Li

Related Edinburgh Organisations

Open Access permissions

Open

Documents

https://ieeexplore.ieee.org/document/9076603
Original languageEnglish
Pages (from-to)78213 - 78227
Number of pages15
JournalIEEE Access
Volume8
DOIs
Publication statusPublished - 23 Apr 2020

Abstract

The research in multimodal interfaces aims to provide immersive solutions and to increase overall human performance. A promising direction is to combine auditory, visual and haptic interaction between the user and the simulated environment. However, no extensive comparison exists to show how combining audiovisuohaptic interfaces would affect human perception and by extent reflected on task performance. Our paper explores this idea and presents a thorough, full-factorial comparison of how all combinations of audio, visual and haptic interfaces affect performance during manipulation. We evaluated how each combination affects the performance in a study (N = 25) consisting of manipulation tasks with various difficulties. The overall performance was assessed using both subjective, by assessing cognitive workload and system usability, and objective measurements, by incorporating time and spatial accuracy-based metrics. The results showed that regardless of task complexity, the combination of stereoscopic-vision with the virtual reality headset increased performance across all measurements by 40%, compared to monocular-vision from a generic display monitor. Besides, using haptic feedback improved outcomes by 10% and auditory feedback accounted for approximately 5% improvement.

    Research areas

  • Audiovisuohaptic, Auditory feedback, haptic feedback, Immersive manipulation, immersive teleoperation, multimodal interface, multimodal interaction, virtual reality

Download statistics

No data available

ID: 144084888