Abstract
In this work, we investigate the use of audio and haptic feedback to augment the display of a mobile device controlled by tilt input. The questions we answer in this work are: How do people begin searching in unfamiliar spaces? What patterns do users follow and which techniques are employed to accomplish the experimental task? What effect does a prediction of the future state in the audio space, based on a model of the human operator, have on subjects’ behaviour? In the pilot study we studied subjects’ navigation in a state space with seven randomly placed audio sources, displayed via audio and vibrotactile modalities. In the main study, we compared only the efficiency of different forms of audio feedback. We ran these experiments on a Pocket PC instrumented with an accelerometer and a headset. The accuracy of selecting, exploration density, and orientation of each target was measured. The results quantified the changes brought by predictive or “quickened” sonified displays in mobile, gestural interaction. Also, they highlighted subjects’ search patterns and the effect of a combination of independent variables and each individual variable in the navigation patterns.
Original language | English |
---|---|
Title of host publication | Handbook of Research on User Interface Design and Evaluation for Mobile Technology |
Publisher | IGI Global |
Chapter | 29 |
Pages | 478-506 |
Number of pages | 29 |
ISBN (Electronic) | 9781599048727 |
ISBN (Print) | 9781599048710 |
DOIs | |
Publication status | Published - 2008 |