Eyetracking for two-person tasks with manipulation of a virtual world

Jean Carletta, Robin L. Hill, Craig Nicol, Tim Taylor, Jan Peter Ruiter, Ellen Gurman Bard

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Eyetracking facilities are typically restricted to monitoring a single person viewing static images or prerecorded video. In the present article, we describe a system that makes it possible to study visual attention in coordination with other activity during joint action. The software links two eyetracking systems in parallel and provides an on-screen task. By locating eye movements against dynamic screen regions, it permits automatic tracking of moving on-screen objects. Using existing SR technology, the system can also cross-project each participant’s eyetrack and mouse location onto the other’s on-screen work space. Keeping a complete record of eyetrack and on-screen events in the same format as subsequent human coding, the system permits the analysis of multiple modalities. The software offers new approaches to spontaneous multimodal communication: joint action and joint attention. These capacities are demonstrated using an experimental paradigm for cooperative on-screen assembly of a two-dimensional model. The software is available under an open source license.
Original languageEnglish
Pages (from-to)254-265
Number of pages12
JournalBehavior Research Methods
Volume42
Issue number1
DOIs
Publication statusPublished - 1 Feb 2010

Fingerprint

Dive into the research topics of 'Eyetracking for two-person tasks with manipulation of a virtual world'. Together they form a unique fingerprint.

Cite this