Purposeful listening in challenging conditions: A study of prediction during consecutive interpreting in noise

Rhona Amos, Robert Hartsuiker, Kilian Seeber, Martin J. Pickering

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Prediction is often used during language comprehension. However, studies of prediction have tended to focus on L1 listeners in quiet conditions. Thus, it is unclear how listeners predict outside the laboratory and in specific communicative settings. Here, we report two eye-tracking studies which used a visual-world paradigm to investigate whether prediction during a consecutive interpreting task differs from prediction during a listening task in L2 listeners, and whether L2 listeners are able to predict in the noisy conditions that might be associated with this communicative setting. In a first study, thirty-six Dutch-English bilinguals either just listened to, or else listened to and then consecutively interpreted, predictable sentences presented on speech-shaped sound. In a second study, another thirty-six Dutch-English bilinguals carried out the same tasks in clear speech. Our results suggest that L2 listeners predict the meaning of upcoming words in noisy conditions. However, we did not find that predictive eye movements depended on task, nor that L2 listeners predicted upcoming word form. We also did not find a difference in predictive patterns when we compared our two studies. Thus, L2 listeners predict in noisy circumstances, supporting theories which posit that prediction regularly takes place in comprehension, but we did not find evidence that a subsequent production task or noise affects semantic prediction.
Original languageEnglish
Article numbere0288960
Number of pages27
JournalPLoS ONE
Volume18
Issue number7
DOIs
Publication statusPublished - 20 Jul 2023

Fingerprint

Dive into the research topics of 'Purposeful listening in challenging conditions: A study of prediction during consecutive interpreting in noise'. Together they form a unique fingerprint.

Cite this