This dataset comprises videos recorded for an artificial language learning study in the manual modality. Three experiments were conducted. In experiment 1, pairs of participants communicated about a set of concepts using only gestures. Pairs of participants were trained on gestures produced by a previous participant and the gestures they produced in testing were used as training for a new pair of participants. Experiment 1 thus operationalised both transmission to new learners and interaction between communicators. In experiment 2, we isolate transmission and interaction. In one condition, individual participants learn gestures from a model and then produce their own gestures for the same concepts in testing. In the other, pairs of participants communicate repeatedly with a partner about the same concepts, without transmission to new participants. In the first 2 experiments, participants involved in interaction were given explicit pressures to communicate quickly and accurately. In experiment 3, we re-run experiment 1 and the interaction only condition of experiment 2 without these explicit pressures.
Motamedi, Yasamin; Schouwstra, Marieke; Smith, Kenny; Culbertson, Jennifer; Kirby, Simon. (2018). Evolving artificial sign languages in the lab: from improvised gesture to systematic sign (dataset), 2015-2018 [dataset]. University of Edinburgh, Centre for Language Evolution. https://doi.org/10.7488/ds/2447.
Date made available | 27 Sept 2018 |
---|
Publisher | Edinburgh DataShare |
---|
Temporal coverage | Jul 2015 - Mar 2018 |
---|
Geographical coverage | Edinburgh,UK,UNITED KINGDOM |
---|