Edinburgh Research Explorer

Assessing the validity of a learning analytics expectation instrument: A multinational study

Research output: Contribution to journalArticle

  • Alexander Whitelock-Wainwright
  • Dragan Gasevic
  • Yi-Shan Tsai
  • Hendrik Drachsler
  • Maren Scheffel
  • Pedro Muñoz-Merino
  • Kairit Tammets
  • Carlos Delgado Kloos

Related Edinburgh Organisations

Original languageEnglish
Number of pages53
JournalJournal of Computer Assisted Learning
Publication statusAccepted/In press - 1 Oct 2019

Abstract

To assist Higher Education Institutions in meeting the challenge of limited student engagement in the implementation of Learning Analytics services, the Questionnaire for Student Expectations of Learning Analytics (QSELA) was developed. This instrument contains 12 items, which are explained by a purported two-factor structure of Ethical and Privacy Expectations and Service Expectations. As it stands, however, the QSELA has only been validated with students from UK University students, which is problematic on account of the interest in Learning Analytics extending beyond this context. Thus, the aim of the current work was to assess whether the translated QSELA can be validated in three contexts (an Estonian, a Spanish, and a Dutch University). The findings show that the model provided acceptable fits in both the Spanish and Dutch samples, but was not supported in the Estonian student sample. In addition, an assessment of local fit is undertaken for each sample, which provides important points that need to be considered in future work. Finally, a general comparison of expectations across contexts is undertaken, which are discussed in relation to the General Data Protection Regulation (GDPR, 2018).

ID: 114406537