Assessing the validity of a learning analytics expectation instrument: A multinational study

Alexander Whitelock-Wainwright, Dragan Gasevic, Yi-Shan Tsai, Hendrik Drachsler, Maren Scheffel, Pedro Muñoz-Merino, Kairit Tammets, Carlos Delgado Kloos

Research output: Contribution to journalArticlepeer-review


To assist Higher Education Institutions in meeting the challenge of limited student engagement in the implementation of Learning Analytics services, the Questionnaire for Student Expectations of Learning Analytics (QSELA) was developed. This instrument contains 12 items, which are explained by a purported two-factor structure of Ethical and Privacy Expectations and Service Expectations. As it stands, however, the QSELA has only been validated with students from UK University students, which is problematic on account of the interest in Learning Analytics extending beyond this context. Thus, the aim of the current work was to assess whether the translated QSELA can be validated in three contexts (an Estonian, a Spanish, and a Dutch University). The findings show that the model provided acceptable fits in both the Spanish and Dutch samples, but was not supported in the Estonian student sample. In addition, an assessment of local fit is undertaken for each sample, which provides important points that need to be considered in future work. Finally, a general comparison of expectations across contexts is undertaken, which are discussed in relation to the General Data Protection Regulation (GDPR, 2018).
Original languageEnglish
Pages (from-to)209-240
Number of pages32
JournalJournal of Computer Assisted Learning
Issue number2
Early online date14 Jan 2020
Publication statusPublished - 30 Apr 2020


  • learning analytics
  • multinational
  • questionnaire
  • Student expectations

Fingerprint Dive into the research topics of 'Assessing the validity of a learning analytics expectation instrument: A multinational study'. Together they form a unique fingerprint.

Cite this