An examination of student user experience (UX) and perceptions of remote invigilation during online assessment

Authors

  • Dr Lesley Sefcik Curtin University https://orcid.org/0000-0002-6877-6943
  • Dr Terisha Veeran-Colton Office of the Academic Registrar, Curtin University, GPO Box U1987, Perth Western Australia, Australia 6845
  • Dr Michael Baird School of Marketing, Curtin University, GPO Box U1987, Perth Western Australia, Australia 6845
  • Dr Connie Price Curtin Learning and Teaching, Curtin University, GPO Box U1987, Perth Western Australia, Australia 6845
  • Steve Steyn Codemaster Institute, Perth Western Australia, Australia.

DOI:

https://doi.org/10.14742/ajet.6871

Keywords:

remote invigilation, invigilation, online tests, online learning, artificial intelligence, user experience

Abstract

This study aimed to understand the effects of a custom-developed, artificial intelligence–based, asynchronous remote invigilation system on the student user experience. The study was conducted over 3 years at a large Australian university, and findings demonstrate that familiarity with the system over time improved student attitudes towards remote invigilation. Positive experiences were found to be related to ease of use and convenience for test sitting. The majority of students reported that it was important for the institution to have approaches such as remote invigilation to discourage cheating and they believed that the system was useful in this regard. Perceived technical problems were found to invoke feelings of anxiety with being remotely invigilated, and students suggested that greater clarity on expectations of appropriate behaviour, privacy and data security would help alleviate discomfort and improve the system.

Implications for practise or policy:

  • Educators can improve the student user experience of remote invigilation by ensuring that students are provided the opportunity to practise and become familiar with using remote invigilation software before any summative assessment task.
  • Administrators should provide clear policy guidance about the management of student data collected during remotely invigilated assessment tasks.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Author Biography

Dr Lesley Sefcik, Curtin University

Dr. Lesley Sefcik is a Senior Lecturer and Academic Integrity Advisor at Curtin University. She provides university-wide teaching, advice, and academic research within the field of academic integrity. Dr. Sefcik has an interdisciplinary education with a Phd in Environmental Science  (University of Michigan) focusing on Plant Physiological Ecology and Global Change, a Bachelor of Science in Biology  (University of Michigan)  and a Graduate Diploma in Education (Murdoch University) majoring in Science and Humanities and Social Science. She is a registered secondary teacher in Western Australia has been awarded an Outstanding Teacher rating for the National Professional Standards for Teachers in all seven domains. She is a Homeward Bound Fellow and a Senior Fellow of the Higher Education Academy. Dr. Sefcik’s professional background is situated in Assessment and Quality Learning within the domain of Learning and Teaching. Past projects include development of the External Referencing of Standards (ERoS) system for external peer review of assessment. Current projects include the development and implementation of academic integrity related programs for students and staff at Curtin, and research related to the development and implementation of remote invigilation for online assessment.

Downloads

Published

2022-02-21

How to Cite

Sefcik, L., Veeran-Colton , T. ., Baird , M. ., Price, C. ., & Steyn , S. . (2022). An examination of student user experience (UX) and perceptions of remote invigilation during online assessment. Australasian Journal of Educational Technology, 38(2), 49–69. https://doi.org/10.14742/ajet.6871

Issue

Section

Articles