Making sense of audit trail data
AbstractIn this paper we argue that the use of audit trail data for research and evaluation purposes has attracted scepticism due to real and perceived difficulties associated with the data's interpretation. We suggest that educational technology researchers and evaluators need to better understand how audit trail data can be processed and analysed effectively, and identify three stages of audit trail analysis. We present an investigation of a computer based learning resource as a vehicle for exploring strategies that can assist researchers and evaluators in the analysis and interpretation of audit trail data. The analytical approach we describe is iterative in nature, moving to greater levels of specificity as it proceeds. By combining this approach with primarily descriptive techniques we were able to establish distinct patterns of access to the learning resource. We then performed a series of cluster analyses which, guided by a clear understanding of two critical components of the learning environment, led to the identification of four distinct 'types' or 'categories' of users. Our results demonstrate that it is possible to document meaningful usage patterns at a number of levels of analysis using electronic records from technology based learning environments. The implications of these results for future work are discussed.
Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY-NC-ND 4.0.
This copyright notice applies to articles published in AJET volumes 36 onwards. Please read about the copyright notices for previous volumes under Journal History.