A study of meta-analyses reporting quality in the large and expanding literature of educational technology

Authors

DOI:

https://doi.org/10.14742/ajet.6322

Keywords:

meta-analysis, systematic review, reporting quality, bias, educational technology, research methodology

Abstract

As the empirical literature in educational technology continues to grow, meta-analyses are increasingly being used to synthesise research to inform practice. However, not all meta-analyses are equal. To examine their evolution over the past 30 years, this study systematically analysed the quality of 52 meta-analyses (1988–2017) on educational technology. Methodological and reporting quality is defined here as the completeness of the descriptive and methodological reporting features of meta-analyses. The study employed the Meta-Analysis Methodological Reporting Quality Guide (MMRQG), an instrument designed to assess 22 areas of reporting quality in meta-analyses. Overall, MMRQG scores were negatively related to average effect size (i.e., the higher the quality, the lower the effect size). Owing to the presence of poor-quality syntheses, the contribution of educational technologies to learning has been overestimated, potentially misleading researchers and practitioners. Nine MMRQG items discriminated between higher and lower average effect sizes. A publication date analysis revealed that older reviews (1988–2009) scored significantly lower on the MMRQG than more recent reviews (2010–2017). Although the increase in quality bodes well for the educational technology literature, many recent meta-analyses still show only moderate levels of quality. Identifying and using only best evidence-based research is thus imperative to avoid bias.

 

Implications for practice or policy:

  • Educational technology practitioners should make use of meta-analytical findings that systematically synthesise primary research.
  • Academics, policymakers and practitioners should consider the methodological quality of meta-analyses as they vary in reliability.
  • Academics, policymakers and practitioners could avoid misleading bias in research evidence by using the MMRQG to evaluate the quality of meta-analyses.

Meta-analyses with lower MMRQG scores should be considered with caution as they seem to overestimate the effect of educational technology on learning.

Downloads

Download data is not yet available.

Metrics

Metrics Loading ...

Downloads

Published

2021-05-29

How to Cite

Tamim, R. M., Borokhovski, E., Bernard, R. M., Schmid, R. F., Abrami, P. C. ., & Pickup, D. I. . (2021). A study of meta-analyses reporting quality in the large and expanding literature of educational technology. Australasian Journal of Educational Technology, 37(4), 100–115. https://doi.org/10.14742/ajet.6322

Issue

Section

Articles