A study of meta-analyses reporting quality in the large and expanding literature of educational technology
DOI:
https://doi.org/10.14742/ajet.6322Keywords:
meta-analysis, systematic review, reporting quality, bias, educational technology, research methodologyAbstract
As the empirical literature in educational technology continues to grow, meta-analyses are increasingly being used to synthesise research to inform practice. However, not all meta-analyses are equal. To examine their evolution over the past 30 years, this study systematically analysed the quality of 52 meta-analyses (1988–2017) on educational technology. Methodological and reporting quality is defined here as the completeness of the descriptive and methodological reporting features of meta-analyses. The study employed the Meta-Analysis Methodological Reporting Quality Guide (MMRQG), an instrument designed to assess 22 areas of reporting quality in meta-analyses. Overall, MMRQG scores were negatively related to average effect size (i.e., the higher the quality, the lower the effect size). Owing to the presence of poor-quality syntheses, the contribution of educational technologies to learning has been overestimated, potentially misleading researchers and practitioners. Nine MMRQG items discriminated between higher and lower average effect sizes. A publication date analysis revealed that older reviews (1988–2009) scored significantly lower on the MMRQG than more recent reviews (2010–2017). Although the increase in quality bodes well for the educational technology literature, many recent meta-analyses still show only moderate levels of quality. Identifying and using only best evidence-based research is thus imperative to avoid bias.
Implications for practice or policy:
- Educational technology practitioners should make use of meta-analytical findings that systematically synthesise primary research.
- Academics, policymakers and practitioners should consider the methodological quality of meta-analyses as they vary in reliability.
- Academics, policymakers and practitioners could avoid misleading bias in research evidence by using the MMRQG to evaluate the quality of meta-analyses.
Meta-analyses with lower MMRQG scores should be considered with caution as they seem to overestimate the effect of educational technology on learning.
Downloads
Metrics
Downloads
Published
How to Cite
Issue
Section
License
Articles published in the Australasian Journal of Educational Technology (AJET) are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant AJET right of first publication under CC BY-NC-ND 4.0.
This copyright notice applies to articles published in AJET volumes 36 onwards. Please read about the copyright notices for previous volumes under Journal History.