TY - JOUR AU - Tamim, Rana M. AU - Borokhovski, Evgueni AU - Bernard, Robert M. AU - Schmid, Richard F. AU - Abrami, Philip C. AU - Pickup, David I. PY - 2021/05/29 Y2 - 2024/03/28 TI - A study of meta-analyses reporting quality in the large and expanding literature of educational technology JF - Australasian Journal of Educational Technology JA - AJET VL - 37 IS - 4 SE - Articles DO - 10.14742/ajet.6322 UR - https://ajet.org.au/index.php/AJET/article/view/6322 SP - 100-115 AB - <p>As the empirical literature in educational technology continues to grow, meta-analyses are increasingly being used to synthesise research to inform practice. However, not all meta-analyses are equal. To examine their evolution over the past 30 years, this study systematically analysed the quality of 52 meta-analyses (1988–2017) on educational technology. Methodological and reporting quality is defined here as the completeness of the descriptive and methodological reporting features of meta-analyses. The study employed the Meta-Analysis Methodological Reporting Quality Guide (MMRQG), an instrument designed to assess 22 areas of reporting quality in meta-analyses. Overall, MMRQG scores were negatively related to average effect size (i.e., the higher the quality, the lower the effect size). Owing to the presence of poor-quality syntheses, the contribution of educational technologies to learning has been overestimated, potentially misleading researchers and practitioners. Nine MMRQG items discriminated between higher and lower average effect sizes. A publication date analysis revealed that older reviews (1988–2009) scored significantly lower on the MMRQG than more recent reviews (2010–2017). Although the increase in quality bodes well for the educational technology literature, many recent meta-analyses still show only moderate levels of quality. Identifying and using only best evidence-based research is thus imperative to avoid bias.</p><p>&nbsp;</p><p><em>Implications for practice or policy:</em></p><ul><li>Educational technology practitioners should make use of meta-analytical findings that systematically synthesise primary research.</li><li>Academics, policymakers and practitioners should consider the methodological quality of meta-analyses as they vary in reliability.</li><li>Academics, policymakers and practitioners could avoid misleading bias in research evidence by using the MMRQG to evaluate the quality of meta-analyses.</li></ul><p>Meta-analyses with lower MMRQG scores should be considered with caution as they seem to overestimate the effect of educational technology on learning.</p> ER -