Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2020 Jul 26;68(4):1991–2001. doi: 10.1007/s11423-020-09811-3

The research we have is not the research we need

Thomas C Reeves 1, Lin Lin 2,
PMCID: PMC7382956  PMID: 32837123

Abstract

The special issue “A Synthesis of Systematic Review Research on Emerging Learning Environments and Technologies” edited by Drs. Florence Martin, Vanessa Dennen, and Curtis Bonk has assembled a noteworthy collection of systematic review articles, each focusing on a different aspect of emerging learning technologies. In this conclusion, we focus on these evidence-based reviews and their practical implications for practitioners as well as future researchers. While recognizing the merits of these reviews, we conclude our analysis by encouraging readers to consider conducting educational design research to address serious problems related to teaching, learning, and performance, collaborating more closely with teachers, administrators, and other practitioners in tackling these problems, and always striving to make a difference in the lives of learners around the world.


There is incredible potential for digital technology in and beyond the classroom, but it is vital to rethink how learning is organized if we are to reap the rewards.

- Geoff Mulgan quoted in Burns (2012)

A story in the USA Today newspaper (García Mathewson and Butrymowicz 2020) was titled “Online programs used for coronavirus-era school promise results. The claims are misleading.” The article highlighted the fact that many online education providers who market their products to K-12 educators claim that their online programs are “proven” to be effective by scientific research, but that the evidence for such claims is typically very weak or even nonexistent. The article went on to state that this problem has been exacerbated by a virtual stampede to provide online learning opportunities in the wake of the coronavirus pandemic. School district administrators and other decision-makers have had to rush to find resources to support teaching and learning activities that have suddenly been forced online, and in this scramble they are likely to have been susceptible to false advertising. The authors pointed out that whereas the Food and Drug Administration in the USA restricts what pharmaceutical manufacturers can claim about their drugs and supplements, merchandizers of digital educational technologies have no such restraints.

The authors of the USA Today article are staff writers from The Hechinger Report, an independent nonprofit, nonpartisan organization focused on educating the public about education and how it can be improved. Perhaps there would be little need for The Hechinger Report if the What Works Clearinghouse that was established nearly twenty years ago by the U.S. Department of Education actually functioned as a reliable source of information about the quality of research on education products and programs. However, García Mathewson and Butrymowicz (2020) noted that “out of 10,654 studies included in the What Works Clearinghouse in mid-April, only 188—less than 2 percent—concluded that a product had strong or moderate evidence of effectiveness.”

So where are educational administrators, practitioners, and others to go to find the information that they need to make informed decisions about which educational programs to adopt if commercial advertising is untrustworthy and the What Works Clearinghouse can’t tell them what actually works? Hopefully they can find guidance in the types of systematic reviews found in this special edition of Educational Technology Research and Development. In this concluding paper, we examine the reviews published in this special issue, with special attention to the advice for practitioners that might be found in them. A caveat is warranted given that the systematic reviews in this special issue were not conducted specifically to provide practical implications, but nonetheless they all do to some extent as delineated below.

Countering bogus claims with evidence-based guidance

Exaggerated claims about education technology innovations are hardly new nor are literature reviews. In describing programmed instruction more than 50 years ago, Slack (1968) wrote:

Thanks to the genius of a few men who have devoted their efforts to new ways to write self-instructional materials — men such as B. F. Skinner, Thomas F. Gilbert, Lloyd Homme, Donald Tosti, Frederick Keller and others — the lowly workbook has undergone marvelous improvement in the last decade. The workbook has become so good that it works all by itself without a teacher to help fill in the answers. The new workbooks are guaranteed to teach all by themselves, without a teacher. They must be passed out, however, filled in by the student and collected by someone capable of managing the class. But this person does not have to do the teaching. That is done by the workbook, which, by the way, works just as well with drop-outs and delinquents as it does with nice little boys and girls. All in all, a marvelous thing. (Bold in original.)

The over one-hundred-year history of educational technology is replete with such outlandish claims starting perhaps with Edison’s prediction in 1913 that films would soon replace textbooks in classrooms (Reiser 2001). Literature reviews have also been around for decades. For example, more than a half century ago, Chu and Schramm (1967) synthesized the research on instructional television. As comprehensive as their seminal literature review was, the distinguished authors provided at best ambiguous guidance to practitioners:

Assuming a degree of caution in applying results, what kinds of guidelines can we extract from this body of research? For one thing, it has become clear that there is no longer any reason to raise the question whether instructional television can serve as an efficient tool of learning. This is not to say that it always does. But the evidence is now overwhelming that it can, and, under favorable circumstances, does. (Underlined words were italicized in original.) p. 98

The editors of this special edition of ETRD point out that the papers herein are not literature reviews in the traditional sense, but systematic reviews that utilize “a specific methodology to locate, select, evaluate, analyze, synthesize, and report evidence that helps to draw conclusions.” Meta-analysis is utilized in five of these systematic reviews whereas the other eight employ different cutting-edge review approaches such as scoping reviews (Arksey and O'Malley 2005).

Meta-analysis was largely unknown in educational technology research until Glass (1976) promoted its value for the educational research community at large. Since then, it has flourished with such high-profile educational researchers as Robert E. Slavin (cf. Cheung and Slavin 2012), John Hattie (cf. Hattie 2009), and James A. Kulik (cf. Kulik et al. 1985) conducting these types of analyses over the years. Unfortunately, educational technologies per se generally have not fared well in these analyses. For example, Cheung and Slavin (2013) conducted a meta-analysis to answer the question: “Do education technology applications improve mathematics achievement in K-12 classrooms as compared to traditional teaching methods without education technology?” They reported “the findings suggest that educational technology applications generally produced a positive, though modest, effect (ES =  + 0.15)” (p. 88).

The modest effect size reported by Cheung and Slavin (2013) does not come close to the 0.4 effect size that Hattie (2009) argued is necessary for any educational treatment to be taken seriously. Hattie’s (2009) comprehensive review of meta-analyses examining the effectiveness of 135 educational variables showed that only instructional design processes (e.g. feedback to students and mastery learning) rather than technologies per se (e.g. computer-based instruction or games) have demonstrated worthwhile impact on learning.

What guidance do systematic reviews provide practitioners?

In this section of the paper, we have attempted to tease out kernels of practical advice for practitioners that can be found in the 13 systematic reviews of the learning technologies and environments. Of course, we recognize that these systematic reviews were not written for practitioners but were primarily oriented toward other researchers and/or developers. We also acknowledge that the evidence that educational decision makers and practitioners actually avail themselves of educational research findings of any kind is lacking (Ion and Iucu 2014; Vanderlinde and van Braak 2010). Table 1 (below) summarizes the recommendations for other researchers proffered by the authors in their assorted reviews as well as the implications for practitioners that we have gleamed from the reviews.

Table 1.

Summary of advice for researchers and practitioners based on the 13 systematic reviews

Topics Authors Recommendations for researchers Implications for practitioners
Social media and school context Dennen, Choi, and Word The systemic nature of social media use by individuals should be considered; studies bridging disciplines and contexts need to be conducted There does not yet exist a robust set of design principles for incorporating social media into teaching and learning at any level. Hence, efforts to utilize social media to support teaching and learning should be approached carefully and best implemented and tested using action research with clear pedagogical needs in mind
Social media and professional development in higher education Luo, Freeman, and Stefaniak Further systematic research is needed using different research methods to investigate specific components and claims of social media-based professional learning Social media-supported professional learning networks and communities of practices can potentially contribute to faculty learning; yet, challenges exist in sustaining faculty participation, engagement, and effective navigation of the social media space
MOOC research Zhu, Sari, and Lee Most MOOC research to date has focused on learner issues. MOOC researchers should design comprehensive studies of various MOOC stakeholders including instructors, designers, or program administrators MOOCs are researched around the world and research is becoming cross-institutionally and internationally collaborative. MOOC instructors and instructional designers should communicate and collaborate globally to share design approaches and pedagogical practices to improve the quality of MOOCs and meet diverse learner needs and expectations
Using technology in special education Oluvabunmi, Akcayir, Ishola, and Demmans Epp More research is needed into learning technologies for life, job skills and training, and social skills; further studies in everyday settings could help students to contextualize their learning We need to keep improving the design and deployment of educational technologies for supporting those with developmental and cognitive disorders to contribute towards creating a future that offers equitable educational opportunities to every student
Mobile technologies on preschool and elementary children's literacy achievement Eutsler, Mitchell, Stamm, and Kogut Transparency is needed in reporting participant characteristics so that the studies are applicable to other contexts; standardized achievement measures are needed to address reliability and replicability of the studies To use mobile apps effectively, practitioners must ensure alignment between the app characteristics and the learning goals (e.g. literacy domain). It is unrealistic to expect too much impact from a single app, and thus multiple interventions and/or apps may be needed to improve students’ literacy over time
Mobile devices on language learning Chen, Chen, Jia, and An MALL (mobile-assisted language learning) studies employing the situated and collaborative features of mobile learning produce a high effect. These features of mobile technologies are transforming the way we live, work, and learn. Future research should explore the role of mobile devices in shaping the relationship between people, technologies, and learning contexts Language learning through mobile devices can be more effective than some traditional instructional approaches. Effective language learning can be enhanced through situated and collaborative features in MALL
Mobile game-based learning in STEM education Gao, Li, and Sun The following studies are needed: studies that move beyond simple comparison of traditional approach and mobile game-based approach; studies that base the design of mobile games on learning and motivational theories; studies that adopt a theoretical framework to categorize learning outcomes; and studies that utilize innovative research techniques Mobile game-based learning has a potential to motivate students to learn STEM and improve their learning. It is important for instructional designers and educators to make sure that the game design is aligned with learning principles. Other factors including learner characteristics, learning contexts, learning content, learning goals, learning activities, and game features should be taken into consideration throughout the design and implementation process
Wearables research in educational settings Havard and Podsiad Researchers need to explore wearable technology and conduct rigorous quantitative studies that can evolve the current literature When using wearables in educational settings, instructors should focus on aligning objectives, pedagogical strategies, specific affordances of the wearables, and assessment strategies in their lesson planning and implementation
Competition in digital game-based learning Chen, Shih, and Law More empirical research is needed in this area, particularly, the examinations of different game elements to enhance cognitive and non-cognitive learning outcomes Competitions worked better in math education than in language learning and science education. Education practitioners are encouraged to include competition in game-based learning environments
Gamification in educational settings on student learning outcomes Huang, Ritzhaupt, Sommer, Zhu, Stephen, Valle, Hampton, and Li Researchers should move past this current era of “pointification” (points, badges, and leaderboards) and evolve into something with more potential in facilitating learning The integration of gamification along with other innovations (e.g. flipped classroom or learning analytics) in educational settings has the potential to advance educational outcomes
Adaptive learning research Martin, Chen, Moore, and Westine Adaptive learning researchers need to consider the broader scope of the adaptive learning model to include both the source and target. Future studies should focus on the increasing availability and capacities of adaptive learning as a learning technology to assist individual learning and personalized growth For designers, developers, and instructors, it might be beneficial to know that learning can be adapted based on a variety of learner characteristics such as knowledge and metacognitive knowledge, preference, behavior, profile, ability, and interest. In addition, content (including presentation, assessment, feedback, and navigation) can be adapted in several ways, such as topic and question difficulty, learning sequence, path, pacing, and material format
Adaptivity in educational games Liu, Moon, Kim, and Dai Adaptive learning condition did not result in a substantial overall effect compared to a non-adaptive condition. Future research should encourage more rigorous experimental design to validate the added value of adaptivity Researchers and practitioners should seek sound theories from learning sciences and instructional design that explain: (a) how learning happens through in-game interactions, and (b) how to support game-based learning experiences
Utilizing learning analytics for study success Ifenthaler and Yau A wider adoption of learning analytics systems is needed. Standardized measures, visualizations, and interventions are needed to be integrated into any digital learning environments so as to reliably predict at-risk students and to provide personalized prevention and intervention strategies Teachers should be encouraged to further their educational data literacy, especially with respect to the ethically responsible collection, management, analysis, comprehension, interpretation, and application of data from educational contexts

The reviews published in this special issue of Educational Technology Research and Development have all been conducted rigorously according to the guidelines of the specific review processes used and reported skillfully with the guidance of the editors. As such, the papers provide an up-to-date portrayal of educational technology research across a wide variety of contexts. These reviews are especially useful in providing departure points for other researchers who seek to advance educational technology research in these contexts. Although clearly not intended to do so, each review also offers useful implications for practitioners.

In the light of the research we have, what is research we need?

Despite the high quality of the systematic reviews found in this special issue, they are only as good as the research studies that were incorporated into them. A major contributor to the paucity of practical guidance in these and other types of systematic reviews is that they are primarily focused on the “things” of our field such as wearable technologies and mobile devices rather than on the “problems” faced by teachers and students such as the lack of engagement of students in online learning (Stott 2016) or the failure to develop higher order learning in STEM education (Sadler and Zeidler 2009). Bonk and Wiley (2020) also note this tendency to focus on things rather than problems in the Preface to this special issue.

Improving educational opportunities is essential to addressing the major problems that the world confronts today such as poverty, climate change, racism, and the current global pandemic (Desai et al. 2018). Unfortunately, educational technology research does not have a distinguished record in dealing with local educational problems, much less global ones (Reich, in press). Arguably, a major contributor to this poor record is the focus on the things of educational technology rather than the problems of practitioners (Reeves and Reeves 2015a).

Fortunately, there are other ways to conduct educational technology research than the quasi-experimental methods employed in most of the studies included in the systematic reviews found in this special issue. One viable option is educational design research (EDR) (McKenney and Reeves 2019). EDR a genre of educational research in which the iterative development of solutions to complex educational problems through empirical investigations are pursued in tandem with efforts to reveal and enhance theoretical understanding. Such efforts can serve to guide educational practitioners as well as other researchers. As a genre of educational inquiry rather than a discrete methodology, EDR encompasses a family of approaches that endeavor to accomplish the twofold goal of designing and implementing interventions that solve serious educational problems while at the same time developing enhanced theoretical understanding to inform further practice. This family of approaches includes design-based research (Design-Based Research Collective 2003), design-based implementation research (Fishman et al. 2013), development research (van den Akker 1999), design experiments (Middleton et al. 2008), and formative research (Newman 1990), among others.

For most of its decades long history as a distinct field (Reiser 2001), educational technology researchers have focused primarily on the question “What works?” with respect to teaching and learning with technology leading to findings that are often so weak as to be practically inconsequential. Educational design research fundamentally changes the focus of research from the often fruitless “what works?” question to the more socially responsible questions “what is the problem, how can we solve it, and what new knowledge can derived from the solution?”

It should be clear we can no longer afford to have research agendas exclusively focused on things (e.g. digital games) rather than problems (e.g. the under-representation of women and minorities in STEM fields). Fortunately, the situation is improving. Notably, the new 5th edition of the Handbook of Research on Educational Communications and Technology (Bishop et al. 2020) has a very different focus from the previous four Handbooks that included many chapters focused on specific technologies as things. Whereas the fourth edition of the Handbook (Spector et al. 2014) has a dozen chapters on ‘emerging technologies’ such as e-books and open educational resources (OER), most chapters in the 5th edition are focused on complex problems and review the extent that these problems can be addressed using innovative learning designs and appropriate applications of technology. In the new Handbook, researchers and practitioners will find an excellent set of contributions that target serious educational problems such as increasing the accessibility of online learning environments and motivating and engaging students using emerging technologies. As a bonus, the 2020 Handbook includes thirteen design cases that are uniquely indexed with the research chapters focused on specific problems.

What else can be done to move from the research we have to the research we need? Reeves and Reeves (2015b) considered this question and responded by stating that our field is at a critical fork in the road:

Educational technology researchers may continue as we have for decades, conducting isolated studies focused on new things rather than significant problems, publishing our research in refereed journals just enough to ensure that our careers are advanced, working as recognized “scholars” but having virtually no impact on practice, and being largely unresponsive to the enormous challenges the world confronts around issues related to teaching and learning. Or we could take a new direction whereby we develop robust, multi-year research agendas focused on important problems and innovative solutions, judge our worthiness for promotion and tenure on evidence of impact rather than simple article counts, closely collaborate with practitioners, and establish our field as preeminent in meeting global problems related to education.

Of course, we are not suggesting that research focused on the things of educational technology has no merit, and we are fully cognizant that such studies will continue to be conducted every time an educational technology innovation appears. However, we encourage readers of this noteworthy special issue of ETR&D to focus on serious problems related to teaching, learning, and performance, collaborate closely with teachers, administrators, and other practitioners, and seek to make a difference in the lives of learners around the world. Instead of expecting the next “killer app” or technical innovation to transform education, perhaps we can better transform education by fostering incremental changes through collaborative research and development with practitioners (Reich, in press).

Biographies

Thomas C. Reeves

is Professor Emeritus of Learning, Design and Technology at the University of Georgia. Dr. Reeves earned his Ph.D. at Syracuse University. He was a Fulbright Lecturer in Peru and he has been an invited speaker in the USA and 30 other countries.

Lin Lin

is a Professor of Learning Technologies at the University of North Texas. She received her doctoral degree at Teachers College, Columbia University. Her research interests lie in the intersections of mind, brain, technology, and learning.

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflicts of interest.

Ethical approval

No human participants were involved in this writing because this is a concluding piece for a special issue.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Thomas C. Reeves, Email: treeves@uga.edu

Lin Lin, Email: Lin.Lin@unt.edu.

References

  1. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology. 2005;8(1):19–32. [Google Scholar]
  2. Bishop MJ, Boling E, Elen J, Svihla V, editors. Handbook of research on educational communications and technology. New York: Springer; 2020. [Google Scholar]
  3. Bonk CJ, Wiley DA. Preface: Reflections on the waves of emerging learning technologies. Educational Technology Research and Development. 2020 doi: 10.1007/s11423-020-09809-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Burns, J. (2012). Costly hi-tech kit lies unused in schools, says study. BBC News. https://www.bbc.com/news/education-20348322
  5. Cheung AC, Slavin RE. The effectiveness of educational technology applications for enhancing mathematics achievement in K-12 classrooms: a meta-analysis. Educational Research Review. 2013;9:88–113. [Google Scholar]
  6. Cheung AC, Slavin RE. How features of educational technology applications affect student reading outcomes: a meta-analysis. Educational Research Review. 2012;7(3):198–215. [Google Scholar]
  7. Chu G, Schramm W. Learning from television: what the research says. Stanford, CA: Institute for Communications Research; 1967. [Google Scholar]
  8. Desai RM, Kato H, Kharas H, McArthur JW. From summits to solutions: innovations in implementing the Sustainable Development Goals. Washington, DC: Brookings Institution Press; 2018. [Google Scholar]
  9. Design-Based Research Collective Design-based research: an emerging paradigm for educational inquiry. Educational Researcher. 2003;32(1):5–8. [Google Scholar]
  10. Fishman BJ, Penuel WR, Allen A-R, Cheng BH, Sabelli N. Design based implementation research: an emerging model for transforming the relationship of research and practice. National Society for the Study of Education. 2013;112(2):136–156. [Google Scholar]
  11. García Mathewson, T., & Butrymowicz, S. (2020) Online programs used for coronavirus-era school promise results. The claims are misleading. USA Today. Retrieved from https://www.usatoday.com/story/news/education/2020/05/20/coronavirus-online-school-programs-learning-games/5218747002/
  12. Glass GV. Primary, secondary, and meta-analysis of research. Educational Researcher. 1976;5(10):3–8. [Google Scholar]
  13. Hattie JAC. Visible learning: a synthesis of 800+ meta-analyses on achievement. New York: Routledge; 2009. [Google Scholar]
  14. Ion G, Iucu R. Professionals' perceptions about the use of research in educational practice. European Journal of Higher Education. 2014;4(4):334–347. [Google Scholar]
  15. Kulik JA, Kulik CLC, Bangert-Drowns RL. Effectiveness of computer-based education in elementary schools. Computers in Human Behavior. 1985;1(1):59–74. [Google Scholar]
  16. McKenney SE, Reeves TC. Conducting educational design research. 2. New York: Routledge; 2019. [Google Scholar]
  17. Middleton J, Gorard S, Taylor C, Bannan-Ritland B. The “compleat” design experiment: from soup to nuts. In: Kelly A, Lesh R, Baek J, editors. Handbook of design research methods in education: innovations in science, technology, engineering, and mathematics learning and teaching. New York: Routledge; 2008. pp. 21–46. [Google Scholar]
  18. Newman D. Opportunities for research on the organizational impact of school computers. Educational Researcher. 1990;19(3):8–13. [Google Scholar]
  19. Reeves TC, Reeves PM. Reorienting educational technology research from things to problems. Learning: Research and Practice. 2015;1(1):91–93. [Google Scholar]
  20. Reeves TC, Reeves PM. Educational technology research in a VUCA world. Educational Technology. 2015;55(2):26–30. [Google Scholar]
  21. Reich, J. (in press). Failure to disrupt. Why technology alone can’t transform education. Cambridge, MA: Harvard University Press.
  22. Reiser RA. A history of instructional design and technology: part I: a history of instructional media. Educational Technology Research and Development. 2001;49(1):53–64. [Google Scholar]
  23. Sadler TD, Zeidler DL. Scientific literacy, PISA, and socioscientific discourse: assessment for progressive aims of science education. Journal of Research in Science Teaching. 2009;46:909–921. [Google Scholar]
  24. Slack CW. Who is the educational technologist? And where is he? Educational Technology. 1968;8(14):13–18. [Google Scholar]
  25. Spector JM, Merrill MD, Elen J, Bishop MJ, editors. Handbook of research on educational communications and technology. New York: Springer; 2014. [Google Scholar]
  26. Stott P. The perils of a lack of student engagement: Reflections of a “lonely, brave, and rather exposed” online instructor. British Journal of Educational Technology. 2016;47(1):51–64. [Google Scholar]
  27. van den Akker J. Principles and methods of development research. In: van den Akker J, Branch R, Gustafson K, Nieveen N, Plomp T, editors. Design approaches and tools in education and training. Dordrecht, NL: Kluwer Academic Publishers; 1999. pp. 1–14. [Google Scholar]
  28. Vanderlinde R, van Braak J. The gap between educational research and practice: Views of teachers, school leaders, intermediaries and researchers. British Educational Research Journal. 2010;36(2):299–316. [Google Scholar]

Articles from Educational Technology Research and Development are provided here courtesy of Nature Publishing Group

RESOURCES