Skip to main content
Frontiers in Psychology logoLink to Frontiers in Psychology
editorial
. 2026 Mar 31;17:1779841. doi: 10.3389/fpsyg.2026.1779841

Evaluating the credibility of claims and recommendations in educational psychology: a grand challenge

Daniel H Robinson 1,*
PMCID: PMC13076310  PMID: 41987972

I started my term as Specialty Chief Section Editor for Frontiers in Psychology: Educational Psychology (FIP:EP) on September 1, 2025. I previously served as editor of Educational Psychology Review (2006–2025) and as associate editor of the Journal of Educational Psychology (2014–2020). What is most different about my latest editorial role is the sheer number of manuscripts. Among the 34 sections of Frontiers in Psychology, the educational psychology section is the largest. In 2025, we received well over 2,000 submissions. For comparison, the Journal of Educational Psychology received 837 submissions in 2023.

Beyond taking on a role that requires herculean efforts on the part of everyone involved in the editorial process, I am also becoming more familiar with the type of research that is submitted, the claims the authors make, and the implications for the field. One question has far outweighed others as I get my feet on the ground in my new role: what is the evidence base for claims and recommendations?

This question is not a new one for me by any means. For over 20 years, my colleagues and I have examined trends in the field of educational psychology in an attempt to answer the question (Brady et al., 2023; Hsieh et al., 2005; Reinhart et al., 2013; Robinson et al., 2007). Three themes have emerged from this work: (1) the proportion of empirical articles published in educational psychology journals (Journal of Educational Psychology, American Educational Research Journal, Cognition and Instruction, Journal of Experimental Education, and Contemporary Educational Psychology) that employ intervention methods has decreased from 47% in 1994 to 25% in 2020; (2) conversely, the proportion that employ observational methods has increased from 43% in 1994 to 66% in 2025; and (3) the proportion of observational articles that include recommendations for practice has increased from 30% in 1994 to 66% in 2020.

An astute reader will have noticed that FIP:EP is not listed among the journals we consider to be the Big Five journals in the field. There are two reasons for this exclusion. First, the journal is relatively new, launching in 2010. Second, unlike the other five journals, FIP:EP is solely open access, with author publication fees. But now that FIP:EP is clearly the most popular educational psychology journal in the world, it is time that we take stock of where we are and how far we have come. Thus, my grand challenge in this editorial is for a similar examination of the FIP:EP section.

The latest article in our line of inquiry is from Brady et al. (2023). As of December 22, 2025, this article has been cited 50 times according to Google Scholar and 34 times according to the Web of Science. It seems that questioning the scientific validity of a field elicits a considerable response. I want to challenge the readers of, and contributors to, FIP:EP to seriously consider what this means for them.

In a response to the reactions from the field, Robinson and Wainer (2023) wrote:

In educational psychology, we have moved away from a very important component of research—rigor. Instead, we have been seduced by what is simple and easy. The fruits of this latter type of research are rarely useful in providing causal evidence. (p. 82)

As section editor, I want to invite manuscripts that examine whether the articles published in FIP:EP have an evidence base that supports their corresponding claims and recommendations.

I also welcome other examinations of FIP:EP, including of who has contributed as authors, editorial members, etc. Are women participating at a rate consistent with their increasing involvement in the field as researchers, faculty, and students? I have been involved in similar investigations (e.g., Fong et al., 2022; Griffin et al., 2023) and am convinced that such studies can serve as a healthy audit for the journal.

Besides looking backwards and assessing where we have come as a field, we also need to consider the future. What will be the key issues and new directions in educational psychology? I see two areas that are relevant here. First, how will artificial intelligence (AI) influence how we pursue knowledge? Generative AI has already begun to shape the way we teach, assess, and learn (Mayrath et al., in press). But along with enthusiasm about its potential, there is an accompanying fear that AI will hurt student engagement and learning by providing easy shortcuts and new ways to plagiarize. I expect to see more submissions that address how we can harness AI to augment, rather than replace, student learning. How can AI be used to improve assessment of learning? How will the roles of the instructor change? How will learning materials be different in this age of AI?

Second, getting back to the main point of this editorial, educational psychology must also continue to deal with the abundance of misinformation, pseudoscience, and quackery that appears daily in popular media outlets. We must continue to serve as watchdogs and call out policies that are not supported by scientific research. In the U.S., we have witnessed a resurgence of childhood diseases such as measles that had been eradicated long ago only to resurface due to vaccine hesitancy. There is a level of distrust in science that is reaching an all-time high. In education, we continue to see recommendations to teach to students' learning styles to improve learning despite overwhelming evidence to the contrary (Cleary and Robinson, 2025; Robinson et al., 2022). Other popular educational practices have been recently challenged, including universal design for learning (Boysen, 2024) and growth mindset training (Macnamara and Burgoyne, 2023). Our first obligation to society must be to protect society from our field. Educators look to us for practical solutions to difficult problems. When we publish empirical findings that are accompanied by unwarranted claims and recommendations, we commit a huge disservice by providing more misinformation. Educational psychology needs to continue to identify what does and does not work.

Footnotes

Edited and reviewed by: Axel Cleeremans, Université Libre de Bruxelles, Belgium

Author contributions

DR: Writing – original draft, Writing – review & editing.

Conflict of interest

The author(s) declared that this work was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Generative AI statement

The author(s) declared that generative AI was not used in the creation of this manuscript.

Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.

Publisher's note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

  1. Boysen G. A. (2024). Lessons (not) learned: the troubling similarities between learning styles and universal design for learning. Scholarsh. Teach. Learn. Psychol. 10, 207–221. doi: 10.1037/stl0000280 [DOI] [Google Scholar]
  2. Brady A. C., Griffin M. M., Lewis A. R., Fong C. J., Robinson D. H. (2023). How scientific is educational psychology research? The increasing trend of squeezing causality and recommendations from non-intervention studies. Educ. Psychol. Rev. 35, 36-37. doi: 10.1007/s10648-023-09759-9 [DOI] [Google Scholar]
  3. Cleary A. M., Robinson D. H. (2025). Institutionalized misinformation in U.S. education: combatting the overselling of learning styles and underselling of spaced effort. J. Soc. Issues 81, 1–16. doi: 10.1111/josi.70005 [DOI] [Google Scholar]
  4. Fong C. J., Flanigan A. E., Hogan E., Brady A. C., Griffin M. M., Gonzales C., et al. (2022). Individual and institutional productivity in educational psychology journals from 2015–2021. Educ. Psychol. Rev. 34, 2379–2403. doi: 10.1007/s10648-022-09704-2 [DOI] [Google Scholar]
  5. Griffin M. M., Hogan E., Fong C. J., Gonzales C., Fathi Z., Robinson D. H. (2023). Women as top-producing authors, editors, and editorial board members in educational psychology journals from 2017-2021. Educ. Psychol. Rev. 35:10. doi: 10.1007/s10648-023-09744-2 [DOI] [Google Scholar]
  6. Hsieh P., Acee T., Chung W.-H., Hsieh Y.-P., Kim H., Thomas G. D., et al. (2005). Is educational intervention research on the decline? J. Educ. Psychol. 97, 523–529. doi: 10.1037/0022-0663.97.4.523 [DOI] [Google Scholar]
  7. Macnamara B. N., Burgoyne A. P. (2023). Do growth mindset interventions impact students' academic achievement? A systematic review and meta-analysis with recommendations for best practices. Psychol. Bull. 149, 133–173. doi: 10.1037/bul0000352 [DOI] [PubMed] [Google Scholar]
  8. Mayrath M., Behrens J. T., Robinson D. H. (eds.). (in press). Handbook of Generative Artificial Intelligence in Education: Integrating Research Into Practice. Springer. [Google Scholar]
  9. Reinhart A. L., Haring S. H., Levin J. R., Patall E. A., Robinson D. H. (2013). Models of not-so-good behavior: yet another way to squeeze causality and recommendations for practice out of correlational data. J. Educ. Psychol. 105, 241–247. doi: 10.1037/a0030368 [DOI] [Google Scholar]
  10. Robinson D. H., Levin J. R., Thomas G. D., Pituch K. A., Vaughn S. (2007). The incidence of “causal” statements in teaching-and-learning research journals. Am. Educ. Res. J. 44, 400–413. doi: 10.3102/0002831207302174 [DOI] [Google Scholar]
  11. Robinson D. H., Wainer H. (2023). It's just an observation. Educ. Psychol. Rev. 35:83. doi: 10.1007/s10648-023-09804-7 [DOI] [Google Scholar]
  12. Robinson D. H., Yan V. X., Kim J. A. (Eds.). (2022). Learning Styles on Trial: Implications for Classroom Instruction and Student Achievement. Cham: Springer. doi: 10.1007/978-3-030-90792-1 [DOI] [Google Scholar]

Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

RESOURCES