Skip to main content
Canadian Journal of Psychiatry. Revue Canadienne de Psychiatrie logoLink to Canadian Journal of Psychiatry. Revue Canadienne de Psychiatrie
. 2015 Dec;60(12):534–540. doi: 10.1177/070674371506001203

Evidence-Based Practice: Separating Science From Pseudoscience

Catherine M Lee 1,, John Hunsley 2
PMCID: PMC4679161  PMID: 26720821

Abstract

Evidence-based practice (EBP) requires that clinicians be guided by the best available evidence. In this article, we address the impact of science and pseudoscience on psychotherapy in psychiatric practice. We describe the key principles of evidence-based intervention. We describe pseudoscience and provide illustrative examples of popular intervention practices that have not been abandoned, despite evidence that they are not efficacious and may be harmful. We distinguish efficacy from effectiveness, and describe modular approaches to treatment. Reasons for the persistence of practices that are not evidence based are examined at both the individual and the professional system level. Finally, we offer suggestions for the promotion of EBP through clinical practice guidelines, modelling of scientific decision making, and training in core skills.

Keywords: pseudoscience, evidence-based practice, critical appraisal, heuristics and biases


Burgeoning rates of mental disorder in most countries (including Canada) and a global shortage of psychiatrists1 require judicious allocation of resources to address mental health problems.2 The promotion of EBP ensures that health care yields maximum returns.3 The commitment to EBP in Canadian psychiatry was articulated in the submission made by the RCPSC to the Romanow Commission on Health Care in 2002. Requirements for psychiatry residency education from the RCPSC and the US Accreditation Council for Graduate Medical Education include competence in the provision of EBP, although they do not address how training should be delivered or outcomes should be measured.4

With the broad professional context of EBP in mind, we address the impact of science and pseudoscience on psychiatry. We begin by discussing key aspects of the EBP movement in mental health care. EBP involves initial assessment, followed by the selection of a treatment, and ongoing monitoring to determine the usefulness of the treatment for the patient. In this article, we focus on psychotherapy, presenting examples that are evidence based as well as those that are not (see online Supplemental Materials for a review of issues in evidence-based assessment). Finally, we examine reasons for the reluctance of many clinicians to embrace EBP and conclude with constructive strategies to promote EBP.

Evidence-Based Practice

There is a long-standing emphasis on the importance of research evidence in psychiatry. Almost a century ago, Eugen Bleuler criticized the “autistic and undisciplined thinking”5, p 1 in medicine that allowed unsubstantiated treatments to flourish. Bleuler suggested that the motivation to alleviate suffering, combined with limited knowledge, created conditions in which physicians and patients are fooled into believing that an ineffective treatment works. There is no ethically defensible argument against the need to offer treatments demonstrated to work for patients with similar problems and compatible with their needs and preferences. Still, a thorny question remains: What constitutes evidence that an intervention works?

One model for choosing psychotherapies is comparable to that used to evaluate medications: establish methodological criteria for evaluating studies, create lists of treatments that meet these criteria, and promote these treatments. This approach was adopted in the early 1990s, by task forces of the American Psychological Association’s Society of Clinical Psychology, to generate lists of ESTs.6 Classification as an EST required sufficient research of high quality (multiple RCTs or single-participant designs), demonstrating efficacy of the intervention in clients with a specific condition.

These initiatives underlined that efficacious psychosocial treatments for many disorders across the lifespan exist (see also Roth and Fonagy,7 Fonagy et al,8,9 and Nathan and Gorman1013). This encouraging news has neither been embraced by some clinicians as positively as one might expect nor led to the widespread adoption of ESTs.14,15 Critics of ESTs have expressed concerns about the scientific soundness of the endeavour and its potential negative impact on clinicians who do not offer ESTs.16 They noted that a treatment with a positive, but limited, evidence base could fall short of EST criteria. The EST initiative was an important step in the development and promotion of EBP, but it is not synonymous with EBP. Originating in Canada and the United Kingdom, evidence-based medicine is based on the premise that the application of empirical knowledge improves patient care.3 This movement spread to the United States17 and other countries. The term evidence based entails the synthesis of information from a wide array of sources to guide clinicians to select the best available treatment options.17 Training in EBP encompasses attention to the content of the evidence base and to the process of evidence-based decision making.4 Sources of information typically considered in EBP include systematic data, clinical expertise, and patient preferences.

Two North American psychological associations created task forces to operationalize evidence-based psychotherapies. The American Psychological Association Presidential Task Force on Evidence-Based Practice18 defined EBP as the integration of the best available research and clinical expertise within the context of client characteristics, culture, values, and treatment preferences. The assertion that treatment should be informed by research but determined by other clinical information, client choice, and the likely costs and benefits of treatments has sometimes been interpreted as suggesting that research evidence is the least important consideration in EBP.19 Additionally, the American Psychological Association’s statement was silent on the need to ensure that clients’ views were based on accurate rather than inaccurate information. Clients often request a treatment they have heard about from a friend or read about online, unaware that it is not backed by scientific evidence. In such situations, clinicians bear a responsibility to educate their clients to ensure that their choices are informed by sound knowledge.

Clinical Implications

  • EBP is an essential safeguard against pseudoscience and potentially harmful assessment and treatment methods.

  • Even interventions that appear to be plausible can be ineffective or even iatrogenic.

Limitations

  • It can be difficult to distinguish between science and some forms of pseudoscience.

  • Research on the effectiveness of training psychiatrists in EBP is still in its infancy.

The Canadian Psychological Association Task Force on Evidence-Based Practice of Psychological Treatments20 asserted that EBP should rely, first and foremost, on peer-reviewed research. The task force recognized the potential contribution of diverse methodologies to yield evidence to guide practice, but established a hierarchy in which the strongest weight is given to research that is based on data from the highest-quality research designs and that has been scrutinized by peer review. Clinicians are exhorted to apply their knowledge of the best available research, considering client characteristics, cultural backgrounds, and treatment preferences. Further, they are expected to continuously monitor the effects of treatment and to adjust treatment when appropriate. The monitoring of treatment effects is a key element of EBP and has been shown to have a substantial positive impact on treatment outcomes.21 Although surveys indicate that mental health professionals underuse patient-report measures when making decisions about clinical services,22,23 there are widely available sources that summarize psychometrically sound instruments that can be used for treatment planning and monitoring.24,25 Additionally, recent initiatives in the United States have led to the widespread availability of psychometrically sound, patient-reported health status and social well-being measures that can also be used for service delivery decisions (see Patient Reported Outcomes Measurement Information System [PROMIS] Network26).

Science, Compared With Pseudoscience

With the move toward scientifically supported practice, one commonly sees monikers, such as scientifically proven or evidence-informed, attached to descriptions of treatments. The cachet of scientific support is a potent marketing tool. Lilienfeld et al27 viewed science and pseudoscience as existing along a continuum. Features of pseudoscience include the following:

  1. The overuse of ad hoc hypotheses to account for negative research findings.

  2. Avoidance of peer review.

  3. Emphasis on confirmation rather than refutation.

  4. Lack of connection with basic or applied research.

  5. Overreliance on anecdotal evidence.

  6. Reversed burden of proof in which proponents of a technique demand that critics refute claims of treatment efficacy.

The greater the number of such features, the more likely a practice is based on pseudoscience rather than science.

TFT, a treatment applied to mood, anxiety, and trauma-related disorders, is a prime example of practice founded on pseudoscience. TFT is based on the premise that bodily energy imbalances cause negative emotions. Treatment is purported to rectify imbalances by tapping on acupuncture meridians.28 Virtually no peer-reviewed research supports this treatment rationale. With only methodologically weak reports available in the literature, the so-called science cited to support TFT is primarily anecdotal and does not rule out placebo effects.2931 Despite these criticisms, the TFT website32 continues to advance unsupported claims about TFT’s ability to cure almost any emotional problem.

Clinicians may find it difficult to resist the pull of anecdotal evidence, such as unsystematic clinical observations, in decision making. Dramatic changes in patient functioning following an intense session may appear to provide more compelling evidence of a treatment’s impact than do the results of RCTs or meta-analyses. Although anecdotal evidence can inform hypotheses to be evaluated systematically, such evidence should not be equated with scientific data. Without the controls afforded by scientific practices and scientific thinking, unsystematic clinical observations can lead to erroneous conclusions about the value of a clinical procedure. Lilienfeld et al33,34 enumerated over 2-dozen cognitive errors that can contribute to spurious therapeutic effectiveness. These include naive realism (for example, “If I saw it, it must be true”), confirmation bias (looking only for evidence that supports an hypothesis and failing to look for evidence that refutes it), illusory correlation (the co-occurrence of 2 events or characteristics does not mean that they are linked), and illusory causation (the co-occurrence of 2 events or characteristics does not mean that one causes the other). Clearly, it is insufficient to think scientifically only when conducting research: one must think scientifically in all aspects of one’s professional activities, including clinical services.

Evidence-Based Treatment

Prior to the 1960s, few studies had evaluated psychotherapy outcome, but, since the 1970s, thousands of studies have filled this gap. There are now evidence-based treatments for many mental health disorders in children, adolescents, and adults.9,12 However, less emphasis has been placed on the need to desist from delivering interventions that are inefficacious or harmful.

For example, media reports of natural and human-made tragedies almost inevitably end with the statement that counsellors will be available on site to assist. Given the woeful shortage of services for mental health problems, it is striking that such services are commonly offered.35 The rationale for postdisaster debriefing is intuitively appealing. If one could protect people from developing posttraumatic stress disorder by devoting a few hours to hearing their stories, it would be time well invested. Nevertheless, literature reviews do not support the efficacy of single or multiple sessions of debriefing and suggest that it can sometimes be harmful, as it may impede natural recovery processes.36,37

Another program based on popular belief is Scared Straight. Prison visits are designed to deter youth on a trajectory toward delinquency by arousing fear about the life they could expect if they engaged in crime.38 Fuelled by anecdotal reports suggesting high success rates, Scared Straight was widely implemented in the United States. Echoing the pattern found for debriefing, reviews of RCTs provided no evidence that Scared Straight or other juvenile awareness programs are efficacious; to the contrary, they boost the odds of future offending.39

An important distinction in treatment outcome research is that between efficacy and effectiveness.40 Efficacy refers to evidence of treatment effects obtained in controlled research, whereas effectiveness refers to evidence of treatment effects as evaluated in the real world. Skeptics of EBP argue that findings from efficacy trials, often conducted in academic training centres, have limited applicability to services delivered in the exigencies of real clinical practice, in which patients have multiple diagnoses and severe problems.15 Nevertheless, Jensen-Doss and Weisz41 found no evidence of poorer treatment effects in youth with multiple problems. Kazdin and Whitley42 found that comorbidity was associated with greater change in young people with disruptive behaviour disorders who received evidence-based parent training or problem-solving treatments. These findings suggest that evidence-based services are useful to people with multiple problems.

Using a benchmarking strategy, in which results of efficacy trials are used as a point of comparison, Hunsley and Lee43 found that effectiveness studies of evidence-based treatments for adults reported comparable completion and improvement rates to those obtained in efficacy trials. A meta-analysis of effectiveness studies of cognitive-behavioural therapy for adult anxiety disorders similarly found that the mean treatment outcome in over 50 studies was consistent with results from efficacy studies.44 Lee et al45 examined treatment effectiveness studies in regular clinical settings for children and adolescents. Across studies, more than 75% of patients completed the services. Improvement rates for internalizing problems were comparable to those reported in efficacy trials. There was greater variability in outcomes for parenting interventions for disruptive behaviour problems, with several studies yielding superior results compared with the benchmark, and a smaller number yielding poorer results. Overall, benchmarking studies indicate that evidence-based treatments for children and youth can be effective in routine practice settings.

Critics of EBP have raised concerns that the use of treatment manuals to guide services interferes with the therapeutic alliance. To test this hypothesis, Langer et al46 randomly assigned youth with internalizing disorders in community clinics to receive manualized services or nonmanualized usual care. Observer ratings revealed that early on in the process, youth receiving manualized services had a stronger therapeutic alliance than did youth receiving usual care. By the end of services, there were no significant group differences. Although this study requires replication, it is noteworthy that there was no evidence that manualized services impaired the therapeutic relationship; instead, these services were associated with a stronger early alliance.

One pressing question in the delivery of EBP is to identify active treatment ingredients.47 Chorpita and Daleiden48 analyzed the practice elements in 322 RCTs of treatments for children and adolescents. They identified constellations of strategies consistently found to be helpful for specific problems. For example, for anxiety, treatments commonly included modules for exposure, relaxation, cognitive interventions, modelling, and psychoeducation. The authors recommended that clinicians individualize treatments by integrating treatment modules that best address a client’s problems.

The usefulness of a modular approach to the treatment for depression, anxiety, and conduct problems in people aged 7 to 13 was examined by Weisz et al.49 Community clinicians were randomly assigned to 1 of the following: treatment as usual; an evidence-based treatment for anxiety, depression, or conduct problems; or a modularized approach in which clinicians flexibly delivered modules that integrated the 3 evidence-based approaches as needed. Modular treatment outperformed both usual care and a single problem-focused, evidence-based treatment, suggesting that this approach holds promise for treatments delivered to youth.

How Can We Understand Persistence in Offering Services That Are Not Evidence Based?

From the treatment of scurvy, to routine handwashing to prevent infection, the history of health care is replete with examples of long delays between the demonstration that a practice is efficacious and its adoption in routine care.50 Similarly, clinicians do not always abandon a mental health service that is ineffective or harmful. To understand why clinicians remain loyal to unsupported or iatrogenic practices and why they do not embrace efficacious practices, we must consider both individual and contextual variables. Individual-level variables include challenges in keeping up with the scientific literature, perceived difficulties in accessing training, and attitudes toward life-long learning in general and in EBP in particular.51 Contextual variables include the culture of the profession,4 training in EBP, and institutional and professional support for EBP.52

Keeping up with the literature requires clinicians to formulate a question related to patient care, seek the best research evidence that addresses the question, critically appraise the validity and applicability of research to the patient, apply the information, and systematically evaluate treatment effects.53 Further, it requires recognition that the evidence base is constantly evolving, thus a practice that is currently the best available may later become obsolete or require modification. With the proliferation of efficacious treatments, psychiatrists may feel overwhelmed, believing that they require proficiency in numerous treatments to serve their patients’ needs. Clinicians who are convinced that their current practice is efficacious may not perceive a need to devote energy and time to achieve competence in new treatments. Clinicians may be most reluctant to consider new interventions that are grounded in a different theoretical framework than that in which they were trained.

The culture of a profession can influence clinicians’ openness to EBP. The antithesis of EBP is practice based on tradition and authority (facetiously dubbed eminence-based practice54) in which recommendations are accepted because the person delivering them is regarded as an expert. Unless backed by high-quality evidence, the opinions of recognized experts are just that—opinions.55 An apprenticeship model of training provides fertile ground for eminence-based practice, in which the views of charismatic or authoritative supervisors are adopted, regardless of their evidence base. Recent discussion in The Canadian Journal of Psychiatry has examined the susceptibility of psychiatry to fads that arouse enthusiasm but are later abandoned.56 Some commentators (for example, see Goldberg57) have suggested that the tradition in psychiatry to convene expert panels to offer pronouncements about practice can be problematic. Goldberg warned that psychiatrists must be aware of the risks of abandoning practices based on expert consensus rather than research evidence. The dialectic between the promotion of hope and certainty in helping professions and the need for skepticism in science is especially acute in psychiatry.56

Responsibility also rests with researchers to ensure their findings are presented accurately. In a review of the American Association for the Advancement of Science’s online coverage of press releases in all areas of science, Yavchitz et al58 found an unwarranted emphasis on the benefits of experimental treatments (such as focusing only on findings showing a treatment effect) in 40% of journal abstracts, 47% of press releases, and 51% of news items based on press releases. In most instances, the spin in press releases and media reports originated in journal abstracts, suggesting that researchers themselves are responsible for at least some of the overselling of findings. Gonon et al59 examined media reports of influential studies on attention-deficit hyperactivity disorder. Most findings reported in newspapers during the 1990s were not replicated, with only 2 of the 10 most cited studies supported by later investigations. Only one newspaper article reported that the findings stemming from one of the most commonly mentioned studies were not replicated.

The practising psychiatrist faces a daunting task in offering EBP. The Draft CanMEDS 2015 Milestones Guide—May 201453 highlights that a physician must “integrate best available evidence, contextualized to specific situations, into real-time decision-making.”p 38, 42, 46 The foundation for critical appraisal skills is to be established in medical school, with demonstration of these skills by the time of transition to practice. The steps to acquisition of competence in formulating testable hypotheses about a patient, consulting the relevant scientific literature and appraising its relevance, and ongoing monitoring of the usefulness of a practice are elaborated in broad brushstrokes. However, the Guide offers few specific suggestions, so it is unclear how training assists in the development of this competence. Nevertheless, psychiatrists require training that promotes a scientific approach to decision making, access to research syntheses, and an environment supportive of EBP.

Constructive Strategies to Promote Use of Evidence-Based Practices

Given the rapid evolution of the knowledge base, psychiatric residents must not only master current knowledge but also learn strategies for continual questioning and updating of knowledge in a process of life-long learning. Our recommendations focus on strategies that create a context favourable for EBP.

Promoting Use of Clinical Practice Guidelines

The Draft CanMEDS 2015 Milestones Guide—May 201453 specifies that by their transition to practice, physicians must be competent in identifying, selecting, and navigating pre-appraised evidence. Psychiatrists have access to many high-quality EBP databases, including the Canadian Psychiatric Association Clinical Practice Guidelines,60 American sites, such as the American Psychiatric Association Clinical Practice Guidelines,61 Evidence-Based Behavioral Practice,62 and the Society for Clinical Psychology’s site on psychological treatments,63 and sites based in the United Kingdom, including the National Institute for Health and Care Excellence.64

Despite the wealth of practice guidelines based on research syntheses, the extent to which clinicians access these resources is unknown. A survey65 found that psychologists were moderately knowledgeable about general research databases, such as PsycINFO and MEDLINE, but were much less knowledgeable about online databases that provide integrated information on evidence-based health and mental health services. Slightly less than one-half of respondents reported feeling competent with at least one evidence-based database. Clinicians will require training in the use of databases to allow for their ongoing integration into practice. We recommend that case presentations, which are routinely used for training and consultation, should include information not only about the patient but also about how the clinician searched for and selected an intervention.66

Modelling of Scientific Reasoning in Clinical Decision Making

In the apprenticeship model of training in psychiatry, residents consolidate their didactic learning by applying knowledge in the clinical context. Peers and supervisors model how they generate hypotheses about patients, search the literature for evidence, and monitor the usefulness of their practice.4 The adoption of EBP requires a commitment to a humble stance in which testable hypotheses guide a search for the best evidence, and ongoing monitoring informs the usefulness of an intervention. In this model, the capacity to generate questions, search the literature, and critically appraise evidence is more important than is knowledge of the literature.

Training in Core Skills

An increasing number of online tools have been developed to assist in EBP training. The Psychotherapy Training e-Resources based at McMaster University are designed for training and continuing education of mental health professionals.67 The PracticeWise site68 offers a database on evidence-based services for children and youth, as well as practice guides and tools for treatment plans and tracking progress. Expertise in using scientifically sound assessment tools, to both develop treatment plans and monitor treatment progress, is essential for true EBP. Because no clinician can expect to master all evidence-based protocols for all disorders, there is growing interest in identifying core treatment skills that can be flexibly applied to different problems. Although there was a time when EBP was largely synonymous with cognitive-behavioural therapy, interventions from other theoretical frameworks (for example, interpersonal psychotherapy and short-term psychodynamic therapies) are also efficacious for certain conditions.69,70

Final Comment

The goal of psychiatric services is to provide the best treatment possible to people with mental disorders. Attending to the research evidence in planning these services and systematically monitoring their impact, provides the optimal means of achieving this goal. For this goal to be realized, clinicians must recognize not only the value of disciplined scientific inquiry but also that clinical judgment should be continually informed by such inquiry. By developing the attitudes, values, and skills consistent with EBP principles, people in training to provide psychiatric services will be well-placed to deliver the best services possible.

Acknowledgments

The authors have no conflicts of interest to report.

The Canadian Psychiatric Association proudly supports the In Review series by providing the authors with an honorarium.

Abbreviations

EBP

evidence-based practice

EST

empirically supported treatment

RCT

randomized controlled trial

RCPSC

Royal College of Physicians and Surgeons of Canada

TFT

Thought Field Therapy

References

  • 1.World Health Organization (WHO) 2014 mental health atlas [Internet] Geneva (CH): WHO; 2015. [cited 2015 Oct 6]. Available from: http://www.who.int/mental_health/evidence/atlas/mental_health_atlas_2014/en. [Google Scholar]
  • 2.Canadian Psychiatric Association (CPA) Ontario Psychiatric Association (OPA) CPA and OPA welcome report on psychiatric supply and practice patterns [Internet] Ottawa (ON): CPA; 2014. Jul, [cited 2015 Sep 15]. Available from: http://www.cpa-apc.org/media.php?mid=2024. [Google Scholar]
  • 3.Sackett DL, Rosenberg WM, Gray JA, et al. Evidence-based medicine: what it is and what it isn’t. Br Med J. 1996;312:71–72. doi: 10.1136/bmj.312.7023.71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Agrawal S, Szatmari P, Hanson M. Teaching evidence-based psychiatry: aligning the formal and hidden curricula. Acad Psychiatry. 2008;32:470–474. doi: 10.1176/appi.ap.32.6.470. [DOI] [PubMed] [Google Scholar]
  • 5.Stam J, Vermeulen MJ. Eugen Bleuler (1857–1939), an early pioneer of evidence based medicine. J Neurol Neurosurg Psychiatry. 2013;84:594–595. doi: 10.1136/jnnp-2012-303715. [DOI] [PubMed] [Google Scholar]
  • 6.Chambless DL, Ollendick TH. Empirically supported psychological interventions: controversies and evidence. Annu Rev Psychol. 2001;52:685–716. doi: 10.1146/annurev.psych.52.1.685. [DOI] [PubMed] [Google Scholar]
  • 7.Roth A, Fonagy P. What works for whom? A critical review of psychotherapy research. New York (NY): Guilford Press; 1996. [Google Scholar]
  • 8.Fonagy P, Target M, Cottrell D, et al. What works for whom? A critical review of treatments for children and adolescents. New York (NY): Guilford Press; 2002. [Google Scholar]
  • 9.Fonagy P, Cottrell P, Phillips J, et al. What works for whom? A critical review of treatments for children and adolescents. 2nd ed. New York (NY): Guilford; 2014. [Google Scholar]
  • 10.Nathan P, Gorman JM. A guide to treatments that work. New York (NY): Oxford University Press; 1998. [Google Scholar]
  • 11.Nathan P, Gorman JM. A guide to treatments that work. 2nd ed. New York (NY): Oxford University Press; 2002. [Google Scholar]
  • 12.Nathan PE, Gorman JM. A guide to treatments that work. 3rd ed. New York (NY): Oxford University Press; 2007. [Google Scholar]
  • 13.Nathan PE, Gorman JM. A guide to treatments that work. 4th ed. New York (NY): Oxford University Press; 2015. [Google Scholar]
  • 14.von Ransom KM, Wallace LM, Stevenson A. Psychotherapies provided for eating disorders by community clinicians: infrequent use of evidence-based treatment. Psychother Res. 2013;23:333–343. doi: 10.1080/10503307.2012.735377. [DOI] [PubMed] [Google Scholar]
  • 15.Weissman MM, Verdeli H, Gameroff MJ, et al. National survey of psychotherapy training in psychiatry, psychology, and social work. Arch Gen Psychiatry. 2006;63:925–934. doi: 10.1001/archpsyc.63.8.925. [DOI] [PubMed] [Google Scholar]
  • 16.Westen D, Novotny CM, Thompson-Brenner H. The empirical status of empirically supported psychotherapies: assumptions, findings, and reporting in controlled clinical trials. Psychol Bull. 2004;130:631–663. doi: 10.1037/0033-2909.130.4.631. [DOI] [PubMed] [Google Scholar]
  • 17.Institute of Medicine . Crossing the quality chasm: a new health system for the 21st century. Washington (DC): National Academy Press; 2002. [Google Scholar]
  • 18.American Psychological Association Presidential Task Force on Evidence-Based Practice Evidence-based practice in psychology. Am Psychol. 2006;61:271–285. doi: 10.1037/0003-066X.61.4.271. [DOI] [PubMed] [Google Scholar]
  • 19.Stuart RB, Lilienfeld SO. The evidence missing from evidence-based practice. Am Psychol. 2007;62:613–614. doi: 10.1037/0003-066X62.6.615. [DOI] [PubMed] [Google Scholar]
  • 20.Dozois DJA, Mikail S, Alden LE, et al. The CPA presidential task force on evidence-based practice of psychological treatments. Can Psychol. 2014;55:153–160. [Google Scholar]
  • 21.Lambert MJ, Shimokawa K. Collecting client feedback. Psychotherapy (Chic) 2011;48:72–79. doi: 10.1037/a0022238. [DOI] [PubMed] [Google Scholar]
  • 22.Bhugra D, Easter A, Mallaris Y, et al. Clinical decision making in psychiatry by psychiatrists. Acta Psychiatr Scand. 2011;124:403–411. doi: 10.1111/j.1600-0447.2011.01737.x. [DOI] [PubMed] [Google Scholar]
  • 23.Ionita G, Fitzpatrick M. Bringing science to clinical practice: a Canadian survey of psychological practice and usage of progress monitoring measures. Can J Psychol. 2014;55:187–196. [Google Scholar]
  • 24.Antony MM, Barlow DH. Handbook of assessment and treatment planning for psychological disorders. 2nd ed. New York (NY): Guilford Press; 2010. [Google Scholar]
  • 25.Hunsley J, Mash EJ. A guide to assessments that work. New York (NY): Oxford University Press; 2008. [Google Scholar]
  • 26.Patient Reported Outcomes Measurement Information System (PROMIS) Network . Dynamic tools to measure health outcomes from the patient perspective [Internet] Chapel Hill (NC): PROMIS; 2011. [cited 2015 Aug 1]. Available from: http://www.nihpromis.org/about/abouthome. National Institutes of Health–funded. [Google Scholar]
  • 27.Lilienfeld SO, Lynn SJ, Lohr JM. Science and pseudoscience in clinical psychology: initial thoughts, reflections, and considerations. In: Lilienfeld SO, Lynn SJ, Lohr JM, editors. Science and pseudoscience in clinical psychology. 2nd ed. New York (NY): Guilford Press; 2015. pp. 1–16. [Google Scholar]
  • 28.Callahan R. The impact of thought field therapy on heart rate variability. J Clin Psychol. 2001;57:1153–1170. doi: 10.1002/jclp.1082. [DOI] [PubMed] [Google Scholar]
  • 29.Gaudiano BA, Herbert JD. Can we really tap our problems away? A critical analysis of thought field therapy. The Skeptical Inquirer. 2000;24:29–33. 36. [Google Scholar]
  • 30.Lohr JM, Hooke W, Gist R, et al. Novel and controversial treatments for trauma-related and stress disorders. In: Lilienfeld SO, Lynn SJ, Lohr JM, editors. Science and pseudoscience in clinical psychology. New York (NY): Guilford Press; 2003. pp. 243–272. [Google Scholar]
  • 31.Pignotti M. Callahan fails to meet the burden of proof for thought field therapy claims. J Clin Psychol. 2005;61:251–255. doi: 10.1002/jclp.20053. [DOI] [PubMed] [Google Scholar]
  • 32.Callahan Techniques, Ltd . Callahan Techniques: thought field therapy [Internet] La Quinta (CA): Callahan Techniques, Ltd; 2015. [cited 2015 Aug 1]. Available from: http://www.rogercallahan.com/index.php. [Google Scholar]
  • 33.Lilienfeld SO, Ritschel LA, Lynn SJ, et al. Why many clinical psychologists are resistant to evidence-based practice: root causes and constructive remedies. Clin Psychol Rev. 2013;33:883–890. doi: 10.1016/j.cpr.2012.09.008. [DOI] [PubMed] [Google Scholar]
  • 34.Lilienfeld SO, Ritschel LA, Lynn SJ, et al. Why ineffective psychotherapies appear to work: a taxonomy of causes of spurious therapeutic effectiveness. Perspect Psychol Sci. 2014;9:355–387. doi: 10.1177/1745691614535216. [DOI] [PubMed] [Google Scholar]
  • 35.North CS, Pfefferbaum B. Mental health response to community disasters: a systematic review. JAMA. 2013;310:507–518. doi: 10.1001/jama.2013.107799. [DOI] [PubMed] [Google Scholar]
  • 36.Rose S, Bisson J, Churchill R, et al. Psychological debriefing for preventing posttraumatic stress disorder (PTSD) Cochrane Database Syst Rev. 2002;(2):CD000560. doi: 10.1002/14651858.CD000560. [DOI] [PubMed] [Google Scholar]
  • 37.Roberts NP, Kitchiner NJ, Kenardy J, et al. Multiple session early psychological interventions for prevention of post-traumatic stress disorder. Cochrane Database Syst Rev. 2009;(3):CD006869. doi: 10.1002/14651858.CD006869.pub2. [DOI] [PubMed] [Google Scholar]
  • 38.Petrosino A, Turpin-Petrosino C, Buehler J. Scared Straight and other juvenile awareness programs for preventing juvenile delinquency. Sci Rev Ment Health Pract. 2005;4:48–54. doi: 10.1002/14651858.CD002796. [DOI] [PubMed] [Google Scholar]
  • 39.Lilienfeld SO. Psychological treatments that cause harm. Perspect Psychol Sci. 2007;2:53–70. doi: 10.1111/j.1745-6916.2007.00029.x. [DOI] [PubMed] [Google Scholar]
  • 40.Seligman MEP. The effectiveness of psychotherapy: the Consumer Reports study. Am Psychol. 1995;50:965–974. doi: 10.1037//0003-066x.50.12.965. [DOI] [PubMed] [Google Scholar]
  • 41.Jensen-Doss A, Weisz JR. Syndrome co-occurrence and treatment outcomes in youth mental health clinics. J Consult Clin Psychol. 2006;74:416–425. doi: 10.1037/0022-006X.74.3.416. [DOI] [PubMed] [Google Scholar]
  • 42.Kazdin AE, Whitley MK. Comorbidity, case complexity, and effects of evidence-based treatments for children referred for disruptive behavior. J Consult Clin Psychol. 2006;74:455–467. doi: 10.1037/0022-006X.74.3.455. [DOI] [PubMed] [Google Scholar]
  • 43.Hunsley J, Lee CM. Research-informed benchmarks for psychological treatments: efficacy studies, effectiveness studies, and beyond. Prof Psychol Res Pr. 2007;38:21–33. [Google Scholar]
  • 44.Stewart RE, Chambless DL. Cognitive-behavioral therapy for adult anxiety disorders in clinical practice: a meta-analysis of effectiveness studies. J Consult Clin Psychol. 2009;77:595–606. doi: 10.1037/a0016032. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Lee CM, Horvath C, Hunsley J. Does it work in the real-world? Effectiveness studies of treatments for psychosocial problems in children and adolescents. Prof Psychol Res Pr. 2013;44:81–88. [Google Scholar]
  • 46.Langer DA, McLeod BD, Weisz JR. Do treatment manuals undermine youth–therapist alliance in community clinical practice? J Consult Clin Psychol. 2011;79:427–432. doi: 10.1037/a0023821. [DOI] [PubMed] [Google Scholar]
  • 47.Kazdin AE. Psychotherapy for children and adolescents. Ann Rev Psychol. 2003;54:253–276. doi: 10.1146/annurev.psych.54.101601.145105. [DOI] [PubMed] [Google Scholar]
  • 48.Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: application of the distillation and matching model to 615 treatments from 322 randomized trials. J Consult Clin Psychol. 2009;77:566–579. doi: 10.1037/a0014565. [DOI] [PubMed] [Google Scholar]
  • 49.Weisz JR, Chorpita BF, Palinkas LA, et al. Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth. Arch Gen Psychiatry. 2012;69:274–282. doi: 10.1001/archgenpsychiatry.2011.147. [DOI] [PubMed] [Google Scholar]
  • 50.Glouberman S. Knowledge transfer and the complex story of scurvy. J Eval Clin Pract. 2009;15:553–557. doi: 10.1111/j.1365-2753.2009.01165.x. [DOI] [PubMed] [Google Scholar]
  • 51.Gallo KP, Barlow DH. Factors involved in clinician adoption and nonadoption of evidence-based interventions in mental health. Clin Psychol Sci Pr. 2012;19:93–106. [Google Scholar]
  • 52.Beidas RS, Kendall PC. Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pr. 2010;17:1–30. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Frank JR, Snell LS, Sherbino J, et al. Draft CanMEDS 2015 milestones guide—May 2014. Ottawa (ON): Royal College of Physicians and Surgeons of Canada; 2014. May, [Google Scholar]
  • 54.Isaacs D, Fitzgerald D. Seven alternatives to evidence-based medicine. BMJ. 1999;319:1618. doi: 10.1136/bmj.319.7225.1618. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Mullen EJ, Streiner DL. The evidence for and against evidence-based practice. Brief Treat Crisis Interv. 2004;4:111–1121. [Google Scholar]
  • 56.Paris J. Why is psychiatry prone to fads? Can J Psychiatry. 2013;58:560–565. doi: 10.1177/070674371305801004. [DOI] [PubMed] [Google Scholar]
  • 57.Goldberg D. Fads in psychiatry. Can J Psychiatry. 2013;58:553–554. doi: 10.1177/070674371305801001. [DOI] [PubMed] [Google Scholar]
  • 58.Yavchitz A, Boutron I, Bafeta A, et al. Misrepresentation of randomized controlled trials in press releases and news coverage: a cohort study. PLoS Med. 2012;9(9):e1001308. doi: 10.1371/journal.pmed.1001308. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Gonon F, Konsman J-P, Cohen D, et al. Why most biomedical findings echoed by newspapers turn out to be false: the case of attention deficit hyperactivity disorder. PLoS ONE. 2012;7(9):e44275. doi: 10.1371/journal.pone.0044275. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Canadian Psychiatric Association (CPA) Clinical practice guidelines [Internet] Ottawa (ON): CPA; 2001–2012. [cited 2015 Aug 1]. Available from: http://publications.cpa-apc.org/browse/documents/67. [Google Scholar]
  • 61.American Psychiatric Association (APA) Clinical practice guidelines [Internet] Arlington (VA): APA; 2015. [cited 2015 Oct 6]. Available from: http://www.psychiatry.org/psychiatrists/practice/clinical-practice-guidelines. [Google Scholar]
  • 62.EBBP: Evidence-Based Behavioral-Practice [Internet] Chicago (IL): Bonnie Spring, Northwestern University; 2007. [cited 2015 Oct 6]. Available from: http://www.ebbp.org. [Google Scholar]
  • 63.Division 12: Society for Clinical Psychology Research-supported psychological treatments [Internet] Atlanta (GA): Division 12 of the American Psychological Association; 2013. [cited 2014 Oct 1]. Available from: http://www.div12.org/psychological-treatments. [Google Scholar]
  • 64.National Institute for Health and Care Excellence (NICE) Guidance list [Internet] London (GB): NICE; 2014. [cited 2015 Oct 6]. Available from: http://www.nice.org.uk/guidance/published?type=cg. [Google Scholar]
  • 65.Berke DM, Rozell CA, Hogan TP, et al. What clinical psychologists know about evidence-based practice: familiarity with online resources and research methods. J Clin Psychol. 2011;67:329–339. doi: 10.1002/jclp.20775. [DOI] [PubMed] [Google Scholar]
  • 66.Edmunds JM, Beidel RS, Kendall PC. Dissemination and implementation of evidence-based practices: training and consultation as implementation strategies. Clin Psychol Res Pr. 2013;20:152–165. doi: 10.1111/cpsp.12031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.McMaster University . Psychotherapy training e-resources [Internet] Hamilton (ON): McMaster University; 2015. [cited 2015 Aug 1]. Available from: https://pter.mcmaster.ca. [Google Scholar]
  • 68.PracticeWise . What works in children’s mental health [Internet] Satellite Beach (FL): PracticeWise; 2005–2015. [cited 2014 Oct 1]. Available from: https://www.practicewise.com. [Google Scholar]
  • 69.Cuijpers P, Berking M, Andersson G, et al. A meta-analysis of cognitive-behavioural therapy for adults alone, and in comparison with other treatments. Can J Psychiatry. 2013;58:376–385. doi: 10.1177/070674371305800702. [DOI] [PubMed] [Google Scholar]
  • 70.Hunsley J, Elliott KP, Therrien Z. The efficacy and effectiveness of psychological treatments for mood, anxiety, and related disorders. Can Psychol. 2014;55:161–176. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials


Articles from Canadian Journal of Psychiatry. Revue Canadienne de Psychiatrie are provided here courtesy of SAGE Publications

RESOURCES