Skip to main content
Psychiatry, Psychology, and Law logoLink to Psychiatry, Psychology, and Law
. 2019 Aug 13;26(5):753–765. doi: 10.1080/13218719.2019.1618755

Science or pseudoscience? A distinction that matters for police officers, lawyers and judges

Louise Marie Jupe a,, Vincent Denault b,c
PMCID: PMC6896483  PMID: 31984109

Abstract

Scientific knowledge has been a significant contributor to the development of better practices within law enforcement agencies. However, some alleged ‘experts’ have been shown to have disseminated information to police officers, lawyers and judges that is neither empirically tested nor supported by scientific theory. The aim of this article is to provide organisations within the justice system with an overview of a) what science is and is not; b) what constitutes an empirically driven, theoretically founded, peer-reviewed approach; and c) how to distinguish science from pseudoscience. Using examples in relation to non-verbal communication, this article aims to demonstrate that not all information which is presented as comprehensively evaluated is methodologically reliable for use in the justice system.

Key words: pseudoscience, investigative interviews, trials, justice system, non-verbal communication

Introduction

Scientific knowledge has been a significant contributor to the development of better practices within law enforcement agencies. Academics often collaborate with various agencies to evaluate and advise upon empirically or theoretically supported approaches which aid the pursuit of justice (e.g. the High-Value Detainee Interrogation Group [HIG]; see FBI, 2018). Academics are often called upon to provide expert testimony within criminal trials (Brodsky, 2013) and can be key advisors to both prosecutors and defence lawyers.

However, some alleged ‘experts’ have been shown to have disseminated ambiguous information to members of the justice community – information that is neither empirically tested nor supported by scientific theory (see Lilienfeld & Landfield, 2008). Because of a lack of both scientific literacy (Fraigman, 2006; Moreno, 2003; Redding, Floyd, & Hawk, 2001; Tadei, Finnilä, Reite, Antfolk, & Santtila, 2016) and clear guidance on how to differentiate between what science is and what it is not, individuals within the justice system may not have the knowledge required to identify questionable information, or even pseudoscience. Pseudoscience has been referred to as the ‘romanticisation’ of science, and is often based on little more than myths and legends (Allchin, 2004):

Pseudoscience is necessarily defined by its relation to science and typically involves subjects that are either on the margins or borderlands of science and are not yet proven, or have been disproven, or make claims that sound scientific but in fact have no relationship to science. (Shermer, 2013, p. 203)

Therefore, understanding what pseudoscience is – and how to distinguish it from science – is crucial both in evaluating the approaches used by so-called experts and in developing better professional practice. This is an important undertaking, as the use of and reliance upon pseudoscience by members of the justice community can result in adverse human, social and economic consequences (e.g., Denault & Jupe, 2017; Kageleiry, 2007; Lilienfeld & Landfield, 2008; Makgoba, 2002; White, 2014). However, despite the rising popularity of pseudoscience (Heller, 2017), the justice system does not have a standardised advisory system which informs police officers, lawyers and judges about how to differentiate science from pseudoscience. The aim of this article therefore is to provide organisations within the justice system with an overview of a) what science is and is not; b) what constitutes an empirically driven, theoretically founded, peer-reviewed approach; and c) how to distinguish science from pseudoscience. Using examples in relation to non-verbal communication, this article aims to demonstrate that not everything which is presented as comprehensively evaluated is methodologically reliable for use in the justice system.

How to distinguish ‘nonscience’ from science

Philosophers nowadays recognize that there is no sharp line dividing sense from nonsense, and moreover that doctrines starting out in one camp may over time evolve into the other. For example, alchemy was a (somewhat) legitimate science in the times of Newton and Boyle, but it is now firmly pseudoscientific (movements in the opposite direction, from full-blown pseudoscience to genuine science, are notably rare). (Pigliucci & Boudry, 2013a)

Whilst distinguishing pseudoscience from science can be difficult, there are methods to aid in making such distinctions. Several indicators which are suggestive of ‘nonscience’ have been proposed, such as lack of falsifiability, misuse of scientific vocabulary, absence of connectivity, extravagant claims, argument from authority and lack of self-correction (Damer, 2013; Denault, 2015; Lilienfeld & Landfield, 2008). However, whilst such cautionary advice may entice some individuals to raise questions regarding ambiguous approaches, in practice pseudoscientists can often provide what appear to be convincing counterarguments. Synergology is a good example of this, as expounded upon below.

The pseudoscience of synergology

Synergology is a self-proclaimed ‘scientific discipline for reading body language’ (Synergology, 2019b, authors’ translation). The proponents of synergology claim that the approach allows the extrapolation of specific states of mind from non-verbal behaviour, supposedly based upon the rigorous analysis of thousands of videos (e.g. Axelrad, 2012; Jarry, 2016, 2018; Moody, 2014; Turchet, 2012). Training sessions in synergology are offered to health, education, justice and security professionals by synergologists (individuals who have received at least 200 hours of synergology training; see Synergology, 2019a). Recently, synergology has been marketed as a discipline that can help in preventing terrorist attacks:

Whether used for crowd monitoring, interrogation, videos analysis or through surveillance cameras, the observation of a suspect or of an interaction between people could prevent terrorist attacks, manage a crisis and more. Synergology’s analysis of non-verbal behaviour is a logical complement to the important work of the various security officers in reading a threat. (Gagnon, 2018)

According to the founder of synergology, ‘[e]motions hold a fundamental place in our lives as human beings. They are at the root of all our decisions, yet they are, paradoxically ignored by mainstream science’ (Turchet, 2012, p. 17). Synergology thus purports to offer a way to understand emotions: ‘Scratching the body or the face is an expression of repressed emotions’ (Turchet, 2012, p. 150). For example, according to the founder of synergology,

[t]he joints give flexibility to the body. The brain moves the hands there each time that the ability to be or an interest in being flexible is questioned […] When someone scratches the inside of the left elbow, the need to become flexible, to change the rhythm of the relationship, is tackled. (Turchet, 2012, pp. 182–183)

Synergology associates such meanings with all parts of the face and body. Claims specific to synergology do not appear to have been subjected to peer review or replication – yet when synergology is in the spotlight, its proponents are fairly proficient at providing counterarguments.

Regarding the lack of falsifiability, the proponents of synergology assert that claims specific to their approach are falsifiable and therefore scientific (Quebec Association of Synergology, 2019). This misuse of scientific vocabulary will not be apparent to those members of the justice community who do not understand what falsifiability is. Falsifiability refers to the potential for a theory or hypothesis to be shown as false through contradictive statements or observation, and is an essential component of the scientific method. If the theory or hypothesis is not thoroughly justified, falsifiability cannot be ascertained – and if it does not allow for testable predictions, it is not falsifiable (Popper, 1968).

However, claims specific to synergology are not published in peer-reviewed papers. According to Philippe Turchet, the founder of synergology, as well as other synergologists, ‘a synergologist has no peer’ (Jarry, 2016) – i.e. nobody but a synergologist can criticise synergology. Therefore, a statement regarding falsifiability is misleading – more so considering that falsifiability as a demarcation criterion is still debated (Pigliucci & Boudry, 2013b) and that pseudoscientists can offer falsifiable claims (e.g. graphologists; see Lilienfeld & Landfield, 2008). This is more apparent when considering that the founder of synergology has also stated that ‘what we absolutely do not believe in within synergology is experiment, because body language is made in such a way that when we participate in an experiment, it does not work’ (European Institute of Synergology, 2015, the authors translation) and argued that ‘you can’t use replication when dealing with humans’ (Jarry, 2016).

Regarding the absence of connectivity, pseudoscientists can combine their claims with common sense and scientific assertions, which may suggest that their theories or hypotheses were not developed in isolation (e.g. Denault & Jupe, 2017). For example, proponents of synergology regularly assert that one should not jump to conclusions whilst observing others (e.g. Moody, 2014), something that has been advised by academics in the past (e.g. Ekman 1992). However, they subsequently make extravagant claims, such as that the holding of your right hand with your left hand ‘indicates a control of the speech, a filtering of the words used and the rationalization of the emotion’ (Gagnon, 2018). Several other extravagant claims are made – for example: ‘Our methods permit the detection of 80 percent of lies in this test called “guilty/innocent” […] The success rate is 90 percent when people work in a group’ (Turchet, 2012, p. 322).

However, if judicial officers do not understand the science around the topic within the training sessions they receive, these common sense and scientific assertions will likely appease suspicions of extravagant claims – more so if pseudoscientists describe their field as a scientific discipline. This is even more likely if pseudoscientists refer to important practitioners and organisations to which they have also provided training (Denault, Larivée, Plouffe, & Plusquellec, 2015). Although such an argument from authority should raise questions, it can also be quite persuasive to naïve observers. Ultimately, if pseudoscientists are challenged by academics regarding their arguments from authority, their retaliatory statements can infer that the academics are themselves using arguments from authority when they refer to peer-reviewed papers, and that their criticism is therefore unfounded – notably because the pseudoscientists’ approach has allegedly evolved, thus counteracting a supposed lack of self-correction (Denault, 2019).

In view of the foregoing, for dubious claims to be rejected by members of the judiciary there needs to first be an understanding of what constitutes a peer-reviewed paper. This must include an understanding of the publishing process; that is, how academics empirically and/or theoretically study a specific aspect within the justice system and then have their work scrutinised by members of the scientific community both before and after it is published in a scientific journal (Ware, 2008).

For example, after academics have run an empirical study, a manuscript is prepared that reports on all aspects of their research process. The academics thoroughly explain in writing how they conducted their study, the results they obtained, how they carried out the analysis of their results, their conclusions and also any limitations of their analysis (Shipman, 2014). The reasons for work being so rigorously reported is to allow readers to draw sound evaluative conclusions from a manuscript while considering any pitfalls and allowing for replication; that is, can another researcher take the manuscript and run the same study again to either support or contest the original findings? This is a critical part of the publishing process within psychological science (Asendorpf et al., 2013; Lindsay, 2015).

Once completed, the manuscript is subsequently submitted to a scientific journal – also known as a peer-reviewed journal – and subjected to a critical review from experts on the subject of the manuscript. Following the critical review, the manuscript may be rejected, the authors may be asked to revise and resubmit with major or minor amendments or it may be accepted for publication. Once a manuscript is accepted, it is then published in accordance with the journal’s specifications and becomes a peer-reviewed paper. This is a process that has been adapted over the years to suit the ever-growing need for a stringent evaluation process (Spier, 2002).

It should be noted, however, that the publication process is lengthy at best. Scientific journals can take three to six months to undertake an initial review of a manuscript, and then – depending on subsequent requests – a further three to nine months for amendments and a further three to six months for publication. These are approximate figures, which vary within disciplines and from one scientific journal to another. Rejection rates also differ between scientific journals. American Psychological Association (APA) journals have an average rejection rate of 70% – for example, Psychological Review has a rejection rate of 86% and Professional Psychology: Research and Practice has a rejection rate of 56% (American Psychological Association, 2018).

There are of course instances when manuscripts are not reviewed by individuals with the specific credentials required (Elmore, 2017), and weakly founded articles do occasionally slip through the net and into publication – but such instances will likely decline with the recent transition to more open and transparent science (Open Science Collaboration, 2012). This allows other academics worldwide to evaluate peer-reviewed papers critically and provide commentary – and in cases of serious misrepresentation, papers may be retracted. The humanities, however, have been shown to be one of the most stringent in terms of the peer-review process (Huisman & Smits, 2017). Therefore, if judicial officers understand what peer-reviewed papers are, they will more frequently reject approaches which are claimed to be ‘rigorous’ or ‘scientifically founded’ but in fact have never been subject to the process of critical appraisal.

From the field to the laboratory and back to the field

To appreciate the value of knowledge in peer-reviewed papers, one should also understand what comes prior to the publication process. Research by academics often starts in the laboratory using willing participants, and then moves into the field; that is, the techniques which are found to have solid empirical support in the laboratory are then evaluated within the justice system itself. This approach often stems from previously identified problems. A prime example of this is the cognitive interview (CI). After the RAND corporation evaluated the criminal investigation process, it became clear that the testimony given by witnesses is key in the collection and evaluation of evidence (Greenwood & Petersilia, 1975). From this, Geiselman, Fisher, MacKinnon, and Holland (1986) developed the CI as a means of increasing the accuracy of eyewitness memory. After a series of initial laboratory studies, the CI was then tested in the field (Fisher, Geiselman, & Amador, 1989) and has since become standard practice both within police investigations and as part of continuous development, including within the field of deception detection (Dodson, Powers, & Lytell, 2015; Frosina et al., 2018; Sooniste, Granhag, Strömwall, & Vrij, 2015). Further examples of laboratory-to-field approaches include treatment programmes for sex offenders (Brown, 2005; Friendship, Mann, & Beech, 2003) and violent offenders (Serin, Gobeil, & Preston, 2009).

However, standard practices that are used during police investigations can lack empirical evidence; that is, despite being examined under the empirical magnifying glass, the results may not have been replicated and the findings may not be supported. One primary example of this is the behavioural analysis interview (BAI), which claims to be similar to the CI but uses behaviour-provoking questions to try to elicit specific behavioural indications of a suspect’s guilt or innocence (Inbau, Reid, Buckley, & Jayne, 2013). The BAI relies heavily on non-verbal and (para)linguistic cues that deception research has shown to be unreliable (DePaulo et al., 2003). In addition, as noted by Vrij, Hope, and Fisher (2014), a field study often cited in defence of the BAI was only able to establish ground truth in 2 of the 60 cases that were examined (Horvath, Jayne, & Buckley, 1994). In addition, empirically tested studies have found the opposite of what the BAI claims to elicit from interviewees (Vrij, Mann, & Fisher, 2006), and the BAI has also been shown to be based upon little more than ‘common sense’ assumptions (Masip, Barba, & Herrero, 2012). A further example of a tool lacking empirical evidence referred to by Vrij et al. (2014) is that of ‘micro-expressions’, which are commonly used amongst practitioners within the detection of deception. Whilst micro-expressions were first introduced by Ekman (1992) as a symptomatic indication of ‘leakage’, there is no evidence that micro-expressions are valid cues for detecting deception in real time (Honts, Hartwig, Kleinman, & Meissner, 2009) or that they occur often (Porter & ten Brinke, 2008). Furthermore, their use (non-verbal behaviour) could be detrimental to one’s ability to detect lies (Bond & DePaulo, 2006).

The ‘anything goes’ nature of pseudoscience

Organisations within the justice system do use empirically and theoretically supported approaches (e.g. Leone, 2015; Memon, Meissner, & Fraser, 2010). However, some implemented approaches lack empirical evidence. In more perturbing cases, police officers, lawyers and judges may resort to pseudoscience – that is, bodies of information that may appear to be scientific but, in reality, lack the characteristics of scientific knowledge (Lilienfeld, Lynn, & Lohr, 2014). As aforementioned, if members of the justice community are not advised about the publishing process then pseudoscientists can be fairly proficient at providing counterarguments. In addition, pseudoscientists can use several other fallacious arguments to achieve maximum support for their approaches.

For example, pseudoscientists might argue that their approaches are supported by a select number of articles, theses or books, and that they are reliable due to their acceptance by important organisations (Denault, Larivée, Plouffe, & Plusquellec, 2015). However, if upon reading such literature it becomes apparent that there is no empirical or theoretical support, or that the steps leading to the conclusions are not thoroughly justified (be this methodologically or through evaluation), the implementation of their approaches remains merely destitute of vision. In addition, such reference to important organisations – often known as ‘name-dropping’ – is detrimental by nature; doing so lends support to the notion that one might be unable to distinguish pseudoscience from science and may not understand the role that science plays in developing better professional practice.

Fallacious arguments from pseudoscientists can also address negative comments in a way that attempts to discourage further criticism from members of the scientific community. They can engage in legal threats (Jarry, 2019) and ad hominem attacks – that is, opposition to an argument ‘by questioning the personal circumstances or personal trustworthiness of the arguer who advanced it’ (Walton, 1987, p. 317). For example, if academics raise concerns regarding a particular pseudoscience without having attended its associated seminars, pseudoscientists might assert that the academics do not have the required understanding and that, as such, their criticism is of no value. If the academics had indeed attended the seminars, the pseudoscientists might instead suggest that their concerns are raised out of obscure or malicious reasons (Denault, 2018; Larivée, 2014; Shermer, 2002). Pseudoscientists might even state that they are criticised due to their revolutionary approach and refer to a quote dubiously attributed to the German philosopher Arthur Schopenhauer: ‘All truth passes through three stages. First, it is ridiculed. Second, it is violently opposed. Third, it is accepted as self-evident’ (Shallit, 2005).

However, as Sagan rightly points out,

the fact that some geniuses were laughed at does not imply that all who are laughed at are geniuses. They laughed at Columbus, they laughed at Fulton, they laughed at the Wright brothers. But they also laughed at Bozo the Clown. (Sagan, 1979, p. 75)

Unfortunately, if organisations within the justice system encounter and use pseudoscientific approaches, the above fallacious arguments can still be persuasive as counterarguments to criticism (Blancke, Boudry, & Pigliucci, 2017).

Although it might be a comfortable assumption to blame organisations within the justice system who resort to pseudoscience, such a conclusion is far too simplistic. There is no clear guidance available to police officers, lawyers and judges on how to recognise empirically driven, theoretically founded, peer-reviewed approaches from ambiguous ones. Moreover, if organisations within the justice system do not have access to empirically and/or theoretically supported approaches and are turning to what is easily accessible, part of the responsibility lies with academics (Colwell, Miller, Miller, & Lyons, 2006; Denault et al., 2019). If academics have not adequately disseminated scientific knowledge and developed clear guidance on how to recognise what is and what is not science, it should not be surprising when ‘nonscience’ finds its way to members of the justice community – more so considering the large body of questionable information on forensic science being broadcast by popular media streams.

Whilst an influx of evidence-based practices within medicine has infiltrated popular media, and thus the mainstream (Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996), the same is not true for justice practices. In fact, television programmes have contributed to distorted knowledge amongst the public, such as the ‘CSI effect’ (Byers & Johnson, 2009; Schweitzer & Saks, 2007), which is commonly referred to as having had a detrimental effect on jurors’ arbitrary beliefs regarding forensic evidence:

Prosecutors, judges and police officers have noted what they believe to be a so-called CSI effect whereby the popular television forensics programs have led jurors to have unreasonable expectations for the quality and quantity of physical evidence. (Houck, 2006, p. 86)

In addition, evidence suggests that watching the popular television series Lie to Me actually decreases individuals’ ability to detect deception (Levine, Serota, & Shulman, 2010). Lie to Me is heavily built upon the concept of using micro-expressions as a tool for detecting deception, which has little or no empirical support within the scientific literature (e.g. Vrij et al., 2017). Whilst such examples are not directly related to pseudoscience per se, they exemplify the ease with which questionable information is able to have a wide and unfavourable effect on an audience.

Science or pseudoscience? A working example of non-verbal communication

When a decision is made as to whether or not a particular approach should be presented to members of the justice community, an initial assessment should be required as to whether or not the concepts disseminated are in fact empirically driven, theoretically founded and peer-reviewed. If the evidence which supports the approach does not meet these requirements, or if the approach has the potential to cause serious harm (e.g. Denault et al., 2019), questions should arise over its place within the justice system. However, counterarguments to criticism and fallacious arguments can be compelling. Pseudoscience can seem logical and appear to be adequately supported; very plausible and comprehensible statements can stem from pseudoscience, whilst the most incomprehensible or confusing may be of high scientific significance:

Thus a statement may be pseudoscientific even if it is eminently ‘plausible’ and everybody believes in it, and it may be scientifically valuable even if it is unbelievable and nobody believes in it. A theory may even be of supreme scientific value even if no one understands it, let alone believes in it. (Lakatos, 1980, p. 1)

Therefore, before an initial assessment, police officers, lawyers and judges should be advised to refrain from too readily concluding that a particular approach is scientifically valuable. This call to caution is all the more important considering that pseudoscientists can combine their claims with common sense and scientific assertions (e.g. Denault & Jupe, 2017).

When ambiguous approaches find their way into the justice system, can this result in dire consequences?

In the following example of a training session offered to judicial officers by a so-called expert, the approach asserts that different facial expressions and gestures are associated with particular states of mind, irrespective of the fact that there are no peer-reviewed papers to support such associations. Even if some of the pseudoscientist’s claims appear to be extraordinary, any concern can be appeased by reasonable underlying principles; thus, a combination of scientific and pseudoscientific assertions gives the impression that the approach is grounded in science. For example, the so-called expert might assert that no single facial expression or gesture, such as Pinocchio’s nose, gives away lies – an empirically supported assumption (Vrij, 2008) – and that one should look for a combination of non-verbal cues and ask further questions to substantiate initial observations before making definitive conclusions as to whether or not someone is lying. However, whilst this advice may appear to be empirically driven, theoretically founded and peer-reviewed, evidence suggests that it is not (e.g. DePaulo et al., 2003; Hartwig & Bond, 2011; Vrij, 2008).

Since the 1960s, thousands of peer-reviewed papers have addressed the issue of non-verbal communication (Burgoon, Guerrero, & Floyd, 2016; Knapp, Hall, & Horgan, 2014; Moore, Hickson, & Stacks, 2014). The overall scientific consensus is that there is no cue akin to Pinocchio’s nose when it comes to detecting deception (Vrij, 2008). Of the cues that have been shown to have an association, the correlation is often weak (DePaulo et al., 2003). Therefore, the advice to look for a combination of non-verbal cues in face-to-face interactions, and to ask further questions, can be inadequate – and more importantly – unsafe during investigative interviews and trials.

For example, the pseudoscientist may assert that hiding one’s hands, scratching one’s nose, lowering one’s head, closing one’s mouth and looking in specific directions are non-verbal indicators of deceit (Denault, 2015). However, many of these indicators stem from stereotypical beliefs regarding deceptive behaviours (Bogaard, Meijer, Vrij, & Merckelbach, 2016; Global Deception Research Team, 2006). In fact, research suggests that indirect methods are more likely to result in higher accuracy rates when making deception judgements (ten Brinke, Stimson, & Carney, 2014; cf. Bond, Levine, & Hartwig, 2015). Furthermore, some individuals, whilst achieving quite high accuracy rates when making deception judgements, mention indicators that are not present during the interview they have observed (Jupe, Akehurst, Vernham, & Allen, 2016). Therefore, considering that there is no conclusive scientific evidence for the above indicators and that non-verbal indicators of deception are generally faint and invalid, decisions which are made by judicial officers through looking for a combination of non-verbal cues not supported by peer-reviewed evidence are likely to be inaccurate (Denault & Jupe, 2017; DePaulo et al., 2003; Otgaar & Howe, 2017).

In addition, if members of the justice community ask further questions to substantiate their initial veracity judgements, they could unknowingly steer their interaction towards confirming their belief that a witness, or a suspect, is lying. This is known as confirmation bias; that is, ‘the seeking or interpreting of evidence in ways that are partial to existing beliefs, expectations, or a hypothesis in hand’ (Nickerson, 1998, p. 175).

Confirmation bias often results in guilt-presumptive questioning during investigative interviews which, when listened to by independent evaluators, frequently leads to a self-fulfilling bias (Hill, Memon, & McGeorge, 2008). Furthermore, forming an initial presumption of guilt based upon questionable information, or upon pseudoscience, may mean that those involved in the investigative process fail to initiate dialogue with suspects which would enable the eliciting of verifiable (Nahari, 2018) and reliable forms of information (Vrij & Granhag, 2012). During trial, a confirmation bias can lead to erroneous credibility assessments (Porter & ten Brinke, 2009; Porter, ten Brinke, & Gustaw, 2010). Considering that ‘[c]redibility is an issue that pervades most trials, and at its broadest may amount to a decision on guilt or innocence’ (R. v. Handy, 2002, p. 951), the implementation of ambiguous approaches, or even pseudoscience, is of serious concern. This manifestation of confirmation bias can be totally unintended but nevertheless can result in adverse human, social and economic consequences (Hill et al., 2008; Vrij et al., 2017).

Conclusion: evidence is not only a matter of investigation

The aim of this article was to provide the justice system with an overview of what science is and what it is not, what constitutes an empirically driven, theoretically founded, peer-reviewed approach and how to distinguish science from pseudoscience. Whilst the importance of empirically and theoretically supported approaches has been outlined, there is no reason to question the intentions of most pseudoscientists. Advocates of pseudoscience usually have the primary intention of assisting police officers, lawyers and judges. However, good faith is not a synonym of good practice. When approaches are implicitly or explicitly presented as scientific or when science is used as a backdrop to create authenticity and influence, the justice system needs to acknowledge that evidence is not only a matter of investigation. Before the presentation of training sessions, police officers, lawyers and judges should systematically request and evaluate the supporting evidence for the proposed approach. It is recommended that organisations within the justice system set up a joint advisory committee of academics and practitioners to request and evaluate the supporting evidence for training sessions offered to police officers, lawyers and judges. This would allow an assessment to be made as to whether or not an approach is in fact empirically driven, theoretically founded and peer-reviewed.

Finally, the implementation of approaches that may appear to be scientific should be preceded with careful consideration, even if their subject matter is listed as a soft skill: ‘Soft skills are interpersonal qualities, also known as people skills, and personal attributes that one possesses’ (Robles, 2012, p. 453). For example, one might intuitively believe that a soft skill such as non-verbal communication has a lower value than several other skill sets. However, non-verbal communication can have a ubiquitous influence on a number of daily decisions made by police officers, lawyers and judges, including those made during investigative interviews and trials (e.g. Abbe & Brandon, 2014; Broaders & Goldin-Meadow, 2010; Denault, 2015). Therefore, organisations within the judicial system should be acutely aware of the importance of distinguishing pseudoscience from science and understanding the role that science plays in developing better professional practice. When thousands of peer-reviewed papers address a subject matter, the scientific knowledge should, at the very least, be understood and considered. Failing to do so could ultimately result in miscarriages of justice (e.g. Kozinski, 2015).

Acknowledgements

The authors would like to thank Jonathan Jarry for his constructive comments on an earlier version of this manuscript.

Ethical standards

Declarations of conflicts of interest

Louise Jupe has declared no conflicts of interest.

Vincent Denault has declared no conflicts of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors .

References

  1. Abbe A., & Brandon S. E. (2014). Building and maintaining rapport in investigative interviews. Police Practice and Research: An International Journal, 15, 207–220. doi: 10.1080/15614263.2013.827835 [DOI] [Google Scholar]
  2. Allchin D. (2004). Pseudohistory and pseudoscience. Science & Education, 13, 179–195. doi: 10.1023/B:SCED.0000025563.35883.e9 [DOI] [Google Scholar]
  3. American Psychological Association (2018). Summary report of journal operations, 2017. American Psychologist, 73, 683–684. doi: 10.1037/amp0000347 [DOI] [PubMed] [Google Scholar]
  4. Asendorpf J. B., Conner M., De Fruyt F., De Houwer J., Denissen J. J. A., Fiedler K., … Wicherts J. M. (2013). Recommendations for increasing replicability in psychology. European Journal of Personality, 27, 108–119. doi: 10.1002/per.1919 [DOI] [Google Scholar]
  5. Axelrad B. (2012). Quand le corps dit tout haut ce que l’esprit pense tout bas [When the body says loudly what the mind thinks down] Retrieved from http://www.pseudo-sciences.org/spip.php?article1911
  6. Blancke S., Boudry M., & Pigliucci M. (2017). Why do irrational beliefs mimic science? The cultural evolution of pseudoscience. Theoria, 83, 78–97. doi: 10.1111/theo.12109 [DOI] [Google Scholar]
  7. Bogaard G., Meijer E. H., Vrij A., & Merckelbach H. (2016). Strong, but wrong: Lay people’s and police officers’ beliefs about verbal and nonverbal cues to deception. Plos One, 11, e0156615. doi: 10.1371/journal.pone.0156615 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bond C. F., & DePaulo B. M. (2006). Accuracy of deception judgments. Personality and Social Psychology Review, 10, 214–234. doi: 10.1207/s15327957pspr1003_2 [DOI] [PubMed] [Google Scholar]
  9. Bond C. F., Levine T. R., & Hartwig M. (2015). New findings in non-verbal lie detection In Granhag P. A., Vrij A., & Vershuere B. (Eds.). Detecting deception: Current challenges and cognitive approaches. Chichester, UK: John Wiley & Sons. [Google Scholar]
  10. Broaders S. C., & Goldin-Meadow S. (2010). Truth is at hand: How gesture adds information during investigative interviews. Psychological Science, 21, 623–628. doi: 10.1177/0956797610366082 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Brodsky S. L. (2013). Testifying in court: Guidelines and maxims for the expert witness (2nd ed.). Washington, DC: American Psychological Association. doi: 10.1037/14037-000 [DOI] [Google Scholar]
  12. Brown S. (2005). Treating sex offenders: An introduction to sex offender treatment programmes. Cullompton: Willan. [Google Scholar]
  13. Burgoon J. K., Guerrero L. K., & Floyd K. (2016), Nonverbal communication. New York: Routledge. [Google Scholar]
  14. Byers M., & Johnson V. M. (2009). The CSI effect: television, crime, and governance. Plymouth: Lexington Books. [Google Scholar]
  15. Colwell L. H., Miller H. A., Miller R. S., & Lyons P.M. (2006). US police officers’ knowledge regarding behaviors indicative of deception: Implications for eradicating erroneous beliefs through training. Psychology, Crime & Law, 12, 489–503. doi: 10.1080/10683160500254839 [DOI] [Google Scholar]
  16. Damer T. E. (2013). Attacking faulty reasoning: A practical guide to fallacy-free arguments (7th ed). Boston, MA: Cengage Learning. [Google Scholar]
  17. Denault V. (2015). Communication non verbale et crédibilité des témoins [Non-verbal communication and the credibility of witnesses]. Cowansville: Yvon Blais. [Google Scholar]
  18. Denault V. (2019). Developing critical thinking in a world of irrational beliefs. Autoethnographic perspectives from a former advocate of pseudoscience on transitioning to evidence-based academia. Manuscript in preparation. [Google Scholar]
  19. Denault V., & Jupe L. M. (2017). Justice at risk! An evaluation of a pseudoscientific analysis of a witness’ nonverbal behavior in the courtroom. The Journal of Forensic Psychiatry & Psychology, 126, 1–22. doi: 10.1080/14789949.2017.1358758 [DOI] [Google Scholar]
  20. Denault V., Larivée S., Plouffe D., & Plusquellec P. (2015). La synergologie, une lecture pseudoscientifique du langage corporel. Revue de psychoéducation, 43, 425–455. doi: 10.7202/1039262ar [DOI] [Google Scholar]
  21. Denault V., Plusquellec P., Jupe L. M., St-Yves M., Dunbar N. E., Hartwig M., … van Koppen P. (2019). The analysis of nonverbal communication: The dangers of pseudoscience in security and justice contexts. Anuario de Psicología Jurídica. doi: 10.5093/apj2019a9 [DOI] [Google Scholar]
  22. DePaulo B. M., Lindsay J. J., Malone B. E., Muhlenbruck L., Charlton K., & Cooper H. (2003). Cues to deception. Psychological Bulletin, 129, 2003, 74–118. doi: 10.1037/0033-2909.129.1.74 [DOI] [PubMed] [Google Scholar]
  23. Dodson C. S., Powers E., & Lytell M. (2015). Aging, confidence, and misinformation: recalling information with the cognitive interview. Psychology and Aging, 30, 46–61. doi: 10.1037/a0038492 [DOI] [PubMed] [Google Scholar]
  24. Ekman P. (1992). Telling lies: Clues to deceit in the marketplace, politics, and marriage. New York: W. W. Norton. [Google Scholar]
  25. Elmore S. A. (2017). Update on the manuscript peer review process. Toxicologic Pathology, 45, 1028–1031. doi: 10.1177/0192623317742616 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. European Institute of Synergology (2015). Info synergo! !! [Online video]. Retrieved July 28, 2017, from https://www.facebook.com/InstitutEuropeenDeSynergologie/videos/598476310301856
  27. FBI [Federal Bureau of Investigation] (2018). High-Value Detainee Interrogation Group Retrieved from https://www.fbi.gov/about/leadership-and-structure/national-security-branch/high-value-detainee-interrogation-group
  28. Fisher R. P., Geiselman R. E., & Amador M. (1989). Field test of the cognitive interview: Enhancing the recollection of actual victims and witnesses of crime. Journal of Applied Psychology, 74, 722–727. doi: 10.1037/0021-9010.74.5.722 [DOI] [PubMed] [Google Scholar]
  29. Fraigman D. L. (2006). Judges as amateur scientists. Boston University Law Review, 86, 1207–1226. [Google Scholar]
  30. Friendship C., Mann R. E., & Beech A. R. (2003). Evaluation of a national prison-based treatment program for sex offenders in England and Wales. Journal of interpersonal violence, 18, 744–759. doi: 10.1177/0886260503253236 [DOI] [PubMed] [Google Scholar]
  31. Frosina P., Logue M., Book A., Huizinga T., Amos S., & Stark S. (2018). The effect of cognitive load on nonverbal behavior in the cognitive interview for suspects. Personality and Individual Differences, 130, 51–58. doi: 10.1016/j.paid.2018.03.012 [DOI] [Google Scholar]
  32. Gagnon C. (2018). ‘State of the art’ agent: Threat detection Retrieved from http://www.christinegagnon.ca/en/blog/state-of-the-art-agent-threat-detection
  33. Geiselman R. E., Fisher R., MacKinnon D. P., & Holland H. L. (1986). Enhancement of eyewitness memory with the cognitive interview. The American Journal of Psychology, 99, 385–401. doi: 10.2307/1422492 [DOI] [PubMed] [Google Scholar]
  34. Global Deception Research Team (2006). A world of lies. Journal of Cross-Cultural Psychology, 37, 60–74. doi: 10.1177/0022022105282295 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Greenwood P. W., & Petersilia J. R. (1975). The criminal investigation process. Santa Monica, CA: RAND Corporation. [Google Scholar]
  36. Hartwig M., & Bond C. F. (2011). Why do lie-catchers fail? A lens model meta-analysis of human lie judgments. Psychological Bulletin, 137, 643–659. doi: 10.1037/a0023589 [DOI] [PubMed] [Google Scholar]
  37. Heller Á. (2017). Reflections on gullibility. Telos, 2017, 36–47. doi: 10.3817/0617179036 [DOI] [Google Scholar]
  38. Hill C., Memon A., & McGeorge P. (2008). The role of confirmation bias in suspect interviews: A systematic evaluation. Legal and Criminological Psychology, 13, 357–371. doi: 10.1348/135532507X238682 [DOI] [Google Scholar]
  39. Honts C. R., Hartwig M., Kleinman S. M., & Meissner C. A. (2009). Credibility assessment at portals: Portals committee report. Final Report of the Portals Committee to the Defense Academy for Credibility Assessment. [Google Scholar]
  40. Horvath F., Jayne B., & Buckley J. (1994). Differentiation of truthful and deceptive criminal suspects in behavior analysis interviews. Journal of Forensic Sciences, 39, 793–807. doi: 10.1520/JFS13657J [DOI] [PubMed] [Google Scholar]
  41. Houck M. M. (2006). CSI: Reality. Scientific American, 295, 84–89. doi: 10.1038/scientificamerican0706-84 [DOI] [PubMed] [Google Scholar]
  42. Huisman J., & Smits J. (2017). Duration and quality of the peer review process: The author’s perspective. Scientometrics, 113, 633–650. doi: 10.1007/s11192-017-2310-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Inbau F. E., Reid J. E., Buckley J. P., & Jayne B. C. (2013). Criminal interrogation and confessions. Burlington, MA: Jones & Bartlett Learning. [Google Scholar]
  44. Jarry J. (2016, May 21). Vlog 10: Lie to me, synergology [Online video]. Retrieved from https://www.youtube.com/watch?v=F2kvuLG_57c
  45. Jarry J. (2018, January 12). Lies and nonverbal communication [Online video]. Retrieved from https://www.youtube.com/watch?v=qIgwbE7XUC8
  46. Jarry, J. (2019). Censoring science communication by screaming defamation. Retrieved from https://www.mcgill.ca/oss/article/general-science/censoring-science-communication-screaming-defamation [Google Scholar]
  47. Jupe L. M., Akehurst L., Vernham Z., & Allen J. (2016). Teenage offenders’ ability to detect deception in their peers. Applied Cognitive Psychology, 30, 401–408. doi: 10.1002/acp.3214 [DOI] [Google Scholar]
  48. Kageleiry P. J. (2007). Psychological police interrogation methods: Pseudoscience in the interrogation room obscures justice in the courtroom. Military Law Review, 193, 1–51. [Google Scholar]
  49. Knapp M., Hall J., & Horgan T. (2014). Nonverbal communication in human interaction. Boston, MA: Wadsworth. [Google Scholar]
  50. Kozinski A. (2015). Preface: Criminal law 2.0. Georgetown Law Journal Annual Review Criminal Procedure, 44, iii–xliv. [Google Scholar]
  51. Lakatos I. (1980). The methodology of scientific research programmes. Cambridge: Cambridge University Press. [Google Scholar]
  52. Larivée S. (2014). Quand le paranormal manipule la science [When the paranormal manipulates the science]. Montréal: Multimondes [Google Scholar]
  53. Leone C. (2015). U.S. v. Hodge: A case study in the use of the cognitive interview as a means to promote therapeutic jurisprudence in sexual assault cases under the Uniform Code of Military Justice (UCMJ). The Air Force Law Review, 74, 201–228. [Google Scholar]
  54. Levine T. R., Serota K. B., & Shulman H. C. (2010). The impact of Lie to Me on viewers’ actual ability to detect deception. Communication Research, 37, 847–856. doi: 10.1177/0093650210362686 [DOI] [Google Scholar]
  55. Lilienfeld S. O., & Landfield K. (2008). Science and pseudoscience in law enforcement: A user-friendly primer. Criminal Justice and Behavior, 35, 1215–1230. doi: 10.1177/0093854808321526 [DOI] [Google Scholar]
  56. Lilienfeld S. O., Lynn S. J., & Lohr J. M. (2014). Science and pseudoscience in clinical psychology. New York: Guilford Press. [Google Scholar]
  57. Lindsay D. S. (2015). Replication in psychological science. Psychological Science, 26, 1827–1832. doi: 10.1177/0956797615616374 [DOI] [PubMed] [Google Scholar]
  58. Makgoba M. W. (2002). Politics, the media and science in HIV/AIDS: The peril of pseudoscience. Vaccine, 20, 1899–1904. doi: 10.1016/S0264-410X(02)00063-4 [DOI] [PubMed] [Google Scholar]
  59. Masip J., Barba A., & Herrero C. (2012). Behaviour Analysis Interview and common sense: A study with novice and experienced officers. Psychiatry, Psychology and Law, 19, 21–34. doi: 10.1080/13218719.2010.543402 [DOI] [Google Scholar]
  60. Memon A., Meissner C. A., & Fraser J. (2010). The cognitive interview: A meta-analytic review and study space analysis of the past 25 years. Psychology, Public Policy, and Law, 16, 340–372. doi: 10.1037/a0020518 [DOI] [Google Scholar]
  61. Moody M. (2014). Nonverbal Communication Expert to Keynote ACFE Canadian Fraud Conference Retrieved from https://acfeinsights.squarespace.com/acfe-insights/2014/8/5/hg49z60sjilsch75gdhcqjvlmaki32
  62. Moore N.-J., Hickson M., & Stacks D. W. (2014). Nonverbal communication: Studies and applications. Oxford: Oxford University Press. [Google Scholar]
  63. Moreno J. A. (2003). Einstein on the bench: Exposing what judges do not know about science and using child abuse cases to improve how courts evaluate scientific evidence. Ohio State Law Journal, 64, 351–584. [Google Scholar]
  64. Nahari G. (2018). The applicability of the Verifiability Approach to the real world In Rosenfeld J. P. (Ed.), Detecting concealed information and deception: Recent developments. London: Academic Press. [Google Scholar]
  65. Nickerson R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175–220. doi: 10.1037/1089-2680.2.2.175 [DOI] [Google Scholar]
  66. Open Science Collaboration (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspectives on Psychological Science, 7, 657–660. doi: 10.1177/1745691612462588 [DOI] [PubMed] [Google Scholar]
  67. Otgaar H., & Howe M. (2017). Finding the truth in the courtroom: Dealing with deception, lies, and memories. New York: Oxford University Press. [Google Scholar]
  68. Pigliucci M., & Boudry M. (2013a). The dangers of pseudoscience Retrieved from https://opinionator.blogs.nytimes.com/2013/10/10/the-dangers-of-pseudoscience
  69. Pigliucci M., & Boudry M. (2013b). Philosophy of pseudoscience: Reconsidering the demarcation problem. Chicago: University of Chicago Press. [Google Scholar]
  70. Popper K. (1968). The logic of scientific discovery. New York: Harper. [Google Scholar]
  71. Porter S., & ten Brinke L. (2008). Reading between the lies: Identifying concealed and falsified emotions in universal facial expressions. Psychological Science, 19, 508–514. doi: 10.1111/j.1467-9280.2008.02116.x [DOI] [PubMed] [Google Scholar]
  72. Porter S., & ten Brinke L. (2009). Dangerous decisions: A theoretical framework for understanding how judges assess credibility in the courtroom. Legal and Criminological Psychology, 14, 119–134. doi: 10.1348/135532508X281520 [DOI] [Google Scholar]
  73. Porter S., ten Brinke L., & Gustaw C. (2010). Dangerous decisions: The impact of first impressions of trustworthiness on the evaluation of legal evidence and defendant culpability. Psychology Crime & Law, 16, 477–491. doi: 10.1080/10683160902926141 [DOI] [Google Scholar]
  74. Quebec Association of Synergology ( 2019). En quoi la synergologie est-elle une discipline scientifique [How synergology is a scientific discipline] Retrieved from http://www.monaqs.ca/la-discipline
  75. Redding R. E., Floyd M. Y., & Hawk G. L. (2001). What judges and lawyers think about the testimony of mental health experts: A survey of the courts and bar. Behavioral Science and the Law, 19, 583–594. doi: 10.1002/bsl.455 [DOI] [PubMed] [Google Scholar]
  76. Robles M. M. (2012). Executive perceptions of the top 10 soft skills needed in today’s workplace. Business and Professional Communication Quarterly, 75, 453–465. doi: 10.1177/1080569912460400 [DOI] [Google Scholar]
  77. R. v. Handy , [2002] 2 SCR 908, 2002 SCC 56 (CanLII). [Google Scholar]
  78. Sackett D. L., Rosenberg W. M., Gray J. A., Haynes R. B., & Richardson W. S. (1996). Evidence based medicine: What it is and what it isn’t. BMJ, 312, 71–72. doi: 10.1136/BMJ.312.7023.71 [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Sagan C. (1979). Broca’s brain: Reflections on the romance of science. New York: Presidio Press. [Google Scholar]
  80. Schweitzer N. J., & Saks M. J. (2007). The CSI Effect: Popular fiction about forensic science affects the public’s expectations about real forensic science. Jurimetrics, 47, 357–364. doi: 10.2307/29762978 [DOI] [Google Scholar]
  81. Serin R. C., Gobeil R., & Preston D. L. (2009). Evaluation of the persistently violent offender treatment program. International Journal of Offender Therapy and Comparative Criminology, 53, 57–73. doi: 10.1177/0306624X07313985 [DOI] [PubMed] [Google Scholar]
  82. Shallit J. (2005). Science, pseudoscience, and the three stages of truth Retrieved from https://cs.uwaterloo.ca/∼shallit/Papers/stages.pdf
  83. Shermer M. (2002). Why people believe in weird things. New York: Henry Holt. [Google Scholar]
  84. Shermer M. (2013). Science and pseudoscience: The difference in practice and the difference it makes In Pigliucci M. & Boudry M. (Eds.), Philosophy of pseudoscience. Chicago: University of Chicago Press. doi: 10.7208/chicago/9780226051826.001.0001 [DOI] [Google Scholar]
  85. Shipman M. (2014). The limitations of social research. London: Routledge. [Google Scholar]
  86. Sooniste T., Granhag P. A., Strömwall L. A., & Vrij A. (2015). Statements about true and false intentions: Using the Cognitive Interview to magnify the differences. Scandinavian Journal of Psychology, 56, 371–378. doi: 10.1111/sjop.12216 [DOI] [PubMed] [Google Scholar]
  87. Spier R. (2002). The history of the peer-review process. Trends in Biotechnology, 20, 357–358. doi: 10.1016/S0167-7799(02)01985-6 [DOI] [PubMed] [Google Scholar]
  88. Synergology (2019a). Etudier la discipline Synergologie pour comprendre le non-verbal [Studying the discipline of Synergology to understand the non-verbal]. Retrieved from http://formation.synergologie.org/
  89. Synergology (2019b). Synergologie, une discipline scientifique de lecture du langage corporel [Synergology, a scientific discipline for reading body language]. Retrieved from http://non-verbal.synergologie.org/nonverbal/communication-non-verbale/le-langage-corporel-doit-il-etre-un-objet-scientifique
  90. Tadei A., Finnilä K., Reite A., Antfolk J., & Santtila P. (2016). Judges’ capacity to evaluate psychological and psychiatric expert testimony. Nordic Psychology, 68, 204–217. doi: 10.1080/19012276.2015.1125303 [DOI] [Google Scholar]
  91. ten Brinke L., Stimson D., & Carney D. R. (2014). Some evidence for unconscious lie detection. Psychological Science, 25, 1098–1105. doi: 10.1177/0956797614524421 [DOI] [PubMed] [Google Scholar]
  92. Turchet P. (2012). The secrets of body language: An illustrated guide to knowing what people are really thinking and feeling. New York: Skyhorse. [Google Scholar]
  93. Vrij A. (2008). Detecting lies and deceit: Pitfalls and opportunities. Chichester: John Wiley & Sons. [Google Scholar]
  94. Vrij A., & Granhag P. A. (2012). Eliciting cues to deception and truth: What matters are the questions asked. Journal of Applied Research in Memory and Cognition, 1, 110–117. doi: 10.1016/j.jarmac.2012.02.004 [DOI] [Google Scholar]
  95. Vrij A., Hope L., & Fisher R. P. (2014). Eliciting reliable information in investigative interviews. Policy Insights from the Behavioral and Brain Sciences, 1, 129–136. doi: 10.1177/2372732214548592 [DOI] [Google Scholar]
  96. Vrij A., Mann S., & Fisher R. P. (2006). An empirical test of the Behaviour Analysis Interview. Law and Human Behavior, 30, 329–345. doi: 10.1007/s10979-006-9014-3 [DOI] [PubMed] [Google Scholar]
  97. Vrij A., Meissner C. A., Fisher R. P., Kassin S. M., Morgan C. A. III, & Kleinman S. M. (2017). Psychological perspective on interrogation. Perspectives on Psychological Science, 12, 927–955. doi: 10.1177/1745691617706515 [DOI] [PubMed] [Google Scholar]
  98. Walton D. N. (1987). The ad hominem argument as an informal fallacy. Argumentation, 1, 317–331. doi: 10.1007/BF00136781 [DOI] [Google Scholar]
  99. Ware M. (2008). Peer review: benefits, perceptions and alternatives. London: Publishing Research Consortium. [Google Scholar]
  100. White E. (2014). Science, pseudoscience, and the frontline practitioner: The vaccination/autism debate. Journal of Evidence-Based Social Work, 11, 269–274. doi: 10.1080/15433714.2012.759470 [DOI] [PubMed] [Google Scholar]

Articles from Psychiatry, Psychology, and Law are provided here courtesy of Taylor & Francis

RESOURCES