Skip to main content
The Clinical Biochemist Reviews logoLink to The Clinical Biochemist Reviews
. 2013 Aug;34(2):93–103.

On-the-Job Evidence-Based Medicine Training for Clinician-Scientists of the Next Generation

Elaine YL Leung 1,*, Sadia M Malick 2, Khalid S Khan 1; the EBM-CONNECT Collaboration
PMCID: PMC3799223  PMID: 24151345

Abstract

Clinical scientists are at the unique interface between laboratory science and frontline clinical practice for supporting clinical partnerships for evidence-based practice. In an era of molecular diagnostics and personalised medicine, evidence-based laboratory practice (EBLP) is also crucial in aiding clinical scientists to keep up-to-date with this expanding knowledge base. However, there are recognised barriers to the implementation of EBLP and its training. The aim of this review is to provide a practical summary of potential strategies for training clinician-scientists of the next generation.

Current evidence suggests that clinically integrated evidence-based medicine (EBM) training is effective. Tailored e-learning EBM packages and evidence-based journal clubs have been shown to improve knowledge and skills of EBM. Moreover, e-learning is no longer restricted to computer-assisted learning packages. For example, social media platforms such as Twitter have been used to complement existing journal clubs and provide additional post-publication appraisal information for journals.

In addition, the delivery of an EBLP curriculum has influence on its success. Although e-learning of EBM skills is effective, having EBM trained teachers available locally promotes the implementation of EBM training. Training courses, such as Training the Trainers, are now available to help trainers identify and make use of EBM training opportunities in clinical practice. On the other hand, peer-assisted learning and trainee-led support networks can strengthen self-directed learning of EBM and research participation among clinical scientists in training. Finally, we emphasise the need to evaluate any EBLP training programme using validated assessment tools to help identify the most crucial ingredients of effective EBLP training.

In summary, we recommend on-the-job training of EBM with additional focus on overcoming barriers to its implementation. In addition, future studies evaluating the effectiveness of EBM training should use validated outcome tools, endeavour to achieve adequate power and consider the effects of EBM training on learning environment and patient outcomes.

On-the-job evidence-based medicine training for clinician-scientists of the next generation

Evidence-based laboratory practice is crucial

The understanding of human pathology and associated technological innovations is driving the evolution of the clinical laboratory into an era of molecular diagnostics and personalised medicine. The need for clinical scientists in training to learn how to keep themselves up to date with this expanding knowledge base is apparent.

Clinical laboratory tests are medical interventions1 with associated risks and benefits. They are also a growing area for the biotechnology industry and an increasing number clinical trials are incorporating these tests to evaluate the potential therapeutic implications of molecular differences (e.g. genetic mutation and expression) observed in their participants. Research on clinical laboratory testing is also exposed to the same vulnerability of unpublished outcomes and unreported methods observed in clinical trial of therapeutics.2,3 Moreover, with the restrictions on health spending and expanding access to care experienced in many countries, judicious cost-effectiveness assessments of these novel medical interventions are warranted.4

In addition, a significant proportion of medical errors are related to diagnostic test results.5 These costly6 errors are often secondary to failure to implement clinical practice which has established efficacy.5,7 Therefore, it is essential that clinical scientists are able to keep abreast with the existing evidence and help educate the frontline health professionals to ensure that the most appropriate and cost-effective laboratory tests are used for our patients.

Implementing evidence-based laboratory practice is not easy

Evidence-based laboratory practice has previously been defined as ‘the conscientious, judicious and explicit use of best evidence in the use of laboratory medicine investigations for assisting in making decisions about the care of individual patients’.8 Practical advice on day-to-day practice of evidence-based laboratory medicine has recently been reviewed.8 However, there are specific barriers that have slowed the implementation of evidence-based laboratory practice.811 For example, the interpretation of scientific literature on diagnostic tests is not straightforward.9 Initiatives such as the STAndards for the Reporting of Diagnostic accuracy studies (STARD) which aim to improve the reporting quality of diagnostic accuracy studies and help readers assess the validity of such studies12 have achieved variable success.1315 It remains important for clinical scientists and clinicians to be able to understand and critically appraise diagnostic research reports to inform practice.

An example: urine protein-creatinine ratio for the diagnosis of pre-eclampsia

The difficulties encountered for evidence-based laboratory practice could be demonstrated using urine protein-creatinine ratio for the diagnosis of pre-eclampsia in pregnant women. In the past, for pregnant women suspected to have pre-eclampsia, it was routine to measure 24-hour urine protein excretion level to identify proteinuria, which is one of the defining clinical feature of the condition. This test remains the current reference standard (‘the gold standard’). However, it is a cumbersome and lengthy test for patients, with potential risks of delaying their diagnosis. There is clinical need for a valid and accurate test to identify proteinuria. More recently, the clinical use of spot urine test of protein-creation ratio (PCR) for the diagnosis of pre-eclampsia has become more widespread and has reduced the number of women requiring 24-hour urine collection for protein measurement in practice.

Although extensively evaluated outside pregnancy, until recently relatively few primary studies have examined the diagnostic accuracy of PCR in diagnosing pre-eclampsia16,17 and predicting complications associated with pre-eclampsia.18 Moreover, previous primary studies were heterogeneous and reporting of methods was often poor.17 In addition, a substantial proportion (>10%) of women included in these studies never completed their 24-hour urine collection for protein. So the application of PCR may be different from results summarised in the latest systematic reviews.1618

Another observation from these reviews suggested variation across primary studies. Negative and positive predictive values of a diagnostic test vary with prevalence of the tested condition.17 Although a negative urine PCR result is helpful to exclude significant proteinuria, the false positive rate may be high in low prevalence scenarios. Clinical scientists and frontline clinicians could work together to allow judicious interpretation of such test results in local settings.

Significant barriers exist to on-the-job EBM training

Despite the recognised need for evidence-based laboratory practice810 and although EBM training has proven to be effective,1921 these findings did not automatically translate into implementation of on-the-job EBM training.22 In a recent multinational survey of EBM teachers,22 the lack of time (for teaching EBM and for learners to learn) was the strongest perceived barrier to implementation of EBM training. Others stated the lack of requirement for EBM skills in clinical training curricula to be a major barrier. Although postgraduate learning tends to be driven by self-motivation and perceived relevance to practice (instead of curriculum and examinations),19,23 the majority of EBM teachers suggested that assessing EBM skills in both undergraduate and postgraduate examinations is an important element for implementing EBM teaching in clinical practice.22

Implementation of evidence is an essential and perhaps the most important component of evidence-based practice, but it does not happen spontaneously and is not straightforward.2427 In this review, we aim to provide a practical summary of potential strategies for training clinician-scientists of the next generation through an overview of existing evidence combined with our experience in the teaching of evidence-based medicine.

Models of interactive and clinically integrated EBM training

Hierarchy of effective EBM teaching

Doctors have the duty to keep their skills and knowledge up-to-date.28,29 Practicing clinicians often spend a considerable amount of time undertaking courses, conferences, workshops, journal clubs and reading relevant literature and clinical guidelines. Many of these activities fall under the auspices of Continuing Professional Development (CPD),20 Knowledge Translation30 and Continuing Medical Education.31 However, there remains a discrepancy between knowing the best evidence and implementing this knowledge into practice.32,33

It is therefore important to evaluate why CPD activities have failed to consistently encourage evidence-based practice.20,30 The problem may lie with the clinicians, the educational interventions undertaken or the adaptability of the health care workplace to adopt evidence-based changes. It is important to recognise that practitioners, rather than the evidence itself, change practice.

Clinically integrated interactive EBM training consistently improved knowledge of EBM and critical appraisal skills and is associated with a more positive attitude and adoption of evidence-based practice when compared to didactic sessions.1921 Primary studies including clinically integrated EBM teaching methods and essential evidence-based practice components (e.g. formulating answerable clinical questions, literature search, critical appraisal and summarising the evidence) have achieved particular success. Participants in these studies learnt to appreciate the cycle of evidence-based practice (Figure 1), which could be the key to the observed success.

Figure 1.

Figure 1.

The process of evidence-based practice.

We have previously proposed a hierarchy of effective EBM teaching based on a systematic review of empirical evidence, combined with theoretical consideration (Figure 2),20 which highlighted the importance of clinical integration and interactions for effective EBM teaching. There are a number of reasons proposed as to why clinically integrated interactive EBM teaching is more effective.20,23,34 Firstly, learners determine educational needs important to their current practice and evidence is actively sought. Secondly, problem-based learning encourages deep learning35 and problem solving. Thirdly, as information is directly relevant to clinical practice, it can easily be incorporated into local guidelines to encourage its application. Finally, as the clinical scenarios are encountered in real life, local barriers are identified and can be dealt with at workplace.20 The ability to use the newly acquired knowledge and skills to resolve clinical problems reminds learners of the relevance of EBM teaching and reinforces the reward of learning such knowledge and skills. However, teaching of EBM at the time when clinical questions are encountered is not straightforward.22 Here we discuss some clinically integrated activities that have been shown to be effective, and which can be tailored and implemented for training clinical scientists.

Figure 2.

Figure 2.

Hierarchy of evidence-based medicine (EBM) teaching. Reproduced from Khan & Coomarasamy, 2006.20

E-learning for EBM

Learning EBM at the point of service delivery is difficult, particularly in rural areas and low-middle-income countries21,36 where trained teachers in EBM are scarce. Moreover, clinical scientists and frontline clinicians often have few opportunities to interact in practice. E-learning packages tailored to specific specialties have been developed to complement work-based EBM teaching,21,36,37 and could be as good as face-to-face lecture-based teaching of EBM.3840 E-learning also has the advantages of allowing flexibility for clinicians to manage their own learning. In addition, it provides the possibility of standardising teaching in different countries and clinical settings, which enables harmonisation in accreditation of competencies.21,36

E-learning packages are not the only form of digital teaching material available. A vast amount of resources for teaching EBM is available online (Table 1), often free-of-charge. Recently, social media has gained popularity for disseminating information by news outlets, politicians, charities and a wide range of commercial organisations. The medical and research communities have been slow to adopt due to concerns about potential unprofessional use.29,41 However, medical educators are increasingly keen to use social media to supplement the formal curriculum.42,43 Researchers and medical professionals active on social media platforms often publish their appraisal of the latest research studies via these channels. In addition, students and trainees worldwide are now accustomed to social media and digital learning, with the more sophisticated use of virtual learning environments at higher education institutes. In the future, e-learning of EBM skills is likely to extend beyond the use of standardised packages, and will provide the feedback and interactions needed to maximise its effectiveness in promoting evidence-based practice.

Table 1.

Some useful online resources related to evidence-based medicine (EBM).

Evidence-based journal clubs and rounds

Journal clubs and training in critical appraisal skills are the commonest methods used in EBM training.22 The existing evidence on the effectiveness of journal clubs44,45 showed significant heterogeneity between studies. Although meta-analysis of the data was therefore not appropriate, authors of a number of previous systematic reviews have consistently identified similar key features in effective journal club44,45 and training in critical appraisal46,47 and EBM skills.19,48

Effective journal clubs are clinical-oriented and have clear educational goals.44,45 According to the experience of the learners, mentorship (e.g. by a clinical librarian) and didactic support (e.g. by a journal club facilitator) are provided to facilitate such journal clubs. In addition, structured instruments for critical appraisal and the availability of experts to facilitate discussions of clinical applicability encourage interactions within these journal clubs.

Evidence-based journal clubs are based on systematic evaluation of clinical problems that have created uncertainty in their management. Each journal club simulates the process of evidence-based practice (Figure 1), converting an identified clinical scenario into answerable questions (first step) for searching the literature (second step). The third step is critical appraisal of the selected articles, preferably based on structured guidelines.49 At the fourth and final step of an evidence-based journal club, the clinical question and results of its systematic evaluation are presented to other journal club members. This provides an opportunity for all members to appraise the evidence and reflect on its clinical applicability. Any conclusion and potential impact on management are recorded and deposited in a repository. These clinical questions could be re-evaluated with up-to-date evidence at a later date to further promote evidence-based practice.

Evidence-based journal clubs can be implemented in specialties with fewer clinician-patient encounters, e.g. clinical biochemistry, to help clinical scientists link clinical scenarios with new developments in laboratory-based medicine. For example, Clinical Chemistry, the official journal of American Association for Clinical Chemistry, posts an original article with a complementary PowerPoint slide set on its dedicated journal club page at www.clinchem.org50 for journal club hosts worldwide.

As discussed previously, researchers and clinicians can share and discuss the latest research reports and their implications in practice51,52 using social media platforms. Social media could be an adjunct to existing journal clubs and provide additional post-publication appraisal information for journals.52,53 It can also be considered as an interactive element in the e-learning of EBM skills.52 Medical journals have begun to realise the potential benefits and pitfalls of social media.51,52,54 A recent systematic review55 suggested that social media use in medical education was associated with improved knowledge, attitudes and skills. At this stage, at least one peer-reviewed journal has started to systematically collect data on post-publication appraisal via a Twitter-based journal club.52

Despite the widespread use of journal clubs for EBM teaching, the most crucial ingredients of an effective journal club warrant further investigations.44,45 Future studies should include validated instruments for assessing the effectiveness of journal clubs and consider their impact on workplace environment and on patient outcomes.45

An example: incorporating the use of social media for evidence-based journal club

Evidence-based journal clubs have practice changing potential,49 but could be further enhanced by immediate access to a varied range of clinicians, including authors and editors, to answer questions and to explore practical implications. BJOG: An International Journal of Obstetrics and Gynaecology (http://www.bjog.org/; @BJOGTweets on Twitter) in conjunction with the Katherine Twining (KT) Network (http://www.ktnetwork.org/; @kt_network on Twitter), a multidisciplinary research collaboration has developed a Twitter-based journal club to achieve this.

The aims of the initiative, #BlueJC, are to collate post-publication appraisal information and encourage professional development activities through interactions on social media platforms.52 Each month, a paper selected by the editorial board is chosen as the journal club paper and made open-access for all readers. A complementary discussion guide and slide set is produced for each journal club paper, which is also suitable for face-to-face journal clubs. A scheduled Twitter-based #BlueJC discussion is announced at the time of publication of a journal club paper. The hashtag #BlueJC is used so that conversations can be searched, tracked and followed on Twitter. The latest #BlueJC sessions had participants from different countries and disciplines, including authors of a journal club paper residing in Saudi Arabia.52

Although #BlueJC is not the first Twitter-based journal club, it is the first to link with both a research network and an international peer-reviewed journal. In addition, the discussion guide and slide set produced follow the evidence-based journal club format based on clinical scenarios, structured questions and focused on learning needs. The research dissemination landscape of the 21st century is changing. E-learning is no longer restricted to computer-assisted learning packages. Innovative solutions can encourage clinically integrated learning of evidence-based practice.

How to deliver an EBM-focused curriculum?

Training the trainers

Although e-learning of EBM skills could reduce the existing burden of clinical teaching,21,56 having EBM trained teachers available locally is crucial in helping promote the implementation of EBM teaching.56 The shortage of trained EBM teachers remains a significant barrier to the implementation of EBM training not only in developing countries21,56 but also in developed European countries.21,57 Therefore, empowering clinical teachers with skills to train others in the delivery of EBM skills is important when developing an EBM-focused curriculum for medical training.

The Training the Trainers course (TTT) is a European training programme targeting senior clinicians with existing EBM knowledge and aims to help identify and make use of EBM training opportunities in clinical practice. TTT was developed by the EU EBM Unity Partnership involving a collaboration of seven European countries.18 The training materials are available in multiple languages (English, French, German, Hungarian, Polish) with a supplementary e-learning package providing practical advice and methods by which EBM can be taught during various clinical activities. Variations of the TTT course have been adopted in multiple European countries. Most of the TTT courses have incorporated the e-learning package and have been officially certified.57 Apart from the lack of funding to provide such training, the providers of TTT course reported the need for help in providing practical examples of successful EBM teaching techniques.57 This feedback needs to be taken into account when designing a curriculum for trainers teaching EBM.

Training portfolio to encourage self-directed learning

Portfolios (electronic and traditional paper-based) are commonly used as tools for trainees in multiple medical disciplines to summarise and reflect on their learning progress.58,59 Assessments can direct learning.60,61 Portfolios have also been used in undergraduate and postgraduate curricula as part of trainees’ assessments of progress.58,59,62,63

However, the effectiveness of portfolios for learning and assessments has been controversial.63 The results of available primary research on the effectiveness of training portfolios have been mixed58,59 and good-quality research is lacking.58,59 Primary studies of successful portfolios (ones associated with improved knowledge and self-reflection) appeared to be related to the amount of support in place to mentor students through their learning process.58,59,63,64 No previous studies have evaluated how individual components of a training portfolio (e.g. the quality of mentorship) contributed to educational benefits.58,59 It would be worthwhile assessing the value of portfolios for clinical scientists in training.

A recognised disadvantage of portfolios is the time required to complete them by both trainees and trainers.58,59,63,65 Moreover, learning styles vary,66 and some have argued that tools such as portfolios and personal development plans bring little to the most proficient self-directed learners62 and that compulsory portfolios might be counter-productive62,63 for a proportion of learners. In addition, trainees in different medical specialties express different willingness to change practice according to feedback recorded on training portfolios.67 To our knowledge, the opinion of trainee clinical scientists on the use of portfolios has not been evaluated.

A growing number of medical training programmes now use electronic portfolios (e-portfolios) as review and assessment tools. E-portfolios encourage tracking of progress and are less cumbersome. They also have the potential to promote flexibility for both trainers and trainees to give and receive feedback.59,68 Users tend to spend longer with electronic portfolios compared to paper-based ones.59 However, the e-portfolio requires both trainees and trainers to have sufficient skills to navigate round the host software and be supported by a compatible information technology infrastructure. In addition, the development of electronic portfolios can be costly69 and often transmitted to individual training as subscription fees for training portfolios.70

Despite the potential of portfolios, developers of a training programme must evaluate the suitability of incorporating training portfolios to support their curriculum based on the above factors. Implementation of training portfolios needs to be coupled with evaluation and review mechanisms for change according to feedback from users.

Working in clinical partnerships to support an environment for EBM training

A supportive environment is crucial for learning71,72 and training of clinicians.73 Therefore, a successful EBM training programme should endeavour to create a learning-oriented culture. Although trainee clinicians experience some of the same barriers to practicing EBM (e.g. the lack of time and poor accessibility of electronic resources), they encounter additional situational barriers to learning and practicing EBM.73 Trainees are often less familiar with the key clinical questions in the specialties and experience difficulties in knowing when to stop searching for further evidence.73,74 Moreover, training of clinicians often relies on team-based learning and hence they could be more susceptible to threats of criticisms, senior resistance to EBM75 and variable team dynamics.73 Adult learning involves egos;76 reducing the fear of judgment and potential impact of career progression is important during learning and discussion on evidence-based practice.34

Laboratory-based clinical scientists can work with frontline clinicians using their different expertise to nurture a culture that actively implements advances in laboratory medicine for our patients in practice. Face-to-face meetings (such as hospital grand rounds), although requiring additional effort and leadership from senior clinicians and scientists, will provide opportunities for this interaction. Multidisciplinary teams could also combine resources for further support of EBM training, such as incorporating the use of clinical librarians to help search for best evidence in a time-efficient manner. Although novel tests and treatment are often expensive, evidence-based practice could potentially minimise medical practice that has no proven clinical benefits and improve cost-effectiveness of our practice.

The EBM Education Environment Measure (EBMEEM)77 was developed based on the model of the Dundee Ready Education Environment Measure (DREEM)72 to assess the educational environment towards EBM. The tool evaluates key themes including learning opportunities, self-directed EBM learning, availability of leaning resources, EBM-specific teaching, supervision and support and evidence medicine atmosphere. It was validated externally by a steering committee77 prior to its use in a recent randomised controlled trial on e-learning of evidence-based practice.21 This validated tool could potentially be used to assess and monitor progress of its learning environment when implementing new EBM training initiatives.

Collaboration, networking and peer-assisted learning

Current evidence suggests collaboration and networks of support are important catalysts for implementing evidence-based practice.21,78,79 Many are familiar with international multidisciplinary collaborations advocating evidence-based decision making and practice such as the Cochrane Collaboration (http://www.cochrane.org/). However, individual users (including clinicians, health system managers and policy makers) often have difficulty interpreting and utilising the high-quality evidence collated in such databases.80,81 Collaboration within an individual specialist area (e.g. The Laboratory Medicine Practice Guidelines) has the potential to focus on the implementation of evidence through various activities, including evidence-based guideline development.

More recently, trainee-led collaborations and networks have shown promise for nurturing primary research projects82 and encouraging systematic review of evidence.83 In the field of clinical chemistry, the Clinical Chemistry Trainee Council (http://direct.aacc.org/) has recently launched a free web-based educational programme for trainees worldwide, available in multiple languages. In addition, trainees who have had training and are thus better equipped to incorporate EBM in practice could take on the role of tutor for their peers. Peer-assisted learning is defined as acquisition of knowledge and skill through active help and support among companions who are matched and/or equal in status.84 The teaching experience is rewarding for the peer tutor,85 but could be further encouraged by non-monetary incentives and recognition by senior colleagues. Today’s trainees are the future leaders in medicine. These trainee-led collaborations could be an engine to promote an environment for learning and practice of EBM, as well as producing high-quality research.

Consideration of language and cultural barriers

International collaborations of evidence-based medicine teaching are increasingly common. International projects such as the Evidence-based Medicine Collaboration Network for guideline development, teaching and dissemination (EBM Connect) aims to combine the expertise of different international centres in EBM, to share skills and develop new methods for evidence synthesis. In order to set standards for assessing competencies across different countries, harmonisation of teaching and training of EBM is imperative.37 Additional factors for consideration include language barriers86,87 and cultural differences22 in practice. The increased awareness and more sophisticated digital resources are encouraging signs for tackling these barriers.

Evaluating outcomes of EBM training

Training in evidence-based practice has gained awareness and some progress in its implementation.22,48 However, many studies reporting the effectiveness of such training are of low methodological quality.19,46,47,48 Few used validated outcome tools48 and current evidence lacks evaluation of EBM training on patient outcomes. The research community is calling for a focus on outcome research, which involves developing novel methods to examine the effects on clinical outcomes by medical education intervention.88,89 However, this area remains challenging with difficulties in adequately powering interventional studies in medical education against the effects of dilution by the healthcare provision pathways,90 as well as potential bias on outcome selection and matching educational intervention according to the assessment tools used.

When evaluating the effectiveness of an EBM training programme, it is important not only to assess the learners’ performance, but also any changes in the learning environment. Validated tools to assess the improvement of EBM knowledge and skills9193 and training environment72,77,94 exist (Table 2 and the above discussion on EBMEEM). Future evaluation of EBM training programmes should therefore incorporate existing validated assessment tools. If these existing validated tools are not appropriate, assessors should attempt to evaluate the validity of an instrument (the trustworthiness of its assessments) before adoption.

Table 2.

Examples of validated tools for assessing EBM training.

  • The Berlin Questionnaire:91 assesses knowledge and skills of EBM

  • Fresno test of competence:92 assesses knowledge and skills of EBM

  • Taylor et al:90 offers a questionnaire for assessing EBM attitudes

  • The Dundee Ready Education Environment Measure (DREEM):72 assesses the undergraduate training environment (including metrics such as the level of autonomy, quality of teaching and social support)

  • The EBM Education Environment Measure (EBMEEM):77 a validated tool to assess health care workplace environment for EBM

Despite these challenges, evaluation should be an important consideration for medical educators planning an EBM training curriculum, in order to strengthen the existing evidence base. As discussed above, the use of validated tools and work in collaboration with stakeholders (including trainees) could help deliver and evaluate a successful training programme.

Conclusions and recommendations

Translating new knowledge gained by research into laboratory practice is crucial but difficult to implement. Skills are required to identify and interpret existing evidence and establish mechanisms in the workplace for implementing change.

Fortunately, these skills can be taught and learnt. Effective training of evidence-based practice requires clinically integrated methods and training material.20 We recommend educators planning an EBM-focused curriculum consider incorporating a combination of EBM journal club and innovative e-learning strategies.

We also advocate paying particular attention to overcoming the significant barriers to the delivery of on-the-job EBM training.22 The lack of trained EBM teachers could be improved by collaborations, staff exchange and training courses such as Training the Trainers. Training portfolio and peer-assisted learning, if used appropriately, could also encourage learning of EBM skills. For clinical scientists, working with frontline clinicians is essential to bring advances in laboratory medicine to our patients. Awareness of language and culture differences is also crucial for implementing EBM training globally.

Finally, the quality of studies evaluating the effectiveness of EBM training must improve. Future studies should use validated outcome tools and endeavour to achieve adequate power. In addition, assessments of effectiveness should include both the learners’ performance and any changes in their learning environment. Novel methods are needed to evaluate the effects of EBM training on patient outcomes.

Footnotes

Competing Interests: None declared.

References:

  • 1.Lundberg GD. The need for an outcomes research agenda for clinical laboratory testing. JAMA. 1998;280:565–6. doi: 10.1001/jama.280.6.565. [DOI] [PubMed] [Google Scholar]
  • 2.Ross JS, Tse T, Zarin DA, Xu H, Zhou L, Krumholz HM. Publication of NIH funded trials registered in ClinicalTrials.gov: cross sectional analysis. BMJ. 2012;344:d7292. doi: 10.1136/bmj.d7292. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Hart B, Lundh A, Bero L. Effect of reporting bias on meta-analyses of drug trials: reanalysis of meta-analyses. BMJ. 2012;344:d7202. doi: 10.1136/bmj.d7202. [DOI] [PubMed] [Google Scholar]
  • 4.Phillips KA. The intersection of biotechnology and pharmacogenomics: health policy implications. Health Aff (Millwood) 2006;25:1271–80. doi: 10.1377/hlthaff.25.5.1271. [DOI] [PubMed] [Google Scholar]
  • 5.Singh H, Giardina TD, Meyer AN, Forjuoh SN, Reis MD, Thomas EJ. Types and origins of diagnostic errors in primary care settings. JAMA Intern Med. 2013;173:418–25. doi: 10.1001/jamainternmed.2013.2777. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Van Den Bos J, Rustagi K, Gray T, Halford M, Ziemkiewicz E, Shreve J. The $17.1 billion problem: the annual cost of measurable medical errors. Health Aff (Millwood) 2011;30:596–603. doi: 10.1377/hlthaff.2011.0084. [DOI] [PubMed] [Google Scholar]
  • 7.Hayward RA, Asch SM, Hogan MM, Hofer TP, Kerr EA. Sins of omission: getting too little medical care may be the greatest threat to patient safety. J Gen Intern Med. 2005;20:686–91. doi: 10.1111/j.1525-1497.2005.0152.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Price CP. Evidence-based laboratory medicine: is it working in practice? Clin Biochem Rev. 2012;33:13–9. [PMC free article] [PubMed] [Google Scholar]
  • 9.Price CP. Evidence-based laboratory medicine: supporting decision-making. Clin Chem. 2000;46:1041–50. [PubMed] [Google Scholar]
  • 10.McQueen MJ. Overview of evidence-based medicine: challenges for evidence-based laboratory medicine. Clin Chem. 2001;47:1536–46. [PubMed] [Google Scholar]
  • 11.Hawkins RC. The Evidence Based Medicine approach to diagnostic testing: practicalities and limitations. Clin Biochem Rev. 2005;26:7–18. [PMC free article] [PubMed] [Google Scholar]
  • 12.Bossuyt PM, Reitsma JB, Bruns DE, Gatsonis CA, Glasziou PP, Irwig LM, et al. Towards complete and accurate reporting of studies of diagnostic accuracy: the STARD initiative. Clin Biochem. 2003;36:2–7. doi: 10.1016/s0009-9120(02)00443-5. [DOI] [PubMed] [Google Scholar]
  • 13.Whiting P, Rutjes AW, Dinnes J, Reitsma JB, Bossuyt PM, Kleijnen J. A systematic review finds that diagnostic reviews fail to incorporate quality despite available tools. J Clin Epidemiol. 2005;58:1–12. doi: 10.1016/j.jclinepi.2004.04.008. [DOI] [PubMed] [Google Scholar]
  • 14.Siddiqui MA, Azuara-Blanco A, Burr J. The quality of reporting of diagnostic accuracy studies published in ophthalmic journals. Br J Ophthalmol. 2005;89:261–5. doi: 10.1136/bjo.2004.051862. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Fontela PS, Pant Pai N, Schiller I, Dendukuri N, Ramsay A, Pai M. Quality and reporting of diagnostic accuracy studies in TB, HIV and malaria: evaluation using QUADAS and STARD standards. PLoS One. 2009;4:e7753. doi: 10.1371/journal.pone.0007753. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Côté A-M, Brown MA, Lam E, von Dadelszen P, Firoz T, Liston RM, et al. Diagnostic accuracy of urinary spot protein:creatinine ratio for proteinuria in hypertensive pregnant women: systematic review. BMJ. 2008;336:1003–6. doi: 10.1136/bmj.39532.543947.BE. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Morris RK, Riley RD, Doug M, Deeks JJ, Kilby MD. Diagnostic accuracy of spot urinary protein and albumin to creatinine ratios for detection of significant proteinuria or adverse pregnancy outcome in patients with suspected pre-eclampsia: systematic review and meta-analysis. BMJ. 2012;345:e4342. doi: 10.1136/bmj.e4342. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Thangaratinam S, Coomarasamy A, O’Mahony F, Sharp S, Zamora J, Khan KS, et al. Estimation of proteinuria as a predictor of complications of pre-eclampsia: a systematic review. BMC Med. 2009;7:10. doi: 10.1186/1741-7015-7-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329:1017. doi: 10.1136/bmj.329.7473.1017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Khan KS, Coomarasamy A. A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine. BMC Med Educ. 2006;6:59. doi: 10.1186/1472-6920-6-59. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Kulier R, Gülmezoglu AM, Zamora J, Plana MN, Carroli G, Cecatti JG, et al. Effectiveness of a clinically integrated e-learning course in evidence-based medicine for reproductive health training: a randomized trial. JAMA. 2012;308:2218–25. doi: 10.1001/jama.2012.33640. [DOI] [PubMed] [Google Scholar]
  • 22.Oude Rengerink K, Thangaratinam S, Barnfield G, Suter K, Horvath AR, Walczak J, et al. How can we teach EBM in clinical practice? An analysis of barriers to implementation of on-the-job EBM teaching and learning. Med Teach. 2011;33:e125–30. doi: 10.3109/0142159X.2011.542520. [DOI] [PubMed] [Google Scholar]
  • 23.Malick S, Das K, Khan KS. Tips for teaching evidence-based medicine in a clinical setting: lessons from adult learning theory. Part two. J R Soc Med. 2008;101:536–43. doi: 10.1258/jrsm.2008.080713. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7:149–58. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Grol R. Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care. 2001;39:II46–54. doi: 10.1097/00005650-200108002-00003. [DOI] [PubMed] [Google Scholar]
  • 26.Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003;362:1225–30. doi: 10.1016/S0140-6736(03)14546-1. [DOI] [PubMed] [Google Scholar]
  • 27.Wensing M, Bal R, Friele R. Knowledge implementation in healthcare practice: a view from The Netherlands. BMJ Qual Saf. 2012;21:439–42. doi: 10.1136/bmjqs-2011-000540. [DOI] [PubMed] [Google Scholar]
  • 28.Australian Medical Council . Good Medical Practice: A Code of Conduct for Doctors in Australia. Canberra: Australian Medical Council; 2009. Section 14. [Google Scholar]
  • 29.General Medical Council . Good Medical Practice (2013). Domain 1: Knowledge, Skills and Performance. Great Britain: General Medical Council; 2013. [Google Scholar]
  • 30.Davis D, Evans M, Jadad A, Perrier L, Rath D, Ryan D, et al. The case for knowledge translation: shortening the journey from evidence to effect. BMJ. 2003;327:33–5. doi: 10.1136/bmj.327.7405.33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Davis D, O’Brien MA, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282:867–74. doi: 10.1001/jama.282.9.867. [DOI] [PubMed] [Google Scholar]
  • 32.Grimshaw JM, Russell IT. Effect of clinical guidelines on medical practice: a systematic review of rigorous evaluations. Lancet. 1993;342:1317–22. doi: 10.1016/0140-6736(93)92244-n. [DOI] [PubMed] [Google Scholar]
  • 33.Moscone F, Tosetti E, Costantini M, Ali M. The impact of scientific research on health care: Evidence from the OECD countries. Econ Model. 2013;32:325–32. [Google Scholar]
  • 34.Das K, Malick S, Khan KS. Tips for teaching evidence-based medicine in a clinical setting: lessons from adult learning theory. Part one. J R Soc Med. 2008;101:493–500. doi: 10.1258/jrsm.2008.080712. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Albanese MA, Mitchell S. Problem-based learning: a review of literature on its outcomes and implementation issues. Acad Med. 1993;68:52–81. doi: 10.1097/00001888-199301000-00012. [DOI] [PubMed] [Google Scholar]
  • 36.Kulier R, Hadley J, Weinbrenner S, Meyerrose B, Decsi T, Horvath AR, et al. Harmonising evidence-based medicine teaching: a study of the outcomes of e-learning in five European countries. BMC Med Educ. 2008;8:27. doi: 10.1186/1472-6920-8-27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Coppus SF, Emparanza JI, Hadley J, Kulier R, Weinbrenner S, Arvanitis TN, et al. A clinically integrated curriculum in evidence-based medicine for just-in-time learning through on-the-job training: the EU-EBM project. BMC Med Educ. 2007;7:46. doi: 10.1186/1472-6920-7-46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial. BMC Med Educ. 2007;7:23. doi: 10.1186/1472-6920-7-23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Davis J, Crabb S, Rogers E, Zamora J, Khan K. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomized controlled trial. Med Teach. 2008;30:302–7. doi: 10.1080/01421590701784349. [DOI] [PubMed] [Google Scholar]
  • 40.Hadley J, Kulier R, Zamora J, Coppus SF, Weinbrenner S, Meyerrose B, et al. Effectiveness of an e-learning course in evidence-based medicine for foundation (internship) training. J R Soc Med. 2010;103:288–94. doi: 10.1258/jrsm.2010.100036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Chretien KC, Azar J, Kind T. Physicians on Twitter. JAMA. 2011;305:566–8. doi: 10.1001/jama.2011.68. [DOI] [PubMed] [Google Scholar]
  • 42.Bahner DP, Adkins E, Patel N, Donley C, Nagel R, Kman NE. How we use social media to supplement a novel curriculum in medical education. Med Teach. 2012;34:439–44. doi: 10.3109/0142159X.2012.668245. [DOI] [PubMed] [Google Scholar]
  • 43.DiVall MV, Kirwin JL. Using Facebook to facilitate course-related discussion between students and faculty members. Am J Pharm Educ. 2012;76:32. doi: 10.5688/ajpe76232. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Deenadayalan Y, Grimmer-Somers K, Prior M, Kumar S. How to run an effective journal club: a systematic review. J Eval Clin Pract. 2008;14:898–911. doi: 10.1111/j.1365-2753.2008.01050.x. [DOI] [PubMed] [Google Scholar]
  • 45.Harris J, Kearley K, Heneghan C, Meats E, Roberts N, Perera R, et al. Are journal clubs effective in supporting evidence-based decision making? A systematic review. BEME Guide No. 16. Med Teach. 2011;33:9–23. doi: 10.3109/0142159X.2011.530321. [DOI] [PubMed] [Google Scholar]
  • 46.Taylor R, Reeves B, Ewings P, Binns S, Keast J, Mears R. A systematic review of the effectiveness of critical appraisal skills training for clinicians. Med Educ. 2000;34:120–5. doi: 10.1046/j.1365-2923.2000.00574.x. [DOI] [PubMed] [Google Scholar]
  • 47.Parkes J, Hyde C, Deeks J, Milne R. Teaching critical appraisal skills in health care settings. Cochrane Database Syst Rev. 2001;3:CD001270. doi: 10.1002/14651858.CD001270. [DOI] [PubMed] [Google Scholar]
  • 48.Flores-Mateo G, Argimon JM. Evidence based practice in postgraduate healthcare education: a systematic review. BMC Health Serv Res. 2007;7:119. doi: 10.1186/1472-6963-7-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Afifi Y, Davis J, Khan K, Publicover M, Gee H. The journal club: a modern model for better service and training. Obstet Gynecol. 2006;8:186–9. [Google Scholar]
  • 50.American Association for Clinical Chemistry Journal Club Articles. http://www.aacc.org/publications/clin_chem/journalclub/pages/default.aspx (Accessed 2 June 2013).
  • 51.Mandavilli A. Peer review: Trial by Twitter. Nature. 2011;469:286–7. doi: 10.1038/469286a. [DOI] [PubMed] [Google Scholar]
  • 52.Leung EY, Tirlapur SA, Siassakos D, Khan KS. #BlueJC: BJOG and Katherine Twining Network collaborate to facilitate post-publication peer review and enhance research literacy via a Twitter journal club. BJOG. 2013;120:657–60. doi: 10.1111/1471-0528.12197. [DOI] [PubMed] [Google Scholar]
  • 53.Eysenbach G. Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact. J Med Internet Res. 2011;13:e123. doi: 10.2196/jmir.2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Chretien KC, Kind T. Social media and clinical care: ethical, professional, and social implications. Circulation. 2013;127:1413–21. doi: 10.1161/CIRCULATIONAHA.112.128017. [DOI] [PubMed] [Google Scholar]
  • 55.Cheston CC, Flickinger TE, Chisolm MS. Social media use in medical education: a systematic review. Acad Med. 2013;88:893–901. doi: 10.1097/ACM.0b013e31828ffc23. [DOI] [PubMed] [Google Scholar]
  • 56.Prasad K. Teaching evidence-based medicine in resource-limited countries. JAMA. 2012;308:2248–9. doi: 10.1001/jama.2012.74124. [DOI] [PubMed] [Google Scholar]
  • 57.Oude Rengerink K, Khan K, Horvath AR, Meyerrose B, Walczak J, Suter K, et al. EU EBM Unity Who teaches the evidence-based medicine teacher? Med Teach. 2012;34:866. doi: 10.3109/0142159X.2012.716185. [DOI] [PubMed] [Google Scholar]
  • 58.Buckley S, Coleman J, Davison I, Khan KS, Zamora J, Malick S, et al. The educational effects of portfolios on undergraduate student learning: a Best Evidence Medical Education (BEME) systematic review. BEME Guide No. 11. Med Teach. 2009;31:282–98. doi: 10.1080/01421590902889897. [DOI] [PubMed] [Google Scholar]
  • 59.Tochel C, Haig A, Hesketh A, Cadzow A, Beggs K, Colthart I, et al. The effectiveness of portfolios for postgraduate assessment and education: BEME Guide No 12. Med Teach. 2009;31:299–318. doi: 10.1080/01421590902883056. [DOI] [PubMed] [Google Scholar]
  • 60.Shepard LA. The role of assessment in a learning culture. Educational Res. 2000;29:4–14. [Google Scholar]
  • 61.Boud D, Falchikov N. Aligning assessment with long-term learning. Assess Eval High Educ. 2006;31:399–413. [Google Scholar]
  • 62.Jennings SF. Personal development plans and self-directed learning for healthcare professionals: are they evidence based? Postgrad Med J. 2007;83:518–24. doi: 10.1136/pgmj.2006.053066. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Driessen E, van Tartwijk J, van der Vleuten C, Wass V. Portfolios in medical education: why do they meet with mixed success? A systematic review. Med Educ. 2007;41:1224–33. doi: 10.1111/j.1365-2923.2007.02944.x. [DOI] [PubMed] [Google Scholar]
  • 64.Dekker H, Driessen E, Ter Braak E, Scheele F, Slaets J, Van Der Molen T, et al. Mentoring portfolio use in undergraduate and postgraduate medical education. Med Teach. 2009;31:903–9. doi: 10.3109/01421590903173697. [DOI] [PubMed] [Google Scholar]
  • 65.Van Tartwijk J, Driessen EW. Portfolios for assessment and learning: AMEE Guide no. 45. Med Teach. 2009;31:790–801. doi: 10.1080/01421590903139201. [DOI] [PubMed] [Google Scholar]
  • 66.Coffield F, Moseley D, Hall E, Ecclestone K. Learning styles and pedagogy in post-16 learning: a systematic and critical review. London: Learning and Skills Research Centre; 2004. [Google Scholar]
  • 67.Miller A, Archer J. Impact of workplace based assessment on doctors’ education and performance: a systematic review. BMJ. 2010;341:c5064. doi: 10.1136/bmj.c5064. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Duque G, Finkelstein A, Roberts A, Tabatabai D, Gold SL, Winer LR, Members of the Division of Geriatric Medicine, McGill University Learning while evaluating: the use of an electronic evaluation portfolio in a geriatric medicine clerkship. BMC Med Educ. 2006;6:4. doi: 10.1186/1472-6920-6-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Lorenzo G, Ittelson J. An overview of e-portfolios. ELI Paper. 2005;1 [Google Scholar]
  • 70.Pereira EA, Dean BJ. British surgeons’ experiences of mandatory online workplace-based assessment. J R Soc Med. 2009;102:287–93. doi: 10.1258/jrsm.2009.080398. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Tessmer M, Richey R. The role of context in learning and instructional design. Educ Technol Res Dev. 1997;45:85–115. [Google Scholar]
  • 72.Roff S, McAleer S, Skinner A. Development and validation of an instrument to measure the postgraduate clinical learning and teaching educational environment for hospital-based junior doctors in the UK. Med Teach. 2005;27:326–31. doi: 10.1080/01421590500150874. [DOI] [PubMed] [Google Scholar]
  • 73.Hoff TJ, Pohl H, Bartfield J. Creating a learning environment to produce competent residents: the roles of culture and context. Acad Med. 2004;79:532–9. doi: 10.1097/00001888-200406000-00007. [DOI] [PubMed] [Google Scholar]
  • 74.Slotnick HB. Physicians’ learning strategies. Chest. 2000;118(Suppl):18S–23S. doi: 10.1378/chest.118.2_suppl.18s. [DOI] [PubMed] [Google Scholar]
  • 75.Bhandari M, Montori V, Devereaux PJ, Dosanjh S, Sprague S, Guyatt GH. Challenges to the practice of evidence-based medicine during residents’ surgical training: a qualitative study using grounded theory. Acad Med. 2003;78:1183–90. doi: 10.1097/00001888-200311000-00022. [DOI] [PubMed] [Google Scholar]
  • 76.Speck M. Best practice in professional development for sustained educational change. ERS Spectr. 1996;14:33–41. [Google Scholar]
  • 77.Kulier R, Khan KS, Gulmezoglu AM, Carroli G, Cecatti JG, Germar MJ, et al. A cluster randomized controlled trial to evaluate the effectiveness of the clinically integrated RHL evidence -based medicine course. Reprod Health. 2010;7:8. doi: 10.1186/1742-4755-7-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Burgers JS, Grol R, Klazinga NS, Mäkelä M, Zaat J, AGREE Collaboration Towards evidence-based clinical practice: an international survey of 18 clinical guideline programs. Int J Qual Health Care. 2003;15:31–45. doi: 10.1093/intqhc/15.1.31. [DOI] [PubMed] [Google Scholar]
  • 79.Horbar JD, Carpenter JH, Buzas J, Soll RF, Suresh G, Bracken MB, et al. Collaborative quality improvement to promote evidence based surfactant for preterm infants: a cluster randomised trial. BMJ. 2004;329:1004. doi: 10.1136/bmj.329.7473.1004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Rosenbaum SE, Glenton C, Cracknell J. User experiences of evidence-based online resources for health professionals: user testing of The Cochrane Library. BMC Med Inform Decis Mak. 2008;8:34. doi: 10.1186/1472-6947-8-34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81.Murthy L, Shepperd S, Clarke MJ, Garner SE, Lavis JN, Perrier L, et al. Interventions to improve the use of systematic reviews in decision-making by health system managers, policy makers and clinicians. Cochrane Database Syst Rev. 2012;9:CD009401. doi: 10.1002/14651858.CD009401.pub2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Pinkney T, Bartlett D, Hawkins W, Mak T, Youssef H, Futaba K, et al. Reduction of surgical site infection using a novel intervention (ROSSINI): study protocol for a randomised controlled trial. Trials. 2011;12:217. doi: 10.1186/1745-6215-12-217. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Gheorghe A, Calvert M, Pinkney TD, Fletcher BR, Bartlett DC, Hawkins WJ, et al. West Midlands Research Collaborative; ROSSINI Trial Management Group Systematic review of the clinical effectiveness of wound-edge protection devices in reducing surgical site infection in patients undergoing open abdominal surgery. Ann Surg. 2012;255:1017–29. doi: 10.1097/SLA.0b013e31823e7411. [DOI] [PubMed] [Google Scholar]
  • 84.Topping KJ. Trends in peer learning. Educ Psychol. 2005;25:631–45. [Google Scholar]
  • 85.Malick S. Peer assisted learning for doctors. BMJ Careers. Jul, 2005. http://careers.bmj.com/careers/advice/view-article.html?id=1014 (Accessed 2 June 2013).
  • 86.Matsui K, Ban N, Fukuhara S, Shimbo T, Koyama H, Nakamura S, et al. Poor English skills as a barrier for Japanese health care professionals in learning and practising evidence-based medicine. Med Educ. 2004;38:1204. doi: 10.1111/j.1365-2929.2004.01973.x. [DOI] [PubMed] [Google Scholar]
  • 87.Letelier LM, Zamarin N, Andrade M, Gabrielli L, Caiozzi G, Viviani P, et al. Exploring language barriers to Evidence-based Health Care (EBHC) in post-graduate medical students: a randomised trial. Educ Health (Abingdon) 2007;20:82. [PubMed] [Google Scholar]
  • 88.Chen FM, Bauchner H, Burstin H. A call for outcomes research in medical education. Acad Med. 2004;79:955–60. doi: 10.1097/00001888-200410000-00010. [DOI] [PubMed] [Google Scholar]
  • 89.Hemingway H, Croft P, Perel P, Hayden JA, Abrams K, Timmis A, et al. PROGRESS Group Prognosis research strategy (PROGRESS) 1: a framework for researching clinical outcomes. BMJ. 2013;346:e5595. doi: 10.1136/bmj.e5595. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Cook DA, West CP. Perspective: Reconsidering the focus on “outcomes research” in medical education: a cautionary note. Acad Med. 2013;88:162–7. doi: 10.1097/ACM.0b013e31827c3d78. [DOI] [PubMed] [Google Scholar]
  • 91.Taylor R, Reeves B, Mears R, Keast J, Binns S, Ewings P, et al. Development and validation of a questionnaire to evaluate the effectiveness of evidence-based practice teaching. Med Educ. 2001;35:544–7. doi: 10.1046/j.1365-2923.2001.00916.x. [DOI] [PubMed] [Google Scholar]
  • 92.Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H-H, Kunz R. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002;325:1338–41. doi: 10.1136/bmj.325.7376.1338. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ. 2003;326:319–21. doi: 10.1136/bmj.326.7384.319. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Mi M, Moseley JL, Green ML. An instrument to characterize the environment for residents’ evidence-based medicine learning and practice. Fam Med. 2012;44:98–104. [PubMed] [Google Scholar]

Articles from The Clinical Biochemist Reviews are provided here courtesy of Australasian Association for Clinical Biochemistry and Laboratory Medicine

RESOURCES