Abstract
Purpose
Behavioral and social science (BSS) competencies are needed to provide quality health care, but psychometrically validated measures to assess these competencies are difficult to find. Moreover, they have not been mapped to existing frameworks, like those from the Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME). This systematic review aimed to identify and evaluate the quality of assessment tools used to measure BSS competencies.
Method
The authors searched the literature published between January 2002 and March 2014 for articles reporting psychometric or other validity/reliability testing, using OVID, CINAHL, PubMed, ERIC, Research and Development Resource Base, SOCIOFILE, and PsycINFO. They reviewed 5,104 potentially relevant titles and abstracts. To guide their review, they mapped BSS competencies to existing LCME and ACGME frameworks. The final, included articles fell into three categories: instrument development, which were of the highest quality; educational research, which were of the second highest quality; and curriculum evaluation, which were of lower quality.
Results
Of the 114 included articles, 33 (29%) yielded strong evidence supporting tools to assess communication skills, cultural competence, empathy/compassion, behavioral health counseling, professionalism, and teamwork. Sixty-two (54%) articles yielded moderate evidence and 19 (17%) weak evidence. Articles mapped to all LCME standards and ACGME core competencies; the most common was communication skills.
Conclusions
These findings serve as a valuable resource for medical educators and researchers. More rigorous measurement validation and testing and more robust study designs are needed to understand how educational strategies contribute to BSS competency development.
In a 2004 report, the Institute of Medicine (IOM) concluded that, although 50% of the causes of premature morbidity and mortality are related to behavioral and social factors, medical school curricula in these areas are insufficient.1–3 The behavioral and social science (BSS) domains that the IOM deemed critical in their report included: (1) mind-body interactions in health and disease, (2) patient behavior, (3) physician role and behavior, (4) physician-patient interactions, (5) social and cultural issues in health care, and (6) health policy and economics.1 Within these six domains, the IOM identified 26 high priority topics, such as health risk behaviors, principles of behavior change, ethics, physician well-being, communication skills, socioeconomic inequalities, and health care systems design.1 The Association of American Medical Colleges (AAMC) similarly identified core BSS content areas and connected them with other educational frameworks, including the Canadian Medical Education Directions for Specialists (CanMEDS) competency framework and the Accreditation Council for Graduate Medical Education (ACGME) core competencies.4
In addition, the Liaison Committee on Medical Education (LCME) incorporates, as part of its educational program requirements for accreditation, BSS domains5 and requires that schools identify the competencies in these areas that both the profession and the public can expect of a practicing physician. Medical schools must use both content and outcomes-based assessments to demonstrate their learners’ progress toward and achievement of these competencies. To do so, many schools use the broad ACGME core competencies--professionalism, medical knowledge, patient care, interpersonal skills and communication, systems-based practice, and practice-based learning and improvement.6 Within these six categories, BSS competencies are nested among other milestones intended to mark learners’ progression toward knowledge and skill acquisition. At present, no fully articulated, standardized list of BSS competencies exists, nor has there been a cross-translation of the LCME standards, the IOM-defined BSS domains, and the ACGME core competencies.
This lack of standardization makes it difficult to pool evaluation data collected across medical schools, which could help evaluate the effectiveness of different training models or instructional designs for BSS curricula. Moreover, determining the levels of achievement of entrustable professional activities or milestones7 as well as conducting rigorous educational research require that measures of competency development are validated. However, often this important step is skipped entirely, not fully completed, or lacks the rigor needed to produce reliable results. Given the breadth of the competency assessment literature and the existence of contradictory or incomplete findings, a systematic review of published work will be valuable to educators as well as administrators seeking to satisfy the LCME standards and instruct their learners in the ACGME core competencies.
Thus, we conducted a systematic review to identify and evaluate the quality of the assessment tools used to measure BSS competencies. Studies were classified by article type and quality. The strongest assessment tools were mapped to both the IOM-defined BSS domains and to the BSS-relevant LCME standards and ACGME core competencies. Our findings can guide educators and educational researchers to both validated instruments for assessing BSS competencies in learners and the best evaluation designs and educational strategies to determine what may be needed in future educational efforts.
Method
Guiding principles
We used the Best Evidence Medical and Health Professional Education Guide8 in our systematic review. As such, we created two review groups, one to conduct the actual review (P.A.C., R.T.P., M.F.M., E.K.T.) and a second to act as a wider authorship and editorial advisory group (S.E.E., D.K.L., F.E.B., C.R.T., A.L., J.M.S.). We next specified our research question: What valid and reliable instruments have been developed to assess learner (medical student and resident) competencies specifically related to the social and behavioral sciences? We considered instruments that may be applicable to other health professions learners as well. Subsequently, we identified a practical, conceptual framework to identify those competencies specifically related to the social and behavioral sciences that would be of the greatest utility to educators and administrators. To accomplish this step, we analyzed the LCME accreditation requirements,5 which are divided into five sections: (1) institutional setting (e.g., governance and organizational environment); (2) educational program for the MD degree (e.g., objectives, learning environment and approach, structure in design and content); (3) medical students (e.g., student demography, admissions, student services); (4) faculty (e.g., qualifications, personnel, organization and governance); and (5) educational resources (e.g., faculty background and time, finances and facilities). As quality assessments of BSS competencies are needed in graduate medical education as well, we also included the ACGME core competencies (professionalism, medical knowledge, patient care, interpersonal skills and communication, systems-based practice, and practice-based learning and improvement) in the development of our conceptual framework.
To focus our review, we selected components from the LCME’s Section II: Educational Program for the MD Degree (ED) and focused specifically on educational content. (The LCME standards provided more detail than the ACGME milestones, and thus we relied heavily on the LCME verbiage as we refined our review.) We reviewed each of the content areas (ED-10 through ED-23), to identify those most relevant to the six IOM-defined BSS domains. Of the 13 possible components, we selected six BSS-relevant curriculum requirements (ED-10, ED-19 through ED-23) and three BSS-relevant integrative program requirements (ED-13 through ED-15), which provided the conceptual framework and core search terms for our literature review (see Table 1). We then weighted each selected LCME standard using a consensus process that included all authors but two (W.J.H., R.T.P.). The weights were assigned to reflect the strength of each standard’s relationship to each IOM-defined BSS domain, with no assigned weight indicating no relationship, + indicating a somewhat relevant relationship, ++ indicating a moderately relevant relationship, and +++ indicating a very relevant relationship.
Table 1.
Conceptual Frameworks, and Assigned Weights, Used in a Systematic Review of the Literature on Tools to Assess Behavioral and Social Science Competencies in Medical Educationa
| BSS-relevant LCME standards | IOM-defined BSS domains | |||||
|---|---|---|---|---|---|---|
| Mind-body interactions in health & disease |
Patient behavior |
Physician- patient interactions |
Physician role & behavior |
Social & cultural issues in health care |
Health policy & economics |
|
| Curriculum requirements: The curriculum of a medical education program must… | ||||||
| ED-10. Include behavioral and socioeconomic subjects in addition to basic science and clinical disciplines. | + | ++ | +++ | +++ | ||
| ED-19. Include specific instruction in communication skills as they relate to physician responsibilities, including communication with patients and their families, colleagues, and other health professionals. | +++ | +++ | ||||
| ED-20. Prepare medical students for their role in addressing the medical consequences of common societal problems (e.g., provide instruction in the diagnosis, prevention, appropriate reporting, and treatment of violence and abuse). | +++ | +++ | ++ | +++ | ++ | |
| ED-21. The faculty and medical students of a medical education program must demonstrate an understanding of the manner in which people of diverse cultures and belief systems perceive health and illness and respond to various symptoms, diseases, and treatments. | ++ | +++ | +++ | +++ | +++ | ++ |
| ED-22. The medical students in a medical education program must learn to recognize and appropriately address gender and cultural biases in themselves, in others, and in the process of health care delivery. | + | ++ | ++ | +++ | +++ | |
| ED-23. Include instruction in medical ethics and human values and require its medical students to exhibit scrupulous ethical principles in caring for patients and in relating to patients’ families and to others involved in patient care. | ++ | +++ | +++ | |||
| Integrative program requirements: The curriculum of a medical education program must… | ||||||
| ED-13. Cover all organ systems, and include the important aspects of preventive, acute, chronic, continuing, rehabilitative, and end-of-life care. | + | + | + | + | + | |
| ED-14. Include clinical experience in primary care. | + | + | + | + | + | + |
| ED-15. Prepare students to enter any field of graduate medical education and include content and clinical experiences related to each phase of the human life cycle that will prepare students to recognize wellness, determinants of health, and opportunities for health promotion; recognize and interpret symptoms and signs of disease; develop differential diagnoses and treatment plans; and assist patients in addressing health-related issues involving all organ systems. | + | ++ | ++ | +++ | +++ | + |
Abbreviations: LCME indicates Liaison Committee on Medical Education; BSS, behavioral and social science; IOM, Institute of Medicine.
Weights were assigned to reflect the strength of each LCME standard’s relationship to each IOM domain, with no assigned weight indicating no relationship, + indicating a somewhat relevant relationship, ++ indicating a moderately relevant relationship, and +++ indicating a very relevant relationship.
Search terms
We conducted a preliminary search for articles published between January 1, 2002 and March 1, 2014 using the databases OVID (Medline), CINAHL, PubMed, ERIC, Research and Development Resource Base (RDRB), SOCIOFILE, and PsycINFO. With guidance from a library science expert, terms used in the search included: education, curriculum, course evaluation, students, teaching, competence, and program evaluation. These terms were further combined with the selected BSS-relevant LCME standards and the IOM-defined BSS domain keywords. See Supplemental Digital Appendix 1 (at [LWW INSERT URL]) for a sample search strategy with the limits and quotations used to search the OVID (Medline) database.
Inclusion/Exclusion criteria
We sought to include articles reporting on some form of validity or reliability testing in more than one learning setting for BSS competency assessment measures. When articles were identified, we reviewed their reference lists for additional articles to consider. Two specially trained research assistants independently reviewed all titles and abstracts manually to assess appropriateness for inclusion. The two research assistants and one author (P.A.C.) initiated a consensus process, which continued until agreement among the group was reached on inclusion and exclusion according to title and abstract. Figure 1 outlines the process we undertook to search for and ultimately identify the final articles for detailed review. We excluded articles that did not cover the BSS competencies, that reported solely on learners’ satisfaction or self-reported or self-assessed competency, and that did not describe some form of validation of the assessment instrument.
Figure 1.
Literature search and article selection process for a systematic review of the literature, published between January 2002 and March 2014, on assessment tools used to evaluate behavioral and social science competencies in medical education. aPrimary reasons for rejection at title and abstract review included: (1) lack of reporting on psychometric properties or validity or reliability testing in more than one learning setting; (2) measures that did not assess learner competency in one of the selected areas; (3) results that were based solely on learners’ satisfaction or self-reported or self-assessed competency; or (4) the curriculum being tested did not address the behavioral and social sciences (e.g., it focused on anatomy or surgical skills). bSome articles were rejected after partial data abstraction for multiple reasons and therefore were counted twice here.
Methods for data abstraction
The review group (P.A.C., R.T.P., M.F.M., E.K.T.) created an abstraction form using the following variables: the type of article, how it was found, if the article described a BSS learner competency and one or more measures of that competency, the quality of the instrument (does the study describe a form of validation of the instrument(s) used), if institutional review board (IRB) review was mentioned, the type of study, the site of the study, learner level of the participants, curriculum specialty, the BSS or competency measurement framework used, the curriculum format tested and for how many hours, how the competency was assessed, what was measured and when, and our classification of the strength of the instrument testing (as described below). The data abstraction form was tested with approximately 30 articles and was iteratively revised and retested to ensure that data capture during the abstraction process was accurate and that only applicable studies would be included. Selected members of the advisory group (F.E.B., A.L., J.M.S.) provided feedback and contributed to the consensus process as needed.
Methods for assessing instrument quality and study design
We focused on both previously validated BSS competency assessment instruments as well as new instruments, validated within the included article. Assessing the evidence derived from the included articles necessarily involved co-mingling assessments of the strength of the instrument itself and of the strength of the study design, as studies rarely focused on just one of these features. For example, a high-quality article was one that applied a validated BSS instrument (either from the published literature or the included article) using a rigorous study design, such as a randomized controlled trial. A low-quality article was one that applied an unvalidated measure of BSS competency and used a weak study design to measure the impact of the educational intervention, such as a post-intervention survey of student satisfaction.
We categorized the level of evidence supporting each BSS competency assessment instrument and study design as weak, moderate, or strong. The weak evidence category included studies containing limited information on the validity and/or reliability of the evaluation instrument or a weak study design, such as a single group pre-post design. The moderate evidence category included studies that provided some information about the reliability of the measures used but were not assessed rigorously, retested in the study sample, or had a moderately strong study design, such as a single group historical cohort assessment. The strong evidence category included studies in which the evaluation instruments were tested rigorously in the study population and used a strong study design, such as a randomized controlled or crossover trial design.
Methods for article categorization, data entry, and analysis
Articles identified for data abstraction were classified into three categories: (1) instrument development with psychometric assessment only, defined as articles devoted to the statistical validation of a new or existing competency tool, such as a measure of physician empathy; (2) educational research, defined as articles that used a specific study design and BSS competency assessment tool to draw conclusions about a defined educational research question; and (3) curriculum evaluation, defined as articles that assessed specific curriculum features.
Three authors (P.A.C., R.T.P., M.F.M.) independently abstracted data from all articles that met the review criteria and then employed a rigorous consensus process for determing the final content of all abstractions during weekly consensus meetings. At these meetings, the variables from each article were discussed until consensus was reached. In one instance, the three authors could not come to a consensus. In this case, the larger advisory group was consulted for a final decision. The final consensus-based abstraction forms were entered into a database designed for this purpose. The data files were then checked and cleaned prior to analysis.
The Web-based system we used for database entry was a free and open-source application (LimeSurvey; Carsten Schmitz, Hamburg, Germany; https://www.limesurvey.org/en/), which was run on a secure and private server hosted at Oregon Health & Science University. Access to the system was limited to team members only, and its use allowed us to easily confirm which team members were reviewing which articles. Descriptive statistics were used to characterize the included articles (SPSS, IBM Corp., Armonk, NY).
Results
Our initial literature review identified 5,104 study titles and abstracts, many of which did not meet our criteria for further review (see Figure 1). Detailed title and abstract review along with searches of reference lists yielded 170 articles that we retrieved for full text review and data abstraction. Of these, we categorized 21 studies as instrument development with psychometric assessment only, 62 as educational research, and 87 as curriculum evaluation (see Supplemental Digital Appendix 2 at [LWW INSERT URL]). A more complete review during data abstraction revealed that 114 met our criteria for full abstraction (see Supplemental Digital Appendix 2).9–122 At the partial abstraction stage, most article exclusions occurred because instrument validation was absent; this exclusion was most common among articles in the curriculum evaluation category. Other exclusions occurred because the article or assessment tool described did not actually address a BSS competency.
More than 70% of articles (13 of 20 instrument development studies, 35 of 48 educational research studies, and 36 of 46 curricular evaluation studies) mentioned IRB review, with most getting approval or exemption (see Supplemental Digital Appendix 2). Randomized study designs with or without controls were most common for educational research studies (23 of 48, 48%) compared to instrument development studies (1 of 20, 5%) and curricular evaluation studies (0 of 46, 0%), while prospective cohort pre-post designs were most common for curriculum evaluation studies (24 of 46, 52%) compared to educational research studies (6 of 48, 13%) and instrument development studies (1 of 20, 5%) (see Supplemental Digital Appendix 2). Validation using formal psychometric assessment was most common for instrument development (19 of 20, 95%) and educational research studies (25 of 48, 52%) compared to curriculum evaluation studies (17 of 46, 37%). We noted significant variability in the BSS frameworks and competency measures used to guide or evaluate the assessment instruments (see Supplemental Digital Appendix 2).
The most common BSS learner competency assessed across all types of articles was communication skills (see Supplemental Digital Appendix 3 at [LWW INSERT URL]). Cultural competence and behavior change counseling (which included motivational interviewing) also were commonly assessed, especially in educational research and curriculum evaluation studies. Using the ACGME competency language, interpersonal skills and communication (in > 90% of included articles), patient care (> 62% of articles), and medical knowledge (> 43% of articles) were most commonly assessed, with practice-based learning and improvement (≤ 10% of articles) and systems-based practice (≤ 10% of articles) less commonly assessed (see Supplemental Digital Appendix 3).
Validated instruments that assessed knowledge, attitudes, and skills were most commonly used to evaluate BSS competencies (65% – 85%), with standardized patients assessing learners’ performance being the second most common (30% – 44%) (see Supplemental Digital Appendix 3). Very few assessments were based on the direct observation of learners. Articles reporting on psychometric assessments typically provided strong evidence for the validity of the instrument (52% – 90%)--16 articles mentioned testing done without specifying the validation method used. Validation by expert consensus was also reported (15% – 42%), though less often than psychometric assessment.
We ranked 33 articles (29%) as contributing strong evidence to support BSS competency measures of communication skills, cultural competence, empathy/compassion, behavioral health counseling, professionalism, and teamwork. Most of these were educational research studies (see Supplemental Digital Appendix 3). In Appendix 1, we present the tools we found to have the strongest evidence for validity and reliability as well as those with strong study or evaluation designs. We also map these tools to both the LCME standards and the ACGME core competencies.
In Supplemental Digital Appendix 4, we provide additional details regarding the included articles. In Supplemental Digital Appendix 5 and 6, we describe the 62 articles (54%) that yielded moderate evidence in support of a BSS assessment tool and the 19 articles (16.7%) that yielded weak evidence, respectively. In Supplemental Digital Appendix 7, we map these articles to the BSS-relevant LCME standards. The majority (n = 65) mapped to the communication skills standard (ED-19), though all LCME accreditation requirements specific to or integrated with BSS competencies are represented in the included articles. Not all articles mapped to the IOM domains, however, with mind-body interactions in health and disease and health policy and economics being represented least often. All supplemental digital content is available at [LWW INSERT URL].
Discussion
This systematic review is the first to identify valid and reliable instruments that have been developed to assess learner competencies specifically related to the social and behavioral sciences. Our aim was to provide the greatest utility to educators and administrators by linking these instruments with the LCME accreditation requirements and the ACGME core competencies. We learned that tools assessing communication skills were supported by the most rigorous validation and study design approaches. These tools included both written tests assessing knowledge, attitudes, and skills as well as assessments conducted with standardized patients. Overall, we found a paucity of assessments that used the direct observation of learners interacting with actual patients. Although such approaches are time and resource intensive, several articles support the value of direct observation in assessing learner competencies.123–126
Other high-quality assessments evaluated cultural competence, empathy/compassion, behavior change counseling (e.g., motivational interviewing), and professionalism. However, only one high-quality assessment tool, described in a 2008 article, evaluated teamwork. As the national interprofessional education center127 has plans to conduct more rigorous instrument development and validation, additional work in this area might be forthcoming. We recommend that educators and educational researchers review the literature for established, validated tools to assess BSS competencies in their learners rather than reinventing the wheel. We found several well-validated tools that were used in only one study.
One of the most significant challenges in completing this review was distinguishing between the strength of the assessment instruments and the strength of the study designs. For example, the tool used might be very strong but the evaluation design was so weak that the strength of the measure could not overcome the weakness in the design in terms of drawing strong conclusions from the study findings. The strongest articles used well validated tools combined with robust evaluation designs, such as randomized designs or historical cohort comparisons. We also included several rigorous qualitative studies in this review. These studies utilized strong qualitative research methodologies and well validated instruments. Alternately, moderate and weak articles used less rigorous approaches to instrument validation, and they had weak study designs that limited the conclusions that could be drawn. Not surprisingly, we found the most rigorous assessments in articles that described robust instrument development and testing. While educational research articles were also likely to apply rigorous study designs, their validation approaches were not always as robust as those described in instrument development articles. This finding is worrisome as readers may draw conclusions from educational research that employs a strong evaluation design, when in reality the design is only as good as the measures used.
Even more concerning is our finding that curriculum assessment studies were the least likely to include validated instruments and frequently used weak research methods. Researchers cannot generate strong evidence for curricular approaches if the evaluation designs or assessment measures they use are sub-optimal. Thus, an important finding from our work is the need for the use of well-validated instruments in quantitative and qualitative studies that represent both educational research and curriculum evaluation. One way to address this issue is to encourage medical school faculty to partner with investigators in either the school of education or public health/community medicine who have more experience with validating and using rigorous instruments and evaluation designs. Efforts to improve the dissemination of validated instruments and study strategies to promote their adoption also could prove beneficial.
The strengths of our study include the rigor with which we approached the consensus process across each phase of the review as well as the detailed information we abstracted from the articles that met our inclusion criteria. This process allowed us to consider not only the strength of the evidence for each included assessment tool but also to map specific studies and instruments to both the LCME accreditation requirements and the ACGME core competencies. By organizing our data in this way, we were able to provide a quick reference for educators who are looking for well-validated instruments to measure medical student competencies in the social and behavioral sciences at their own institutions.
Our systematic review has a number of limitations that arise from the breadth of the topic area, the lack of specificity in describing BSS competencies, and the related but distinctly different frameworks of the IOM, ACGME, and LCME. We identified the quality of the BSS tools and evaluation designs used in studies that were specific to different learner populations, such as undergraduate medical students. Nuances between the IOM, ACGME, and LCME frameworks should be taken into account when applying our findings from one distinct learner population to another. While these nuances do exist, we also feel that the universality of the BSS competencies, as well as the need to assess them rigorously, outweighs any variance in learner level and thus our findings can be of use in all learner populations. In addition, due to the breadth of the topic area and lack of specificity of the BSS competencies, the search terms we used (and their various Boolean operators) were complex and could be difficult to replicate. Although we searched the CINAHL, PsychInfo, and ERIC databases, our use of the IOM, ACGME, and LCME frameworks in data abstraction might have caused us to over-rely on the medical education literature. We did not include the EMBASE database, truncated search terms, or wildcards, which also limited our search. Next, we determined the quality scores by consensus using a subjective approach in assigning articles to strong, moderate, and weak categories. This process was challenging at times as some articles described high-quality instruments but weak study designs that affected our weighting of the evidence, while others described strong study designs but weak instruments that similarly affected our weighting. Finally, with the growth of peer evaluation and an emphasis on critical reflection in medical school curricula, we may have missed an important body of research because we excluded studies of self-reported competencies in the BSS domains; future reviews should consider addressing this gap.
In conclusion, we abstracted data from 114 articles, after reviewing a total of 5,104 identified studies. Of these, 33 (29%) yielded strong evidence to support BSS assessment tools that evaluated communication skills, cultural competence, empathy/compassion, behavioral health counseling, professionalism, and teamwork. Sixty-two (54%) articles yielded moderate evidence, and 19 (17%) yielded weak findings. In the future, more rigorous validation and testing of assessment tools as well as more robust evaluation designs are needed in both educational research and curriculum assessment. At the same time, the conceptual and content domains of BSS pedagogy deserve similar, careful definition and increased specificity so educators can better assess medical student competencies in areas such as population health and social inequalities and their influence on health status, particularly with regards to gender, race/ethnicity, socioeconomic resources, and the social organization of health care.
Supplementary Material
Acknowledgements
The authors gratefully acknowledge the assistance of Claire Diener, summer student, Oregon Health & Science University, in conducting this literature review.
Funding/Support: The following institutions received grant funding to support this work: Oregon Health & Science University--National Cancer Institute 1R25CA158571-01, 5R25CA158571-02; Indiana University--National Institute of Arthritis and Musculoskeletal and Skin Diseases 5R25AR060994; University of California, San Francisco--National Institutes of Health (NIH) Office of Behavioral and Social Sciences Research (OBSSR) and National Center for Complementary and Integrative Health (NCCIH) R25AT006573; University of North Carolina at Chapel Hill--NIH OBSSR and NCCIH 5R25HD068722-02; and Baylor University--NIH OBSSR and NCCIH 5R25HL108183-04.
Appendix 1
Details of the Assessment Tools with “Strong” Evidence of Validity or Reliability Included in a Systematic Review of the Literature on Tools to Assess Behavioral and Social Science Competencies in Medical Education, 2002–2014 (n = 33)
| Article first author, year |
Assessment tool | Framework | Validation methodology |
Type of study |
Learner level |
LCME standarda |
ACGME core competency |
|---|---|---|---|---|---|---|---|
| Communication skills competencies (n = 17) | |||||||
| Bosse, 20129 |
|
|
|
Educational research |
Medical students |
ED-19, ED-21 |
|
| Daeppen, 201210 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-20 |
|
| Gallagher, 200511 |
|
|
|
Instrument development |
Medical students |
ED-10, ED-19 |
|
| Guiton, 200412 |
|
|
|
Instrument development |
Medical students |
ED-10, ED-19 |
|
| Huntley, 201213 |
|
|
|
Instrument development |
Medical students |
ED-19 | |
| Iramaneerat, 200914 |
|
|
|
Educational research |
Residents | ED-19, ED-20, ED-23 |
|
| Jensen, 201015 |
|
|
|
Instrument development |
Practicing physicians |
ED-19 |
|
| Jensen, 201116 |
|
|
|
Educational research |
Practicing physicians |
ED-19 |
|
| Joshi, 200417 |
|
|
|
Educational research |
Residents | ED-19 |
|
| Krupat, 200618 |
|
|
|
Instrument development |
Residents, practicing physicians |
ED-19 |
|
| Lim, 201119 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-20 |
|
| Lurie, 200820 |
|
|
|
Instrument development |
Medical students |
ED-19, ED-21, ED-22 |
|
| Moulton, 200921 |
|
|
|
Educational research |
Medical students, residents |
ED-19 |
|
| Rees, 200222 |
|
|
|
Educational research |
Medical students |
ED-19 |
|
| Scheffer, 200823 |
|
|
|
Educational research |
Medical students |
ED-19 |
|
| Wouda, 201224 |
|
|
|
Educational research |
Residents | ED-19 |
|
| Yedidia, 200325 |
|
|
|
Curriculum development |
Medical students |
ED-10, ED-19 through ED-23, ED-14 |
|
| Cultural competence/patient-centered care competencies (n = 3) | |||||||
| Crosson, 200426 |
|
|
|
Curriculum development |
Medical students |
ED-10, ED-19, ED-21, ED-22, ED-14 |
|
| Kirby, 201127 |
|
|
|
Educational research |
Medical students |
ED-19, ED-22 |
|
| Wilkerson, 201028 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-21 |
|
| Empathy/Compassion (n = 5) | |||||||
| Austin, 200729 |
|
|
|
Educational research |
Medical students |
ED-19 |
|
| Fields, 201130 |
|
|
|
Instrument development |
Nursing students |
ED-19 |
|
| Hojat, 200231 |
|
|
|
Instrument development |
Practicing physicians |
ED-19, ED-21 |
|
| Peterson, 201232 |
|
|
|
Instrument development |
Medical students |
ED-19, ED-23 |
|
| Shapiro, 200433 |
|
|
|
Educational research |
Medical students |
ED-19, ED-21, ED-22 |
|
| Behavioral health counseling competencies (n = 4) | |||||||
| Mounsey, 200634 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-20 |
|
| Prochaska, 201235 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-20 |
|
| Spollen, 201036 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-15 |
|
| Truncali, 201137 |
|
|
|
Educational research |
Medical students |
ED-10, ED-19, ED-20 |
|
| Professionalism competencies (n = 3) | |||||||
| Crossley, 200938 |
|
|
|
Instrument development |
Medical students |
ED-19, ED-21, ED-22, ED-23 |
|
| De Haes, 200539 |
|
|
|
Instrument development |
Medical students |
ED-19, ED-22 |
|
| Noble, 200740 |
|
|
|
Curriculum development |
Medical students |
ED-19, ED-20, ED-23 |
|
| Teamwork competencies (n = 1) | |||||||
| Youngblood, 200841 |
|
|
|
Educational research |
Medical students, residents |
ED-19 |
|
Abbreviations: LCME indicates Liaison Committee on Medical Education; ACGME, Accreditation Council for Graduate Medical Education; KAS, knowledge, attitudes, and skills; SP, standardized patient; PCC, patient-centered care; MITI, motivational interviewing treatment integrity; UIC, University of Illinois at Chicago; SADP, Scale of Attitudes Toward Disabled Persons; WC, wheelchair; SEGUE, set the stage, elicit information, give information, understand the patients perspective, and end the encounter; OSCE, objective structured clinical exam.
See Table 1 for a list of the LCME standards.
Footnotes
Supplemental digital content for this article is available at [LWW INSERT URL].
Other disclosures: None reported.
Ethical approval: Reported as not applicable.
Contributor Information
Patricia A. Carney, family medicine and of public health & preventive medicine, Oregon Health & Science University School of Medicine, Portland, Oregon.
Ryan T. Palmer, family medicine, Oregon Health & Science University School of Medicine, Portland, Oregon.
Marissa Fuqua Miller, Department of Family Medicine, Oregon Health & Science University School of Medicine, Portland, Oregon.
Erin K. Thayer, Department of Family Medicine, Oregon Health & Science University School of Medicine, Portland, Oregon.
Sue E. Estroff, Department of Social Medicine, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, North Carolina.
Debra K. Litzelman, Medicine and senior director for research in health professions education and practice, Indiana University School of Medicine, Indianapolis, Indiana.
Frances E. Biagioli, family medicine, Oregon Health & Science University School of Medicine, Portland, Oregon.
Cayla R. Teal, Department of Medicine, and director, Educational Evaluation and Research, Office of Undergraduate Medical Education, Baylor College of Medicine, Houston, Texas.
Ann Lambros, social sciences & health policy, Wake Forest School of Medicine, Winston-Salem, North Carolina.
William J. Hatt, Department of Family Medicine, Oregon Health & Science University School of Medicine, Portland, Oregon.
Jason M. Satterfield, clinical medicine, University of California, San Francisco, School of Medicine, San Francisco, California.
References
- 1.Institute of Medicine. Improving Medical Education: Enhancing the Behavioral and Social Science Content of Medical School Curricula. Washington, DC: The National Academies Press; 2004. [PubMed] [Google Scholar]
- 2.McGinnis JM, Foege WH. Actual causes of death in the United States. JAMA. 1993;270:2207–2212. [PubMed] [Google Scholar]
- 3.Centers for Disease Control and Prevention. Behavioral Risk Factor Surveillance System Prevalence Data. [Accessed November 13, 2015];2010 http://www.cdc.gov/brfss/annual_data/annual_2010.htm#information.
- 4.Association of American Medical Colleges Behavioral and Social Science Expert Panel. Washington, DC: AAMC; 2011. [Accessed November 13, 2015]. Behavioral and Social Science Foundations for Future Physicians. https://www.aamc.org/download/271020/data/behavioralandsocialsciencefoundationsforfuturephysicians.pdf. [Google Scholar]
- 5.Liaison Committee on Medical Education. Washington, DC: LCME; 2013. [Accessed December 7, 2015]. Standards for Accreditation of Medical Education Programs Leading to the M.D. Degree. http://www.lcme.org/publications/functions.pdf. [Google Scholar]
- 6.Accreditation Council for Graduate Medical Education (ACGME) Core Competency Definitions. Greensboro Area Heath Education Center. [Accessed December 7, 2015]; http://www.gahec.org/CME/Liasions/0)ACGME%20Core%20Competencies%20Definitions.htm. [Google Scholar]
- 7.Holmboe E, Carraccio C. The Horizon in Medical Education: From Milestones to EPAs to a New Accreditation System. [Accessed December 7, 2015];Health Resources and Services Administration webinar. http://bhpr.hrsa.gov/grants/medicine/technicalassistance/medicaleducation.pdf. [Google Scholar]
- 8.Hammick M, Dornan T, Steinert Y. Conducting a best evidence systematic review. BEME Guide. No. 13. Med Teach. 2010;32:3–15. doi: 10.3109/01421590903414245. [DOI] [PubMed] [Google Scholar]
- 9.Bosse HM, Schultz JH, Nickel M, et al. The effect of using standardized patients or peer role play on ratings of undergraduate communication training: A randomized controlled trial. Patient Educ Couns. 2012;87:300–306. doi: 10.1016/j.pec.2011.10.007. [DOI] [PubMed] [Google Scholar]
- 10.Daeppen JB, Fortini C, Bertholet N, et al. Training medical students to conduct motivational interviewing: A randomized controlled trial. Patient Educ Couns. 2012;87:313–318. doi: 10.1016/j.pec.2011.12.005. [DOI] [PubMed] [Google Scholar]
- 11.Gallagher TJ, Hartung PJ, Gerzina H, Gregory SW, Jr, Merolla D. Further analysis of a doctor-patient nonverbal communication instrument. Patient Educ Couns. 2005;57:262–271. doi: 10.1016/j.pec.2004.06.008. [DOI] [PubMed] [Google Scholar]
- 12.Guiton G, Hodgson CS, Delandshere G, Wilkerson L. Communication skills in standardized-patient assessment of final-year medical students: A psychometric study. Adv Health Sci Educ Theory Pract. 2004;9:179–187. doi: 10.1023/B:AHSE.0000038174.87790.7b. [DOI] [PubMed] [Google Scholar]
- 13.Huntley CD, Salmon P, Fisher PL, Fletcher I, Young B. LUCAS: A theoretically informed instrument to assess clinical communication in objective structured clinical examinations. Med Educ. 2012;46:267–276. doi: 10.1111/j.1365-2923.2011.04162.x. [DOI] [PubMed] [Google Scholar]
- 14.Iramaneerat C, Myford CM, Yudkowsky R, Lowenstein T. Evaluating the effectiveness of rating instruments for a communication skills assessment of medical residents. Adv Health Sci Educ Theory Pract. 2009;14:575–594. doi: 10.1007/s10459-008-9142-2. [DOI] [PubMed] [Google Scholar]
- 15.Jensen BF, Gulbrandsen P, Benth JS, Dahl FA, Krupat E, Finset A. Interrater reliability for the Four Habits Coding Scheme as part of a randomized controlled trial. Patient Educ Couns. 2010;80:405–409. doi: 10.1016/j.pec.2010.06.032. [DOI] [PubMed] [Google Scholar]
- 16.Jensen BF, Gulbrandsen P, Dahl FA, Krupat E, Frankel RM, Finset A. Effectiveness of a short course in clinical communication skills for hospital doctors: Results of a crossover randomized controlled trial. Patient Educ Couns. 2011;84:163–169. doi: 10.1016/j.pec.2010.08.028. [DOI] [PubMed] [Google Scholar]
- 17.Joshi R, Ling FW, Jaeger J. Assessment of a 360-degree instrument to evaluate residents’ competency in interpersonal and communication skills. Acad Med. 2004;79:458–463. doi: 10.1097/00001888-200405000-00017. [DOI] [PubMed] [Google Scholar]
- 18.Krupat E, Frankel R, Stein T, Irish J. The Four Habits Coding Scheme: Validation of an instrument to assess clinicians’ communication behavior. Patient Educ Couns. 2006;62:38–45. doi: 10.1016/j.pec.2005.04.015. [DOI] [PubMed] [Google Scholar]
- 19.Lim BT, Moriarty H, Huthwaite M. “Being-in-role”: A teaching innovation to enhance empathic communication skills in medical students. Med Teach. 2011;33:e663–e669. doi: 10.3109/0142159X.2011.611193. [DOI] [PubMed] [Google Scholar]
- 20.Lurie SJ, Mooney CJ, Nofziger AC, Meldrum SC, Epstein RM. Further challenges in measuring communication skills: Accounting for actor effects in standardised patient assessments. Med Educ. 2008;42:662–668. doi: 10.1111/j.1365-2923.2008.03080.x. [DOI] [PubMed] [Google Scholar]
- 21.Moulton CA, Tabak D, Kneebone R, Nestel D, MacRae H, LeBlanc VR. Teaching communication skills using the integrated procedural performance instrument (IPPI): a randomized controlled trial. Am J Surg. 2009;197:113–118. doi: 10.1016/j.amjsurg.2008.09.006. [DOI] [PubMed] [Google Scholar]
- 22.Rees C, Sheard C, Davies S. The development of a scale to measure medical students’ attitudes towards communication skills learning: The Communication Skills Attitude Scale (CSAS) Med Educ. 2002;36:141–147. doi: 10.1046/j.1365-2923.2002.01072.x. [DOI] [PubMed] [Google Scholar]
- 23.Scheffer S, Muehlinghaus I, Froehmel A, Ortwein H. Assessing students’ communication skills: Validation of a global rating. Adv Health Sci Educ Pract. 2008;13:583–592. doi: 10.1007/s10459-007-9074-2. [DOI] [PubMed] [Google Scholar]
- 24.Wouda JC, van de Wiel HB. The communication competency of medical students, residents and consultants. Patient Educ Couns. 2012;86:57–62. doi: 10.1016/j.pec.2011.03.011. [DOI] [PubMed] [Google Scholar]
- 25.Yedidia MJ, Gillespie CC, Kachur E, et al. Effect of communications training on medical student performance. JAMA. 2003;290:1157–1165. doi: 10.1001/jama.290.9.1157. [DOI] [PubMed] [Google Scholar]
- 26.Crosson JC, Deng W, Brazeau C, Boyd L, Soto-Greene M. Evaluating the effect of cultural competency training on medical student attitudes. Fam Med. 2004;36:199–203. [PubMed] [Google Scholar]
- 27.Kirby RL, Crawford KA, Smith C, Thompson KJ, Sargeant JM. A wheelchair workshop for medical students improves knowledge and skills: A randomized controlled trial. Am J Phys Med Rehabil. 2011;90:197–206. doi: 10.1097/PHM.0b013e318206398a. [DOI] [PubMed] [Google Scholar]
- 28.Wilkerson L, Fung CC, May W, Elliott D. Assessing patient-centered care: One approach to health disparities education. J Gen Intern Med. 2010;25:S86–S90. doi: 10.1007/s11606-010-1273-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Austin EJ, Evans P, Magnus B, O’Hanlon K. A preliminary study of empathy, emotional intelligence and examination performance in MBChB students. Med Educ. 2007;41:684–689. doi: 10.1111/j.1365-2923.2007.02795.x. [DOI] [PubMed] [Google Scholar]
- 30.Fields SK, Mahan P, Tillman P, Harris J, Maxwell K, Hojat M. Measuring empathy in healthcare profession students using the Jefferson Scale of Physician Empathy: Health provider--student version. J Interprof Care. 2011;25:287–293. doi: 10.3109/13561820.2011.566648. [DOI] [PubMed] [Google Scholar]
- 31.Hojat M, Gonnella JS, Nasca TJ, Mangione S, Vergare M, Magee M. Physician empathy: definition, components, measurement, and relationship to gender and specialty. Am J Psychiatry. 2002;159:1563–1569. doi: 10.1176/appi.ajp.159.9.1563. [DOI] [PubMed] [Google Scholar]
- 32.Peterson LN, Eva KW, Rusticus SA, Lovato CY. The readiness for clerkship survey: Can self-assessment data be used to evaluate program effectiveness? Acad Med. 2012;87:1355–1360. doi: 10.1097/ACM.0b013e3182676c76. [DOI] [PubMed] [Google Scholar]
- 33.Shapiro J, Morrison E, Boker J. Teaching empathy to first year medical students: Evaluation of an elective literature and medicine course. Educ Health (Abingdon) 2004;17:73–84. doi: 10.1080/13576280310001656196. [DOI] [PubMed] [Google Scholar]
- 34.Mounsey AL, Bovbjerg V, White L, Gazewood J. Do students develop better motivational interviewing skills through role-play with standardised patients or with student colleagues? Med Educ. 2006;40:775–780. doi: 10.1111/j.1365-2929.2006.02533.x. [DOI] [PubMed] [Google Scholar]
- 35.Prochaska JJ, Gali K, Miller B, Hauer KE. Medical students’ attention to multiple risk behaviors: A standardized patient examination. J Gen Intern Med. 2012;27:700–707. doi: 10.1007/s11606-011-1953-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Spollen JJ, Thrush CR, Mui DV, Woods MB, Tariq SG, Hicks E. A randomized controlled trial of behavior change counseling education for medical students. Med Teach. 2010;32:e170–e177. doi: 10.3109/01421590903514614. [DOI] [PubMed] [Google Scholar]
- 37.Truncali A, Lee JD, Ark TK, et al. Teaching physicians to address unhealthy alcohol use: A randomized controlled trial assessing the effect of a Web-based module on medical student performance. J Subst Abuse Treat. 2011;40:203–213. doi: 10.1016/j.jsat.2010.09.002. [DOI] [PubMed] [Google Scholar]
- 38.Crossley J, Vivekananda-Schmidt P. The development and evaluation of a Professional Self Identity Questionnaire to measure evolving professional self-identity in health and social care students. Med Teach. 2009;31:e603–e607. doi: 10.3109/01421590903193547. [DOI] [PubMed] [Google Scholar]
- 39.De Haes JC, Oort FJ, Hulsman RJ. Summative assessment of medical students’ communication skills and professional attitudes through observation in clinical practice. Med Teach. 2005;27:583–589. doi: 10.1080/01421590500061378. [DOI] [PubMed] [Google Scholar]
- 40.Noble LM, Kubacki A, Martin J, Lloyd M. The effect of professional skills training on patient-centredness and confidence in communicating with patients. Med Educ. 2007;41:432–440. doi: 10.1111/j.1365-2929.2007.02704.x. [DOI] [PubMed] [Google Scholar]
- 41.Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc. 2008;3:146–153. doi: 10.1097/SIH.0b013e31817bedf7. [DOI] [PubMed] [Google Scholar]
- 42.Bachmann C, Barzel A, Roschlaub S, Ehrhardt M, Scherer M. Can a brief two-hour interdisciplinary communication skills training be successful in undergraduate medical education? Patient Educ Couns. 2013;93:298–305. doi: 10.1016/j.pec.2013.05.019. [DOI] [PubMed] [Google Scholar]
- 43.Bombeke K, Van Roosbroeck S, De Winter B, et al. Medical students trained in communication skills show a decline in patient-centred attitudes: An observational study comparing two cohorts during clinical clerkships. Patient Educ Couns. 2011;84:310–318. doi: 10.1016/j.pec.2011.03.007. [DOI] [PubMed] [Google Scholar]
- 44.Feeley TH, Anker AE, Soriano R, Friedman E. Using standardized patients to educate medical students about organ donation. Communication Education. 2010;59:249–262. [Google Scholar]
- 45.Hausberg MC, Hergert A, Kröger C, Bullinger M, Rose M, Andreas S. Enhancing medical students’ communication skills: Development and evaluation of an undergraduate training program. BMC Med Educ. 2012;12:16. doi: 10.1186/1472-6920-12-16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Joekes K, Noble LM, Kubacki AM, Potts HW, Lloyd M. Does the inclusion of “professional development” teaching improve medical students’ communication skills? BMC Med Educ. 2011;11:41. doi: 10.1186/1472-6920-11-41. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Tiuraniemi J, Läärä R, Kyrö T, Lindeman S. Medical and psychology students’ self-assessed communication skills: A pilot study. Patient Educ Couns. 2011;83:152–157. doi: 10.1016/j.pec.2010.05.013. [DOI] [PubMed] [Google Scholar]
- 48.Turan S, Elcin M, Uner S, Odabasi O, Sayek I, Senemoglu N. The impact of clinical visits on communication skills training. Patient Educ Couns. 2009;77:42–47. doi: 10.1016/j.pec.2009.02.012. [DOI] [PubMed] [Google Scholar]
- 49.Claramita M, Majoor G. Comparison of communication skills in medical residents with and without undergraduate communication skills training as provided by the Faculty of Medicine of Gadjah Mada University. Educ Health (Abingdon) 2006;19:308–320. doi: 10.1080/13576280600937887. [DOI] [PubMed] [Google Scholar]
- 50.Mauksch L, Farber S, Greer HT. Design, dissemination, and evaluation of an advanced communication elective at seven U.S. medical schools. Acad Med. 2013;88:843–851. doi: 10.1097/ACM.0b013e31828fd5ed. [DOI] [PubMed] [Google Scholar]
- 51.Hulsman RL, Mollema ED, Hoos AM, de Haes JC, Donnison-Speijer JD. Assessment of medical communication skills by computer: Assessment method and student experiences. Med Educ. 2004;38:813–824. doi: 10.1111/j.1365-2929.2004.01900.x. [DOI] [PubMed] [Google Scholar]
- 52.Hulsman RL, Peters JF, Fabriek M. Peer-assessment of medical communication skills: The impact of students’ personality, academic and social reputation on behavioural assessment. Patient Educ Couns. 2013;92:346–354. doi: 10.1016/j.pec.2013.07.004. [DOI] [PubMed] [Google Scholar]
- 53.Ishikawa H, Hashimoto H, Kinoshita M, Yano E. Can nonverbal communication skills be taught? Med Teach. 2010;32:860–863. doi: 10.3109/01421591003728211. [DOI] [PubMed] [Google Scholar]
- 54.Lie DA, Bereknyei S, Vega CP. Longitudinal development of medical students’ communication skills in interpreted encounters. Educ Health (Abingdon) 2010;23:466. [PubMed] [Google Scholar]
- 55.Karabilgin OS, Vatansever K, Caliskan SA, Durak HI. Assessing medical student competency in communication in the pre-clinical phase: Objective structured video exam and SP exam. Patient Educ Couns. 2012;87:293–299. doi: 10.1016/j.pec.2011.10.008. [DOI] [PubMed] [Google Scholar]
- 56.Shapiro SM, Lancee WJ, Richards-Bentley CM. Evaluation of a communication skills program for first-year medical students at the University of Toronto. BMC Med Educ. 2009;9:11. doi: 10.1186/1472-6920-9-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Simmenroth-Nayda A, Weiss C, Fischer T, Himmel W. Do communication training programs improve students’ communication skills?--A follow-up study. BMC Res Notes. 2012;5:486. doi: 10.1186/1756-0500-5-486. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Yeap R, Beevi Z, Lukman H. Evaluating IMU communication skills training programme: Assessment tool development. Med J Malaysia. 2008;63:244–246. [PubMed] [Google Scholar]
- 59.LeBlanc VR, Tabak D, Kneebone R, Nestel D, MacRae H, Moulton CA. Psychometric properties of an integrated assessment of technical and communication skills. Am J Surg. 2009;197:96–101. doi: 10.1016/j.amjsurg.2008.08.011. [DOI] [PubMed] [Google Scholar]
- 60.Leung K-K, Wang W-D, Chen Y-Y. Multi-source evaluation of interpersonal and communication skills of family medicine residents. Adc Health Sci Educ Theory Pract. 2012;17:717–726. doi: 10.1007/s10459-011-9345-9. [DOI] [PubMed] [Google Scholar]
- 61.Wagner JA, Pfeiffer CA, Harrington KL. Evaluation of online instruction to improve medical and dental students’ communication and counseling skills. Eval Health Prof. 2011;34:383–397. doi: 10.1177/0163278710380612. [DOI] [PubMed] [Google Scholar]
- 62.Bachner YG, Castel H, Kushnir T. Psychosocial abilities of first-year medical students participating in a clinical communication course. Natl Med J India. 2012;25:80–82. [PubMed] [Google Scholar]
- 63.King AE, Conrad M, Ahmed RA. Improving collaboration among medical, nursing and respiratory therapy students through interprofessional simulation. J Interprof Care. 2013;27:269–271. doi: 10.3109/13561820.2012.730076. [DOI] [PubMed] [Google Scholar]
- 64.Koponen J, Pyörälä E, Isotalus P. Comparing three experiential learning methods and their effect on medical students’ attitudes to learning communication skills. Med Teach. 2012;34:e198–e207. doi: 10.3109/0142159X.2012.642828. [DOI] [PubMed] [Google Scholar]
- 65.Hook KM, Pfeiffer CA. Impact of a new curriculum on medical students’ interpersonal and interviewing skills. Med Educ. 2007;41:154–159. doi: 10.1111/j.1365-2929.2006.02680.x. [DOI] [PubMed] [Google Scholar]
- 66.Mason SR, Ellershaw JE. Preparing for palliative medicine; evaluation of an education programme for fourth year medical undergraduates. Palliat Med. 2008;22:687–692. doi: 10.1177/0269216308090070. [DOI] [PubMed] [Google Scholar]
- 67.Schulz C, Möller MF, Seidler D, Schnell MW. Evaluating an evidence-based curriculum in undergraduate palliative care education: Piloting a phase II exploratory trial for a complex intervention. BMC Med Educ. 2013;13:1. doi: 10.1186/1472-6920-13-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Aper L, Reniers J, Koole S, Valcke M, Derese A. Impact of three alternative consultation training formats on self-efficacy and consultation skills of medical students. Med Teach. 2012;34:e500–e507. doi: 10.3109/0142159X.2012.668627. [DOI] [PubMed] [Google Scholar]
- 69.Christner JG, Stansfield RB, Schiller JH, Madenci A, Keefer PM, Pituch K. Use of simulated electronic mail (e-mail) to assess medical student knowledge, professionalism, and communication skills. Acad Med. 2010;85:S1–S4. doi: 10.1097/ACM.0b013e3181ed45f8. [DOI] [PubMed] [Google Scholar]
- 70.Crandall SJ, George G, Marion GS, Davis S. Applying theory to the design of cultural competency training for medical students: A case study. Acad Med. 2003;78:588–594. doi: 10.1097/00001888-200306000-00007. [DOI] [PubMed] [Google Scholar]
- 71.Jarris YS, Bartleman A, Hall EC, Lopez L. A preclinical medical student curriculum to introduce health disparities and cultivate culturally responsive care. J Natl Med Assoc. 2012;104:404–411. doi: 10.1016/s0027-9684(15)30193-0. [DOI] [PubMed] [Google Scholar]
- 72.Mihalic AP, Morrow JB, Long RB, Dobbie AE. A validated cultural competence curriculum for US pediatric clerkships. Patient Educ Couns. 2010;79:77–82. doi: 10.1016/j.pec.2009.07.029. [DOI] [PubMed] [Google Scholar]
- 73.Hoang L, LaHousse SF, Nakaji MC, Sadler GR. Assessing deaf cultural competency of physicians and medical students. J Cancer Educ. 2011;26:175–182. doi: 10.1007/s13187-010-0144-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Genao I, Bussey-Jones J, St George DM, Corbie-Smith G. Empowering students with cultural competence knowledge: Randomized controlled trial of a cultural competence curriculum for third-year medical students. J Natl Med Assoc. 2009;101:1241–1246. doi: 10.1016/s0027-9684(15)31135-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Weissman JS, Betancourt J, Campbell EG, et al. Resident physicians’ preparedness to provide cross-cultural care. JAMA. 2005;294:1058–1067. doi: 10.1001/jama.294.9.1058. [DOI] [PubMed] [Google Scholar]
- 76.Cushing A, Evans D, Hall A. Medical students’ attitudes and behaviour towards sexual health interviewing: Short- and long-term evaluation of designated workshops. Med Teach. 2005;27:422–428. doi: 10.1080/01421590500046502. [DOI] [PubMed] [Google Scholar]
- 77.Kim S, Spielberg F, Mauksch L, et al. Comparing narrative and multiple-choice formats in online communication skill assessment. Med Educ. 2009;43:533–541. doi: 10.1111/j.1365-2923.2009.03368.x. [DOI] [PubMed] [Google Scholar]
- 78.Wouda JC, Zandbelt LC, Smets EM, van de Wiel HB. Assessment of physician competency in patient education: Reliability and validity of a model-based instrument. Patient Educ Couns. 2011;85:92–98. doi: 10.1016/j.pec.2010.09.007. [DOI] [PubMed] [Google Scholar]
- 79.Hoover CR, Wong CC, Azzam A. From primary care to public health: Using problem-based learning and the ecological model to teach public health to first year medical students. J Community Health. 2012;37:647–652. doi: 10.1007/s10900-011-9495-y. [DOI] [PubMed] [Google Scholar]
- 80.McGarvey E, Peterson C, Pinkerton R, Keller A, Clayton A. Medical students’ perceptions of sexual health issues prior to a curriculum enhancement. Int J Impot Res. 2003;15:S58–S66. doi: 10.1038/sj.ijir.3901074. [DOI] [PubMed] [Google Scholar]
- 81.Tang TS, Fantone JC, Bozynski ME, Adams BS. Implementation and evaluation of an undergraduate Sociocultural Medicine Program. Acad Med. 2002;77:578–585. doi: 10.1097/00001888-200206000-00019. [DOI] [PubMed] [Google Scholar]
- 82.Lim BT, Moriarty H, Huthwaite M, Gray L, Pullon S, Gallagher P. How well do medical students rate and communicate clinical empathy? Med Teach. 2013;35:e946–e951. doi: 10.3109/0142159X.2012.715783. [DOI] [PubMed] [Google Scholar]
- 83.Shapiro J, Rucker L, Boker J, Lie D. Point-of-view writing: A method for increasing medical students’ empathy, identification and expression of emotion, and insight. Educ Health (Abingdon) 2006;19:96–105. doi: 10.1080/13576280500534776. [DOI] [PubMed] [Google Scholar]
- 84.Mercer SW, Maxwell M, Heaney D, Watt GC. The consultation and relational empathy (CARE) measure: Development and preliminary validation and reliability of an empathy-based consultation process measure. Fam Pract. 2004;21:699–705. doi: 10.1093/fampra/cmh621. [DOI] [PubMed] [Google Scholar]
- 85.Fernández-Olano C, Montoya-Fernández J, Salinas-Sánchez AS. Impact of clinical interview training on the empathy level of medical students and medical residents. Med Teach. 2008;30:322–324. doi: 10.1080/01421590701802299. [DOI] [PubMed] [Google Scholar]
- 86.Bass PF, III, Stetson BA, Rising W, Wesley GC, Ritchie CS. Development and evaluation of a nutrition and physical activity counseling module for first-year medical students. Med Educ Online. 2004;9:23. doi: 10.3402/meo.v9i.4359. [DOI] [PubMed] [Google Scholar]
- 87.Bell K, Cole BA. Improving medical students’ success in promoting health behavior change: A curriculum evaluation. J Gen Intern Med. 2008;23:1503–1506. doi: 10.1007/s11606-008-0678-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Brown RL, Pfeifer JM, Gjerde CL, Seibert CS, Haq CL. Teaching patient-centered tobacco intervention to first-year medical students. J Gen Intern Med. 2004;19:534–539. doi: 10.1111/j.1525-1497.2004.30144.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Han PK, Joekes K, Elwyn G, et al. Development and evaluation of a risk communication curriculum for medical students. Patient Educ Couns. 2014;94:43–49. doi: 10.1016/j.pec.2013.09.009. [DOI] [PubMed] [Google Scholar]
- 90.Kosowicz LY, Pfeiffer CA, Vargas M. Long-term retention of smoking cessation counseling skills learned in the first year of medical school. J Gen Intern Med. 2007;22:1161–1165. doi: 10.1007/s11606-007-0255-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Martino S, Haeseler F, Belitsky R, Pantalon M, Fortin AH., 4th Teaching brief motivational interviewing to year three medical students. Med Educ. 2007;41:160–167. doi: 10.1111/j.1365-2929.2006.02673.x. [DOI] [PubMed] [Google Scholar]
- 92.McEvoy M, Schlair S, Sidlo Z, Burton W, Milan F. Assessing third-year medical students’ ability to address a patient’s spiritual distress using an OSCE case. Acad Med. 2014;89:66–70. doi: 10.1097/ACM.0000000000000061. [DOI] [PubMed] [Google Scholar]
- 93.White LL, Gazewood JD, Mounsey AL. Teaching students behavior change skills: Description and assessment of a new motivational interviewing curriculum. Med Teach. 2007;29:e67–e71. doi: 10.1080/01421590601032443. [DOI] [PubMed] [Google Scholar]
- 94.Edwardsen EA, Morse DS, Frankel RM. Structured practice opportunities with a mnemonic affect medical student interviewing skills for intimate partner violence. Teach Learn Med. 2006;18:62–68. doi: 10.1207/s15328015tlm1801_13. [DOI] [PubMed] [Google Scholar]
- 95.Haeseler F, Fortin AH, 6th, Pfeiffer C, Walters C, Martino S. Assessment of a motivational interviewing curriculum for year 3 medical students using a standardized patient case. Patient Educ Couns. 2011;84:27–30. doi: 10.1016/j.pec.2010.10.029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Prochaska JJ, Teherani A, Hauer KE. Medical students’ use of the stages of change model in tobacco cessation counseling. J Gen Intern Med. 2007;22:223–227. doi: 10.1007/s11606-006-0040-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 97.Stolz D, Langewitz W, Meyer A, et al. Enhanced didactic methods of smoking cessation training for medical students--A randomized study. Nicotine Tob Res. 2012;14:224–228. doi: 10.1093/ntr/ntr186. [DOI] [PubMed] [Google Scholar]
- 98.Boenink AD, de Jonge P, Smal K, Oderwald A, van Tilburg W. The effects of teaching medical professionalism by means of vignettes: An exploratory study. Med Teach. 2005;27:429–432. doi: 10.1080/01421590500069983. [DOI] [PubMed] [Google Scholar]
- 99.Kittmer T, Hoogenes J, Pemberton J, Cameron BH. Exploring the hidden curriculum: A qualitative analysis of clerks’ reflections on professionalism in surgical clerkship. Am J Surg. 2013;205:426–433. doi: 10.1016/j.amjsurg.2012.12.001. [DOI] [PubMed] [Google Scholar]
- 100.Abadel FT, Hattab AS. Patients’ assessment of professionalism and communication skills of medical graduates. BMC Med Educ. 2014;14:28. doi: 10.1186/1472-6920-14-28. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Kelly M, O’Flynn S, McLachlan J, Sawdon MA. The clinical conscientiousness index: A valid tool for exploring professionalism in the clinical undergraduate setting. Acad Med. 2012;87:1218–1224. doi: 10.1097/ACM.0b013e3182628499. [DOI] [PubMed] [Google Scholar]
- 102.Brock D, Abu-Rish E, Chiu CR, et al. Interprofessional education in team communication: Working together to improve patient safety. Postgrad Med J. 2013;89:642–651. doi: 10.1136/postgradmedj-2012-000952rep. [DOI] [PubMed] [Google Scholar]
- 103.Yuasa M, Nagoshi M, Oshiro-Wong C, Tin M, Wen A, Masaki K. Standardized patient and standardized interdisciplinary team meeting: Validation of a new performance-based assessment tool. J Am Geriatr Soc. 2014;62:171–174. doi: 10.1111/jgs.12604. [DOI] [PubMed] [Google Scholar]
- 104.Baribeau DA, Mukovozov I, Sabljic T, Eva KW, deLottinville CB. Using an objective structured video exam to identify differential understanding of aspects of communication skills. Med Teach. 2012;34:e242–e250. doi: 10.3109/0142159X.2012.660213. [DOI] [PubMed] [Google Scholar]
- 105.Chisolm A, Hart J, Mann K, Peters S. Development of a behaviour change communication tool for medical students: The “Tent Pegs” booklet. Patient Educ Couns. 2014;94:50–60. doi: 10.1016/j.pec.2013.09.007. [DOI] [PubMed] [Google Scholar]
- 106.Endres J, Laidlaw A. Micro-expression recognition training in medical students: A pilot study. BMC Med Educ. 2009;9:47. doi: 10.1186/1472-6920-9-47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 107.Kalet AL, Mukherjee D, Felix K, et al. Can a web-based curriculum improve students’ knowledge of, and attitudes about, the interpreted medical interview? J Gen Intern Med. 2005;20:929–234. doi: 10.1111/j.1525-1497.2005.0193.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 108.Lee CA, Chang A, Chou CL, Boscardin C, Hauer KE. Standardized patient-narrated web-based learning modules improve students’ communication skills on a high-stakes clinical skills examination. J Gen Intern Med. 2011;26:1374–1377. doi: 10.1007/s11606-011-1809-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 109.Mukohara K, Kitamura K, Wakabayashi H, Abe K, Sato J, Ban N. Evaluation of a communication skills seminar for students in a Japanese medical school: A non-randomized controlled study. BMC Med Educ. 2004;4:24. doi: 10.1186/1472-6920-4-24. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Schildmann J, Kupfer S, Burchardi N, Vollmann J. Teaching and evaluating breaking bad news: A pre-post evaluation study of a teaching intervention for medical students and a comparative analysis of different measurement instruments and raters. Patient Educ Couns. 2012;86:210–219. doi: 10.1016/j.pec.2011.04.022. [DOI] [PubMed] [Google Scholar]
- 111.Chun MB, Young KG, Honda AF, Belcher GF, Maskarinec GG. The development of a cultural standardized patient examination for a general surgery residency program. J Surg Educ. 2012;69:650–658. doi: 10.1016/j.jsurg.2012.04.013. [DOI] [PubMed] [Google Scholar]
- 112.Sanchez NF, Rabatin J, Sanchez JP, Hubbard S, Kalet A. Medical students’ ability to care for lesbian, gay, bisexual, and transgendered patients. Fam Med. 2006;38:21–27. [PubMed] [Google Scholar]
- 113.Yacht AC, Suglia SF, Orlander JD. Evaluating an end-of-life curriculum in a medical residency program. Am J Hosp Palliat Care. 2006;23:439–446. doi: 10.1177/1049909106294829. [DOI] [PubMed] [Google Scholar]
- 114.Chun MB, Deptula P, Morihara S, Jackson DS. The refinement of a cultural standardized patient examination for a general surgery residency program. J Surg Educ. 2013;71:398–404. doi: 10.1016/j.jsurg.2013.10.005. [DOI] [PubMed] [Google Scholar]
- 115.Fletcher I, Leadbetter P, Curran A, O’Sullivan H. A pilot study assessing emotional intelligence training and communication skills with 3rd year medical students. Patient Educ Couns. 2009;76:376–379. doi: 10.1016/j.pec.2009.07.019. [DOI] [PubMed] [Google Scholar]
- 116.Goldsmith J, Wittenberg-Lyles E, Shaunfield S, Sanchez-Reilly S. Palliative care communication curriculum: What can students learn from an unfolding case? Am J Hosp Palliat Care. 2011;28:236–241. doi: 10.1177/1049909110385670. [DOI] [PubMed] [Google Scholar]
- 117.Loke S-K, Blyth P, Swan J. In search of a method to assess dispositional behaviours: The case of Otago Virtual Hospital. Australasian Journal of Educational Technology. 2012;28:441–458. [Google Scholar]
- 118.Rosenthal S, Howard B, Schlussel YR, et al. Humanism at heart: Preserving empathy in third-year medical students. Acad Med. 2011;86:350–358. doi: 10.1097/ACM.0b013e318209897f. [DOI] [PubMed] [Google Scholar]
- 119.Van Winkle LJ, Fjortoft N, Hojat M. Impact of a workshop about aging on the empathy scores of pharmacy and medical students. Am J Pharm Educ. 2012;76:9. doi: 10.5688/ajpe7619. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 120.Foley KL, George G, Crandall SJ, Walker KH, Marion GS, Spangler JG. Training and evaluating tobacco-specific standardized patient instructors. Fam Med. 2006;38:28–37. [PMC free article] [PubMed] [Google Scholar]
- 121.Haist SA, Wilson JF, Pursley HG, et al. Domestic violence: Increasing knowledge and improving skills with a four-hour workshop using standardized patients. Acad Med. 2003;78:S24–S26. doi: 10.1097/00001888-200310001-00008. [DOI] [PubMed] [Google Scholar]
- 122.Margalit AP, Glick SM, Benbassat J, Cohen A, Katz M. Promoting a biopsychosocial orientation in family practice: Effect of two teaching programs on the knowledge and attitudes of practising primary care physicians. Med Teach. 2005;27:613–618. doi: 10.1080/01421590500097091. [DOI] [PubMed] [Google Scholar]
- 123.Hanson JL, Bannister SL, Clark A, Raszka WV., Jr Oh, what you can see: The role of observation in medical student education. Pediatrics. 2010;126:843–845. doi: 10.1542/peds.2010-2538. [DOI] [PubMed] [Google Scholar]
- 124.Fromme HB, Karani R, Downing SM. Direct observation in medical education: A review of the literature and evidence for validity. Mt Sinai J Med. 2009;76:365–371. doi: 10.1002/msj.20123. [DOI] [PubMed] [Google Scholar]
- 125.Gigante J. Direct observation medical trainees. Pediat Therapeut. 2013;3:e118. [Google Scholar]
- 126.Holmboe ES. Faculty and the observation of trainees’ clinical skills: Problems and opportunities. Acad Med. 2004;79:16–22. doi: 10.1097/00001888-200401000-00006. [DOI] [PubMed] [Google Scholar]
- 127.National Center for Interprofessional Practice and Education. [Accessed November 13, 2015]; https://nexusipe.org. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

