Skip to main content
Physiotherapy Canada logoLink to Physiotherapy Canada
. 2017;69(1):65–72. doi: 10.3138/ptc.2015-45E

Reaching Consensus on Measuring Professional Behaviour in Physical Therapy Objective Structured Clinical Examinations

Robyn Davies *,†,‡,, Cindy Ellerton *, Cathy Evans *
PMCID: PMC5280039  PMID: 28154446

Abstract

Purpose: We determined which professional behaviours (PBs) are important and feasible to measure in an objective structured clinical examination (OSCE) intended to assess the hands-on skills and knowledge of students in a Canadian physical therapy (PT) program. Methods: We used a modified Delphi technique to identify the criteria required to assess PBs in PT students during an OSCE. We conducted a focus group to better understand the results of the modified Delphi process. Results: Experienced local OSCE examiners participated in the modified Delphi panel, which consisted of two rounds of surveys: round 1 (n=12) and round 2 (n=10). A total of 31 PBs were reduced to 18 through the two rounds. Five of the panellists participated in the focus group, reduced the 18 PBs to 15, and then identified 4 as clinical skills. Participants categorized the remaining 11 as mixed PBs and clinical skills (1 item), PBs (4 items), or communication skills (6 items). Conclusion: This study provides preliminary evidence to support the feasibility and importance of evaluating 5 PB items in practical skills OSCEs for entry-to-practice PT students.

Key Words : educational measurement, professionalism, students


Teaching and assessing professionalism have become highly valued in health care education over the past 25 years.1 A recent systematic review highlighted the challenging nature of defining professionalism and acknowledged that although there is no consensus on the definition, one is necessary to convey meaning.2 The Essential Competency Profile for Physiotherapists in Canada has defined professional as “committed to the best interests of clients and society through ethical practice, support of profession-led regulation, and high personal standards of behaviour.”3(p.14) The professional behaviours (PBs) that comprise the construct of professionalism are in and of themselves complex and somewhat intangible in nature; thus, they can be difficult to teach and assess. Therefore, health care practitioners need to understand the PBs that are integral to their individual practice.

Work has been done in physical therapy (PT) to better understand professionalism. For example, in developing the Generic Abilities assessment, May and colleagues4 identified 10 behaviours or professional attributes that were not an explicit part of PT students' core knowledge or technical skills. This assessment tool allows these professional attributes to be objectively assessed in PT students. Building on this work, a Canadian group examined what the key PBs are for physical therapists5 and, using a consensus exercise, identified a framework of 10 key PBs for physical therapists:5 communication, adherence to legal and ethical codes of practice, respect, empathy and sensitive practice, lifelong learning, best evidence and evidence-based practice, client-centred practice, critical thinking, accountability, and professional image.5 Within this framework, the identified behaviours are large constructs, each containing smaller sample behaviours. Therefore, as with many elements of PT practice, PB may need to be broken down into smaller parts for teaching and assessment purposes.5

The Generic Abilities assessment4 and the American Physical Therapy Association's Physical Therapist Clinical Performance Instrument6 are objective measures available to academic programmes to measure students' PBs. These tools are primarily used in the clinical environment to provide a standardized assessment of students' ability to display predetermined PBs. The Comprehensive Professional Behaviours Development Log developed by Bartlett and colleagues7 is a self-assessment tool that PT students can use across both the academic and the clinical environments.

In the academic environment, the Objective Structured Clinical Examination (OSCE) is one method of assessing PBs in PT students. The OSCE was originally developed to evaluate clinical competencies thought not to be adequately assessed through written examinations.8,9 It is a time-limited, multi-station examination that requires students to perform specific skills with a trained person who simulates patient clinical scenarios,8,9 and it is used to assess elements of hands-on skills and knowledge, often before the students begin to learn in the clinical environment. PBs have been assessed in a stand-alone OSCE station or embedded in a station intended to assess hands-on skills.10 Performance is typically evaluated using checklists, either alone or in combination with global rating scales.8,10

A recent study examining the practices of Canadian university PT programmes determined that the PBs of PT students are assessed in both the academic and the clinical environments.11 All participating programmes confirmed that the OSCE was one measure they used to assess professionalism in the academic environment. Participants in that study collectively identified 31 different PBs assessed in their OSCEs. Each programme reported assessing a varying number of PBs in each OSCE, and no clear relationship was identified between the choice of PBs assessed and the level of the student.

The variability and number of PB items identified in that recent national study make it difficult for individual PT programmes to determine which PB items to include in their OSCE. The purpose of our study was, first, to determine the importance and feasibility of assessing the 31 PB items being used in OSCEs for PT students. Second, we sought to understand the rationale for the panellists' ratings. The results of this study will allow us to (1) make decisions regarding which PB items to include in OSCEs, (2) inform other PT programmes on PB items to be considered for inclusion in an OSCE, and (3) provide a process for selecting PB items that could be used by individual programmes for their OSCEs.

Methods

We used a modified Delphi technique12 to identify criteria that could be used to assess PBs in PT students during an OSCE. The modified Delphi technique is a formal group consensus method in which self-administered questionnaires are sent to a target group by email.12 Multiple rounds take place, during which panellists' responses are summarized and distributed to the group for further consideration. Unlike the classic Delphi technique,13 by which the process continues until 100% consensus is achieved, the modified Delphi technique continues until an a priori consensus threshold has been reached.12 No standard recommendation for an appropriate level of consensus exists for the modified Delphi technique, and suggestions range from 51% to 80%.14 We determined, a priori, that a 70% consensus threshold would be used for this study. The Research Ethics Board at the University of Toronto approved the study (no. 26349).

Recruitment of Delphi panellists

Teaching faculty at the University of Toronto generated a list of 24 potential panellists representing each of the three main clinical areas (cardiorespiratory, musculoskeletal, and neurological PT). A research assistant sent recruitment emails outlining the purpose of the study and the expected role of the panellists. We required each panellist to have a minimum of 2 years of experience as a registered physical therapist in Canada, have participated as an examiner in at least two OSCEs at the University of Toronto, have access to the Internet, and be English speaking. All examiners were familiar with our OSCE assessment rubric, which includes a combination of checklist items (“done” or “not done”) and a global rating scale for evaluating PBs. This rubric was published in an earlier article.15

Modified Delphi technique: round 1

We provided the panellists with a list of criteria that could be used to assess PBs in an OSCE. This list had been generated in an earlier national survey.11 The purpose of round 1 was to begin to reduce the number of criteria and obtain a manageable number to use in an OSCE. We asked panellists whether each criterion was important, clear, and feasible for assessing PBs in the context of an OSCE using “agree,” “neutral/unsure,” and “disagree” response options in a questionnaire delivered through the SurveyMonkey online survey platform. Determining an item's feasibility required the panellists to consider the practical constraints of an OSCE intended to assess hands-on skills and knowledge. Panellists used the opportunity to make comments regarding each of the criteria and to suggest additional criteria that could be used to assess PBs in an OSCE if they thought that an aspect of this behaviour was not already covered in the list provided. We collated the panellists' responses using the downloaded responses from SurveyMonkey. We calculated descriptive statistics, including proportions and means, and reviewed the panellists' comments.

For a criterion to be retained for consideration in round 2, at least 70% of the panellists had to agree that it was important and feasible for inclusion in an OSCE (i.e., 70% of panellists had to respond “agree” in those columns). We kept items that we thought were both important and feasible but not clear for consideration in round 2 in an attempt to improve their clarity.

Modified Delphi technique: round 2

The purpose of the round 2 survey was twofold. First, we asked panellists to rate the importance, feasibility, and clarity of any new criteria that were generated in round 1 using the same “agree,” “neutral/unsure,” and “disagree” response options as in round 1. Second, panellists further rated the relative importance and feasibility of each criterion on a 10-point Likert scale, on which 1 indicated a low level of importance and feasibility and 10 indicated a high level of importance and feasibility. We set a priori hypotheses to evaluate the results of the Likert ratings of an individual criterion. A criterion was judged to be important and feasible if the mean score was 7 or higher.

Focus group

After the two Delphi surveys had been completed, we held a focus group to better understand the participants' opinions on the selection and categorization of the final PB items. This focus group allowed us to determine which items should be retained to assess PBs in an OSCE. We invited the 10 individuals who completed the two Delphi surveys to participate in the focus group. We obtained written informed consent at the beginning of the focus group.

Conducting the focus group

The members of the research team conducted the focus group using a semi-structured discussion guide. Two researchers acted as facilitators, and the third acted as an observer, managed the logistics of the session, and took field notes to ensure the accuracy of data collection. Field notes included participants' responses, pertinent quotes, and facilitators' observations. The focus group lasted 90 minutes and began with broad, open-ended questions followed by queries that were more specific to allow matters that were significant to the participants to arise naturally, before we asked questions of specific interest to us. We asked participants for their opinions regarding the items that had been eliminated and those that had been retained during the consensus exercise. The questions explored professional and communication behaviours as well as their importance, feasibility, and categorization. We used questions, prompts, and probes to encourage respondents to categorize and explain their decisions.

Analysis

A single researcher reviewed the notes from the focus group and then grouped similar comments and observations together. The two other researchers read the focus group records independently and confirmed the information that had been extracted from the field notes. To further confirm the validity of the results, the three researchers discussed identified themes and sent a summary of the discussion back to the participants to determine whether the categorization that we had created adequately reflected the group's discussions and decisions.

Results

Delphi panellists

Of the potential panellists, 12 of 24 agreed to participate in the study and completed the demographic questionnaire, for a 50% response rate. (See Table 1 for a demographic summary.) The majority of respondents had more than 16 years of clinical practice and had participated as examiners in five or more OSCEs for PT students. Incomplete data exist for the rating of some criteria in round 1. Of the panellists who completed round 1, 2 were lost to attrition in round 2, leaving a final group of 10 panellists.

Table 1.

Demographic Characteristics of Panellists (n=12)

Characteristic No. (%)
of respondents
Level of education*
 BScPT or equivalent 9 (75)
 Master's (entry to practice) 3 (25)
 Master's 5 (42)
 PhD 0 (0)
No. of years of practice as a physical therapist
 1–5 0 (0)
 6–10 3 (25)
 11–15 1 (8)
 >16 8 (67)
Practice setting
 Private 3 (25)
 Public 9 (75)
Population
 Adult 8 (67)
 Paediatrics 3 (25)
 Geriatrics 1 (8)
Clinical area*
 Musculoskeletal 6 (50)
 Cardiorespiratory 4 (33)
 Neurological 4 (33)
No. of OSCEs examined in past 10 y
 ≤5 4 (33)
 >5 8 (67)
*

Panellists could choose more than one response option.

OSCE=objective structured clinical examination.

Modified Delphi technique: round 1

We gave panellists 31 criteria that could be used to assess PBs in PT students during an OSCE.11 Panellists eliminated 11 of the 31 criteria because they were deemed to be both unimportant and infeasible; 4 criteria because they were deemed to be infeasible; and, finally, 1 criterion that was considered to be redundant, compared with another criterion, on the basis of unsolicited comments from the panellists. Hence, 15 criteria remained for consideration in round 2. (See Table 2.)

Table 2.

Round 1: Criteria Determined to Be Unimportant or Infeasible (Ratings <70%) or Redundant (n=12)

No. (%) of panellists rating item as:
Criteria Important Clear Feasible
Criteria determined to be unimportant and infeasible
 Speaks to patient in a calm manner 7 (58) 7 (58) 7 (58)
 Uses appropriate non-verbal communication (i.e., makes eye contact) 7 (58) 8 (67) 8 (67)
 Answers questions throughout the session 6 (50) 6 (50) 6 (50)
 Adapts communication style to patient needs and abilities 7 (58) 7 (58) 7 (58)
 Closes the session appropriately 7 (58) 6 (50) 7 (58)
 Makes an effort to alleviate patient fears 7 (64)* 7 (58) 8 (67)
 Attends to the patient throughout the session 8 (67) 7 (58) 7 (58)
 Makes an effort to build rapport 6 (50) 5 (42) 3 (27)*
 Treats patient with positive regard, dignity, respect, and compassion 8 (67) 8 (67) 8 (67)
 Demonstrates cultural sensitivity 4 (33) 5 (42) 5 (42)
 Demonstrates sensitivity and respect when handling the patient 8 (67) 7 (58) 6 (50)
Criteria determined to be infeasible
 Demonstrates active listening 11 (92) 8 (67) 7 (58)
 Demonstrates confidence 8 (73)* 6 (55)* 6 (55)*
 Maintains professional patient-therapist relationship 9 (82)* 7 (64)* 6 (55)*
 Obtains informed consent 9 (75) 8 (67) 6 (60)
Criteria determined to be redundant
 Ensures patient comfort throughout the session 9 (82)* 8 (73)* 9 (82)*
 Considers patient comfort 12 (100) 10 (83) 11 (92)
New round 2 criterion generated from merged criteria determined to be redundant
 Ensures patient comfort throughout the session
*

Data from 11 participants.

Data from 10 participants.

Panellists generated three new criteria in round 1: “Provides reassurance and encouragement for patient during uncomfortable procedures,” “Uses voice effectively, considering volume, speed, and clarity,” and “Uses language and tone appropriate for the situation.”

Modified Delphi technique: round 2

A total of 18 criteria were included in the round 2 survey: 15 retained from round 1 and 3 new criteria generated in round 1. All 18 criteria met the a priori importance and feasibility rating threshold. (See Table 3.)

Table 3.

Retained and Generated Criteria: Rounds 1 and 2, Focus Group, and Post–Focus Group Follow-Up

Round 1: Panellist agreement on rating (n=12), no. (%)
Round 2: Panellist average rating on 10-pt. Likert scale (n=10)
Focus group category Post–focus group follow-up: results of clustered items
Criterion Important Clear Feasible Important Feasible
1. Introduces self to patient 12 (100) 10 (83) 11 (92) 9.6 10 PB
2. Explains assessment/treatment techniques in lay terms 12 (100) 11 (92) 11 (92) 10 9.8 C
3. Obtains permission to proceed with interview/assessment/treatment 10 (83) 8 (67) 9 (75) 9.7 9.6 PB
4. Demonstrates an organized approach 9 (75) 7 (58) 10 (83) 8.7 8.2 CS
5. Uses concise verbal communication 8 (73)* 8 (67) 9 (75) 7.8 8.3 C Items 5 and 6 merged. New item: Verbal commands are appropriate in type and timing
6. Verbal commands are appropriate in type and timing 12 (100) 10 (83) 10 (83) 8.3 7.6 C
7. Confirms that patient understands 12 (100) 11 (92) 11 (92) 9.3 9.2 Mixed: C & PB After voting: C
8. Provides patient education (e.g., home program) 11 (92) 10 (83) 9 (75) 7.8 8.1 CS
9. Considers patient dignity, including appropriate draping 11 (92) 8 (67) 9 (75) 9.4 9.1 Mixed: C & PB After voting: PB
10. Considers patient comfort throughout the session 12 (100) 10 (83) 11 (92) 9.0 9.0 Mixed: C & PB After voting=PB
11. Provides opportunity for the patient to ask questions 11 (92) 8 (67) 9 (75) 8.7 9.3 C
12. Monitors patient response throughout the session 11 (100)* 11 (100)* 9 (90) 9.7 9.5 CS Items 12 and 16 merged. New item: Monitors patient response and provides reassurance and encouragement throughout encounter
13. Responds appropriately to patient feedback 10 (91)* 9 (82)* 9 (82)* 9.2 9.1 Mixed: PB & CS After voting: Mixed PB & CS
14. Positions self appropriately throughout the session 9 (82)* 9 (82)* 8 (73)* 8.4 8.1 CS
15. Presents a professional appearance 9 (82)* 8 (73)* 8 (73)* 8.5 8.6 Item eliminated
New criteria generated in round 1: panellist ratings (n=11)
16. Provides reassurance and encouragement for patient during uncomfortable procedures 9 (82)* 9 (82)* 9 (82)* 9.0 8.7 CS Items 12 and 16 merged. New item: Monitors patient response and provides reassurance and encouragement throughout encounter
17. Uses voice effectively, considering volume, speed, and clarity 8 (73)* 9 (82)* 9 (82)* 8.1 7.9 C
18. Uses language and tone appropriate for the situation 8 (73)* 8 (73)* 8 (73)* 8.6 8.2 C
*

Data from 11 participants.

Data from 10 participants.

PB=professional behaviour; C=communication; CS=clinical skills.

Focus group

Participants

Of the 10 participants who completed round 2 of the survey, 5 (50%) attended the focus group. Individuals worked in a variety of clinical specialties, including cardiorespiratory, musculoskeletal, and neurological, with geriatric, adult, and pediatric populations. All institutions were fully affiliated with the University of Toronto. Participants were experienced practitioners, with a mean clinical experience of 20.6 years (range 10–30 y). Participants had an average of 7 years of experience as OSCE examiners (range 3–11 y).

Process

Participants' comments during the focus group remained consistent with comments made during the consensus exercise for both the eliminated and the retained items. After reviewing the retained items, the participants eliminated 1 item and clustered two pairs of like items together, leaving a total of 15 items. As the participants discussed the remaining PBs in greater depth, it became apparent that not all items were purely PBs and could instead be considered clinical or communication skills. As a result, the participants categorized these 15 items into clinical skills, communication skills, or PBs to better identify those items that should be used to assess PBs in an OSCE.

Participants categorized four items as clinical skills, five items as communication skills, and two items as PBs. There were four items that panellists could not categorize into a single category, and they therefore identified them as mixed items that belonged in more than one category (see Table 3).

After the focus group ended, a checking-in document asked the five participants to vote on a single category for each of the four items that had been identified as mixed (belonging in two categories). They categorized two of the items (“Considers patient dignity, including appropriate draping” and “Considers patient comfort throughout the session”) as PBs (five and three participants, respectively) and one item (“Confirms that patient understands”) as a communication skill (four participants); the final item (“Responds appropriately to patient feedback”) remained identified as a mixed item (three participants). (See Table 4.)

Table 4.

Summary of Categorized Retained Professional Behaviour and Communication Criteria

Category Criterion (item no.)
Professional behaviour (4 items) Introduces self to patient (1)
Obtains permission to proceed with interview, assessment, and treatment (3)
Considers patient dignity, including appropriate draping (9)
Considers patient comfort throughout the session (10)
Mixed: professional behaviour and clinical skill (1 item) Responds appropriately to patient feedback (13)
Communication (6 items) Explains assessment and treatment techniques in lay terms (2)
Verbal commands are appropriate in type and timing (new item or combined; includes items 5 and 6)
Confirms that patient understands (7)
Provides opportunity for the patient to ask questions (11)
Uses voice effectively, considering volume, speed, and clarity (17)
Uses language and tone appropriate for the situation (18)

Discussion

Rehabilitation students have indicated that they are often uncertain about what is expected of them regarding PBs.16 Views of what constitutes professionalism are similar among health disciplines, which see the interaction of person and context and the importance of situational judgment as being key to professionalism.17 The regulating bodies provide professions with basic minimum standards, regulations, and codes of conduct, which are often intertwined with a profession's set of core beliefs or values rather than a clearly defined skill or knowledge-based behaviour.16 The tacit knowledge related to these beliefs and values is assumed through the professional socialization that a learner experiences when he or she participates in a practice setting.16 The learning related to this role identification and submersion in a new profession is hidden and achieved through observation and modelling.18 For a student, PBs are complex, and therefore it is not surprising that when they struggle in the clinical environment, it is often because they lack a PB skill.19 It has also been noted that if a student's problematic behaviour has not been addressed or remediated, it is unlikely to improve19 and can result in complaints to the respective professional college.20 Academic programmes can emphasize to students what skills or behaviours are valued by assessing these items. Assessing PBs in both the academic and the clinical environments will serve to emphasize the value of these behaviours.

It must be recognized that, because of their complex nature, PBs cannot be fully assessed in a single, time-limited encounter but rather must be evaluated during ongoing relationships with clients, other members of a health care team, or both. For example, the Essential Competency Profile for Physiotherapists in Canada lists a key competency for physical therapists in their role as a professional as “Conducts self within legal/ethical requirements.”3(p.14) A relevant component of this competency in the clinical environment is obtaining informed consent. In our study, panellists thought it was not feasible to obtain informed consent during a clinical skills station because of time constraints; as a result, they eliminated the item “Obtains informed consent” from the original 31 criteria during the modified Delphi process. The panellists did think, however, that “Obtains permission to proceed with assessment or treatment techniques” was important and feasible in the context of the OSCE. In the recently developed Canadian Physiotherapy Assessment of Clinical Performance,21 whose foundation is the Essential Competency Profile, the role of professional has similar challenges in that the identified competencies are broad and more appropriately assessed in the authentic clinical environment.

Canadian university PT programmes use the OSCE as one measure to assess PBs.11 It gives students an opportunity to be evaluated on their clinical skills and PBs. The feedback that students receive in PB assessment can be valuable to those still struggling to integrate these complex behaviours into their clinical practice (e.g., effective communication, respect). The value of the OSCE lies in the practice element, which students perceive as enhancing their learning experience.22 Practice and reinforcement through OSCEs support students' mastery and assist them in identifying behaviours that are considered important for their discipline. This study is important because it highlights to educators the PBs that are valued and, therefore, where students should focus their attention for learning. This learning gives students a foundation for the intricacy of practice.

Key PBs have been identified by others using consensus methods.4,5 In a previous national study,11 participants identified 31 PBs as items that have been assessed in an OSCE context. Feasibility is an important consideration in the assessment of PBs,23 and we eliminated several items during our consensus exercise because they were deemed infeasible in an OSCE context. The challenge of assessing these items for feasibility could be related to the complex nature of professional issues that arise in an authentic clinical environment. Of the 18 behaviours that remained after the modified Delphi process and that were discussed in the focus group, one item (“Presents a professional appearance”) was judged to be unnecessary in an OSCE. On further probing into these results, it was apparent that focus group participants were not indicating that the item was not important but rather that, in the OSCE context, it should be a minimum standard (and therefore did not need to be included as a separate item). Professional appearance or image has been acknowledged as a component of professionalism in the literature,5,16 and perhaps it is in the more authentic clinical environment, where lapses in professional appearance are a more profound determinant of a student's professionalism.

Participants classified four behaviours as clinical skills. The blurring of behaviours that can be deemed professional versus clinical or technical skills is acknowledged in the literature.20,24 For example, Rogers and Ballantyne20 were purposeful in defining and excluding clinical skills from their definition of PB. We eliminated these technical skills items (see Table 3) from our PB items.

Participants classified 10 items as either PBs or communication. As noted in previous work,11 this finding was expected because PBs are often contextualized by the interaction between individuals, which by definition cannot ignore elements of communication. Using follow-up questions to focus group members, we determined that of these 10 remaining items, 4 were PBs and 6 were communication skills (see Tables 3 and 4). When the members of the focus group needed clarification, they referred to the Essential Competency Profile3 to facilitate their decisions. As per the profile,3 we felt that these two categories were distinct yet interdependent domains and that separating them would enhance clarity for both learning and assessment. As a result, these communication items will be removed from the PB section of the OSCE assessment and added to the communication section. We recognize the interdependence of these two domains and will continue to ensure that both PBs and communication are assessed in each OSCE and in the authentic clinical environment.

Participants determined that one remaining behaviour (“Responds appropriately to patient feedback”) was either a clinical skill or a PB, depending on what the student was responding to. We decided to include this item as a PB, noting that the intention would need to be framed around the interpersonal response to patient feedback. Therefore, we retained five items as potential PBs to be assessed in an OSCE.

This study has some potential limitations that need to be considered. We had a relatively small sample of participants for both the modified Delphi technique and the focus group. However, the majority of the modified Delphi panellists had more than 16 years of clinical experience and had participated as examiners in five or more OSCEs. Although the panel was not representative of the full range of clinical experience (i.e., there were no novice clinicians), we think that this high level of experience better informed our study. The participants did represent diverse clinical environments from both the private and the public sectors.

We drew participants from a local sample to ensure applicability to our OSCE. Familiarity with our existing OSCEs may have influenced participants' feasibility ratings of new items, and ratings may differ for academic programmes that use other OSCE formats. Finally, our results may have differed if we had used an a priori consensus threshold of 100% instead of 70%; however, 70% is an accepted level of agreement14 and was considered reasonable for this exercise.

Conclusion

In summary, this study provides preliminary evidence to support the feasibility and importance of assessing five PB items in practical-skills OSCEs for entry-to-practice PT students. The challenge our participants had in assessing and clustering PB items in this study confirms the complex nature of professionalism and highlights the importance of clinical context. Further study is required to evaluate PB items in different OSCE formats and to determine their relationship to professionalism in authentic clinical environments.

Key Messages

What is already known on this topic

PBs are measured in both the academic and the clinical environments in Canadian university PT programmes. Professional behaviour is a key factor in the success of PT students. Many different sample behaviours are being used to assess PBs in OSCEs at Canadian universities.

What this study adds

We used a consensus exercise to explore a previously generated list of PBs used in OSCEs for entry-to-practice PT students. Participants re-categorized sample behaviours into clinical skills, communication, and PBs. Ultimately, they identified five sample behaviours as being important and feasible in measuring student PBs in OSCEs in Canadian university PT programmes.

References


Articles from Physiotherapy Canada are provided here courtesy of University of Toronto Press and the Canadian Physiotherapy Association

RESOURCES