1. Assessment formats
Understanding and application of basic and clinical sciences knowledge was assessed using written tests administered four times a year.
22
We anticipated that such a progressive approach would allow rich feedback for students, promote deep rather than superficial learning and provide timely monitoring for applied knowledge on all content areas of the curriculum.
23
In addition to assessing clinical competence using objective structured clinical examination (OSCE) style clinical assessments, work‐based assessments captured the performance of students over a period of time. We anticipated that these formats would be appropriate to detect issues in the traditionally difficult areas to assess, such as communication skills and professionalism.
24
In the assessment of professionalism, assessors provided judgements on individual student's written reflections aimed at promoting self‐regulatory behaviour. This also included assessment tasks on curricular themes such as indigenous health and interprofessional learning activities developed and assessed longitudinally. Additionally, a student progress record documented any late submissions. These assessments were guided by Hodges et al.'s
25
three levels of assessment or professionalism: individual, the inter‐personal and the societal–institutional.
|
2. Learning advisor system
A key element of supporting the programmatic assessment and encouraging students to be self‐monitoring and fostering other elements of professionalism was the learning adviser system where a group of five to six students were allocated to a clinician from a teaching hospital.
26
,
27
The discussion was focused on students' learning plans and progress records.
|
3. Proportionality
In adapting the principle of proportionality,
28
,
29
the stakes of decisions about a student's progress were intended to be proportional to the credibility or richness of information.
29
Students and assessors were informed that individual assessments in programmatic assessment were perceived as low or medium stakes and were intended to optimise self‐directed learning.
|
4. Decision making and progression
Decision making about student progression was managed by a portfolio advisory group meeting as appropriate to consider mid‐year progress, the need for remediation, and end of year meetings to determine progression to the next year. The longitudinal nature and the information richness of the triangulation process were intended to make programmatic assessment defensible for high‐stakes assessment of learning.
30
|
5. Remediation
|
6. Integration with other aspects of the curriculum
Key curricular elements provided important assessment tasks ‘as’ and ‘for’ learning. Team‐based learning (TBL) was the dominant teaching activity in Year 1 providing a collaborative approach wherein students worked in small teams on authentic cases. The sequencing of activities encouraged students to apply conceptual knowledge through a series of steps involving preparation, readiness assurance testing, feedback and the application of knowledge through clinical problem solving activities.
31
,
32
It was supported by the flipped classroom method of learning and teaching.
31
,
32
,
33
,
34
Thus, readiness assurance tests and explanatory mechanistic diagrams of the specific clinical problem in the TBL were included in the student progress record.
|