Abstract
Objective To describe the current methods used by English medical schools to identify prospective medical students for admission to the five year degree course.
Design Review study including documentary analysis and interviews with admissions tutors.
Setting All schools (n = 22) participating in the national expansion of medical schools programme in England.
Results Though there is some commonality across schools with regard to the criteria used to select future students (academic ability coupled with a “well rounded” personality demonstrated by motivation for medicine, extracurricular interests, and experience of team working and leadership skills) the processes used vary substantially. Some schools do not interview; some shortlist for interview only on predicted academic performance while those that shortlist on a wider range of non-academic criteria use various techniques and tools to do so. Some schools use information presented in the candidate's personal statement and referee's report while others ignore this because of concerns over bias. A few schools seek additional information from supplementary questionnaires filled in by the candidates. Once students are shortlisted, interviews vary in terms of length, panel composition, structure, content, and scoring methods.
Conclusion The stated criteria for admission to medical school show commonality. Universities differ greatly, however, in how they apply these criteria and in the methods used to select students. Different approaches to admissions should be developed and tested.
Introduction
Medicine remains one of the more oversubscribed university courses. Failure to gain admissions despite attaining the highest academic grades can lead to understandable resentment and perceptions of bias and unfairness in the system.1 Lumsden and colleagues suggested that the admission to medical schools in the United Kingdom was based on procedures that were often “secretive and varied.”2 Others have suggested discriminatory practices exist.3,4
The recent independent review of admissions to higher education (the Schwartz report) emphasised the need for a fair and transparent admissions system, especially for subjects such as medicine where demand exceeds supply and where it is difficult for staff to select from a growing pool of academically highly qualified candidates.5 While emphasising its belief in the autonomy of institutions over admissions policies, the report recommended that all admissions systems should strive to use assessment methods that are “reliable and valid.”
Admissions tests used to select medical students in other countries
In the United States, requirements for admission to medical school vary from school to school and include minimum academic levels (indicated by undergraduate grade point averages), performance in the medical college admissions test (MCAT), and interview6 to identify one or more of a range of non-academic characteristics.7 A similar approach to selection is seen among the 17 Canadian medical schools.8 In both the US and Canada, medicine is read as a postgraduate degree. This is not the case in other countries—for example, in Australia, admission may be as a postgraduate or directly from high school. None the less, the methods adopted in selection again comprise a combination of minimum academic attainment, cognitive testing, and interview.9 In Europe, there is even greater heterogeneity—for instance, in the Netherlands, medical schools may select a proportion of entrants to their course via interview and other methods, but the remaining candidates are identified through a lottery among school leavers weighted for academic attainment.10
The heterogeneity in selection processes exists both between and within countries. We examined diversity in England as a first step towards meeting the need for developing and testing different approaches to selection. Specifically, we identified the stated criteria for admissions, the tools with which such criteria are sought, and the processes by which those tools are applied.
Methods
Consent for participation
The details of the study were initially presented at a meeting of the Council of Heads of Medical Schools where verbal consent to proceed with the work was provided by all present. We then wrote to the heads of all the schools involved in the national expansion programme in England setting out further details of the review and requesting formal consent for participation. In 1997, the medical workforce standing advisory committee (MWSAC) published a report recommending an increase of 1000 places at medical school to meet the future demand for doctors within the NHS. The government accepted the recommendation and charged a joint implementation group (JIG) consisting of representatives of the Higher Education Funding Council for England and the Department of Health to coordinate the allocation of these additional places.
Once written consent was provided, the heads of schools were asked to nominate a member of staff as contact person—usually the admissions dean or other senior staff member responsible for overseeing the school's admissions process. We then contacted this person and explained the purposes of the review. All schools agreed to participate in the review.
Information collected in the review
We mapped out a generic “pathway” from receipt of the student's application form from UCAS (the Universities and Colleges Admissions Service) by the school through to making an offer to a candidate who had applied to join the five year medical course. We identified key points on this pathway and developed questions necessary to ascertain the nature of the process at these points. The questions were collated into a single proforma, which was used as the template for all subsequent data abstraction.
Information sources
Our first step was to collect and review all documents and information in the public domain relating to each school's selection process. These included prospectuses (both hard copies and electronic) and other documents made available through school and university websites. AP reviewed the documents and abstracted and entered all relevant information on to an electronic copy of the proforma.
Once each school's documents had been reviewed, we forwarded a copy of the proforma including the abstracted information to the school's contact person for checking. We arranged a telephone interview with the contact person to go through the proforma and in particular to fill in any gaps. All telephone meetings were conducted by the same person (AP) and arranged at a time convenient for the contact. The interview was audiotaped (with consent) to provide a contemporaneous record and a back up to the hand written notes taken during the discussion. Information from the completed proforma was transferred to an electronic database. Where necessary, we contacted the interviewee by email or telephone, or both, for further clarification. All schools were provided with copies of their own proforma plus a copy of the review report (on which this paper is based) to provide an opportunity to check for factual errors and to comment on our observations.
Results
Twenty two of the 23 English schools participating in the expansion programme admit applicants to a five year “traditional” medical course (one institution, not a separate medical school, admits students to study for phase I (two years) before being integrated with students from another school for phase II (three years) of the programme and because of this process we considered it as a separate medical school). Within this group are two schools that offer a non-foundation six year course, which includes a one year BSc/BA degree. We grouped schools into five categories depending on the steps taken to process the UCAS form and to decide whether or not to make an applicant an offer (table 1).
Table 1.
Pathway | No of schools |
---|---|
UCAS form assessed on academic and non-academic criteria, offer made | 2* |
UCAS form assessed on academic and non-academic criteria, invitation to interview, offer made | 15† |
UCAS form assessed on academic criteria only, invitation to interview, offer made | 1 |
UCAS form assessed on academic criteria only, additional written assessment undertaken, invitation to interview, offer made | 1‡ |
UCAS form assessed on academic and non-academic criteria, additional written assessment undertaken, invitation to interview, offer made | 3§ |
One will interview some candidates if information on UCAS form is not clear; one interviews selected international, widening access, and mature non-graduate applicants.
Two operate identical admission processes; until 2004, one made offers without interview.
Additional written information completed by candidates at interview and is used to help only with determining offers for borderline candidates (when candidate's personal statement on UCAS form will also be scanned).
At one school, all candidates take BMAT (biomedical admissions test) and are asked to provide further information on supplementary application questionnaire. Some candidates may be requested to undertake additional written assessments; certain candidates for whom costs of attending interview are prohibitive may be offered places without interview. At one other all candidates take BMAT; in addition, some candidates may be requested to undertake additional written assessments.
Selection criteria
Academic criteria—Table 2 shows the academic criteria used by schools to identify applicants who would progress further in the assessment process. High A level grades (A or B in the exams taken at age 17-18) in two or more science subjects are a common requirement, but there is discordance with regard to the acceptability of A level re-sits (that is, exams passed at the required grade on a second or subsequent attempt).
Table 2.
Criteria | Details |
---|---|
A level grades | 18 schools require AAB; two require AAA; and two require ABB. One stated that depending on circumstances and, in particular, GCSE grades, BBB might be acceptable |
A level re-sits | 17 schools did not look favourably on A level re-sits, with applicants not considered or considered only if there had been exceptional extenuating circumstances. Five accept re-sits if passed at the original grade requested, with two of these commenting not only that they would they consider re-sits without prejudice but saw the decision to re-sit as evidence of a commitment to read medicine |
A level subjects | All schools required A levels in chemistry or biology, or both, with some schools specifying grade A in these subjects; some schools advised third A level to be in a science/maths related subject, while others actively encouraged applicants to consider a non-science subject |
Other qualifications | All schools would consider graduates with good degree (upper second or higher), with one school indicating that in exceptional circumstances a lower second might be acceptable. Twelve schools made explicit reference to “access to medicine” courses as acceptable means by which academic criteria could be satisfied. All schools recognise alternative equivalent education qualifications—for example, the International Baccalaureate and Scottish Highers |
Non-academic criteria—With the exception of two, all schools considered some aspect of non-academic criteria when assessing the student's personal statement and the referee's report presented in the UCAS application form. The non-academic criteria identified by each school varied in terms of number and nature but there was some commonality in terms of evidence of motivation for, and a commitment to, medicine; team working, leadership, and the acceptance of responsibility; a range of extracurricular interests; and experience of working in health or social care settings.
Short listing candidates for interview
The number of people involved in assessing applicants' UCAS forms ranged from one to 30 across schools. In four schools this involved lay people—that is, people who were not member of the academic or clinical staff, such as a local headteacher. Only 11 schools offered training for people assessing the UCAS forms, and within this group the nature of training varied considerably (table 3). Among the schools that assessed non-academic criteria, 13 either double mark each form or involve the use of a second assessor if a form was rejected by the main assessor.
Table 3.
Criteria | Details |
---|---|
Who assesses the UCAS form? | Pool of assessors ranged from one to two to over 30 in each school. All schools have assessors drawn from clinical and academic staff. Three schools have lay assessors, with a fourth having lay people as part of the admissions committee which oversees the selection process |
Training of assessors | 11 schools offer training: for seven this involved in-house briefing sessions with the marking of sample forms; two held informal sessions when assessors discussed processes; one an external two day course; and one had a formal training manual |
No of assessors per form | Nine schools ask two assessors to score each form independently with discrepancies resolved by a third; four refer UCAS forms to second assessor if first assessor has rejected candidate. At five schools forms are single marked but with either sample of each assessor's forms checked and if discrepancies are found, the entire set of forms are second marked (three schools) or then reviewed by admissions tutor or subdean (two schools). At one other school forms are single marked with range of marks awarded by each of assessors “eyeballed” by senior admissions tutor to check on range of marks awarded. One school would not reveal details on number of assessors |
Method of assessing the UCAS form
Of the 20 schools that scored UCAS forms on academic and non-academic criteria, 11 have complex scoring systems based on the allocation of marks (often from 0-3 or 0-5) for a set of predefined criteria, with written guidance to assist scoring. At six other schools assessment is less complex and aims to divide applicants into those who should or should not be called for interview, and those who are borderline. Of the remaining schools, one assesses the UCAS form on a combination of GCSE (general certificate of secondary education, taken at 15-16 years) results, predicted A level grades, and healthcare experience with other non-academic criteria coming into play only for applicants who fall short on this initial scoring, while another has developed an online questionnaire that all applicants complete. This is scored electronically, and the results are coupled with an assessor's scoring of the referee's statement. The last school did not reveal details of how the form was assessed.
Two schools use the biomedical admissions test (BMAT, a two hour paper developed by the University of Cambridge local examinations syndicate that tests critical thinking; www.bmat.org.uk) to provide information additional to that presented on the UCAS form to select candidates, with one of these also requiring candidates to provide further background A level module scores. One other school has trialled the personal qualities assessment (PQA) tool but at present does not use it as part of the formal selection process. The PQA is an instrument designed to assess a range of personal qualities considered to be important for the study and practice of medicine, dentistry, and other health professions (www.pqa.net.au). It comprises questions, grouped into three sections; the first measures cognitive skills, the other two measure relevant personality/attitudinal traits.
The interview
Who is on the panel?—Only one school suggested that it would be prepared to hold one-on-one interviews but that this would occur only if one of the designated interviewers for that day was unable to attend at the last minute—for example, as a result of illness. All other schools had a minimum of two or three examiners, with two schools explicitly preferring larger panels of four to five people. Interview panels invariably comprised at least one or more senior members of the academic/clinical staff. Ten schools included lay members, and six included senior medical students. All schools attempted to get a mix of men and women on the panel and, when possible, to have panellists from different ethnic backgrounds. As with the training of assessors of the UCAS forms, there was a range of different training provision between medical schools (table 4).
Table 4.
Criteria | Details |
---|---|
Do the panellists receive training in the interview process? | 18 schools required all interviewers to undergo training either provided in-house (13 schools), by the NHS or university (three), or by external agencies (two) |
Length of interviews | 18 schools interview for 15-20 minutes, with a further three schools interviewing for 25 minutes, 30-45 minutes, and 40 minutes each. Two schools stated that interview length “varied” and could not be specified |
How is the interview scored? | 14 schools score interview numerically. At six scoring is non-numeric with candidates categorised as offer/borderline/reject. Two schools were not explicit about their scoring process |
Interview format—Interviews were designed to draw out from candidates replies that would assist in assessing their performance against several prespecified categories (usually but not always correlating with the categories used to assess the UCAS form). Two schools had a standardised process where candidates were asked predetermined questions from a bank formulated by the school. All other schools, bar three, adopted a more mixed approach where questions were both predetermined and also interviewer led depending on the candidate's responses and statements on the UCAS form. At the remaining three schools questions were solely interviewer led but still directed to elicit information on pre-agreed criteria. Two schools have introduced variations to the simple question and answer approach within the interview. One has introduced questions to elicit information on communication skills based on a video viewed by candidates while waiting for their interview; at the other, candidates are given a question to consider, again while waiting for their interview, and which they are told they will be required to answer during their interview. Other schools are considering the introduction of problem solving tasks and group work as part of the interview process, though these components are not yet in place as part of the actual assessment.
Discussion
Our review suggests that there is commonality across schools with regard to the criteria used to select future students: academic ability coupled with a well rounded personality demonstrated by motivation for medicine, extracurricular interests, and experience of team working and leadership skills. Most schools operate a range of systems based around a two stage approach: shortlisting for interview—on the basis of predicted academic performance and the information presented in the UCAS form—followed by an interview. The implementation of this approach, however, varied substantially: some schools do not interview; some shortlist for interview only on predicted academic performance, while those that shortlist on a wider range of non-academic criteria use various techniques and tools. Some schools use information presented in the candidate's personal statement and referee's report while others ignore this because of concern over bias. A small number seek additional information from supplementary questionnaires filled in by candidates. Interviews vary in terms of length, panel composition, structure, content, and scoring methods.
What methods can be used to select students?
Previous academic performance has been shown both in the UK and the US to predict future academic performance, though correlation with clinical skills and postgraduate performance are less clear.11,12 In the UK, the use of GCSE grades and predicted A level grades as a marker of academic performance has long been the basis of medical school selection. Key studies providing evidence to support this, however, used data from cohorts in the 1970s when students needed grade B in three science subjects.13 Widening acceptance of a non-science subject in combination with science makes it difficult to know the applicability to current and future students. Furthermore, the increasing numbers of students gaining three A grades at A level make differentiation between applicants problematic. Such difficulties have led some to argue for the introduction of intellectual aptitude tests,14 and in this review we identified schools using in-house questionnaires or external tests, or both, as part of the selection process. But others note that in the current context of selection to medical school the highly preselected level of the candidate group makes the contribution of cognitive tests unlikely to discriminate much further, and instead conclude that “using a more finely developed marking system at the top end (A+ and A++, for example) has the greatest potential towards enabling enhanced selection by medical schools' admissions staff.”15
Consideration of non-academic characteristics, such as empathy, conscientiousness, team working, and so forth, have some face validity, but there is no absolute consensus on the characteristics medical schools should be seeking among future doctors—indeed, in a review of admissions processes in the US, Albanese et al noted that 87 different personal qualities relevant to the practice of medicine have been identified.7 Moreover, the literature offers little guidance on how best to assess these characteristics. We found that most schools used a combination of the candidate's personal statement, their referee's report, and an interview, though they were not used in a standard approach across schools. Compelling evidence of the utility of the referee's report and personal statement is lacking,16 and while there is evidence that a structured interview format and the use of trained and experienced interviewers may improve the reliability and validity of the interview, controversy remains as to whether the costs required by this process justify the end point.17,18 This is relevant as schools in the survey reported difficulty in the training and recruitment of staff for interview purposes.
A way forward?
One option to reduce differences between schools in selection processes is the implementation of a centralised admissions system. Such as scenario is not that far fetched: as noted, the use of standardised cognitive tests is already in place at some schools in this survey. Moreover, two schools (Nottingham and St George's) conduct joint interviews for graduate students, and we understand there are discussions among northern medical schools about devising a shared bank of interview questions. But crucially, the present paucity of evidence on which to base selection cautions against the implementation of a single process based on present procedures. Rather, in the era of evidence based medicine, there is a case for developing a system after a process of experimentation and evaluation, the first stage of which is to clarify what type of student we want to select and why. In principle three or four assessment processes could be established nationally with applicants randomised to each and outcomes tracked over time. If a centralised approach is rejected because medical schools want to retain a local system allowing them to recruit a distinctive type of student, however, there is no less a need to more stringently assess the validity of their selection methods in identifying students that meet their local criteria.
What is already known on this topic
A recent review of admissions to UK higher education emphasised the need for a fair and transparent system
This is particularly necessary for entry to medical school, where demand exceeds supply and there is a growing pool of highly qualified candidates
What this study adds
There is no single process for selection at English medical schools and too little evidence to develop one
Developing a clear definition of suitability for medical training is the first priority, whether locally or nationally
We acknowledge the support and collaboration of the UK Council of Heads of Medical Schools, and the admissions staff at each school.
Contributors: JP, JM, AS, HT, PS, and RL conceived and prepared the study protocol for the National Evaluation of Medical Schools project of which this study (review of admissions processes) is a key component. JP, JM, and AP constructed the framework that informed the production of the data proforma. AP undertook the documentary analysis and interviews with admissions tutor, supervised by JM and JP. JP produced the first draft of the paper. All authors commented on subsequent drafts and have approved the final version. JP and JM are guarantors.
Funding: Department of Health (Evaluation of the National Expansion of Medical Schools project; reference No 0160056).
Competing interests: None declared.
Ethical approval: The West Midlands multicentre research ethics committee (reference No 04/MRE07/58).
Comment made by an admissions tutor
The problem was excluding people from interview, not taking them for interview, because they were all so good. That was the problem, you felt heartbroken when you got down to the last 10 forms and you had only three places left. How on earth would you select them? It's terribly difficult, and quite unfair. People would joke that we should throw them all up in the air and invite the first 40 you pick up for interview—it would have been just as fair. Several of my colleagues wanted to introduce lotteries.
References
- 1.Laura's moment of truth. http://news.bbc.co.uk/1/hi/education/883358.stm (accessed 19 Jan 2006).
- 2.Lumsden AM, Bore M, Millar K, Jack R, Powis D. Assessment of personal qualities in relation to admission to medical school. Med Educ 2005;39: 258-65. [DOI] [PubMed] [Google Scholar]
- 3.McManus IC. Factors affecting likelihood of applicants being offered a place at medical schools in the United Kingdom in 1996 and 1997: retrospective study. BMJ 1998;317: 1111-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Esmail A, Nelson P, Primarolo D, Toma T. Acceptance into medical school and racial discrimination. BMJ 1995;310: 501-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Admissions to Higher Education Steering Group (Chair: Steven Schwartz). Fair admissions to higher education: recommendations for good practice. London: Department for Education and Skills, 2004.
- 6.Association of American Medical Colleges. www.aamc.org/students/applying/about/start.htm (accessed 18 Jan 2006).
- 7.Albanese MA, Snow MH, Skochelack SE, Huggett KN, Farrell PM. Assessing personnel qualities in medical school admissions. Acad Med 2003;78: 313-21. [DOI] [PubMed] [Google Scholar]
- 8.Admission Requirements of Canadian Faculties of Medicine www.afmc.ca/docs/2005AdBk.pdf (accessed 19 Jan 2006).
- 9.Admission to Australian Medical Schools. www.medical-colleges.net/medical.htm (accessed 19 Jan 2006).
- 10.Coebergh J. Dutch medical schools abandon selection for lottery systems for places. StudentBMJ 2003;11: 138. [Google Scholar]
- 11.Lumb AB, Vail A. Comparison of academic, application form and social factors in predicting early performance on the medical course. Med Educ 2004;38: 1002-5. [DOI] [PubMed] [Google Scholar]
- 12.Ferguson E, James D, Madeley L. Factors associated with success in medical school: systematic review of the literature. BMJ 2002;324: 952-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.McManus IC, Smithers E, Partridge P, Keeling A, Fleming PR. A levels and intelligence as predictors of medical careers in UK doctors: 20 year prospective study. BMJ 2003;327: 139-42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Nicholson S. The benefits of aptitude testing for selecting medical students. BMJ 2005;331: 559-60. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.McManus IC, Powis DA, Wakeford R, Ferguson E, James D, Richards P. Intellectual aptitude tests and A levels for selecting UK school leaver entrants for medical school. BMJ 2005;331: 555-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Ferguson E, James D, O'Hehir F, Sanders A. Pilot study of the roles of personality, references and personal statements in relation to performance over the five years of a medical degree. BMJ 2003;326: 429-32. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Kreiter CD, Solow C, Brennan RL. Investigating the reliability of the medical school admissions interview. Adv Health Sci Educ Theory Pract 2004;9: 147-59. [DOI] [PubMed] [Google Scholar]
- 18.Salvatori P. Reliability and validity of admissions tools used to select students for the health professions. Adv Health Sci Educ Theory Pract 2001;6: 159-75. [DOI] [PubMed] [Google Scholar]