Skip to main content
Medical Journal, Armed Forces India logoLink to Medical Journal, Armed Forces India
. 2019 Mar 29;76(2):207–212. doi: 10.1016/j.mjafi.2018.12.012

Blueprinting of summative theory assessment of undergraduate medical students in microbiology

JS Gill a,, Sourav Sen b
PMCID: PMC7244858  PMID: 32476720

Abstract

Background

Assessment drives learning. Written assessment of many universities lacks uniformity and validation. Subjectivity influences assessment. Blueprinting has been used as content validity tools.

Methods

In this study, last 5-year's Maharashtra University of Health Sciences (MUHS) second year MBBS papers in Microbiology were evaluated for its content validity. Desired weightage to all the topics in microbiology was given by the faculty of Department of the Microbiology. University papers were also evaluated for level of cognitive domain tested. Closed ended feedback from faculty was taken and was statistically evaluated.

Result

Study revealed both overrepresentation and underrepresentation of many topics across all the last 5-year university papers in subject of microbiology. The cognitive dimension tested in question papers as per revised Bloom's taxonomy was merely 8% from Bloom's level 1, 20% from level 2, and 8% from level 3, whereas 64% of the questions were ambiguous. Faculty feedback revealed significant impact (P < 0.05) from blueprinting in microbiology.

Conclusion

Assessment should be aligned to learning objectives, and blueprinting improves content validity.

Keywords: Assessment, Blueprinting, Validity, Curriculum, Educational measurement

Introduction

An important responsibility of a medical teacher is to conduct an assessment of the learning. This forms an essential component of the medical education, and it is an integral part of the medical curriculum.1 Many medical colleges or institutes still use conventional written assessment tool. Such tools may not observe the principles of evaluation and have many drawbacks. Few of these drawbacks are subjective in setting of theory papers, loss of uniformity, no prevalidation by peer reviewers, and specific learning objectives not clearly defined. Qualitative feedback from students across various medical colleges often suggests that the question paper was not framed properly. Students' complains were that the question paper lacked uniformity and was lengthy.2

Blueprinting has been used to increase the validity of the examination.3 A blueprint is prepared as a table where each question is placed, based on its objective and the content.3 Studies have shown positive impact of blueprinting on students' performance and assessment.3, 4 Blueprinting provides framework to examiners for setting question papers. It reduces interexaminer variability and increases validity of the examination.4

No studies have been conducted in India on blueprinting of medical microbiology curriculum and content validity.

Aims and objectives

The primary objective of the study is to examine the content validity and weightage given to different areas in the subject of microbiology in second year MBBS summative written examinations held by the Maharashtra University of Health Sciences (MUHS) over the last 5 years (2012–2016).

The secondary objective of the study is to prepare a blueprint that can be used in the second year MBBS summative written examination in the subject of microbiology.

Materials & methods

Medical Council of India (MCI) medical curriculum includes written assessment of second-year MBBS students in microbiology. This study was based on the analysis of second-year MBBS microbiology question papers of last 5 years (2012–2016) held by the MUHS.

Steps involved in blueprinting

  • 1.

    Listing all content areas inside the syllabus of microbiology

As per the MUHS guidelines, the contents of paper I and paper II are as follows:

Paper I- General microbiology, systematic bacteriology which includes Chlamydia and Mycoplasma and associated applied microbiology.

Paper II- Virology, immunology, parasitology, mycology, and associated applied microbiology.

  • 2.

    Pattern of the question papers

As per MUHS guidelines, mark allotment for microbiology papers is 40 marks each for paper I and paper II. Thus, each paper is of 40 marks (50 marks, if marks of optional questions were added). Pattern of each paper was as follows:

The paper had three sections: A, B, and C. All sections were compulsory to attempt. Section A contained 16 multiple choice questions (MCQs); each MCQ had 0.5 marks for correct answer and no negative marks with maximum total marks of 8 marks. Section B had option of 5 out of 6 short answer questions (SAQs); each SAQ carried four marks with maximum total of 20 marks. Section C had option of 2 out of 3 long answer questions (LAQs) with each LAQ carrying six marks and maximum total of 12 marks.

  • 3.

    Content validity

Weightage was calculated for content validity. Impact and frequency were used to calculate the desired weightage.5

Impact refers to the clinical relevance of a topic.6 Scoring was as follows: minimal clinical significance = 1; moderate clinical significance = 2; marked clinical significance = 3.

Frequency refers to the prevalence of a disease or health problem.6 Scoring was as follows: low prevalence = 1; moderate prevalence = 2; high prevalence = 3.

  • 4.

    Levels of cognition

In the previous 5-year question paper, all questions were evaluated for the type of domain and the level of cognition. Questions were categorized into various domains based on the revised Bloom's taxonomy action verbs used in the questions. Revised Bloom's hierarchy of cognitive learning was used for assessment of cognitive dimensions process. Level 1 comprised recognizing and recalling; level 2 included understanding, interpretation, classifying, and comparing; and level 3 included application, analysis, and evaluation.6

The 12 faculty members from the Department of Microbiology provided written feedback on Likert scale–based questionnaires. The questionnaires evaluated their perceptions regarding blueprinting in designing a valid theory assessment paper. Feedback was collected both before and after interactive sessions. A total of 11 questions with five-point Likert scale–based responses were used. These responses were as follows: strongly disagree, disagree, neutral, agree, and strongly agree.

Means and standard deviations of the scores given by the faculty members were calculated and compared using a paired t-test. Responses of strongly disagreement and disagreement were merged; similarly, responses of agreement and strongly agreements were also merged. Faculty feedback questionnaires responses were presented as percentages. Statistical analysis was performed using software Minitab, Version 18 (Coventry, UK).

This study was approved by the Institutional Human Ethics Committee of Armed Forces Medical College (AFMC), Pune. Informed consent was obtained from the faculty before data collection.

Results

Analysis of the theory paper pattern shows that both the question papers (paper I and II) carried a maximum of 40 marks each and comprised two LAQs of six marks each, implying that 30% of the total marks were devoted to mainly two topics, while the rest of the topics had to be accommodated in the remaining 70%. This goes against one of the main principles of an assessment that it has to give appropriate/adequate representation to all the topics. Furthermore, there is an extreme concentration of topics in both paper 1 (due to clubbing of general microbiology, systematic bacteriology, and applied microbiology) and paper II (due to clubbing of parasitology, mycology, virology, immunology, and applied microbiology).

Desired weightage with expected number of questions and average marks for each topic of paper I and paper II are shown in Table 1 and Table 2, respectively.

Table 1.

Blueprint for microbiology paper I with desired weightage.

S. no. Topic I F W% W × 25 Final marks LAQ SAQ MCQ
1 History, classification, morphology and physiology of bacterial genetics. 2 1 4% 1 1.6∼2 4
2 Disinfection, sterilization, methods of isolation and identification. 2 2 8% 2 3.2∼3 6
3 Gram-positive cocci 3 3 18% 4.5 7.2∼7 1 2
4 Gram-negative cocci 2 1 4% 1 1.6∼2 4
5 Gram-positive bacilli 2 1 4% 1 1.6∼2 4
6 Gram-negative bacilli 3 3 18% 4.5 7.2∼7 1 1
7 Spirochetes, rickettsiae, Chlamydia, Mycoplasma 2 2 8% 2 3.2∼3 6
8 Mycobacteria 3 3 18% 4.5 7.2∼7 1
9 Applied microbiology 3 3 18% 4.5 7.2∼7 1

I, impact factor; F, frequency; W, weightage; LAQ, long answer question; SAQ, short answer question; MCQ, multiple choice question.

Table 2.

Blueprint for microbiology paper II with desired weightage.

S. no. Topic I F W% W × 25 Final marks LAQ SAQ MCQ
1 Antigen, immunoglobulin, complement. 2 2 6.8% 1.7 2.72∼3 6
2 Immunity and hypersensitivity. 2 2 6.8% 1.7 2.72∼3 6
3 Immunodeficiency states and immunological reactions. 2 2 6.8% 1.7 2.72∼3 6
4 DNA viruses. 3 2 10.3% 2.575 4.12∼4 1
5 RNA viruses. 3 3 15.5% 3.875 6.2∼6 1
6 Mycology 3 3 15.5% 3.875 6.2∼6 1 4
7 Protozoa and nematodes 3 3 15.5% 3.875 6.2∼6 1
8 Cestodes and trematodes 2 2 6.8% 1.7 2.72∼3 6
9 Applied microbiology 3 3 15.5% 3.875 6.2∼6 1

I, impact factor; F, frequency; W, weightage; LAQ, long answer questions; SAQ, short answer questions; MCQ, Multiple choice questions.

Analysis of desired weightage with actual weightage allotted in each question paper I (Table 3) and question paper II (Table 4) of last 5 years (2012–2016) duration shows underrepresentation of many topics, despite their high impact and frequency, for example, gram positive cocci and gram negative bacilli in paper I and RNA viruses in paper II.

Table 3.

Year-wise analysis of actual weightage of contents of microbiology paper I.

S. no. Topic 2016 2015 Summer 2015 Winter 2014 Summer 2014 Winter 2013 Summer 2013 Winter 2012
1 History, classification, physiology of bacterial genetics. 18% 17% 13% 31% 28% 10% 22% 28%
2 Disinfection, sterilization, isolation & and identification. 2% 23% 8% 10% 24% 26% 16%
3 Gram Gram-positive cocci 9% 1% 2% 9% 1% 8% 1% 8%
4 Gram Gram-negative cocci 1% 2% 1% 8% 1%
5 Gram Gram-positive baciilibacilli 11% 18% 11% 14% 9% 8% 10%
6 Gram Gram-negative bacilli 2% 13% 9% 8% 13% 3% 16%
7 Spirochetes., Rickettsiaerickettsiae, Chlamydia, Mycoplasma 21% 13% 9% 14% 9% 8% 3%
8 Mycobacteria 1% 1% 2% 2% 8% 12% 12% 2%
9 Applied Microbiologymicrobiology 35% 25% 42% 17% 11% 15% 32% 33%

Table 4.

Year-wise analysis of actual weightage of contents of microbiology paper II.

S. no. Topic 2016 2015 Summer 2015 Winter 2014 Summer 2014 Winter 2013 Summer 2013 Winter 2012
1 Antigen, immunoglobulin, complement. 2% 18% 10% 31% 28% 10% 22% 28%
2 Immunity and hypersensitivity. 8% 1% 10% 10% 24% 26% 16%
3 Immunodeficiency states and immunological reactions. 2% 1% 14% 9% 1% 8% 1% 8%
4 DNA viruses. 13% 14% 13% 1% 8% 1%
5 RNA viruses. 11% 10% 10% 14% 9% 8% 10%
6 Mycology 22% 23% 8% 8% 13% 3% 16%
7 Protozoa and nematodes 31% 24% 15% 9% 14% 9% 8% 3%
8 Cestodes and trematodes 2% 8% 2% 8% 12% 12% 2%
9 Applied microbiology 9% 9% 12% 17% 11% 15% 32% 33%

Similarly, year-wise analysis of university papers shows overrepresentation of many topics. Most notably being question from topics of antigen, immunoglobulin, and complements, which had weightage of more than 20% for four consecutive years from 2012 to 2015.

Further analysis of both paper 1 and 2 also revealed skewed representation of some topics over the years. For instance, paper 1 questions from gram positive bacilli had 18% weightage in summer paper of 2014 and ironically did not have a single question in winter paper of 2014. Similar trends were noted for the topic disinfection and sterilization.

Previous year question papers were analyzed for cognitive domain of students being tested. Analysis revealed that 8% of the questions tested belonged to level 1 of revised Bloom's taxonomy cognitive dimensions. Similarly, questions testing Bloom's level 2 and Bloom's level 3 represented merely 20% and 8%, respectively. Analysis further highlighted that majority (64%) of the questions were ambiguous.

In total, 12 faculty members gave five-point Likert scale feedback (Fig. 1) on blueprinting. Preinteractive and postinteractive session difference in mean score was statistically significant (p < 0.050).

Figure-1.

Figure-1

Postinteractive session faculty feedback on blueprinting of question papers. LAQ, long answer question; SAQ, short answer question; MCQ, multiple choice question.

In postinteractive session feedback, all 12 faculty members agreed that the blueprinting of microbiology syllabus will ensure that the questions are framed based on the learning objectives and are uniformly distributed across the syllabus topics. Similarly, 100% faculty agreed that blueprint of syllabus will act as essential guide for setting theory paper, which will minimize examiners bias and the interexaminer variability.

Discussion

Assessment and evaluation are very important elements of teaching and learning. What is assessed and evaluated, how it is done, and the way results are communicated send a vital message to students about relative importance of different topics and its relevance to health care.7 For any assessment to be valid, it must have a proper coverage of the curriculum. Validity is defined as the extent to which an assessment accurately measures what it is intended to measure.8 It is assumed that a medical student should be assessed both on practical skills and on his/her degree of academic knowledge. However, knowledge lays foundation for acquiring appropriate clinical skills, and it is best evaluated as written assessment.9

While setting a paper, consistent representation of all topics as per their weightage can be ensured by ‘blueprinting’. In simple words, blueprinting has bridged assessment to learn objectives. The aim of blueprinting is to limit two important threats to the validity of an assessment—construct underrepresentation (course contents not adequately sampled) and construct-irrelevant variance (inclusion of unsuitable items, too simple or too tough questions, or examiner's bias). A blueprint clearly defines the syllabus topics, domains of learning being evaluated, and methods for assessment. It acts as a reference guide and layout for the examiners setting question papers for medical students.10, 11 Marks carried by each question are indicated. It is useful to prepare a blueprint so that the faculty who sets the question paper knows about the content distribution and how many marks it would carry.12, 13, 14

In our study, paper setter's bias and affinity for some topics is observed. This leads to overrepresentation of many topics and underrepresentation of many other topics. In this study, question papers had LAQs of six marks and SAQs of four marks each. This pattern leaves many topics underrepresented or unrepresented because of the use of the allotted marks in a limited number of questions. It is proposed that each LAQ to be allotted four marks instead of 6 and each SAQ to have 3 marks instead of 4. Additional MCQs or one mark questions will allow coverage of a wide array of topics in a limited mark setup. An adequate coverage of the course content is necessary to make sure the validity of assessment.

Conclusion

Periodic analysis of various methods of teaching and assessment should be performed. Blueprinting is critical in harmonizing course objectives with assessment content and help in achieving academic excellence.

Conflicts of interest

All authors have none to declare.

References

  • 1.Garg R., Saxena D., Shekhawat S., Daga N. Analytical study of written examination papers of undergraduate anatomy: focus on its content validity. Indian J Basic Appl Med Res. 2013 Sep;2:1110–1116. [Google Scholar]
  • 2.Adkoli B., Deepak K.K. Blue printing in assessment. In: Anshu Singh T., editor. Principles of Assessment in Medical Education. Jaypee Publishers; NewDelhi: 2012. pp. 205–213. [Google Scholar]
  • 3.Mookherjee S., Chang A., Boscardin C.K., Hauer K.E. How to develop a competency-based examination blueprint for longitudinal standardized patient clinical skills assessments. Med Teach. 2013;35:883–890. doi: 10.3109/0142159X.2013.809408. [DOI] [PubMed] [Google Scholar]
  • 4.Ahmad R.G., Hamed O.A. Impact of adopting a newly developed blueprinting method and relating it to item analysis on students' performance. Med Teach. 2014;36:S55–S61. doi: 10.3109/0142159X.2014.886014. [DOI] [PubMed] [Google Scholar]
  • 5.Patil S.Y., Gosavi M., Bannur H.B. Blueprinting in assessment: a tool to increase the validity of undergraduate written examinations in pathology. Int J Appl Basic Med Res. 2015;5:76–79. doi: 10.4103/2229-516X.162286. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Bloom B., Englehart M., Furst E. 1956. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook I: Cognitive Domain. New York, Toronto: Longmans, Green. [Google Scholar]
  • 7.Narvekar R.S., Bhandare N.N., Bhandare P.N. Analysis of undergraduate pharmacology question papers at Goa medical college as regards to their content areas. IntJSci Rep. 2016;2:182–186. [Google Scholar]
  • 8.Downing S.M. Validity: on the meaningful interpretation of assessment data. Med Educ. 2003;37:830–837. doi: 10.1046/j.1365-2923.2003.01594.x. [DOI] [PubMed] [Google Scholar]
  • 9.Shumway J.H., Harden R.M. Amce guide no. 25: the assessment of learning outcomes for the competent and reflective physician. Med Teach. 2003;25:569–584. doi: 10.1080/0142159032000151907. [DOI] [PubMed] [Google Scholar]
  • 10.Hamdy H. Blueprinting in medical education. N Engl J Med. 2007;356:387–395. [Google Scholar]
  • 11.Downing S.M., Haladyna T.M. Validity and its threats. In: Downing S.M., Yudkowsky R., editors. Assessment in Health Professions Education. Routledge; New York: 2009. pp. 21–56. [Google Scholar]
  • 12.Adkoli B.V., Deepak K.K. Blueprinting in assessment. In: Tejinder S., Anshu, editors. Principles of Assessment in Medical Education. Jaypee Brothers Medical Publishers; India: 2012. pp. 205–213. [Google Scholar]
  • 13.Patil S.Y., Gosavi M., Bannur H.B., Ratnakar A. Blueprinting in assessment: a tool to increase the validity of undergraduate written examinations in Pathology. Int J App Basic Med Res. 2015;5:S76–S79. doi: 10.4103/2229-516X.162286. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Vinod Kumar C.S., Suneeta K., Lakshmi R.P. Descriptive analysis of the MBBS microbiology question papers of RGUHS, Bengaluru. J Educ Res & Med Teach. 2014;2:29–32. [Google Scholar]

Articles from Medical Journal, Armed Forces India are provided here courtesy of Elsevier

RESOURCES