Skip to main content
Journal of Clinical Microbiology logoLink to Journal of Clinical Microbiology
. 2003 Mar;41(3):917–918. doi: 10.1128/JCM.41.3.917-918.2003

Clinical Microbiology: Past, Present, and Future

Henry D Isenberg 1,*
PMCID: PMC150327  PMID: 12624009

During the last two decades of the 19th century, a plethora of bacteria were isolated and designated etiological agents of human infectious diseases. As with many instances at the interface between cause and effective therapy, the further characterization of these alleged pathogens remained in the hands of a few devoted investigators until drugs with therapeutic potential became available. This vague period before the advent of proper cures for infections explains the shadowy origin of clinical or diagnostic microbiology. But, as R. Porter has stated, “history should be rooted in detail and as messy as life itself” (8); this is an undeniable description of the history of clinical microbiology, long the stepchild, frequently denied legitimacy, among the many siblings that constitute the science of microbiology. Yet the practice of clinical microbiology is the application of knowledge gained to the betterment of the human condition, the goal of clinical microbiologists. To appreciate the history of clinical microbiology, it must be said—without malice or rancor—that this practical side has earned us the disdain of those who emphasize theory exclusively. Our working behind the scenes is misinterpreted by colleagues in related fields whose egos require constant applause. Our role is belittled, but the wondrous ingenuity of our test objects underlines our contributions to health, disease diagnoses, and therapy.

The evolution of clinical microbiology is a response to clinical needs. This can take the form of a technical innovation garnered from the armamentarium of science in general or a reflection of advances in anti-infective therapy demanding recognition of microbial etiologies that now respond to therapeutic intervention. But this history is also a reflection of the political climate, of the perceived threats posed by an emerging field to more established organizational and professional entities. While this struggle greatly influenced the development of clinical microbiology as a distinct specialty, it is sufficiently controversial to be omitted from this essay.

One could propose that clinical microbiology got its start when various stains became available to indicate the presence of different organisms. The Gram stain helped divide the vast array of bacteria in various specimens into categories based, in addition to the staining reaction, on the anatomy of the organism and its source. The acid-fast stain aided in the recognition of mycobacteria, while the Albert stain suggested the presence of Corynebacterium diphtheriae, leading to the administration of a specific antiserum, an emerging therapy at the start of the 20th century that was usually used on the basis of clinical presentation. A variety of selective, supplemented, and en-riched culture media became available for isolation, permitting more rapid recognition of the presumptive presence of a significant organism. Antisera were put to use as therapies and to identify isolates. The Quellung reaction became a standard task of interns and resident house officers, performed to identify the type of pneumococcus present and guide specific serum therapy, in addition to avoiding serum sickness by obtaining careful histories of previous treatment with equine antibodies against other etiologic agents. The advances in the grouping and typing of streptococci, salmonellae, and shigellae, the separation of Staphylococcus aureus on the basis of the coagulase reaction, and the growing awareness of the need for safe water and uncontaminated food items established the need for laboratories to assume these responsibilities. It was only logical that microbiology should join endeavors such as chemistry, hematology, and serology under the rubric of clinical pathology. Differential media especially designed to sequester species increased dramatically during Word War II; military hospitals developed clinical microbiology sections devoted not only to recognizing agents endangering the health of troops in camps, in battle, and in foreign environments but also to assessing the responses of certain of the microorganisms isolated to several sulfonamides and that hitherto unknown agent, penicillin. The subsequent explosion of antimicrobial agents—streptomycin, chloramphenicol, tetracyclines, and erythromycin—suggested to the reigning powers of medical facilities that clinical microbiologists could be phased out, since infectious disease would disappear before the onslaught of agents discovered through human ingenuity.

In the interim, cotton plugs gave way to Bakelite, polypropylene, glass, metal, and plastic closures; in-house medium preparation was relieved in part by the beginnings of commercially manufactured ready-to-use media especially for mycobacteria and antimicrobial susceptibility testing. Alcohol, Bunsen, and Tyril burners were replaced by microincinerators, eventually followed by disposable loops and transfer needles.

The prescient wisdom of hospital boards soon was shattered by the genetic versatility of the microbial world, dramatically demonstrated by the pandemic of S. aureus 80/81 in the late 1950s and early 1960s and the emergence of gram-negative rods that demonstrated the superiority of the bacterial physiology over the commercially prepared secondary microbial metabolites that initially appeared so promising. To be sure, the tug of war between antimicrobial agents—natural and synthetic—and the microorganisms continues unabated, with signs that the evolutionary potential of the microbial world will succeed in the long run (4).

Since the 1960s, numerous ingenious innovations have been introduced and used in clinical microbiology laboratories. Parenthetically, none of them allowed microbiologists to abandon the dogma of the pure culture technique, to escape the need to study microorganisms under the most unnatural conditions, or to work with the laboratory variants of the organisms encountered in the natural setting. Molecular biology techniques promise to revolutionize the diagnosis of infectious disease—to date a promise still in its infancy.

Systems approaches began to replace the single test tube with but one substrate. Perhaps the first was double sugar iron agar (5) for the recognition of so-called enteric pathogens, followed by triple sugar iron agar (5) and the next tentative shortcut, the r/b tube (1). Rollender and Beckford, the inventors of the r/b tube, must be credited with initiating manufacturers' efforts to teach laboratory staffs the vagaries and problems of new system approaches. Shortly thereafter, the API system was introduced in the United States, bringing a novel numerical approach first to the identification of Enterobacteriaceae and then to that of several other categories of microorganisms (6). Similarly, the Roche Enterotube (7) used fewer reaction substrates to decrease the time needed to identify isolates to the species level; initially it was used for members of the Enterobacteriaceae and eventually for other microbial representatives. All systems eventually addressed yeasts and nutritionally demanding bacteria, obviating the multiple-tube approaches in use.

Miniaturization, plastic disposables, and commercial medium production are the norm in present-day microbiology laboratories. Mechanization of analytical microbiological procedures gave way to automation. The first automated procedure was Technicon Automated Antibiotic Susceptibility (TAAS) (2), followed by variations of the TAAS approach such as Autobac (Pfizer) and MS2 (Abbott). Blood cultures were and are still addressed by BacTek, also used for mycobacterial susceptibilities (6). The McDonnel-Douglas system for detection of life on Mars was modified to the Automated Microbiology System (AMS) for use in the clinical microbiology setting (9), now a product of bioMérieux after a Vitek interval; it has been followed by a growing number of various machines performing the tasks of yesterday's technical staffs. Robotization, now available for the routines of chemistry, hematology, endocrinology, some serology, and urinalysis, may eventually find applications in some if not many aspects of clinical microbiology (3).

But the very nature of clinical microbiology has undergone profound changes in the past few decades. Hitherto unknown microorganisms and viruses have appeared, often as the result of human activities with environmental impacts. Etiological agents unknown locally may be introduced by the faster transportation of people, manufactured goods, and foods, wreaking havoc among immunologically innocent populations. The task of identifying, isolating, and evaluating therapies for these emerging host-harmful entities has become part and parcel of the present function of clinical microbiologists, aided frequently by the tools of molecular biology. The threat of bioterrorism adds a new dimension to the steadily growing number of emerging infectious agents.

I implied earlier that the modalities and procedures of clinical microbiology are intimately entwined with advances in the diagnosis and treatment of infectious diseases. The pressure for rapid and accurate diagnosis is immeasurably enhanced when therapeutic measures for a specific disease come into use. Thus, the impetus to improve the microbiological acumen was felt as early as the era of serotherapy, when that acumen was needed to monitor vaccine effectiveness and to identify agents of sexually transmitted diseases for early chemotherapeutic measures, followed by the era of antimicrobial agents and, more recently—especially spurred on by the human immunodeficiency virus—the era of antiviral and antiparasitic compounds.

Clinical microbiologists are acutely aware of the constantly emerging intruders into the intimate human biosphere. These agents appear as the traditional scourges of humanity are brought under control. But the application of antimicrobial agents to the food chain, cosmetics, and over-the-counter medications, and the advances in medical science, sparing individuals afflicted with a variety of diseases but accompanied by impaired immunity—all these factors have combined to increase nosocomial infections, placing the medical facility at the very apex of the selective-pressure pyramid. The selection results in colonization by microbiota with a minority of antimicrobial-tolerant or -resistant constituents; administration of antimicrobial therapy converts these organisms to a majority. These selected prokaryotes and eukaryotes, along with the emerging viruses, coccidia, yeasts, and molds, pose a dynamic challenge to the clinical microbiologist and promise a continued need for her or his services. But these challenges must be met by the expansion of technical skills brought to bear on the changing nature of the challenging microbiota and the willingness of clinical microbiologists to adopt and practice evolving technologies, to gain knowledge in addition to information, and to remain in the forefront of innovation and invention.

(This review was presented in part at the 99th General Meeting of the American Society for Microbiology 1999, and an abstract has been published [Clin. Microbiol. Newsl. 21:191-194, 1999]).

REFERENCES

  • 1.Isenberg, H. D., and B. G. Painter. 1971. Comparison of conventional methods, the r/b system and modified r/b system as guide to the major divisions of Enterobacteriaceae. Appl. Microbiol. 22:1126-1134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Isenberg, H. D., A. Reichler, and D. Wiseman. 1971. Prototype of a fully automated device for determination of bacterial antibiotic susceptibility in the clinical laboratory. Appl. Microbiol. 22:980-986. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Isenberg, H. D. 1992. Clinical microbiology procedures handbook. American Society for Microbiology, Washington, D.C.
  • 4.Levy, Stuart. 2002. The antibiotic paradox: how the misuse of antibiotics destroys their curative powers, 2nd ed. Perseus Publishing, Cambridge, Mass.
  • 5.MacFaddin, J. F. 1985. Media for isolation, cultivation, identification, maintenance of medical bacteria. Williams & Wilkins, Baltimore, Md.
  • 6.Miller, J. M., and C. M. O'Hara. 1999. Manual and automated systems for microbial identification, p. 193-201. In P. R. Murray, E. J. Baron, M. A. Pfaller, F. C. Tenover, and R. H. Yolken (ed.), Manual of clinical microbiology, 7th ed. ASM Press, Washington, D.C.
  • 7.Painter, P. G., and H. D. Isenberg. 1973. Clinical laboratory experience with the improved enterotube. Appl. Microbiol. 25:896-899. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Porter, R. 1998. Greatest benefit to mankind. A medical history of humanity. W.W. Norton, New York, N.Y.
  • 9.Smith, P. B., T. L. Gavan, H. D. Isenberg, A. Sonnenwirth, W. I. Taylor, J. A. Washington II, and A. Balows. 1978. Multilaboratory evaluation of an automated microbiological detection/identification system. J. Clin. Microbiol. 8:657-666. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Clinical Microbiology are provided here courtesy of American Society for Microbiology (ASM)

RESOURCES