Abstract
Summary: Between mid‐century and 1992, there was a consensus that the battle against infectious diseases had been won, and the Surgeon General announced that it was time to close the book. Experience with human immunodeficiency virus/acquired immunodeficiency syndrome, the return of cholera to the Americas in 1991, the plague outbreak in India in 1994, and the emergence of Ebola in Zaire in 1995 created awareness of a new vulnerability to epidemics due to population growth, unplanned urbanization, antimicrobial resistance, poverty, societal change, and rapid mass movement of people. The increasing virulence of dengue fever with dengue hemorrhagic fever and dengue shock syndrome disproved the theory of the evolution toward commensalism, and the discovery of the microbial origins of peptic ulcer demonstrated the reach of infectious diseases. The Institute of Medicine coined the term ‘emerging and reemerging diseases’ to explain that the world had entered an era in which the vulnerability to epidemics in the United States and globally was greater than ever. The United States and the World Health Organization took devised rapid response systems to monitor and contain disease outbreaks and to develop new weapons against microbes. These mechanisms were tested by severe acute respiratory syndrome in 2003, and a series of practical and conceptual blind spots in preparedness were revealed.
Keywords: emerging diseases, reemerging diseases, infectious diseases, epidemics
An age of hubris
In the long contest between humans and microbes, the years from mid‐century until 1992 marked a distinctive era. In those euphoric decades, there was a consensus that the decisive battle had been joined and that the moment was at hand to announce the final victory. Almost as if introducing the new period, the US Secretary of State George Marshall declared in 1948 that the world now had the means to eradicate infectious diseases from the earth. Marshall's view was by no means exceptional. For some, in the early postwar years, the triumphant vision applied primarily to a single disease. The heady goal arose first of all within the field of malariology, where the Rockefeller Foundation scientists Fred Soper and Paul Russell thought that they had discovered in DDT (dichlorodiphenyltrichloroethane) a weapon of such unparalleled power that it would enable the world to eliminate the ancient scourge forever. With premature confidence in 1955, Russell published Man's Mastery of Malaria (1), in which he envisaged a global spraying campaign that would free mankind from malaria – cheaply, rapidly, and without great difficulty. Rallying to Russell's optimism, the World Health Organization (WHO) adopted a global campaign of malaria eradication with DDT as its weapon of choice. The director of the campaign, Emilio Pampana, elaborated a one‐size‐fits‐all program of eradication through four textbook steps –‘preparation, attack, consolidation, and maintenance’ (2). Russell's followers Alberto Missiroli, the director of the postwar campaign in Italy, and George Macdonald, the founder of quantitative epidemiology, reasoned that so signal a victory over mosquitoes could be readily expanded to include the elimination of all other vector‐borne tropical diseases, ushering in what Missiroli called a contagion‐free Eden, where medicine would make man not only healthy but also happy (3, 4, 5).
If malariologists, who dominated the international public health community, launched the idea of the final conquest of infectious diseases, it rapidly developed into the prevailing orthodoxy. E. Harold Hinman, chief malariologist to the Tennessee Valley Authority and member of the WHO Expert Committee on Malaria, extrapolated from the conquest of malaria to the conquest of all contagion in his influential work World Eradication of Infectious Diseases (6). Aidan Cockburn, a distinguished epidemiologist at Johns Hopkins and advisor to the WHO, gave expression to this new creed in his revealingly titled work The Evolution and Eradication of Infectious Diseases (7). As Cockburn noted, ‘“Eradication” of infectious disease as a concept in public health has been advanced only within the past two decades, yet it is replacing “control” as an objective’ (7). Although not a single disease had yet been destroyed by his time of writing in 1962, Cockburn believed that the objective of eradication was ‘entirely practical,’ not just for individual illnesses but for the whole category of communicable diseases. Indeed, he argued, ‘it seems reasonable to anticipate that within some measurable time, such as 100 years, all the major infections will have disappeared’ (7). By that time, he explained, ‘the major infections of today should have disappeared, and only remaining should be their memories in textbooks, and some specimens in museums. … With science progressing so rapidly, such an end‐point is almost inevitable, the main matter of interest at the moment is how and when the necessary actions should be taken’ (8). Cockburn's timetable of total eradication by 2060 was, in fact, too slow for some. Just a decade later, in 1973, the Australian virologist and Nobel laureate Frank Macfarlane Burnet went so far as to proclaim, together with his colleague David White, that ‘at least in the affluent West,’ the grand objective had already been reached. ‘One of the immemorial hazards of human existence has gone,’ he reported, because there is a ‘virtual absence of serious infectious disease today’ (9). The WHO also saw the entire planet as ready to enter the new era by the end of the century. Meeting at Alma Ata in 1979, the World Health Assembly adopted the goal of ‘Health for All, 2000’ (10).
What could possibly have led to such overweening confidence in the power of science, technology, and civilization to vanquish communicable disease? One factor was historical. In the industrialized West, rates of mortality and morbidity from infectious diseases began to plummet in the second half of the 19th century, in large part as a result of ‘social uplift’– dramatic improvements in wages, housing, diet, and education. At the same time, developed nations erected the solid fortifications of sanitation and public health: sewers, drains, sand filtration, and chlorination of water as defenses against cholera and typhoid; sanitary cordons, quarantine, and isolation against bubonic plague; vaccination against smallpox; and the first effective ‘magic bullet’– quinine – against malaria. Meanwhile, improvements in the handling of food, pasteurization, retort canning, and the sanitation of seafood beds, yielded major advances against bovine tuberculosis (TB), botulism, and a variety of food‐borne maladies.
Already by the early 20th century, therefore, many of the most feared epidemic diseases of the past were in headlong retreat for reasons that were initially more empirical and spontaneous than the result of the application of science. Science, however, soon added new and powerful weapons. The foundational work of Louis Pasteur and Robert Koch had established the biomedical model of disease that promoted unprecedented understanding and yielded a cascade of scientific discoveries and new sub‐specialties (microbiology, immunology, parasitology, and tropical medicine). The dawn of the antibiotic era with penicillin and streptomycin provided means to treat syphilis, staph infections, and TB. The development of a series of vaccines dramatically lowered the incidence of smallpox, pertussis, diphtheria, tetanus, rubella, measles, mumps, and polio. DDT seemed to furnish a means to abolish malaria and other insect‐borne pathogens. By the 1950s, therefore, scientific discoveries had provided effective weapons against many of the most prevalent infectious diseases. Extrapolating from such dramatic developments, many concluded that it was reasonable to expect that communicable diseases could be eliminated one at a time until the vanishing point was reached. Indeed, the worldwide campaign against smallpox provided just such an example when the WHO announced in 1979 that the disease had become the first ever to be eradicated by intentional human action.
Those who asserted the doctrine of the conquest of infection viewed the microbial world as largely static or only very slowly evolving. For that reason, there was little concern that the victory over existing infections would be challenged by the appearance of new diseases for which humanity was unprepared and immunologically naive. Falling victim to historical amnesia, they ignored the fact that the last 500 years even in the West had been punctuated by the appearance of a series of catastrophic new diseases: bubonic plague in 1347, syphilis in the 1490s, cholera in 1830, Spanish influenza in 1918–1919. Macfarlane Burnet in this regard was typical. Burnet was a founding figure in evolutionary medicine who acknowledged, in theory, the possibility of the emergence of new diseases as a result of mutation. But, in practice, he believed that such appearances are infrequent and that they occur only at such distant intervals as to occasion little concern. ‘There may,’ he wrote, ‘be some wholly unexpected emergence of a new and dangerous infectious disease, but nothing of the sort has marked the last fifty years’ (9). The notion of microbial fixity, that the diseases that we have are the ones that we will face, even underpinned the International Health Regulations adopted in 1969 (IHR 1969), which specified that the three great epidemic killers of the 19th century were the only diseases requiring notification: plague, yellow fever, and cholera. The regulations gave no thought to what action would be required if an unknown but deadly and transmissible new microbe should appear (11, 12).
If belief in the stability of the microbial world was one of the major articles of faith underpinning the eradicationists' vision, a second misplaced evolutionary idea also played a crucial role. This was the doctrine that nature was fundamentally benign. Over time, eradicationists believed, the pressure of natural selection would drive all communicable diseases toward a decline in virulence. The principle was that excessively lethal infectious diseases would prevent their own transmission by prematurely destroying their hosts. The long‐term tendency, the proponents of victory asserted, is toward commensalism and equilibrium. New epidemic diseases are virulent almost by accident as a temporary maladaptation, and they therefore evolve toward mildness, ultimately becoming readily treatable diseases of childhood. Examples were the evolution of smallpox from variola major to variola minor; the transformation of syphilis from the fulminant ‘great pox’ of the 16th century into the slow‐acting disease of today; and the transformation of classic cholera into the far milder El Tor biotype. Similarly, the doctrine held a priori that, in the family of four diseases of human malaria, the most virulent, i.e. Falciparum malaria, was an evolutionary newcomer relative to the less lethal Vivax, Ovale, and Malariae malaria, which were believed to be older and to have evolved toward commensalism. Against this background, the standard textbook of internal medicine in the eradicationist era, the 7th edn of Harrison's Principles of Internal Medicine of 1974, claimed that a feature of infectious diseases is that they ‘as a class are more easily prevented and more easily cured than any other major group of diseases’ (13, 14).
The most fully elaborated and most cited theory of the new era was the ‘epidemiologic transition’ or ‘health transition’ theory represented by Abdel Omran, professor of epidemiology at Johns Hopkins, in 1971 and refined by him in 1977 and 1983. Omran's theory of the transition was an account of the encounter of human societies with disease in the modern period. According to Omran and his followers in such journals as the Health Transition Review, humanity has passed through three eras of modernity in health and disease. Although Omran is ambiguous about the precise chronology of the first era, the ‘age of pestilence and famine,’ it is clear that it lasted until the 18th century in the West and was marked by Malthusian positive checks on demography: epidemics, famines, and wars. There followed the ‘age of receding pandemics’ that extended from the mid‐18th century until the early 20th in the developed West and until later in non‐Western countries. During this period there was a declining mortality from infectious diseases in general and from TB in particular. Finally, after World War I in the West and after World War II in the rest of the globe, humanity entered the ‘age of degenerative and man‐made diseases.’ Whereas in the earlier stages of disease evolution, social and economic conditions played the dominant role in determining health and the risk of infection, in the final phase medical technology and science played a major part. In this period, mortality and morbidity from infectious diseases have been progressively replaced by the rise of degenerative diseases such as cardiovascular disease, cancer, diabetes, and metabolic disorders, by man‐made diseases such as occupational and environmental illnesses, and by accidents (15, 16, 17). Adopting the perspective of ‘health transition’ theory, US Surgeon General Julius B. Richmond announced in 1979 that infectious diseases were simply the ‘predecessors’ of the degenerative diseases that succeed and replace them. The course of nature, in his view, was simple, unidirectional, and benign (18).
If memory of the power of public health and science provided a major impetus to overconfidence, forgetfulness also played a vital role. US Surgeon General William Steward reported in 1969 that the time had come to ‘close the book on infectious diseases.’ This view was profoundly Eurocentric. Even as medical experts in Europe and North America declared final victory, infectious diseases remained the leading cause of death worldwide, and nowhere more disastrously than in the poorest and most vulnerable countries of Africa, Asia, and Latin America. While the TB sanatoria were closing their doors in the developed North, the disease continued its ravages in the South. Indeed, the disease continued to ravage the marginalized underclasses of the North itself: the homeless, prisoners, intravenous drug users, immigrants, and racial minorities. As Paul Farmer has argued, TB was emphatically not disappearing; it was just that the bodies it affected were either distant or hidden from sight (19, 20). Indeed, in 2008 the best estimates suggest that there are more people ill with TB today than at any time in human history and that nearly two million will die of it during the course of the year (21, 22).
Raising the alarm
Ultimately, by the early 1990s, the eradicationist position became untenable. Rather than witnessing the rapid fulfillment of the prediction that science and technology would eliminate all infectious diseases from the globe, the industrial West discovered that it remained painfully vulnerable and to a degree that had seemed unimaginable. The decisive event, of course, was the arrival and upsurge of human immunodeficiency virus (HIV)/acquired immunodeficiency syndrome (AIDS). AIDS was first recognized as a new disease entity in 1981, and its etiologic pathogen was identified in 1983. By the end of the decade, it was clear that HIV/AIDS embodied everything that the eradicationists had considered unthinkable. AIDS was a new infectious disease for which there was no cure, it reached the industrial world as well as developing countries, and it unleashed in its train a series of exotic additional opportunistic infections. Furthermore, it had the potential to become the worst pandemic in history as measured not only by mortality and suffering but also by its profound social, economic, and security consequences.
From the front lines of the battle against AIDS, a series of voices sounded the alarm in the 1980s about the severity of the new threat. Most famous of all was the case of the US Surgeon General C. Everett Koop, who became the chief federal spokesman on the disease. In 1988 he produced the brochure Understanding AIDS and took the pioneering step of having it mailed to all 107 million households in the nation (23). Working in greater obscurity in sub‐Saharan Africa, Peter Piot, who later directed UNAIDS, warned in 1983 that AIDS in Africa was not a ‘gay plague’ but an epidemic of the general population. He warned that it was transmitted by heterosexual as well as homosexual intercourse and that in fact it affected women more readily than men.
The warnings of the 1980s, however, were confined to the issues of AIDS: they did not directly confront the larger issue of eradicationism or announce a new era in medicine and public health. That task fell first to the National Academy of Science's Institute of Medicine (IOM) and its landmark publications on emerging diseases that began in 1992 with Emerging Infections: Microbial Threats to Health in the United States (24). Once raised by the IOM, the cry of alarm was taken up widely and almost immediately: by the Centers for Disease Control and Prevention (CDC), which devised its own response to the crisis in 1994 and founded a new journal Emerging Infectious Diseases devoted to the issue; by the National Science and Technology Council (NSTC) in 1995; and by 36 of the world's leading medical journals that agreed to take the unprecedented step by which each devoted a theme issue to emerging diseases in January 1996, which they proclaimed ‘Emergent Diseases Month’ (25, 26, 27). In 1996, in addition, President Bill Clinton (28) issued a fact sheet entitled ‘Addressing the Threat of Emerging Infectious Diseases’ in which he declared them ‘one of the most significant health and security challenges facing the global community.’ There were also highly visible hearings on emerging infections in the US Congress (29). In opening those hearings before the Senate Committee on Labor and Human Resources, Senator Nancy Kassebaum, the committee chairperson, noted,
New strategies for the future begin with increasing the awareness that we must re‐arm the Nation and the world to vanquish enemies that we thought we had already conquered. These battles, as we have learned from the 15‐year experience with AIDS, will not be easy, inexpensive, nor quickly resolved. (29)
Finally, to attract attention at the international level, the WHO, which had designated April 7 of each year World Health Day, declared that the theme for 1997 was ‘Emerging Infectious Diseases – Global Alert, Global Response’ with the lesson that in a global village, no nation is immune (30).
In addition to the voices of scientists, elected officials, and the public health community, the popular press gave extensive coverage to the new and unexpected danger, especially when the lesson was driven home by three events of the 1990s that captured attention worldwide. The first was the onset of a large‐scale epidemic of Asiatic cholera in South and Central America, beginning in Peru in 1991 and rapidly spreading across the continent until 400 000 cases and 4000 deaths were reported in 16 countries (31). Since the Americas had been free of the disease for a century, the arrival of the unwelcome visitor reminded the world of the fragility of painfully won advances in public health. Because cholera is transmitted by the contamination of food and water by fecal matter, it is a ‘misery thermometer’– an infallible indicator of societal neglect and substandard living conditions (32). Its outbreak in the West late in the 20th century, therefore, caused shock and a sudden awareness of unexpected danger. Indeed, the press informed its readers of the ‘Dickensian slums of Latin America,’ where the residents of Lima and other cities drew their drinking water directly from the ‘sewage‐choked River Rimac’ and similarly polluted sources (33, 34). WHO Director‐General Hiroshi Nakajima proclaimed the South American epidemic an ‘emergency situation.’
The second news‐catching event in the matter of epidemic diseases was the outbreak of plague in the Indian states of Gujarat and Maharashtra in September and October 1994. The final toll for the epidemic was limited – 700 cases and 56 deaths were reported (31). Nevertheless, the news that plague had broken out in both bubonic and pneumonic forms unleashed an almost Biblical exodus of hundreds of thousands of people from the industrial city of Surat. It cost India an estimated $1.8 billion in lost trade and tourism, and it sent waves of panic around the world. The disproportionate fear, as the New York Times explained, was due to the fact that the very word plague was explosively charged. It evoked cultural memories of the Black Death that killed a quarter of the population of Europe in the 14th century. India's plague, the paper continued, ‘is a vivid reminder that old disease, once thought to have been conquered, can strike unexpectedly anytime, anywhere’ (35).
The third major epidemic shock of the 1990s was an outbreak of the frightening disease of Ebola hemorrhagic fever at the city of Kikwit, Zaire (now Democratic Republic of the Congo), in 1995. Cholera claimed international attention because of the numbers of those it afflicted, even though it had a low case fatality rate if treated early. Plague demanded attention because of its all too familiar potential. Ebola, by contrast, did not inspire terror by giving rise to a major epidemic: it infected only 315 people between January and July 1995. Nor did it create fear because of historical memories of disaster since it was a new disease first recognized in 1976. Nevertheless, Ebola set off a tidal wave of fear – a ‘modern nightmare’ in the words of Le Monde– across the globe. The reasons were that it dramatically revealed the lack of preparedness of both industrial and developing nations to deal with a public health emergency. It ignited primordial western fears of the jungle and of untamed nature, and it fed on racial anxieties about ‘darkest’ Africa. As a result, a prominent aspect of the Kikwit outbreak was its capacity to generate what the Journal of Infectious Diseases termed ‘extraordinary’ and ‘unprecedented’ press coverage that amounted at times to the commercial ‘exploitation’ of human misery and a ‘national obsession’ (36). Descending onto the banks of the Kwilu River, the world's tabloids stressed in vivid hyperbole that Ebola was a zoonotic disease that had sprung directly from the jungles of Africa as a result of the encounter between native charcoal burners and monkeys and now threatened the West. In the revealing headline of The Daily Telegraph of Sydney, ‘Out of the jungle a monster comes’ (37). Even the most legitimate investigators, however, were disturbed to discover that Ebola had eluded public health attention for 12 weeks between the death of the index case on January 6 and the notification of the international community on April 10, despite the fact that the disease had left clusters of severely ill and dying patients in its train. With such a porous surveillance network in place, Ebola aroused the fear that it might spread unnoticed 500 km from Kikwit to Kinshasa, and then throughout the world by means of the Zairian capital's intercontinental airport. There the virus could be loaded on board as ‘a ticking, airborne time bomb’ (38). Most of all, however, the Kikwit outbreak commanded attention because Ebola is almost invariably fatal and because its course in the human body is excruciating, dehumanizing, and dramatic. Commenting on the scenes that he had observed in Zaire, the author Richard Preston explained on television at the height of the outbreak that the mortality rate among sufferers was 90% and that there was no known remedy or prophylactic. He continued:
The victims suffer what amounts to a full‐blown biological meltdown. … When you die of Ebola, there's this enormous production of blood, and that can often be accompanied by thrashing or epileptic seizures. And at the end you go into catastrophic shock and then you die with blood pouring out of any or all of the orifices of the body. And in Africa where this outbreak is going on now, medical facilities are not all that great. I've had reliable reports that doctors … were literally struggling up to their elbows in blood – in blood and black vomit and in bloody diarrhea that looks like tomato soup, and they know they're going to die. (39)
In combination with the announcement by scientists that the world was highly susceptible to new pandemics of just such infections, these events on three continents generated sensationalist headlines. Representative examples were ‘Killers on the Loose,’‘Bug War,’‘Doomsday Virus Fear,’‘Heat from the Hot Zone,’ and ‘Revenge of the Microbes.’ The images invoked were those of the apocalypse, of civilization perched on the slopes of an erupting volcano, of the West besieged by invisible hordes, and of nature exacting its revenge for human presumption. As Forrest Sawyer reported on ABC news, ‘Once the western world thought it was safe from these invisible killers. Not anymore. We are now biologically connected in a web or a net.’ In addition, there was an outpouring of films devoted to the possibility of pandemic disaster such as Wolfgang Petersen's thriller Outbreak and of widely read books on the same theme, including Richard Preston's best‐seller, The Hot Zone; Laurie Garrett, The Coming Plague: Newly Emerging Diseases in a World Out of Balance; and William Close, Ebola. In the words of David Satcher, director of the CDC, the result was the ‘CNN effect’– the perception by the public that it was at immediate risk even at times when the actual danger was small (29).
A more dangerous era unfolds
In this climate of anxiety, the term ‘emerging and reemerging diseases’ was coined for the IOM by Joshua Lederberg, winner of the Nobel Prize for Medicine, to mark a new era. Lederberg defined these disease entities as follows: ‘Emerging infectious diseases are diseases of infectious origin whose incidence in humans has increased within the past two decades or threatens to increase in the near future’ (40). Emerging diseases were those that, like AIDS and Ebola, were previously unknown to have afflicted humans; reemerging diseases, such as cholera and plague, were familiar scourges whose incidence was rising, or whose geographical range was expanding.
Lederberg's purpose in devising a new category of diseases was to give notice that the age of euphoria was over. Instead of receding to a vanishing point, he declared, communicable diseases ‘remain the major cause of death worldwide and will not be conquered during our lifetimes … We can also be confident that new diseases will emerge, although it is impossible to predict their individual emergence in time and place’ (24). Indeed, the contest between humans and microbes was a Darwinian contest with the advantage tilted toward the microbes. The stark message of the IOM was that, far from being secure from danger, the United States and the West were at greater risk from contagious and epidemic diseases than at any time in history.
An important reason for this new vulnerability was the legacy of eradicationism itself. The belief that the time had come to close the books on infectious diseases had produced a pervasive climate that critics labeled variously as ‘complacency,’‘optimism,’‘overconfidence,’ and ‘arrogance.’ The conviction that victory was imminent had led the industrial world to premature and unilateral disarmament. Assured by a consensus of the leading medical authorities for 50 years that the danger was past, federal and state governments in the United States dismantled their public health programs dealing with communicable diseases and slashed their spending. At the same time, investment by private industry on the development of new vaccines and classes of antibiotics dried up, the training of health care workers failed to keep abreast of new knowledge, vaccine development and manufacture were concentrated in fewer laboratories, and the discipline of infectious diseases struggled to attract its aliquot share of research funds and of the best minds. At the nadir in 1992, the United States spent only $74 million for infectious disease surveillance as public health officials prioritized other concerns – chronic diseases, substance abuse, tobacco use, geriatrics, and environmental issues. For these reasons, the assessment of American preparedness to face the challenges of the new era was disheartening. In the words of the CDC in 1994:
The public health infrastructure of this country is poorly prepared for the emerging disease problems of a rapidly changing world. Current systems that monitor infectious diseases domestically and internationally are inadequate to confront the present and future challenges of emerging infections. Many foodborne and waterborne disease outbreaks go unrecognized or are detected late; the magnitude of the problem of antimicrobial drug resistance is unknown; and global surveillance is fragmentary. (25)
More bluntly, Michael Osterholm, the Minnesota state epidemiologist, informed Congress in 1996 that, ‘I am here to bring you the sobering and unfortunate news that our ability to detect and monitor infectious disease threats to health in this country is in serious jeopardy. … For twelve of the States or territories, there is no one who is responsible for food or water‐borne disease surveillance. You could sink the Titanic in their back yard and they would not know they had water’ (29).
A striking example of the effects of complacency on infectious disease is the case of TB in New York City. TB had once been the leading cause of death in the city, but improvements in hygiene and education, followed by the discovery of streptomycin, led to the conviction by the middle of the 20th century that the disease was on the verge of being entirely conquered. As a result, funding was diverted, and demonstrably effective TB programs were dismantled although the social determinants of the disease worsened dramatically – immigration, crowding, homelessness, and rates of incarceration. Meanwhile, HIV/AIDS continued to provide large numbers of patients with compromised immunity. As a result, the risk of infection increased, while access to health care became increasingly difficult, and the city experienced a remarkable and entirely preventable resurgence of the ‘white plague,’ primarily among African American and Hispanic residents. Between 1978 and 1992, the numbers of cases tripled, while drug resistance developed as a significant additional problem. New York City led the way in a national resurgence of TB as cases increased by 20% between 1985 and 1992. Overweening confidence led directly and rapidly to a local epidemic and a partial reversal nationally of decades of tireless campaigning (29).
If the experience of the United States with TB suggests how fragile advances in health remained even in the industrial world, the situation in developing countries was still more disquieting. There, progress toward the germ‐free Eden during the eradicationist era was nil. In David Satcher's uncompromising observation, ‘Persons living in tropical climates are still as vulnerable to infectious disease as their early ancestors were’ (41).
The critique of 50 years of hubris went deeper than just a protest against a decline in vigilance. In addition, the theorists of emerging diseases argued that, unnoticed by the eradicationists, society since World War II had changed in ways that actively promoted the emergence and reemergence of epidemic diseases. One of the leading features most commonly cited was the impact of globalization in the form of the rapid mass movement of goods and populations. As William McNeill noted in Plagues and Peoples (14), the migration of people throughout history has been one of the most dynamic factors affecting the balance between microbes and man. Humans are permanently engaged in a kind of war in which the social and ecological conditions that they create exert powerful evolutionary pressure on micro‐parasites. By mixing gene pools and by providing access for microbes to populations of non‐immunes living in conditions in which the microbes thrive, globalization gave microorganisms a powerful advantage. In the closing decades of the 20th century and the early years of the 21st, the speed and scale of this phenomenon amounted to a quantum leap, as 2.1 billion passengers boarded airplanes in 2006 (31, 42). In the words of the popular press, the daily movement of people around the globe by airplane means that a disease breaking out today in Kikwit can arrive in New York, Mumbai, and Mexico City tomorrow. The numbers of voluntary travelers, moreover, are massively supplemented by millions of involuntary refugees and displaced persons in flight from warfare, famine, and religious, ethnic, or political persecution. For Lederberg and the IOM, these rapid mass movements have tilted the advantage in favor of microbes, ‘defining us as a very different species from what we were 100 years ago. We are enabled by a different set of technologies. But despite many potential defenses – vaccines, antibiotics, diagnostic tools – we are intrinsically more vulnerable than before, at least in terms of pandemic and communicable diseases’ (43).
After globalization, the second factor most frequently underlined was demographic growth, especially because this growth occurred in circumstances that were the delight of microorganisms and of the insects that often transmit them. In the postwar era, population has soared above all in the poorest and most vulnerable regions of the world, with the global urban population growing at four times the rate of the rural. Its hallmark has been wholesale, chaotic, and unplanned urbanization, led by the resource‐poor nations of sub‐Saharan Africa, which is the most rapidly urbanizing region on the planet (44). The results have been escalating poverty, widening social inequality, the birth of ‘megacities’ exceeding 10 million inhabitants, and the spawning of teeming peri‐urban slums without sanitary, educational, or other infrastructures. Such places were ready‐made for ancient diseases to expand, as cholera demonstrated in the shantytowns and barrios of cities like Lima, Mexico City, and Rio de Janeiro, where millions lived without sewers, drains, secure supplies of drinking water, or appropriate waste management. Already in the 19th century, cholera had flourished in the conditions created in European cities by rapid and unplanned urbanization. In the final decades of the 20th century and the start of the 21st, a much larger process on a global scale reproduced in the cities of Africa, Asia, and Latin America the anomalous sanitary conditions propitious for cholera (45).
Another clear indication of socio‐economic conditions in these new urban ecosystems is the appearance of trench fever (Bartonella quintana) among the inhabitants of homeless shelters in North American cities. Trench fever first emerged in the filth and crowding of soldiers in the trenches of the Western Front in the First World War, when millions of combatants were infected by the lice that covered their bodies. Bartonella quintana, however, had never been documented apart from the vermin and the grime of wartime. The reemergence of the disease in urban America is therefore a clear measure of the insalubrious conditions of marginalized populations among the urban poor (46, 47).
Here too in urban poverty were the social determinants that made possible the global pandemic of dengue fever that began in 1950 and has continued unabated until today, when 2.5 billion people are at risk every year and 50–100 million people are infected. Dengue is the ideal type of an emerging disease. An arborovirus transmitted primarily by the highly urban, day‐biting, and domestic Aedes aegypti mosquito, dengue thrives in crowded tropical and semi‐tropical slums whenever there is standing and unregulated water. It breeds abundantly in gutters, uncovered cisterns, unmounted tires, stagnant puddles, and plastic containers, and it takes full advantage of societal neglect and the absence or cessation of vector control programs.
Particularly important for the theorists of ‘emerging diseases’ was the manner in which dengue demonstrated the hollowness of the reassuring dogma that infectious diseases evolve inexorably toward commensalism and reduced virulence. The dengue virus is a complex of four closely related serotypes (DEN‐1, DEN‐2, DEN‐3, and DEN‐4) that have been known to infect humans since the 18th century. Until 1950, however, dengue infections in any geographical area were caused by a single virus that gave rise to a painful illness marked by fever, rash, headache behind the eyes, vomiting, diarrhea, prostration, and joint pains so severe that the infection earned its nickname ‘break‐bone fever.’ But ‘classical’ dengue was a self‐limiting disease that was followed by lifelong immunity. The movement and mobility of populations, however, have allowed all four serotypes to spread indiscriminately around the globe, so that for the first time individuals who have already experienced infection with one dengue virus can subsequently be infected with one or more of the others, as there is no crossover immunity from one serotype to another. Through mechanisms that are still imperfectly understood, the disease is much more severe in patients suffering re‐infections with different serotypes. Instead of becoming milder, therefore, dengue has become a growing threat, giving rise to far more frequent outbursts and to sudden, devastating epidemics in which large numbers of patients suffer the severe and often lethal complications of dengue hemorrhagic fever (DHF) and dengue shock syndrome (DSS) that were once unknown.
In the Americas, the first modern epidemic of dengue fever broke out in 1983 in Cuba, producing 344 000 cases, of whom 24 000 suffered DHF and 10 000 DSS (48). Moreover, since the dengue vectors A. aegypti and Aedes albopictus are present in the United States, scientists at the National Institute of Allergy and Infectious Diseases (NIAID), such as its director Anthony Fauci, have noted that dengue fever has broken out in both Hawaii and Puerto Rico, and that they see no inherent reason it could not include the continental United States in its ongoing global expansion (49). Dengue therefore demonstrates the following important evolutionary lessons: (i) infectious diseases that do not depend on the mobility of their host for transmission (because they are vector‐borne, waterborne, or foodborne) are not under selective pressure to become less virulent; (ii) overpopulated and unplanned urban or peri‐urban slums provide ideal habitats for microbes and their arthropod vectors; and (iii) modern transportation and the movements of tourists, migrants, refugees, and pilgrims facilitate the process by which microbes and vectors gain access to these ecological niches.
Paradoxically, the very successes of modern medical science also prepared the way for the emergence of new infections. By prolonging life, medicine gives rise to ever larger numbers of elderly people with compromised immune systems. As part of this process, significant numbers of immunocompromised populations have appeared at earlier ages as well‐diabetics, cancer and transplant patients undergoing chemotherapy, and AIDS patients whose lives have been radically extended by antiretroviral treatment. Furthermore, such people are frequently concentrated in settings where the transmission of microbes from body to body is amplified, such as hospitals, facilities for the elderly, and prisons. The proliferation of invasive procedures has also increased the opportunities for such diseases. Modern nosocomial infections emerged in these conditions, and have become a major problem of public health as well as an ever growing economic burden. Of these infections, the so‐called ‘superbug’Staphylococcus aureus– the leading cause of nosocomial pneumonia, of surgical site infections, and of nosocomial bloodstream infections – is the most important and widespread. A recent study notes that in the United States by 2008:
Each year approximately two million hospitalizations result in nosocomial infections. In a study of critically ill patients in a large teaching hospital, illness attributable to nosocomial bacteria increased intensive care unit stay by 8 days, hospital stay by 14 days, and the death rate by 35%. An earlier study found that postoperative wound infections increased hospital stay an average of 7.4 days. (50)
A further threatening byproduct of the advance of medical science is the development of ever increasing antimicrobial resistance. Already in his 1945 Nobel Prize acceptance speech, Alexander Fleming, who discovered penicillin, the first antibiotic, issued a prophetic warning. Penicillin, he advised, needed to be administered with care, because the bacteria susceptible to it were likely to develop resistance. The selective pressure of so powerful a medicine would make it inevitable. Echoing Fleming's warning, the emerging diseases theorists argue that antibiotics are a ‘non‐renewable resource’ whose duration of benefit is biologically limited. By the late 20th century, this prediction was reaching fulfillment. On the one hand, the discovery of new classes of antimicrobials had slowed to a trickle, especially in a market in which profit margins are compressed by competition, by regulations requiring large and expensive clinical trials before approval, and by the low tolerance for risk on the part of regulatory agencies charged with the safety of the public. On the other hand, while anti‐infective development stagnates, many microorganisms have evolved extensive resistance. As a result, in one telling metaphor, physicians are rapidly emptying their quiver, and the world stands poised to enter the postantibiotic era (51). Some of the most troubling examples of the emergence of resistant microbial strains are the emergence of plasmodia that are resistant to all synthetic antimalarials, of S. aureus that is resistant both to penicillin and to methycillin (MRSA), and of strains of Mycobacterium tuberculosis that are resistant not only to first‐line medications (MDR‐TB) but to second‐line medications as well (XDR‐TB) (52). Antimicrobial resistance has become a global crisis, and many anticipate the early appearance of strains of HIV, TB, Staph A, and malaria that are not susceptible to any available therapy.
In part the problem of antimicrobial resistance is a simple result of Darwinian evolution. As a Rand Corporation study (53) notes, there are tens of thousands of viruses and 300 000 species of bacteria that are capable of infecting human beings, and many of them replicate and evolve billions of times in the course of a single human generation. Evolutionary pressures, in this context, work to the long‐term disadvantage of human beings. But unwise human actions have dramatically hastened the process. Farmers spray crops with pesticides and fruit trees with antibiotics, and they add subtherapeutic doses of antibiotics such as virginiamycin and avoparcin wholesale to animal feed to prevent disease, promote growth, and increase the productivity of chickens, pigs, and feedlot cattle. Indeed, half the world output of antimicrobials by tonnage is used in agriculture (54). At the same time, the popular confidence that microorganisms will succumb to a chemical barrage has led to a profusion of antimicrobials in domestic settings where they serve no purpose (55). Physicians, pressured to give priority in clinical settings to the immediate risk of individual patients over the long‐term interest of the species and to meet patients' expectations, have succumbed to profligate prescribing fashions, administering antibiotics even for non‐bacterial conditions for which they are unnecessary or entirely useless. The classic case in this regard is the pediatric treatment of otitis media (or middle ear infection), for which the overwhelming majority of practitioners in the 1990s prescribed antibiotics, even though two‐thirds of the children derived no benefit from the medication. Widespread possibilities of self‐medication in countries with few regulations or through opportunities created by the internet amplify the difficulties. In the case of diseases such as malaria and TB that require a long and complicated therapeutic regimen, there is also the issue of patients who interrupt their treatment after the alleviation of their symptoms instead of persevering until their condition is cured. Here the problem is not the overuse but the underuse of antibiotics. Sometimes described as simple non‐compliance by patients, the issue in fact raises complex questions of education, poverty, and lack of access to health care. Here the WHO strategies of DOTS (Directly Observed Treatment Short Course) and DOTS‐plus are helpful but cannot solve the underlying problems.
A further issue raised by the new era was the overly rigid conceptualization of disease by the eradicationists, who drew too sharp a distinction between chronic and contagious diseases. Infectious diseases, it became clear during the 1990s, are a more expansive category than scientists previously realized because many diseases long considered non‐infectious in fact have infectious origins. In demonstrating these causal connections, the decisive work was that of the Australian Nobel laureates Barry J. Marshall and Robin Warren with regard to peptic ulcers in the 1980s. Peptic ulcers are a significant cause of suffering, cost, and even death, as one American in 10 develops one during the course of a life time, over one million people are hospitalized by them every year, and 6000 die. Marshall noted in his acceptance speech for the Nobel Prize in 2005, however, that the chronic etiology of peptic ulcer in the 1980s was universally accepted as scientific truth. In his words, ‘I realized that the medical understanding of ulcer disease was akin to a religion. No amount of logical reasoning could budge what people knew in their hearts to be true. Ulcers were caused by stress, bad diet, smoking, alcohol and susceptible genes. A bacterial cause was preposterous.’ What Marshall and Warren were able to demonstrate, therefore, was a medical watershed. They proved, in part by means of an auto‐experiment, that the bacterium Helicobacter pylori was the infectious cause of the disease and that antibiotics rather than diet, lifestyle change, and surgery were the appropriate therapy (56). This insight led to the realization that many other non‐acute diseases, such as certain forms of cancer, chronic liver disease, and neurological disorders, are due to infections. Human papillomavirus, for instance, is thought to give rise to cervical cancer, hepatitis B and C viruses to chronic liver disease, Campylobacter jejuni to Guillain‐Barré syndrome, and certain strains of Escherichia coli to renal disease (57, 58). There are indications as well that infections serve as an important trigger to atherosclerosis and arthritis, and there is a growing recognition that epidemics and the fear that accompanies them leave psychological sequelae in their wake, including posttraumatic stress (59, 60). This understanding of these processes is what some have termed a new awareness of the ‘infectiousness of non‐infectious diseases’ (61).
Finally, and most emphatically, the concept of emerging and reemerging diseases was intended to raise the most important threat of all – that the spectrum of diseases that humans confront is broadening with unprecedented and unpredictable rapidity. The number of previously unknown conditions that have emerged to afflict humanity since 1970 exceeds 40, with a new disease discovered on average more than once a year. The list includes such frightening names as HIV, Hantavirus, Lassa fever, Marburg fever, Legionnaires' disease, hepatitis C, Lyme disease, Rift Valley fever, Ebola hemorrhagic fever, Nipah virus, West Nile virus, SARS (severe acute respiratory syndrome), bovine spongiform encephalopathy, avian flu H5N1, Chikungunya virus, and group A streptococcus – the so‐called ‘flesh‐eating bacterium.’ Skeptics argue that simply to list the diseases that have emerged since 1970s gives the misleading impression that diseases are emerging at an accelerating rate. This impression, they suggest, is largely an artifact of heightened surveillance and improved diagnostic techniques rather than a new development. The WHO has countered that not only have diseases emerged at record rapidity as one would expect from the transformed social and economic conditions of the postwar world, but also that they gave rise between the years 2002 and 2007 to a record 1100 worldwide epidemic events (31). The most recent and comprehensive examination of the question (62), published in February 2008 in Nature, involved the study of 335 emerging infectious disease (EID) ‘events’ between 1940 and 2004, controlling for reporting effort through more efficient diagnostic methods and more thorough surveillance. The conclusion was that, ‘The incidence of EID events has increased since 1940, reaching a maximum in the 1980s. … Controlling for reporting effort, the number of EID events still shows a highly significant relationship with time. This provides the first analytical support for previous suggestions that the threat of EIDS to global health is increasing’ (62).
There are no rational grounds, the public health community concluded, to fail to expect that as diseases emerge in the future, some of them will be as virulent and as transmissible as HIV or the Spanish influenza of 1918/1919. Discussion has therefore shifted dramatically from the question of whether new diseases will emerge and old ones resurge to the issue of how the international community can best prepare to face them. In the stark words of the US Department of Defense, ‘Historians in the next millennium may find that the twentieth century's greatest fallacy was the belief that infectious diseases were nearing elimination. The resultant complacency has actually increased the threat’ (63).
Rearmament
A major aspect of the official response to the challenge of emerging and reemerging diseases is that microbes now are regarded as threats to the security of states and to the stability of the international order. For the first time, therefore, not only public health authorities but also intelligence agencies and conservative think tanks have classified infectious diseases as a ‘non‐traditional threat’ to national and global security. They assumed therefore the task of envisaging the future and the challenge that communicable diseases would play. Here a turning point was the Central Intelligence Agency (CIA)'s National Intelligence Estimate (NIE) for 2000 (64), which was devoted to the danger posed by disease and presented defense against epidemic diseases as a major security goal for the United States. As a document, NIE 99‐17D (64) was divided into four major sections: alternative scenarios, impact, implications, and discussion.
In the first section, the CIA attempted to outline three possible scenarios for the course of infectious diseases over the next 20 years: (i) the optimistic contemplation of steady progress in combating communicable disease; to (ii) the forecast of a stalemate with no decisive gains either by microbes or by humans in their long war of attrition; and (iii) the consideration of the most pessimistic prospect of deterioration in the position of humans, especially if the world population continues, as seems probable, to expand and if megacities continue to spring up with their attendant problems of crowding, sanitation, and unprotected drinking water. Unfortunately, the CIA regarded the optimistic first case as extremely unlikely. The probable course of events, in its view, is that 170 000 Americans will die from infectious diseases every year or considerably more if a pandemic of influenza or of a still unknown disease occurs, if there is a dramatic decline in the effectiveness of antiretroviral treatments for HIV/AIDS. Only toward the end of the 20 years did the report foresee possible advances due to enhanced public health initiatives, the development of new drugs and vaccines, and economic development (64).
Against this background, the succeeding sections on ‘impact’ and ‘implications’ outlined a series of likely economic, social, and political results that would occur in the new age of increasing disease burdens. In the most afflicted regions of the world, such as sub‐Saharan Africa, the report anticipated ‘economic decay, social fragmentation, and political destabilization.’ The international consequences of these developments would be growing struggles to control increasingly scarce resources, accompanied by crime, displacement, and the degradation of familial ties. Disease, therefore, would heighten international tensions while it weakened forces, such as international peacekeepers, who might otherwise have played a larger role in controlling regional tensions. US or European military forces deployed abroad in support of humanitarian or other operations would be at high risk. Because the economic and social consequences of increasing burdens of communicable diseases in the developing world are certain to impede economic development, the NIE also predicted that democracy would be imperiled, that civil conflicts and emergencies would multiply, and that the tensions between North and South would deepen.
Three years later, motivated by the CIA's report, an influential national security think tank, the Rand Corporation, turned to the intersection of disease and security when it attempted to provide ‘a more comprehensive analysis than has been done to date, encompassing both disease and security’ (53). In so doing, it envisaged even more somber probabilities than the CIA in the new global environment. The Rand Corporation intelligence report The Global Threat of New and Reemerging Infectious Diseases: Reconciling U.S. National Security and Public Health Policy (53) had two leading themes. The first was that in the postwar era there was a sharp decline in the importance of direct military threats to security. The second was that there is a corresponding rise in the impact of ‘non‐traditional challenges,’ of which diseases are the major but inadequately recognized component. It has always been accepted, the report stressed, that diseases kill and undermine the quality of individual lives. In addition, it was essential to recognize that the transition to the era of emerging and reemerging diseases marked the opening of a period in which infectious diseases would profoundly affect the ability of states to function and to preserve social order.
The most striking portion of The Global Threat of New and Reemerging Infectious Diseases (53) was its imagining of a probable scenario in which South Africa could become the first modern state to fail specifically because of infectious diseases in general and the HIV/AIDS pandemic in particular. As the report explained, ‘The contemporary HIV/AIDS crisis in South Africa represents an acute example of how infectious diseases can undermine national resilience and regional stability.’ In absolute numbers, South Africa has the highest number of HIV‐positive inhabitants in Africa – 4.7 million people in 2000, or 25% of the country's adult population. Already, such extreme prevalence of the disease has pervasive impacts, affecting all aspects of South African security. But South Africa is just emerging from the first phase of the AIDS pandemic and is therefore far from experiencing the full effects of the crisis, which even in the absence of resistance to antiretroviral therapy, is expected to produce 6 200 000 patients with HIV and 800 000 with full‐blown AIDS by 2010. In these circumstances, over a quarter of the economically active population will have the disease, causing severe skill shortages, creating poverty, destroying economic development, undermining participation in political life, and giving rise to more than two million orphans who will be impoverished, uneducated, and easily drawn into crime and prostitution. The effects will also be deeply felt in the military, the police, and the legal system, which will be severely deprived of manpower and unable to function just as social tensions deepened. ‘The net effect,’ it concluded, ‘will be entirely negative for South Africa's civil stability, possibly reducing the country to widespread social anarchy within the next five to twenty years.’ This disturbing outcome, moreover, could be hastened by the public health policies of President Thabo Mbeki, who espoused the theories of the AIDS denier Peter Duesberg and rejected the link between the HIV virus and the disease.
The point the Rand Corporation stressed most about South Africa, however, was that it was simply a dramatic illustrative example. What was occurring there as a result of HIV/AIDS could happen without warning elsewhere. ‘A crisis of similar proportions,’ it explained, ‘could therefore break out in any country at any time.’ Indeed, in the context of a growing danger of bioterrorist attack, such an outbreak could be launched intentionally. It was precisely this point – the growing vulnerability of all in the age of globalization – that led the world community, the European Union, and individual nations to rearm in preparation for the inevitable threats to come. In the new climate of preparedness, the United States took a prominent role, beginning almost immediately in the aftermath of the 1992 IOM report. In 1994 the CDC – the chief monitoring agency – drafted a strategic plan that it then updated in 1998, while NIAID – the principal basic research center – established a research agenda. Both agencies' plans were endorsed by the White House, where the NSTC under the chairmanship of Vice President Al Gore issued a ‘Fact Sheet: Addressing the Threat of Emerging Infectious Diseases,’ which in turn was backed by a Presidential Decision Directive of June 12, 1966. The result, as Gore explained, was the first national policy by the United States to confront the international problem of infectious diseases (65).
The essential starting point of the plan envisaged by the CDC, NIAID, and the White House was the IOM's description of the Darwinian struggle under way between humans and microbes. In the IOM's analysis of that struggle, microbes possess formidable advantages. They outnumber human beings a billionfold, they enjoy enormous mutability, and they replicate, in Lederberg's estimate, a billion times more quickly than man, with generations measured in minutes rather decades. In terms of natural evolutionary adaptation, therefore, microbes are genetically favored to win the contest. In Lederberg's observation, ‘Pitted against microbial genes, we have mainly our wits’ (66). Taking this IOM analysis as its starting point, the American response to the new challenge is best seen as the attempt to organize and deploy human wit, backed by newly found financial resources, to counter the microbial genetic challenge (25).
The White House ‘Fact Sheet’ declared in clear alarm that, ‘The national and international system of infectious disease surveillance, prevention, and response is inadequate to protect the health of U.S. citizens.’ To remedy the situation, the White House established six policy goals, as follows:
-
1
Strengthen the domestic infectious disease surveillance and response system, both at the Federal, State, and local levels and at ports of entry into the United States, in cooperation with the private sector and with public health and medical communities.
-
2
Establish a global infectious disease surveillance and response system, based on regional hubs and linked by modern communications.
-
3
Strengthen research activities to improve diagnostics, treatment, and prevention, and to improve the understanding of the biology of infectious disease agents.
-
4
Ensure the availability of the drugs, vaccines, and diagnostic tests needed to combat infectious diseases and infectious disease emergencies through public and private sector cooperation.
-
5
Expand missions and establish the authority of relevant US Government agencies to contribute to a worldwide infectious disease surveillance, prevention, and response network.
-
6
Promote public awareness of EIDs through cooperation with non‐governmental organizations and the private sector (65).
In pursuit of goals 2, 3, and 4, NIH funding was doubled between 1998 and 2003. NIAID established a research agenda to develop new weapons to combat epidemic diseases, giving rise to an explosion in knowledge while publications on infectious diseases burgeoned. Indeed, the agency director, Anthony S. Fauci, claimed in 2008 that HIV/AIDS in particular has become the most extensively studied disease in human history. NIAID's priority is the development of safe and effective vaccines and medications to combat HIV/AIDS, malaria, TB, and influenza. To that end, it has evaluated over 50 HIV vaccine candidates, funded 70 clinical trials, and developed 20 antiretroviral medications. In the field of malariology, it has completed the genomic sequencing of Plasmodium falciparum and of the feared malaria vector Anopheles gambiae with the expectation that this genetic knowledge is the first step toward the capacity to design anti‐malarial drugs, vaccines, and pesticides. The work of the federal agency, moreover, has been complemented by the work of private organizations such as the Bill and Melinda Gates Foundation, and university laboratories (67).
At the same time that NIAID stressed basic research, the CDC developed a defensive strategy against emerging pathogens in compliance with goal 1 of the President's directive. The CDC articulated its plan in two seminal works published in 1994 and 1998. There it articulated its objectives in four principal areas: surveillance; applied research; prevention and control; and the enhancement of the infrastructure and trained personnel needed for diagnostic laboratories at the federal, state, local, and international levels. In addition, the Atlanta‐based agency strengthened its links with the international public health community and with other surveillance agencies such as the FDA and the Department of Defense. It enhanced its capacity to respond to outbreaks, and it launched the journal Emerging Infectious Diseases as a forum to pool information on communicable diseases. It sponsored a series of major international conferences on the topic of emerging and reemerging diseases, beginning in 1998 with the participation of representatives from all 50 states and 70 countries. The CDC initiatives were widely regarded as a model for the establishment of surveillance and response capabilities in other countries as well (25, 68).
At the global level, the UN and its agency WHO also took major steps to strengthen international preparedness for the ongoing siege by microbial pathogens. A first step was the creation in 1996 of the disease‐specific organization UNAIDS with the function of raising awareness, mobilizing resources, and monitoring the pandemic. Funding levels in the fight against the disease increased from $300 million in 1996 to nearly $9 billion a decade later (69). A further step was that like the United States, the United Nations announced that it regarded infectious diseases as threats to international security. In acknowledgement of this new development, the Security Council took the unprecedented step in June 2001 of devoting a Special Session to the HIV/AIDS crisis. The Session adopted a ‘Declaration of Commitment on HIV/AIDS: Global Crisis – Global Action.’ The Declaration declared the global epidemic a ‘global emergency and one of the most formidable challenges to human life and dignity’ (70). Five years later, in June 2006, the General Assembly reaffirmed its commitment to the campaign, and adopted the ‘2006 Political Declaration on HIV/AIDS,’ whose chief goal was the establishment of national campaigns to improve access to care and treatment (69).
A third step was the establishment of a new set of international sanitary regulations – IHR (2005) – to replace the outdated IHR (1969). Whereas the old framework was disease‐specific and required notification only in the event of plague, yellow fever, and cholera, the new rules required notification for any ‘public health emergency of international concern,’ thereby including unknown pathogens and emerging infections. The regulations specified the nature of the ‘events’ that should trigger international concern. They also committed all of the 193 WHO member states to improve their capacity for surveillance and response and to designate ‘national IHR focal points’ as the units responsible for providing notification while requiring, in exchange, that the WHO provide assistance to member states in fulfilling their obligations (71, 72). In addition, recognizing that microbes do not acknowledge political frontiers, IHR (2005) called for effective responses wherever necessary to contain an outbreak on the basis of real‐time epidemiological evidence instead of concentrating on taking defensive measures at international borders.
Finally, the WHO organized a rapid response capacity with the necessary supporting infrastructure. This was the Global Outbreak Alert and Response Network (GOARN), which was established in 2000 with the goal of ensuring that even most resource‐poor countries would have access to the experts and resources needed to respond to an epidemic emergency. To that end, GOARN pooled the resources of 60 countries and organized 500 experts in the field. In addition, it stockpiles vaccines and drugs, and supervises their distribution during epidemic events. Between its founding and 2005, GOARN responded to 70 outbreaks and attempted to learn from experience by establishing protocols to standardize such matters as field logistics, security, communication, and the deployment of field teams (73). In addition to GOARN, the WHO set up surveillance systems specifically designed to deal with pandemic influenza, which the UN agency determined as its most feared security threat. These disease‐specific networks are (i) the Global Influenza Surveillance Network, which provides recommendations twice a year on the appropriate vaccine for the subsequent influenza season by collecting samples from patients in 94 countries and forwarding them to WHO collaborating laboratories for analysis, and (ii) FluNet, which compiles the surveillance data thus collected to establish a global real‐time early‐alert system for the disease (74, 75).
In practice, the first test of the effectiveness of the new structures was the SARS pandemic of 2002/2003 – the first major emerging disease threat of the 21st century. After first appearing in the Chinese province of Guangdong in November 2002, it erupted as an international health threat in March 2003, when the WHO received notification and declared a global travel alert. Between March and the declaration on July 5 that the disease had been contained, SARS affected 8098 people, caused 774 deaths, brought international travel to a halt in entire regions, and cost $60 billion in gross expenditure and business losses to Asian countries alone. As retrospective studies have demonstrated, SARS presented many of the features that most severely expose the vulnerabilities of the global system: SARS is a respiratory disease capable of spreading from person to person without a vector; it has an asymptomatic incubation period of more than a week; it generates symptoms that closely resemble those of other diseases; it takes a heavy toll on caregivers and hospital staff; it readily spreads unobserved aboard aircraft; and it has a case fatality rate of 10%. At the time this new disease appeared, moreover, its causative pathogen (SARS‐associated coronavirus) was unknown, and there was neither a diagnostic test nor a specific treatment. For all of these reasons, it dramatically confirmed the IOM's 1992 prediction that all countries were more vulnerable than ever to EIDs. SARS demonstrated no predilection for any region of the globe and was no respecter of prosperity, education, technology, or access to health care. Indeed, after its outbreak in China, SARS spread by airplane primarily to affluent cities such as Singapore, Hong Kong, and Toronto, where it struck relatively prosperous travelers and their contacts, hospital workers, patients, and hospital visitors, rather than targeting the poor and the marginalized. More than half of the recognized cases occurred in well‐equipped and technologically advanced hospital settings such as the Prince of Wales Hospital in Hong Kong, the Scarborough Hospital in Toronto, and the Tan Tock Seng Hospital in Singapore (31, 76, 77).
In terms of response to the crisis, the SARS outbreak demonstrated and vindicated the reforms taken on both the national and international levels. After the debacle of Chinese obfuscation at the start of the epidemic, national governments cooperated fully with IHR (2005). The world's most equipped laboratories and foremost epidemiologists, working in real‐time collaboration via the internet, succeeded, with unprecedented speed, in identifying SARS‐CoV in just 2 weeks. At the same time the newly created GOARN, together with such national partners as the Canadian Public Health Intelligence Network, the CDC, and the WHO Global Influenza Network, took rapid action to issue global alerts, monitor the progress of the disease, and supervise containment strategies before the disease could establish itself endemically. Ironically, given the high‐tech quality of the diagnostic and monitoring effort, the containment policies were based on traditional methods dating from the public health strategies against bubonic plague by the 17th century and the foundation of epidemiology as a discipline in the 19th. These measures were case tracking, isolation, quarantine, the cancellation of mass gatherings, the surveillance of travelers, recommendations to increase personal hygiene, and barrier protection by means of masks, gowns, gloves, and eye protection (78). Although SARS affected 27 countries and every continent, the containment operation coordinated by GOARN successfully limited the outbreak overwhelmingly to hospital settings with only sporadic community involvement, so that by July 5 the WHO could announce that the pandemic was over.
Although SARS tested the newly established global defenses against emerging diseases and the protective ramparts withstood the challenge, doubts relentlessly surfaced. The Chinese policy of concealment between November 2002 and March 2003 had placed international health in jeopardy and revealed that even a single weak link in the response network could undermine the IHR (2005) system. Indeed, resource‐poor countries that were compliant with the new framework of obligations nonetheless found it difficult or impossible to maintain the surveillance effort for the full 4‐month duration of the emergency. Still more tellingly, it was also clear that a major factor in the containment of SARS was simple good fortune. The world was lucky that SARS is spread by droplets and therefore requires extended contact for transmission, unlike classic airborne diseases such as influenza and smallpox. It was, relatively, much easier to contain, because except in the infrequent and still poorly understood case of so‐called ‘super shedders,’ it is not readily communicable from person to person. As poorly transmissible as it was, however, SARS exposed the absence of ‘surge capacity’ in the hospitals and health care systems of the prosperous and well‐resourced countries it affected. The events of 2003 thereby raised the specter of what might have happened had SARS been pandemic influenza, and if it had traveled to resource‐poor nations at the outset instead of mercifully visiting cities with well‐equipped and well‐staffed modern hospitals and public health care systems. Furthermore, SARS arrived in peacetime rather than in the midst of the devastation and the dislocations of war. In that respect, too, it did not repeat the challenge of the Spanish Lady of 1918–1919. The physician Paul Caulford, who fought the SARS epidemic in the front lines at Scarborough Hospital in Toronto, raised these matters. In December 2003, after the passing of the emergency, he reflected:
SARS must change us, the way we treat our planet, and how we deliver health care, forever. Will we be ready when it returns? SARS brought one of the finest publicly‐funded health systems in the world to its knees in a matter of weeks. It has unnerved me to contemplate what the disease might do to a community without our resources and technologies. Without substantive changes to the way we manage the delivery of health care, both locally and on a worldwide scale, we risk the otherwise preventable annihilation of millions of people, either by this virus, or the next. (79)
At the end of the victory over SARS, the nagging question therefore remains: even after the impressive efforts at re‐armament since 1992, how prepared is the international community for upcoming emerging diseases? Have we been forever changed?
Conclusion: blind spots and anxieties
The reforms introduced since the IOM report in 1992 have been profound and important. Indeed, the manner in which the international community responded to SARS was innovative and, in the circumstances, highly successful. There is, however, a disconcerting sense of a systematic blindness in the responses – at all levels – to the crisis described by the IOM, the CIA, the Rand Corporation, the WHO, and the White House. What has been done has been necessary but probably far from sufficient. Some of the issues raised by those who sounded the alarm have been forcefully addressed, but others have been largely ignored.
The responses to date have fit into two chief categories, both of which are essential and both of which were evident during the SARS pandemic. The first is reactive: the ability to respond rapidly and effectively to the outbreak of new epidemic threats. Through a series of initiatives, the years since 1992 have witnessed the establishment of organized networks for gathering public health intelligence, of an international legal framework to structure emergency interventions, and of well equipped response teams of experts to contain and monitor outbreaks. If one were to compare outbreaks of infectious diseases to forest fires, the world has provided itself with surveillance satellites, advanced communications infrastructures, and a well‐equipped fire department. One could question details of the response to SARS, such as implementation lapses that risked the spread of the disease from the hospital environment into the community, but overall the world's ‘dress rehearsal’ demonstrated far‐sighted planning and coordination beyond anything ever attempted before on an international scale.
The second category of initiatives is proactive and scientific: the attempt to discover new weapons to attack microbial threats. After half a century of dwindling resources for the fight against infectious diseases, the scientific and public health communities have successfully aroused worldwide awareness of the threat to health and security. They have, at least initially, attracted new levels of funding for basic research from both public and private sources, and they have set research agendas. The result has been an explosion of knowledge, grants, and publications with priority given to genomic approaches to microbes and vectors, to the development of vaccines, and to the search for new medications and diagnostic tools. Naturally there are grounds for criticism of various aspects of these initiatives. There is, for example, general agreement that overall levels of funding remain inadequate to the extent of the crisis and that after initial enthusiasm, governments have not continued to increase their support. There are also reasonable grounds for disagreement as to the relative distribution of research efforts, with discussion, for example, about the balance struck between research against HIV/AIDS and that against such other major diseases as malaria, TB, and pandemic influenza. Some have also questioned whether developing vaccines is the right paradigm on all fronts. For example, should priority be given to those diseases for which the human immune response gives grounds for optimism that – on the basis of historical experience – a safe and effective vaccine can be developed (e.g. influenza and dengue)? Or should other strategies be followed with respect to diseases for which the human immune response makes the development of a vaccine a far more arduous and unpredictable endeavor (e.g. cholera and malaria)? Nevertheless, although there is no basis for false confidence, global research efforts have been galvanized, and major advances have been made in the field of infectious diseases in comparison with the early decades after World War II. There is also a consensus that the effort to find vaccines and medicines is vital and that it must be enhanced in order to replenish the quivers of clinicians and public health officials.
What is more troubling in principle is that there are also systematic blind spots – areas of danger raised by those who first sounded the tocsin regarding emerging diseases that have not been addressed at all or only marginally and sporadically. Broadly speaking, the global community has chosen to address those issues for which scientific and technological responses are appropriate, while giving little sustained priority to what might be termed the social, economic, and environmental determinants of infectious disease. Here there is a considerable irony. The founding figures of the modern concept of emerging and reemerging diseases such as Joshua Lederberg and Robert Shope stressed that epidemics do not strike societies randomly or in accord with the caprices of angry gods. Diseases instead reflect the relationships that human beings establish with one another and with the natural and built environments. They then spread by taking advantages of the fault lines created by demography, poverty, environmental degradation, warfare, mass transportation, and societal neglect. The very beginning of the IOM's discussion of the new dangers was the recognition that our new vulnerability is not accidental but is the logical result of the type of society that we have become. In defining this vulnerability in a keynote speech in 1998, for instance, Lederberg stated:
To our disadvantage, we have crowding; we have social, political, economic, and hygienic stratification. We have crowded together a hotbed of opportunity for infectious agents to spread over a significant part of the population. This condensation, stratification, and mobility is unique, defining us as a very different species from what we were 100 years ago. (80)
If our problem results from ‘condensation, stratification, and mobility,’ there is a disturbing silence in the government response. Ironically, the various agencies – NIAID, the CIA, the Department of Defense – tasked by the Presidential Directive with augmenting American preparedness in the fight against infectious diseases neither mention socioeconomic factors nor elaborate a long‐term strategy to address them. The call to action aroused the will to find new means to attack microbes and their vectors, and to contain disease outbreaks in human populations, but not to ameliorate the underlying conditions that have made modern societies vulnerable in the first place.
Three crucial examples illustrate the problem. The first is condensation or the press of overpopulation. Clearly unrestrained demographic growth as the world population approaches seven billion strains all resources, degrades the environment, gives rise to the megacities and peri‐urban slums where dengue, TB, and cholera thrive, drives populations to intrude into forests where they are exposed to new zoonotic infections, and overwhelms educational, housing, and hygienic infrastructures. Here, the medical and public health communities agree, is a driving factor in the new human vulnerability to emerging diseases. The remedies, moreover, are already known, involving voluntary universal access for women to family planning education and technologies. One of the few forums even to raise the issue was the ‘First International Conference on Women and Infectious Diseases’ held in Atlanta, February 27–28, 2004, where it was noted that, ‘Women's health, in and of itself, rarely has been at the forefront of international development programs or national health planning and policies’ (81). In the field of infectious diseases, this lacuna is especially glaring because women are, as the conference stressed, more susceptible to infections than men, both for biological reasons and due to their caregiving roles and their relative burden of unemployment and poverty. Women, moreover, suffer more serious complications from infectious diseases, above all during pregnancy.
A second illustration is stratification, the burden of poverty and inequality. Nearly all of the leading studies on emerging diseases regard poverty and its sequelae of poor diet, substandard housing, lack of education, and inadequate access to health care as one of the chief determinants of epidemic disease. Poverty prevents people from taking measures to protect their own health, it undermines the immune system, it complicates access to safe water supplies, it leads to overcrowding in unhygienic housing, and it creates patterns of labor mobility and migration that compromise health. Health care workers and clinicians recognize the link between inadequate resources and disease, with the result that many of the leading epidemic infections are widely termed ‘diseases of poverty’ (82). The issue therefore surfaces in WHO campaigns to combat the three most important contemporary epidemics: HIV/AIDS, malaria, and TB. As the 2005 report Addressing Poverty in TB Control stated:
Poverty is the greatest impediment to human and socioeconomic development. The United Nations and its specialized agencies are focusing on poverty reduction as a leading priority. In the health sector, poverty represents a principal barrier to health and health care and, consequently, the World Health Organization has committed to integrate the promotion of pro‐poor policies throughout its work. (83)
The reduction of extreme poverty and hunger also form part of the UN ‘Millennium Development Goals’ to be achieved by 2015.
Except for exhortation and moral suasion, however, it is not clear that the WHO has developed specific plans to tackle the problem of poverty as a primary determinant of public health, and the promotion of greater equality is entirely ignored. More strikingly, neither issue forms part of the strategic public health thinking of the United States. American analyses recognize poverty as a factor creating an environment favorable for infectious diseases, but they avoid both poverty and inequality as matters of practical health policy. Here is the antithesis of the strategic recommendation of the South African pediatrician Nulda Beyers, who commented:
The Western Cape is in some ways a model of TB epidemiology …. TB is almost non‐existent in the white population, but in the black and coloured populations, where unemployment is running at 60%, and malnutrition and crowded slum housing are the norm, TB deaths can reach 3000 per 100 000. If I had to put my money on only one option – science or social uplift – there is no doubt that social uplift would have the bigger impact. (84)
Poverty, moreover, reinforces both condensation and mobility. Poverty creates a vicious downward spiral by interacting with population pressure because impoverished women are unable to practice effective family planning. The population explosion of the 21st century is based in the poorest regions of the planet. Given a free and informed choice, privileged families in the industrial world limit their fertility. At the same time, however, poverty also augments vulnerability to infectious disease by setting in motion great streams of mobile people – the poor who become migrants, refugees, and displaced persons, and who then crowd into slums, mining compounds, refugee camps, and homeless shelters. These are people who are at disproportionately high risk of falling ill and of transporting their microbial burden with them.
Finally, there is the question of access to care. Here the position of the leading figures in the campaign to recognize the importance of emerging and reemerging diseases is strangely contradictory. The IOM examined the managed care revolution in the United States and the implications of for‐profit medicine for the preparedness of the nation to face infectious diseases (85). By 2000, managed care already enrolled 150 million Americans and therefore dominated health care delivery. The performance of the managed care revolution, however, did not inspire the IOM. On the contrary, it produced a list of the major problems that, in its view, managed care created for public health. This list was lengthy and devastating. According to the IOM, managed care creates severe public health difficulties because it does the following: (i) it places such strict controls on reimbursements that it becomes an impediment to effective collaboration with the public health community; (ii) it lowers costs by fostering management of infectious diseases by non‐specialists; (iii) it promotes the shift from inpatient to outpatient treatment, where there are neither the specialists nor the infrastructure to diagnose or contain infectious diseases; (iv) it proliferates bureaucratic complexities that complicate prompt response to disease outbreaks; (v) it reduces the commitment to training and research; and (vi) it encourages excessive antibiotic use (85).
By leaving tens of million of people in the United States without insurance coverage and therefore without effective access to care, for‐profit medicine effectively removes them from the disease surveillance network. To the extent that uninsured people avoid care entirely or seek it only at a late stage of their illness, the prompt information on which effective public health depends is undermined. In addition, excluding people from coverage drives them further into poverty and creates an underclass of the marginalized. Finally, managed care relentlessly cuts costs by squeezing out of the system the surge capacity on which populations depend in the event of a disease outbreak. Nevertheless, despite these observations, the IOM reached perfectly anodyne conclusions. It did not conclude that only a system that guaranteed universal access is compatible with defense against infectious disease threats. Instead, it lamely urged a deeper partnership between the managed care industry and public health officials.
For these reasons, one can only conclude that we are not, in fact, forever changed. On the contrary, on both the national and international levels the response to the challenge of emerging disease threats remains partial with major gaps that are potentially costly in terms of human life and suffering. The United States and the world health community have established a sophisticated and necessary rapid response system. They have also proclaimed – and partially funded – a new commitment to basic research aimed at finding new antimicrobial weapons. They have not, however, systematically addressed the underlying causes for the new vulnerability.
References
- 1. Russell PF. Man's Mastery of Malaria. New York: Oxford University Press, 1955. [Google Scholar]
- 2. Pampana E. Textbook of Malaria Eradication, 2nd edn London: Oxford University Press, 1969. [Google Scholar]
- 3. Bruce‐Chwatt LJ, Glanville VJ, eds. Dynamics of Tropical Disease: A Selection of Papers with Biographical Introduction and Bibliography by the Late George Macdonald: Chapters 16–19. London: Oxford University Press, 1973. [Google Scholar]
- 4. Farid MA. The malaria program – from euphoria to anarchy. World Health Forum 1980;1:8–22. [Google Scholar]
- 5. Snowden FM. The Conquest of Malaria: Italy, 1900–1962. New Haven: Yale University Press, 2006:198–212. [Google Scholar]
- 6. Hinman EH. World Eradication of Infectious Diseases. Springfield, IL: C.C. Thomas, 1966. [Google Scholar]
- 7. Cockburn A. The Evolution and Eradication of Infectious Diseases. Baltimore: Johns Hopkins, 1963. [Google Scholar]
- 8. Cockburn A., ed. Infectious Diseases: Their Evolution and Eradication. Springfield, IL: C.C. Thomas, 1967. [Google Scholar]
- 9. Burnet F, White DO. Natural History of Infectious Disease, 4th edn Cambridge: Cambridge University Press, 1972. [Google Scholar]
- 10. World Health Assembly Resolution 32.30 of 1979, “Health for All, 2000.” Available at http://www.healthpromotionagency.org.uk/Healthpromotion/Health/section6b.htm. Accessed January 28, 2008.
- 11. World Health Organization (WHO) . International Health Regulations (1969), 3rd edn Geneva: WHO, 1983. [Google Scholar]
- 12. Merianos A, Peiris M. International health regulations. Lancet 2005;366:1249–1251. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Wintrobe MM, et al, eds. Harrison's Principles of Internal Medicine, 7th edn New York: McGraw‐Hill, 1974:722–729. [Google Scholar]
- 14. McNeill WH. Plagues and Peoples. Garden City, NY: Anchor Press, 1976. [Google Scholar]
- 15. Omran AL. Epidemiologic transition: changes of fertility and mortality with modernization. Milbank Q 1971;49:509–538. [Google Scholar]
- 16. Omran AL. A century of epidemiologic transition in the United States. Prev Med 1977;6:30–51. [DOI] [PubMed] [Google Scholar]
- 17. Omran AL. The epidemiologic transition theory. A preliminary update. J Trop Pediatr 1983;29:305–316. [DOI] [PubMed] [Google Scholar]
- 18. US Department of Health, Education, and Welfare, Healthy people: The Surgeon General's Report on Health Promotion and Disease Prevention, 1979. Washington, DC: US Department of Health, Education, and Welfare, 1979. [Google Scholar]
- 19. Farmer P. Pathologies of Power: Health, Human Rights, and the New War on the Poor. Berkeley: University of California Press, 2003. [Google Scholar]
- 20. Farmer P. Infections and Inequalities: The Modern Plagues. Berkeley: University of California Press, 1999. [Google Scholar]
- 21. Hayward AC, Coker RJ. Could a tuberculosis epidemic occur in London as it did in New York? Emerg Infect Dis 2008;6:12–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Navin TR, McNab SJN, Crawford JT. The continued threat of tuberculosis. Emerg Infect Dis 2002;8:1187. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. US Public Health Service . Understanding AIDS. Rockville, MD: US Department of Health and Human Services, 1988. [Google Scholar]
- 24. Institute of Medicine (IOM) . Emerging infections: microbial threats to health in the United States. In: Lederberg J, Shope RE, Oaks SC, eds. Washington, DC: National Academy Press, 1992. [PubMed] [Google Scholar]
- 25. CDC . Addressing Emerging Infectious Disease Threats: A Prevention Strategy for the United States. Atlanta: US Department of Health and Human Services, 1994. [Google Scholar]
- 26. Infectious disease – a global threat: report of the National Science and Technology Council, Committee on International Science, Engineering, and Technology: Working Group on Emerging and Re‐emerging Infectious Diseases. US GPO, Washington, DC, 1995.
- 27. Winkler MA, Flanagin A. Infectious diseases: a global approach to a global problem. JAMA 1996;275:245–246. [DOI] [PubMed] [Google Scholar]
- 28. The White House: Office of Science and Technology Policy. Fact sheet: addressing the threat of emerging infectious diseases, June 12, 1996. Available at http://fas.org/irp/offdocs/pdd_ntsc7.htm. Accessed March 9, 2008.
- 29. US Congress. Senate Committee on Labor and Human Resources . Emerging Infections: A Significant Threat to the Nation's Health. Washington, DC: US GPO, 1996, p. 3. [Google Scholar]
- 30. Fauci AS. World Health Day, April 7, 1997. Available at http://ww3.niaid.nih.gov/news/newsreleases/1997/world.htm. Accessed March 17, 2008.
- 31. WHO . The World Health Report 2007. Geneva: WHO, 2007. [Google Scholar]
- 32. Brooke J. How the cholera scare is waking Latin America. Washington Post, March 8, 1992, section 4:4.
- 33. Brooke J. Feeding on 19th century conditions, cholera spreads in Latin America. New York Times, April 21, 1991, section 4:2.
- 34. WHO . Cholera in 1991. Wkly Epidemiol Rec 1992;67:253–259. [PubMed] [Google Scholar]
- 35. Altman LK. A 30‐year respite ends: cases of plague reported in India's largest cities. New York Times, October 2, 1994, section 4:2.
- 36. Peters CJ, Le Duc JW. An introduction to Ebola: the virus and the disease. J Infect Dis 1999;179 (Suppl.):ix–xvi. [DOI] [PubMed] [Google Scholar]
- 37. Out of the jungle a monster comes. The Daily Telegraph, December 28, 2007, features, p. 32.
- 38. Quarantine at 30,000 feet: Ebola virus as a ticking, airborne time bomb. Daily News, August 2, 1995, p. 31.
- 39. Arthur Richard Preston discusses the deadly outbreak of the Ebola virus in Zaire. CBS News Transcripts, May 15, 1995.
- 40. Davis JR, Lederberg J, eds. Public Health Systems and Emerging Infections: Assessing the Capabilities of the Public and Private Sectors. Washington, DC: National Academy Press, 2000. [PubMed] [Google Scholar]
- 41. Satcher D. Emerging infections: getting ahead of the curve. Emerg Infect Dis 1995;1:1–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42. Mangill A, Gendreau MA. Transmission of infectious diseases during commercial air travel. Lancet 2005;365:989–996. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43. Lederberg J. Infectious diseases as an evolutionary paradigm. Emerg Infect Dis 1997;3:417–423. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44. Martens P, Hall L. Malaria on the move: human population movement and malaria transmission. Emerg Infect Dis 1999;6:103–109. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45. Snowden FM. Naples in the Time of Cholera, 1884–1911. Cambridge: Cambridge University Press, 1995. [Google Scholar]
- 46. Jackson LA, Spach DH. Emergence of Bartonella quintana infection among homeless persons. Emerg Infect Dis 1996;2:141–144. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47. Trench fever identified in Seattle's homeless. NPR transcripts: “All Things Considered,” February 18, 1995. Available at http://www.lexisnexis.com/us/Inacademic/results/docview/docview.htm. Accessed March 14, 2008.
- 48. Stevens JE. Dengue cases on the rise. Washington Post, June 6, 1995, p. Z07.
- 49. NIAID . NIAID experts see dengue as potential threat to U.S. public health. News release of January 8, 2008. Available at http://www3.niaid.hih.gov/news/newsreleases/2008/dengue.htm. Accessed February 8, 2008.
- 50. Rubin RJ, et al The economic impact of Staphylococcus aureus infection in New York City hospitals. Emerg Infect Dis 1999;5:9–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51. Weber JT, Courvalin P. An emptying quiver: antimicrobial drugs and resistance. Emerg Infect Dis 2005;11:791–793. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52. Shah NS, et al World wide emergence of extensively drug‐resistant tuberculosis. Emerg Infect Dis 2007;13:380–387. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53. Brown J, Chalk P. The Global Threat of New and Reemerging Infectious Diseases: Reconciling U.S. National Security and Public Health Policy. Santa Monica: RAND, 2003. [Google Scholar]
- 54. WHO . The Medical Impact of Antimicrobial Use in Food Animals. Geneva: WHO, 1997. [Google Scholar]
- 55. Levy SB. Antibacterial household products: cause for concern. Emerg Infect Dis 2001;7 (Suppl.):512–515. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56. Marshall BJ. Helicobacter connections. Available at http://nobelprize.org/nobel_prizes/medicine/laureates/2005/marshall-lecture.html. Accessed March 18, 2008.
- 57. Lindsay JA. Chronic sequelae of foodborne disease. Emerg Infect Dis 1997;3:443–452. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58. O'Connor SM, Taylor CE, Hughes JM. Emerging infectious determinants of chronic diseases. Emerg Infect Dis 2006;12:1051–1057. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59. O'Connor SM, et al Potential infectious etiologies of atherosclerosis: a multifactorial perspective. Emerg Infect Dis 2001;17:780–788. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60. Grown J, Chalk P. The Global Threat of New and Reemerging Infectious Diseases: Reconciling U.S. National Security and Public Health Policy. Santa Monica: Rand, 2003. [Google Scholar]
- 61. CIA . The global infectious disease threat and its implications for the United States. NIE 99‐17D, January 2000. Available at http://permanent.access.gpo.gov/websites/www.cia.gov/cia/reports/nie/report/nie99-17d.html. Accessed March 3, 2008.
- 62. Jones KE, et al Global trends in emerging infectious diseases. Nature 2008;451:990–993. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63. Department of Defense . Addressing Emerging Infectious Disease Threats: A Strategic Plan for the Department of Defense. Washington, DC: US GPO, 1998. [Google Scholar]
- 64. CIA . The global infectious disease threat and its implications for the United States. NIE 99‐17D, January 2000. Available at http://permanent.access.gpo.gov/websites/www.cia.gov/www.cia.gov/cia/reports/nie/report/nie99-17d.html. Accessed February 28, 2008.
- 65. Presidential Decision Directive NTSC‐7 Addressing the Threat of Emerging Infectious Diseases, June 12, 1996. Available at http://www.fas.org/irp/offdocs/pdd_ntsc7.htm. Accessed March 9, 2008.
- 66. Lederberg J. Infectious disease – a threat to global health and security. JAMA 1996;275:417–419. [PubMed] [Google Scholar]
- 67. Fauci AS, Touchette NA, Folkers GK. Emerging infectious diseases: a 10-year perspective from the National Institute of Allergy and Infectious Diseases. Emerg Infect Dis 2005;11:519–525. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68. CDC . Preventing Emerging Infectious Diseases: A Strategy for the 21st Century. Atlanta: US Department of Health and Human Services, 1998. [Google Scholar]
- 69. UNAIDS . Annual Report 2006. Geneva: UNAIDS, 2006. [Google Scholar]
- 70. UN General Assembly . Declaration of commitment on HIV/AIDS, June 2001. Available at http://un.org/ga/aids/conference.html. Accessed March 12, 2008.
- 71. Rodier G, Greenspan A, Hughes JM, Heymann DL. Global public health security. Emerg Infect Dis 2007;1447–1452. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72. WHO . International Health Regulations, 2005. Available at http://www.who.int/csr/ihr/wha_58_3/en/index.html. Accessed July 25, 2008. [Google Scholar]
- 73. WHO . Global outbreak alert and response network. Available at http://www.int/csr/outbreaknetwork/en/. Accessed March 12, 2008.
- 74. WHO . Global influenza surveillance network. Available at http://who.int/csr/disease/influenza/surveillance/en/. Accessed March 15, 2008.
- 75. Flahault A, Dias‐Ferrao V, Chaberty P, Esteves K, Valleron AJ, Lavanchy D. FluNet as a tool for global monitoring of influenza on the web. JAMA 1998;280:1330–1332. [DOI] [PubMed] [Google Scholar]
- 76. Lingappa JR, McDonald LC, Simone P, Parashar UD. Wresting SARS from uncertainty. Emerg Infect Dis 2004;10:167–170. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77. Heymann DL, Rodier G. Global surveillance, national surveillance, and SARS. Emerg Infect Dis 2004;10:173–175. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78. Bell DM World Health Organization Working Group on International and Community Transmission of SARS . Public health interventions and SARS spread, 2003. Emerg Infect Dis 2004;10:1900–1906. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79. Caulford P. SARS: aftermath of an outbreak. Lancet 2003;362 (Suppl.):s2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80. Lederberg J. Infectious disease as an evolutionary paradigm. Available at http://www.cdc.gov.ncidod/eid/vol3no4/lederberg.htm. Accessed March 1, 2008. [DOI] [PMC free article] [PubMed]
- 81. Periago MR, Fescina R, Ramon‐Pardo P. Steps for preventing infectious diseases in women. Emerg Infect Dis 2004;10:1968–1973. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 82. Watts J. Targets now set by G8 countries to reduce “diseases of poverty”. Lancet 2000;356:408. [DOI] [PubMed] [Google Scholar]
- 83. WHO . Addressing Poverty in TB Control: Options for National TB Control Programmes. Geneva: WHO, 2005. [Google Scholar]
- 84. Abdulla S. Tuberculosis experts back social reform. Lancet 1997;350:1604. [Google Scholar]
- 85. IOM . In: Davis JR, ed. Managed Care Systems and Emerging Infections: Challenges and Opportunities for Strengthening Surveillance, Research and Prevention. Washington, DC: National Academy Press, 2000. [PubMed] [Google Scholar]