Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2020 Aug 14;55(Suppl 1):13–30. doi: 10.1007/s41775-020-00088-0

India in the pandemic age

Sumit Guha 1,
PMCID: PMC7427269  PMID: 32836356

Abstract

COVID-19 is only the latest in a series of global pandemics that began when the world of disease was united by the establishment of intensive connections by sea after 1500. India was a major participant in this process. A pandemic has both direct and indirect effects. Human reactions to mass illness both mitigate and enhance these effects. The networks of transmission are paralleled by networks of private and public information. But aggregated information only becomes available as governmental information systems take shape. This article explains the use of quarantine as emerging from both. It then explains why it was introduced to India only after 1800. It then looks at three great pandemics: cholera, bubonic plague and lethal influenza and governmental and societal responses to each of these. The article analyses the subsidence of pandemics into chronic presences (‘background’) that nonetheless contributed significantly ill-health, poverty and early death for hundreds of millions. But there is a paradox after Independence. Successful state action in independent India was nevertheless accompanied by the effective collapse of government information systems. This contributed to the massive economic damage from what should have been a minor episode of plague in 1994. The article thus reviews what we know about the effects of the pandemic, epidemic and chronic background phenomena on the economic life of Indian sub-continent through the past 500 years.

Keywords: Information systems, Market networks, Pandemics, India

Introduction

Let me at the outset define a pandemic: it is a sudden, rapidly spreading epidemic on a global scale. It is therefore an interaction between at least two species: humans and a disease-causing organism that infects the former (many pandemics involve other species too). The COVID outbreak is the most recent global pandemic. Diseases create host immune responses that suppress or eliminate them. Furthermore, human awareness about and control of pathogens has advanced enormously in recent centuries as has the information and treatment infrastructure worldwide. Immunization, surveillance, treatment and a general rise in standards of hygiene and nutrition have tamped down many formerly lethal diseases and changed the earth’s biome in unprecedented ways. But major perturbations of disease-host interactions are more common nowadays and it is they that now generate pandemics. This situation demonstrates the deficiencies of the ceteris paribus view of the natural world as a passive backdrop against which the actual drama of economic life is enacted.

If this interaction is visualized in terms of the simplest equilibrium model—the classic economic model where shocks are absorbed and homeostasis restored, such an outbreak could only have two outcomes. One would be the destruction of either the host population or the parasite population (i.e. the extinction of one of two species) and the other, an adaptation between host and disease organism that would maintain the viability of each species. A shift away from equilibrium thinking has already been occurring in real-world economics, with the recognition of ‘Black Swan events’ and historical contingency generally.

The global economy and the first pandemics

Having made that pitch for my own discipline, let me turn to the birth of pandemics considered as global phenomena. The two halves of the globe, the New World and the Old only began intensive interaction in 1500 (if we exclude Australia and New Zealand). This was when pandemics became possible. And pandemics there certainly were. The new-arriving Old Worlders harbored many diseases. Some of these, such as smallpox were lethal in all parts of the world; others had become less fatal childhood diseases in Europe. But they all ravaged American indigenous populations on whom they were suddenly unleashed. Native populations of the Caribbean islands and coast were wiped out and replaced by European settlers and the African slaves they imported. Further away from the coasts, the combined effects of disease, political subjugation and ecosystem disruption were less intense. But they still radically changed the demography and environment of the Americas (Guha 2001: 11–12; Crosby 1986).

One cannot calculate the economic effects of these transformative epidemics because massive structural changes do not permit such computations. (What would be the current value of the lost income and life of thirty million Native Americans?). The conquest of the Americas had important economic effects on India. The subjugation of the peoples of Southern America resulted in first, the confiscation of their accumulated gold stocks followed by a vast conscription of surviving local populations to mine more gold and silver. The bullion was shipped back to Spain (and Portugal). This flood of precious metal then boosted oceanic trade with Asia where Europeans ran a chronic balance of payments deficit before the Industrial Revolution. It enabled the Mughal Empire to move from a copper based currency to a trimetallic one.

Meanwhile Spanish sailors and conquerors acquired syphilis in the Americas and brought it back to Europe. It soon spread across the world, into the Middle East and further into Asia. The initial outbreak however, subsided into a pattern of low-level chronic infection, following the trajectory of many diseases that were lethal in unexposed ‘greenfield’ populations but then settled down into yet another low-level chronic scourge (Zinsser 1935).

Pandemic effects in an information economy

A market exists if price information is being transmitted through it, however imperfectly. An epidemic is recognized by information about many interconnected cases of illness. That too needs an information network. As the current COVID outbreak demonstrates, many of the effects of a pandemic are not the direct consequences of the disease (sickness and death resulting in a reduction of the effective work-force and disruptions of supply chains). Once the presence of an epidemic outbreak is recognized, then direct effects may be dwarfed by the results of reactions to that knowledge. Reactions, in turn depend on the strength of an information apparatus and the level of information, true or false, diffused through it. Generating and transmitting information has costs, both by way of durable infrastructure and real-time collection and preservation. In the absence of state initiative, these would only be created if private benefits exceeded private costs. They often did not. In 1867, the government of Bombay Presidency instructed an officer to obtain and collate information on past droughts and famines from each administrative jurisdiction. The officer in question (Lt. Col. A.T. Etheridge) did what he could, but ultimately reported that he “found an almost total absence of any authentic account of any past famine or drought whatever. It was not, apparently, a part of the system of revenue management under the Native Governments to register statistics of famines or droughts…”. Quantitative records that Etheridge obtained were therefore limited to two kinds: statements of current prices obtained from merchants’ accounts and of taxes forgone (called ‘remissions’) from government sources.

An extreme example of a limited apparatus of record came from  Pahlunpoor (Palanpur) in central Gujarat. The British officer there (Lt. Col. Arthur) was instructed to prepare a report on past famines. He wrote in 1868 that in earlier times, the local people “were altogether uneducated, with the exception of Bunias [businessmen] whose knowledge of writing and accounts consisted mostly in making entries in books of their daily sales.” Much of Etheridge’s report is therefore composed of oral traditions and personal recollections. He had to use this source even for the early British period: there had formerly been a “Road and Tank Department” organized to employ famine victims, but “the files are supposed to have perished in a fire which occurred some years ago” (Etheridge 1868: 1–32 and passim).

The invention of quarantines

The first economic calculations around the costs and methods of epidemic control and information management arose in the trading cities around the Mediterranean in the 1300s. This was when the intensification of caravan trade along the Silk Road under the protection of the Mongol Empire also brought deadly baggage: the bubonic plague. Yersinia Pestis is primarily a flea-borne disease of burrow-dwelling furry mammals. It is easily transmitted to the rat populations fostered by human agriculturalists and their grain stores. During major outbreaks, it has been known to establish direct human-to-human (‘pneumonic’) chains of transmission. Mongol protection of caravan trade helped to bring it from its ancestral homeland in Southwest China onto the ships that moved busily around the Mediterranean sea. This was the second of the great recorded epidemics—the first spread around the Mediterranean world c. 540 CE, but died down after raging fiercely for the next century. The third one—which we shall consider more fully came from Yunnan to Hong Kong and then spread world-wide after 1894 (McNeill, 1976:136–38).

Yersinia produces florid and dramatic symptoms. Up to 90% of patients in pre-modern times died, and died quickly. That reduced disease’s opportunities for transmission among humans. But the fact that the disease is really one of rats has ensured its survival and periodic recurrence. The plague arising out of Yersinia became a recurring phenomenon in all medieval cities around the Mediterranean and sometimes further inland too. Trading ports were small havens of local autonomy in the landscape of the medieval West. The many contending kings in their hinterlands realized that traders would select harbors where their customs were understood and their property comparatively respected. The many estuaries, harbors and inlets around the Mediterranean competed for trade. How then could they remain open for business and yet protected from the effects of the dreadful epidemic?

Even though there was no medical understanding of the disease, a new information apparatus was created for its management. The practice of quarantine depends upon the collection and recording of information. Aparna Nair has written an exemplary study of the introduction of this practice to India. “Quarantine was intended to identify individuals who were either sick or suspected of disease; isolate them from the general population and thus prevent the transmission of disease.” Ships entering the harbor had to produce an authenticated document of the health of all those aboard, especially if they came from a port where an epidemic was known to be raging. Ships with confirmed or suspected illness aboard were blocked from all communication to the shore or other ships for a set time, traditionally 40 days (Nair 2009).

These measures—like present-day COVID quarantine, had high costs. They suspended the commerce that was vital to the life of cities, and consequently stopped their tax income. This was a time when there were no real institutions of public credit or fiat currency. A town or kingdom could take ‘forced loans’ by postponing payments, but only to a point. But the knowledge that quarantine measures were in place was thought to calm traders and encourage them to visit. Town governments then realized that authentic information was the best way to build confidence, especially since plague deaths were too dramatic and striking to be concealed. Exact counts transparently achieved were needed to counteract rumors. Thus the municipal government of the Spanish city of Barcelona was the first to create a system to record the number of plague deaths thus. This they declared, would suppress exaggerated reports about it. The calculation was evidently that such rumors would damage the economy more than the truth would do. The weekly lists of the dead were known as ‘Bills of Mortality’. This practice was soon institutionalized in many Western harbor towns (Smith 1936). Not incidentally perhaps, the urban elite in many places also possessed rural retreats to which they could withdraw for a time until the disease ‘burned itself out’. So they too benefited from exact counts which enabled them to know when the epidemic was abating and it was safe to return to the city.

We see the process at work in London. The city was a major trading harbor, but also a royal capital. The arrival of the plague usually meant an exodus of nobles and courtiers to their country estates, and of the King and attendants to one of the many royal palaces outside the city. But the permanent abandonment of the town was not feasible. So weekly lists of those who died of plague began to be compiled by parish clerks in the City of London in the 1500s. This became a regular practice in epidemic years after 1562. At this point, the weekly increases and decreases were watched in order to determine if the death rate was increasing or decreasing, a practice that  still continues today. The trend would help decide when it would be safe for the royal court, attendant nobles and innumerable hangers-on to return to the city. The information was soon commercialized: another innovation. Households that received a copy of the printed weekly list paid four shillings (₤0.20) for the service (Brend 1907). This is an early example of how the numerically inclined and relatively autonomous cities of the West created a new information apparatus. That in turn served to balance of lethal risks and economic costs of pandemic disease.

We may contrast this practice with that of the Mughal Empire in India (c.1550–1720). It created an elaborate network of ‘news-writers’, whose job was to report on local events and the conduct of government servants to the Court. That apparatus was obviously essential for the operation of a far-flung Empire. Financial information was also recorded and compiled from the reign of Akbar (1556–1605) onward. It was used to calculate tax-assessment and salary payments. But the information apparatus did not extend to demography or public health (Guha 2001: 24–67; 2015) It is interesting that while Indian rulers such as the Mughal Emperor Jahangir (r. 1605–1628) sought to minimize their own exposure to deadly epidemics, including the bubonic plague, they did not construct a permanent information apparatus to monitor it. Nor as far as we know, was there a commercially available source of quantitative information.

Therefore outbreak of the plague that spread from Lahore to Delhi and Agra in 1616 left a deep impression on the Emperor Jahangir. He recorded the appearance, symptoms and course of the disease. Jahangir wrote that it first broke out in rural Punjab and then spread to the city of Lahore, from where the disease spread eastward to Delhi. Many died but he did not record the numbers: perhaps no such tally existed. Perhaps this was because communities and neighborhoods would dispose of their own dead and generate no records. Jahangir also sought to avoid cities where the plague was raging (Jahangir n.d.: 330, 342). But hearsay estimates were current. In 1616, an English merchant in Agra wrote that a great plague raged for three months and that on some days, as many as a thousand people died. One wonders who was counting or if ‘thousand’ just mean ‘many’ (Roe 1899: vol. 2, 307–8, note 1).

The movements of the imperial Court certainly had a great effect on the regional economy since tens of thousands of servants, craftsmen and traders depended upon it for their livelihood. They perforce followed it wherever it went. The French traveler and doctor, Francois Bernier remarked on this feature of any Mughal capital city. If the Emperor and army departed for any length of time then a great number of bazar traders, artisans and camp-followers left too as they had no means of support other than the patronage from the Court and army. Bernier guessed the total number of men in an imperial camp at 200 or even 300 thousand (Bernier 1914: 220).

Such mobility may explain general societal disinterest in seeking to record aggregate vital statistics. Population statistics assume the existence of a stable underlying population whose vital events enter the statistical record. But a vast mobile population makes numerical calculations pointless. Babur, the first Mughal ruler (1526–30) remarked on the great mobility of Indian populations generally. He wrote that a remarkable feature of North India was that a town could be abandoned “in a day, even half a day”. He also wrote that overall population being “limitless”, a new town could be built at some other location in a short time too (Babur 2002: 335). The conditions that generated the European statistical frame of comparison and record therefore did not exist in the Indian subcontinent.

It did become a concern for the English East India Company from its earliest days. It had very few personnel at its Indian settlements and sickness and death could seriously disrupt its operations. In 1690, for example, there only five English officials alive in Bombay (Mumbai) and all of them were sick. By 1800, therefore the local police establishment maintained a record of deaths in the growing city and suburbs and reported from 4000 to 8000 annually. Registration was still affected by migration. The great famine of 1803–4 brought many desperate migrants and so Bombay recorded 26,000 deaths in that year (Gazetteer 1910: 3, 161–7). Certain diseases were thought inevitably associated with certain climates. Visitors and residents in humid areas expected seasonal bouts of malarial fever for example, both in India and elsewhere (Guha 1999: 39, 50).

Benefit–cost analysis of plague quarantines

We do not know when the bubonic plague first appeared in India. It certainly revived in in the seventeenth century. It however appeared at long intervals which clearly made it noteworthy, unlike the everyday diseases like malaria, dysentery or even cholera. Apart from the plague outbreak noticed by Jahangir in 1616, other major outbreaks occurred in the Mughal camp in South India, at the port city of Surat and the small English colony in Bombay, between 1684 and 1702 (Gazetteer 1910: vol. 3, 164–5). The deployment of European statistical and quarantine methods in public health is illustrated by responses to the plague in Bombay and Madras respectively.

The bubonic plague was smoldering in Egypt when the country was thrown into turmoil by the French invasion of 1798. Indian soldiers were shipped out to assist with the destruction of the French Army, which however surrendered before they got there. But the soldiers despite all the precautions available at the time, soon picked up the bubonic plague. A strict quarantine was immediately imposed. One battalion of Bombay Native Infantry was left behind in Egypt for a considerable time until plague cases ceased to appear in its ranks. Units were then slowly returned to Bombay and Madras. It should be remembered that this involved a sea journey lasting from 2 to 6 weeks. It is entirely possible that the chain of transmission among rats was broken in that time. Nair draws an interesting contrast between the attitudes of the local administration of the two cities (each the capital of a Presidency government).

The Madras government was anxious to avoid the expense of running a quarantine hospital and also to secure the cargos of rice and rum that had come aboard the ships from Egypt. The Bombay government was much more careful and detained ships and passengers for much longer periods before allowing them to return to normal life. Thus some 700 soldiers and camp-followers of the 7th Infantry regiment were detained for an entire month because it was known that there had been cases among them in Egypt. Their old clothing was burned and new clothes issued before their release (Nair 2009). Why this contrast in the responses of two similarly cash-strapped provincial governments? I suggest that it derived from the different political settings in which the policies were conceived.

The French in Egypt had already suffered severely from the plague. Military officers would thus have immediate knowledge of the damage that an incurable and lethal disease could inflict on any army. Bombay and the harbors dependent on it were going to be the base for military operations being planned against the Marathas. Soldiers and supplies would move through them. Madras harbor on the other hand was at this time, logistically unimportant. The operational front had shifted to northern Mysore (now part of Karnataka) from where Arthur Wellesley’s armies were probing the Maratha frontier to the north. I would therefore argue that this was not an idiosyncratic variation. It was calibrated to preserve the operational capacity of an army that was soon to face the challenging task of confronting the greatest extant Indian empire, that of the Marathas. This made expense a secondary consideration for Bombay but an important one for Madras. As it happened, this importation of the plague was successfully contained. But it was to return.

Background mortality and economic effects in India

When the British government began keeping records, it soon became clear that, even in the absence of major pandemics, India had very high death rates and a correspondingly low life expectancy. It varied regionally and by gender but was only between 20 and 34 years. The population was only maintained by high birth rates: only during the influenza pandemic decade 1911–21 did it actually shrink (Dyson 2018: 125–70). The economy had largely adapted. Cotton mills and dockyards for example maintained a floating labor force of “badli workers” who were called in to replace absentees, for example. It was even possible for the Ahmedabad textile mills to function during the influenza outbreak with half their regular workforce (Times of India July 15, 1918).

This section will briefly consider the indirect signatures of major pandemics via the contrast of ‘normal background’ and peak mortality. “Fever” of various kinds was widely reported as a cause of death. Some of this may have been malaria, some other infections such as typhoid and some viral diseases like influenza. In a few instances it referred to disguised plague cases, reported post-mortem to avoid state intervention. One indicator of the arrival of a new disease would be a spike in fever deaths. Even today, the concealed effects of new epidemics are partly calculated indirectly via the height of peaks of mortality (Wall Street Journal 2020).

Now, a weather-induced harvest shortfall would reduce employment and earnings in the countryside, and through a rise in prices also affect wage-earners, artisans and the poorer classes in towns. Christophers observed of the poorer classes generally that “in times of scarcity these are accustomed to adapt themselves to circumstances by proportionately restricting the amount of food they take” (cited in Guha 2001: 85).

Thus weakened, numbers of people would fall prey to the infections already present around them, or dormant within them. There would be a sudden multiplication of foci of unusually acute infection—and often ambulant foci at that. Exposure would increase, and those lacking immunity, as well as those whose immunity could not withstand a heavy dose of infection, would succumb. Secondary infections could follow the primary ones. So death from disease would reach out from the poorest to affect other classes. When crop failures were frequent the successive waves would overlap, maintaining a generally elevated level of mortality and a succession of peaks. On the other hand, greater agricultural stability would feed back into lower mortality via a reduction in the morbidity of the more vulnerable sections of the population.

I have shown elsewhere that the years from c.1890 to c.1925 were periods of low population growth, very largely as a consequence of the great famines and epidemics that marked this period. Despite this, the population of India maintained a slow upward trend of approximately 0.5% in years without major calamities in large parts of the country.

Cholera: the first Indian pandemic to go global

This ancient disease has long been endemic and epidemic in the Indian subcontinent. Low-lying areas near or on major river-courses were endemic foci. Pollitzer wrote a major review of the modern epidemiology of the disease. He thought it “probable that the development of a herd immunity in the endemic areas is apt to lead to a kind of equilibrium between the causative organisms and the host population” (Pollitzer 1957: 789). This allowed cholera to remain endemic in the eastern part of the Ganges valley, flaring up only when large numbers of non-immune persons gathered for major pilgrimages and fairs at places such as Hardwar and Prayag. But as most pilgrims would have moved on foot, many small outbreaks might simply die out as the sick perished before traveling far.

The familiar regional epidemic however flared up into a global pandemic after 1817. That year, the expanding British Empire in Asia commenced military operations against several Maratha states and their Pindari allies. British-trained armies moved from cantonments in Bengal westward to converge with troops coming from Bombay and Gujarat. The campaign involved extensive marching and counter-marching in pursuit of various Indian armies. This was not the first time soldiers and their camp-followers had marched out of the lower delta. But perhaps the forces involved were both larger and more mobile. At any rate, it was these campaigns led to the spread of cholera out of Bengal and across the sub-continent. Indian and English soldiers both succumbed in large numbers, as did camp-followers. It is likely that recovered patients and immune carriers also played a major role in its transmission. Periodic outbreaks became focused around major pilgrimages far from the Ganges valley, such as the gathering at Pandharpur in central Maharashtra (Sholapur Gazetteer 1880: 486–90).

This cholera pandemic slowly spread out of India, through the Mediterranean and ultimately across the oceans to the United States. But it traveled slowly, as befitted the age of sail, first appearing across Russia and then into western Europe. It was fifteen years before it reached the US. Thereafter however, it reappeared in the United States four times after its initial appearance in 1832–34. “After this 2-year visit, North America was free of the disease until the winter of 1848–49. Between 1849 and 1854, however, no twelve-month period passed without cholera appearing in some part of the United States. Then the disease disappeared as abruptly as it had in 1834; it was not to return until 1866.” The causative organism was identified by Robert Koch in 1883 and its water-borne nature firmly established as well. The disease last appeared in the United States in the 1890s (Rosenberg 1987:3–7). Rosenberg makes the astute observation that the London doctor John Snow who offered the first evidence of the water-borne nature of the disease was able to calculate the differential cholera death-rate only because England (and London) had initiated the universal registration of deaths and births a decade earlier (Rosenberg 1966). As with a range of diseases, its epidemiology was understood because of the apparatus of monitoring and control that went back to the European city-states of the sixteenth century even if its causative agent remained as mystery.

But modern institutions can have paradoxical effects. The disease had accompanied British and Indian soldiers out of its ancient home in 1817. It once again moved with soldiers in 2010. That year, the breakdown of internal order in Haiti after the great earthquake caused a UN peace-keeping force to be posted there. This included a unit from Nepal which evidently had asymptomatic carriers in its ranks. Their camp allowed imperfectly treated sewage to drain into a nearby river and sparked a still enduring epidemic that brought this ancient scourge back into the Western hemisphere after many decades. The outbreak sickened more than 800,000 people and killed at least 10,000 in the years that followed. The ensuing epidemic infected roughly 7% of the population of Haiti. The UN promised to raise $400 million, but maintained its legal immunity from civil litigation (Orata 2014; Washington Post 2020).

Cholera in British India became a larger concern after British authorities realized the general effects of disease on their armies there and elsewhere in the Empire. But that understanding only became possible after different army commands began to require units, both large and small, to maintain and transmit registers of sickness and mortality in their ranks. These were then analyzed using the statistical methods being evolved in Britain. But army units were accompanied by vast numbers of camp-followers and soldiers inevitably interacted with the local population whether in licit liquor-shops or illicit brothels. Gradually therefore surveillance began to extend to the ‘native’ population, though their health was quite secondary to that of soldiers and officers. Massive programs of barrack construction and the development of cantonments at healthy locations were pushed ahead. On the other hand, bases which showed high average mortality and sickness were evacuated (Guha 2001: 110–139).

Control efforts and economic damage: the Bombay plague 1896–1910

As we have seen, the bubonic plague did not increase to epidemic proportion in Madras or Bombay in 1801–2. The Plague Commission of 1900 inquired closely into the history of previous outbreaks. It concluded that except the district in the immediate neighborhood of Kumaun (now in Uttaranchal State) where there was a local outbreak in 1853–4, there had been no plague in the plains of India after 1836. The disease only reappeared in Bombay during the rainy season of 1896, as part of the first global pandemic of this disease. It appears likely that it had come from Hong Kong and the first known case in India was a warehouse worker in Bombay handling fireworks imported from China. The Plague Commission argued that it was also possible that it came from some endemic focus in the Persian Gulf or Arabia. (Plague Commission 1898: vol. 5, 6–11) It is likely that the Commissioners knew that monitoring or limiting pilgrimages had few political costs but interfering in the important cotton yarn export trade to China would be an explosive issue.

The plague was recognized in September 1896 and the number of cases increased rapidly thereafter. Bombay was not only the most important port in the Western Indian Ocean, but also the provincial capital. While the pathogenic organism was known yet the mode of transmission was still disputed among experts. The local government created a special committee with wide powers. The Plague Committee then attempted to enforce the complete segregation of those who had been in contact with a patient, the removal of the sick to special hospitals (where the great majority died), the burning of clothing and ‘disinfection’ of dwellings and alleys. Units of British soldiers accompanied teams of “sweepers” to enforce these measure. Such draconian enforcement was resisted and evaded. Riots broke out in several places.

Finally, fear of these arbitrary measures as well as of the disease itself led to an exodus from the city. The population was counted at 822,000 in the 1891 Census but was estimated to be 846,000 in October 1896. Initially, relatively small numbers left the city, about 20,000 in October. But as cases and deaths increased and coercive sanitary measures extended to much of the city there was a veritable panic. In November and December, the authorities calculated 171,500 left and in January 1897, another 187,400. By that time, the resident population was almost halved, perhaps only 450,000 were left in the city. The recorded plague deaths from 23 September 1896, when the first one was reported to March 31, 1897 were 9,142. The Commission pointed out however that the city had recorded 19,843 excess deaths if calculated against average of the previous 5 years. But these latter deaths occurred when the underlying population was in fact much smaller as a result of the flight of much of the population. Therefore this extraordinary excess mortality meant that many plague deaths had passed unrecorded as such (Plague Commission, 1898: vol.5, p.11). It also testifies that families preferred to live in close proximity to the sick rather than send them to hospitals and themselves suffer ham-handed measures of ‘sanitation’. (It is clear therefore that the modern fear of contagion that has sometimes resulted in doctors being asked to vacate their flats did not exist at the time.) Then in June 1897 semi-starved victims of the great famine just beginning in the interior of peninsular India began coming into the city seeking livelihood. Many succumbed then to plague and other diseases.

The city authorities realized too late that the specialized sanitary workers (called halalkhor) on whom street cleaning and general sanitation depended might also flee. That would not only negate the Committee’s efforts but quickly make the city uninhabitable. The Committee was abolished and community leaders recruited to develop more acceptable measures. Meanwhile the plague spread inland, following the railroad lines.

Bombay city gradually became accustomed to seasonal epidemics of the plague. The highest weekly totals came in February or March of each year, with over 2000 plague deaths recorded weekly for many years. Even in 1909–10, 1152 plague deaths were registered in the deadliest single week (Gazetteer 1910: vol. 3, 174–77). But business life returned to normal in this most business-minded of Indian cities: the coercive imposition of sanitary measures had damaged the economy more than the disease ultimately did.

Mechanized cotton spinning and weaving factories were established in the suburbs of Bombay after 1854. The opening of railway lines into the cotton-growing districts of Central India and Gujarat by the late 1860s sharply reduced the transport cost of raw cotton in Bombay and expanded interior markets for cotton yarn used by weavers there. The further opening of the Chinese market following the Anglo-French attack on China in 1856–60 led to surge in yarn exports there. All this stimulated mill-production in Bombay aided by the steady depreciation of silver-based currencies like the rupee, until the government of India stepped in to “fix” the exchange against gold-backed currencies after 1893, revaluing the rupee in the process. After that, the mill industry was struggling to retain the silver-based Chinese market against Japanese competition.

Then the massive flight of mill-workers resulting from plague control measures forced many factories to shut down for several months in 1897. The Times of India’s annual review noted the great fall in Bombay exports to Chinese ports, which declined by 60% between January and May. The direct export loss was calculated at Rs. 1.11 crores. It was only partly compensated by the rise of cotton yarn exports from Eastern Indian mills exporting via Calcutta. But the shortfall in Bombay exports especially benefited the emerging Japanese factories which establish a footing in the Chinese market (that they never subsequently lost) (Eduljee 1898). Plague peaked countrywide in 1903–4 and then declined gradually (Dyson 2018: 141–46).

But apart from the loss of export earnings, there was obviously also a fall in production for the domestic market, a loss of incomes for the many associated with the mill industry and the many shops and establishments that they patronized. Likewise, the plague bacillus migrated to the Americas and established a foothold in the Rocky Mountain states that it has never relinquished. In 2015, for instance, there was a cluster of 11 recorded cases and three deaths (ECDC 2016). The number of visitors to Yellowstone Park next year was unaffected, increasing by 160,000 to 4.26 million, its peak for the decade (Statista 2020). Far more dramatic effects were experienced in India a few years earlier, in 1994.

The economic value of impression management: Surat plague of 1994

Bombay at the outset of its plague pandemic in 1896 had about one-third the population of the city of Surat in 1994. Surat in the 1990s was, like Bombay/Mumbai, deeply knotted into the world market. It had a strong informal sector and had grown into a major hub of diamond and gemstone cutting and polishing as well as an important powerloom center. Vast numbers of migrants from eastern India provided low-cost labor for such businesses. Meanwhile, thanks to a century of monitoring, the use of new insecticides starting with DDT and perhaps immunity in the rat population, the plague had slipped from the headlines. But it still lurked among rodents and in retrospect, it is thought it may have spread to humans in Latur (Maharashtra) after the earthquake of 1993 forced rats out of their dens and into contact with humans.

Textiles and gemstones were in 1994 an important part of India’s exports as it strove to re-orient the economy after the near-default of 1991. Bookings for the winter tourist season had picked up with the devaluation of the rupee. The reforms just initiated by the Narasimha Rao government were receiving international attention, especially after the dramatic success of the Chinese economy following its reforms fifteen years earlier.

It was in this setting that Surat reported its first cases of the pneumonic variety of the plague on September 23, 1994. The news spread across India and worldwide. The general image of the country as dirty and backward encouraged panic. Many countries imposed restrictions or bans on travelers and goods from India (AP 1994). Meanwhile panic spread among Surat residents both at the disease and at rumors of compulsory quarantine detention. It is estimated that 700,000 persons fled the city within two weeks, scattering to all parts of India. Some of these were already ill. There were 57 deaths in all and 54 were in Surat. The state and central governments acted swiftly. By October 5, an international delegation including Russian and American doctors certified the plague had been extinguished in India (Dutt 2006). But the two week panic had dramatic effects on the economy. Initial estimates had been that lost exports would be $1 billion. But the Finance Minister Pranab Mukherjee later estimated the loss at approximately $700 million—not insignificant as India’s total exports in 1994–95 were only about $25 billion (News-India 1994). Domestic income losses are impossible to estimate, but they would certainly have been several times the value of lost foreign earnings.

We may contrast this with the plague mortality of a relatively good year, 1917–1918. The monsoon rains of 1917 were exceptionally abundant and allowed a considerable increase in sown area. But the season also saw a resurgence of the bubonic plague as burrows were flooded. The government recorded a total of 587,000 deaths as being from this cause in 1917. Bombay province suffered 163,000 of these. But officials, businessmen and the public were now inured to this disease. Bombay was an important logistical base for British imperial armies operating in the Middle East and markets were booming. The next year, 1918, saw an increase in plague deaths (alongside the catastrophic influenza) to 621,000 across British India before the disease leveled off (Statistical Abstract 1926: Table 187).

The last great peak of mortality – certainly the highest recorded in the entire twentieth century, came with the great influenza pandemic of 1918 (Guha 2001: 68–94; Table 1 below). This came as a result of troop movements as World War I was ending. But it came too quickly and was simply beyond official managerial capacities. It caused enormous mortality and vanished.

Table 1.

Influenza Mortality 1918: Contemporary official estimate (Compiled from MoH 1921)

Province (ranked by influenza death rate) Population Census of 1911 (000s) Estimated total deaths up to Nov. 30, 1918 Death rate per 000
CP and Berar 13,916 790,820 56.8
Delhi 417 23,175 55.6
Bombay 19,587 900,000 45.9
Punjab 19,337 816,317 42.2
North-West Frontier 2041 82,000 40.0
UP 46,821 1,072,671 22.9
Madras 40,006 509,667 12.7
Assam 6052 69,113 11.4
Bihar and Orissa 34,490 359,482 10.3
Burmah 9856 60,000 6.0
Bengal 45,329 213,098 4.7

The influenza of 1918–19 in India

An official report compiled soon after the end of the epidemic stated confidently that no part of the world suffered as acutely as India did from this epidemic (MoH 1920). The Sanitary Commissioner to the Government calculated that up to the end November 1918 there had been five million deaths in British India, with a guess of one million more in the princely states. “The total mortality in India in October 1918 is without parallel in the history of disease” (MoH 1920: 383). This was also the conclusion reached by Mills (1986) who re-analyzed the data on a provincial basis. He also flagged the effects of the failure of the South-west monsoon in autumn 1918. It enhanced mortality generally and also led to famine migration from what is today Maharashtra and Gujarat into Bombay city. Since the flu had been first introduced there—probably by military shipping—this resulted in a large number of the new arrivals succumbing to the disease. Morbidity was “estimated to be in excess of 50 per cent of the general population, and with the concentration of severe attacks in the most productive age range 20–40, the effect on agricultural production was extreme” (Mills 1986: 35). This combination of sickness and drought resulted in a 19 per cent decrease in the area under food crops and 15 per cent in non-food area when compared to the previous year (Mills 1986: 34–36). Table 1 below shows estimated influenza deaths by major provinces of British India, up to November 30, 1918. The actual totals have been re-calculated by demographers be in the range 10–14 million deaths, or 4 percent of the 1911 population (Dyson 2018: 149).

In a still unpublished paper, Donaldson and Keniston sought to estimate the effects of this great epidemic in order to consider whether the late colonial economy was in a Malthusian trap. The data do not credibly permit such a test. As they realized, the effects of the epidemic of 1918 were masked by the fact that it was also a drought year in several regions, including Punjab, Sind and Western and Central India (Donaldson and Keniston 2016), It is thus difficult to separate the reduction in sown area resulting from the shrinkage of the workforce owing to illness from the impossibility or pointlessness of ploughing and sowing rock-hard fields. It could be argued that the incapacitation from illness matched a corresponding fall in labor needs in agriculture. Indeed the monsoon failure was so acute that famine was declared and relief measures implemented in the Central Provinces and UP in 1918 (Mills 1986: 34). The Moral and Material Progress Report for 1919 declared the monsoon failure of Autumn 1918 to be the worst since the famine year 1899–1900. Regional rainfall data bear this out, with the driest regions suffering the most. It had a dramatic effect on India’s international trade. Measured at constant prices, exports fell 16 per cent and imports 6 percent as compared to 1917–18. “The effects of the failure of the monsoon are unmistakably evident from these figures.” (1920: 83–85, 99–100—quote p.84; Statistical Abstract 1926: Table 287).

Figure 1 is a scatter diagram of infant mortality rates per 1000 births, adjusted for deficient registration of births and deaths in British India (from Guha 2001). The 1918 influenza shows up as an extreme value—the highest recorded. But even the lowest rates are over 150 per thousand. This has been used to explain the high birth rates prevailing historically: families depended on their children for support and several births might be needed to ensure the survival of a single son into adulthood. Great pandemics like the influenza could wipe out a whole generation of young men. But the psychological and economic cost of repeated pregnancy and lactation on often malnourished women should also be considered an effect of high background and epidemic mortality as well. A British official commented on the influenza-affected plateau of eastern Maharashtra that in “thousands of villages of the plateau the decrease in the number of able-bodied males was appalling, but ‘heirs’ were at once forthcoming, often summoned from great distances…” (cited in Donaldson and Keniston 2016: 20) Paradoxically, it may be that the effects of the extremely severe drought of 1918 were masked by those of the epidemic.

Fig. 1.

Fig. 1

Infant mortality per 1000 live births in India

The decay of the information apparatus

The newly emerged Republic of India faced many challenges from the moment of its birth onward (Guha 2017). It nonetheless launched several successful public health measures and extended the rudimentary infrastructure inherited from earlier times. Some of these challenges were met with successful technological fixes—the Green Revolution is the best-known. Coming between the droughts of 1965–7 and those of 1972–3, it probably averted a disastrous famine in 1973–4, at a time when oil prices rose dramatically and drained India’s foreign exchange reserves. A less noted but very important advance  was the suppression of malaria and as a collateral benefit, the bubonic plague with the widespread use of DDT in the 1950s (Dyson 2018: 192–93). Many Punjab refugees were successfully rehabilitated as farmers in hitherto malaria-infested sub-Himalayan regions of UP. The population decline in hilly North Karnataka caused by malaria and plague combined 1921–41 was reversed. But industrialization, mass migration, the availability of individualized medical care and simple governmental overstretch then led to neglect of disease monitoring and death registration. This is reflected in the quality of statistical publications (excepting the decennial Censuses). I have discussed this extensively (Guha 2001: 156–173). I give a short excerpt as a warning to other researchers who may try to use these sources without intensive scrutiny.

There is hardly a year in which all the states sent in their reports [of notable diseases]: for example the 1986 Statistical Abstract has figures for 1985 for Himachal, 1958 and 1970 for Bihar, 1971 for J & K, 1983 for Kerala, 1978 for Gujarat, 1976 for UP, and 1970 for West Bengal. Again, particular diseases seem to be omitted or reported as the whim of state governments takes them. For example, UP has no cases of hookworm after 1971, although it continues to be found in other states. Rabies is eradicated even sooner: no cases reported after 1966. No cases of accident, poisoning and violence are reported after 1971 from that state… the unfortunate Union Territory of Arunachal has 278,000 cases of accidents, poisoning and violence in 1965, and 211,000 in 1966: possibly this reduces the population sharply, because the next highest figure is 23.802 in 1977.”

It is possible also that confidence in the curative powers of modern medicine led to a neglect of surveillance and recording. It was in this setting that the Surat plague outbreak caught the Central and State government by surprise. Neither disease management nor impression management were taken in hand before the Indian economy was seriously damaged.

Conclusion

This article has given the reader a historical horizon to consider the current COVID pandemic. The pandemic is one of the unexpected consequences of globalization. This article has introduced the historical conditions for the emergence of global outbreaks of disease. However, human diseases are hardy and ever-changing. The travels of plague and cholera illustrate how they can spread with the most rudimentary forms of transport. I have also related pandemics and their effects to to the information apparatus that generates knowledge about them. My article has shown how quarantine measures could cause serious economic disruption and force the retreat of even the powerful British colonial state. It has related rise of the information apparatus to the institutionalization of state responsibilities in modernizing economies. Finally, it has shown that information is a double-edged sword. Disease management and impression management are both important sides of the economic life of pandemics.

Compliance with ethical standards

Conflict of interest

The author has no known conflicts of interest.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  1. AP (1994). Many nations take precautions to avoid India's plague. Associated Press Worldstream September 30.
  2. Babur (2002). The Baburnama: Memoirs of Babur, Prince and Emperor. Edited and translated by Wheeler M. Thackston. New York: The Modern Library.
  3. Bernier F. Travels in the Mogol Empire 1556-1668. In: Constable A, Smith VA, editors. Translated. 2. London: Oxford University Press; 1914. pp. 1556–1668. [Google Scholar]
  4. Brend, W. (1907). Bills of mortality 1907. https://upload.wikimedia.org/wikipedia/commons/2/2d/Bills_of_Mortality.pdf, downloaded April 29, 2020.
  5. HMSO . Condition of India 1917–18. Statement Exhibiting the Moral and Material Progress and Condition of India ordered by the House of Commons to be printed. London: HMSO; 1919. [Google Scholar]
  6. Ditto. (1920). Condition of India 1919.
  7. Crosby AW. Ecological imperialism: The biological expansion of Europe, 900–1900. 2. Cambridge: Cambridge University Press; 2004. [Google Scholar]
  8. Donaldson, D., & Keniston, D. (2016). Dynamics of a Malthusian economy: India in the Aftermath of the 1918 Influenza. Unpublished first draft available at https://pdfs.semanticscholar.org/0804/e2d07a55892cdeba72f8f15df241698a16f7.pdf.
  9. Dutt Ashok K. Surat plague of 1994 re-examined Dutt, Ashok K; Akhtar, Rais; McVeigh, Melinda. Southeast Asian Journal of Tropical Medicine and Public Health. 2006;37(4):755–760. [PubMed] [Google Scholar]
  10. Dyson T. A population history of India: From the first modern people to the present day. Oxford: Oxford University Press; 2018. [Google Scholar]
  11. ECDC (European Centre for Disease Prevention and Control) (2016). Annual Epidemiological Report 2016–Plague. http://ecdc.europa.eu/en/healthtopics/plague/Pages/Annual-epidemiological-report-2016.aspx.
  12. Eduljee. (1898). Mr. P. Eduljee’s Annual Bombay Yarn Report. Times of India February 28 via ProQuest Historical Newspapers.
  13. Etheridge AT. Compiled Report on past Famines in the Bombay Presidency. Bombay: Education Society’s Press; 1868. [Google Scholar]
  14. Gazetteer . Gazetteer of Bombay Town and Island. Bombay: The Times Press; 1910. [Google Scholar]
  15. Government of India (1918). Report on the Moral and Material Progress of India, 1917–18.
  16. Guha R. India after Gandhi: The history of the World’s largest democracy. New York: Pan MacMillan Ebook; 2017. [Google Scholar]
  17. Guha S. Environment and Ethnicity in India, 1200–1991. Cambridge: Cambridge University Press; 1999. [Google Scholar]
  18. Guha S. Health and population in South Asia from earliest times to the present. Delhi: Permanent Black; 2001. [Google Scholar]
  19. Guha S. Rethinking the economy of Mughal India: A lateral view. Journal of the Economic and Social History of the Orient. 2015;58(November):532–575. doi: 10.1163/15685209-12341382. [DOI] [Google Scholar]
  20. Jahangir, no date. Tuzuk-i-Jahangiri or Memoirs of Jahangir, Translated by Henry Beveridge. Calcutta: Asiatic Society of Bengal.
  21. Kingsland SE. Modeling nature: Episodes in the history of population ecology. Second edition with new afterword. Chicago: University of Chicago Press; 1995. [Google Scholar]
  22. MoH 1920. Ministry of Health. Report on the pandemic of influenza, 1918–19.
  23. McNeill WH. Plagues and peoples. Garden City: Anchor Press; 1976. [Google Scholar]
  24. Mills ID. The 1918–1919 Influenza Pandemic—The Indian Experience. Indian Economic and Social History Review. 1986;23(1):1–40. doi: 10.1177/001946468602300102. [DOI] [PubMed] [Google Scholar]
  25. Nair A. ‘An Egyptian Infection’: War, Plague and the Quarantines of the English East India Company at Madras and Bombay, 1802. Hygiea Internationalis: an Interdisciplinary Journal for the History of Public Health. 2009;8(1):7–29. doi: 10.3384/hygiea.1403-8668.09817. [DOI] [Google Scholar]
  26. News-India (1994). Plague impact on exports less than feared. Publication info: News India–Times ; New York, N.Y. [New York, N.Y]18 Nov 1994: 20. Via EBSCO.
  27. Orata FD, Keim PS, Boucher Y. The 2010 Cholera Outbreak in Haiti: How science solved a controversy. PLoS Pathogens. 2014;10(4):e1003967. doi: 10.1371/journal.ppat.1003967. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Plague Commission . The Plague in India, 1896, 1897. Simla: Compiled for the Government of India by R. Nathan; 1898. [Google Scholar]
  29. Pollitzer R. Cholera studies No.10. Bulletin of the WHO. 1957;16:783–857. [PMC free article] [PubMed] [Google Scholar]
  30. Roe, T. (1899). Embassy of Sir Thomas Roe. Edited from contemporary records by William Foster. London: Hakluyt Society.
  31. Rosenberg CE. The Cholera years: 1832, 1849 and 1866. Chicago: University of Chicago Press; 1987. [Google Scholar]
  32. Rosenberg CE. Cholera in nineteenth-century Europe: A tool for social and economic analysis. Comparative Studies in Society and History. 1966;8(4):452–463. doi: 10.1017/S0010417500004229. [DOI] [Google Scholar]
  33. Sholapur Gazetteer . Gazetter of the Bombay Presidency. Vol. XX Sholapur. Bombay: Government Press; 1880. [Google Scholar]
  34. Smith RS. Barcelona ‘Bills of Mortality’ and population, 1457–1590. Journal of Political Economy. 1936;44(1):84–93. doi: 10.1086/254886. [DOI] [Google Scholar]
  35. Statista (2020). https://www.statista.com/statistics/254231/number-of-visitors-to-the-yellowstone-national-park-in-the-us/. copied May 24, 2020.
  36. Statistical Abstract . Statistical Abstract for British India, 1915–16 to 1924–25 ordered by the House of Commons to be printed. London: HMSO; 1926. [Google Scholar]
  37. Times of India (Bombay) accessed via Proquest.
  38. Wall Street Journal. (2020). https://www.wsj.com/articles/covid-19s-exact-toll-is-murky-though-u-s-deaths-are-up-sharply-11589555652?mod=hp_lead_pos7. copied May 15, 2020.
  39. Washington Post. (2020). Retrieved May 8, 2020 from https://www.washingtonpost.com/opinions/the-uns-neglect-of-another-recent-pandemic-stains-its-legacy/2020/05/05/c22ace5e-8be1-11ea-ac8a-fe9b8088e101_story.html.
  40. Zinsser H. Rats, lice and history. Boston: Little, Brown; 1935. [Google Scholar]

Articles from Indian Economic Review are provided here courtesy of Nature Publishing Group

RESOURCES