Skip to main content
The Milbank Quarterly logoLink to The Milbank Quarterly
. 2003 Dec;81(4):603–626. doi: 10.1046/j.0887-378X.2003.00296.x

“Evil Habits” and “Personal Choices”: Assigning Responsibility for Health in the 20th Century

Howard M Leichter 1
PMCID: PMC2690243  PMID: 14678481

The 20th century began and ended with many of the nation's more affluent and better-educated citizens in a near-frenzied pursuit of better health through lifestyle modification. Americans sought and received health-enhancing and disease-preventing advice in books, newspapers, popular magazine articles, and government studies. They followed exercise regimes and bought self-improvement paraphernalia from “medicine balls” to elliptical machines in order to learn “How to Avoid Heart Troubles” (Groedel 1901) and, even more ambitiously, to stay Forever Young (Berger 1989). Upton Sinclair alerted early 20th-century Americans to the appalling conditions in the meatpacking industry (1906), and nearly a century later Eric Schlosser warned of the “Dark Side of the All-American Meal” (2001). Finally, San Franciscans, fretting about the nutritional dangers to their children's health, began the last century by banning “roving pie vendors” who catered to the “habitual pie-eating” habits of schoolchildren (Habitual Pie-Eating Ruining Health of Children 1910) and began this century by prohibiting the sale of soft drinks and other unhealthy snacks on school campuses (Delgado 2003).

At the center of all the health promotion admonition, advice, and advocacy was this question: To what extent were morbidity and premature mortality self-inflicted, the result of uninformed, careless, and avoidable personal behavior? Although many experts emphasized the impact of such collective factors as racism, declining moral standards, capitalism, and urbanization or suburbanization on people's health, they placed much of the blame for seemingly avoidable morbidity and premature mortality on Americans' alleged personal careless and imprudent lifestyle choices. The timelessness and persistence of holding the individual person responsible for his or her own health status has its genesis in one of the most distinguishing historical features of American culture and politics, namely the extraordinary emphasis on individual rights and responsibilities. According to Michael S. Goldstein, “The veneration of the individual is a hallmark of classic American views of religion, society, and medicine” (1992, 23). And in his book on fitness movements in the United States, Harvey Green refers to “the persistent American tendency to equate success [in becoming fit] with individual effort and failure with individual responsibility” (1986, 322).

The 20th century began and ended with many of the nation's health policymakers and opinion shapers blaming individuals for their own ill health. Despite this common theme, assumptions about the character, causes, and consequences of personal irresponsibility in health matters differed in important respects, particularly in Americans' freedom to choose healthier lifestyles, the role of science and medicine in promoting health, the alleged dangers of personal irresponsibility, and the moral authority on which these debates were based.

Because so much of the early 20th-century literature emphasized the collectivist and institutional origins of the nation's ills, including its public health problems, I begin with an overview of these origins and then discuss the alternative view that many of the nation's health problems were self-inflicted, the result of uninformed, imprudent, or immoral personal decisions.

Social or Collectivist Explanations for Illness and Premature Death

In the two decades before World War I, often known as the Progressive Era, “a great reform spirit shook American society” (Resek 1967, xi). During this period many social reformers—Upton Sinclair, Jane Addams, Lincoln Steffens, and Ida Tarbell—attributed the nation's ills to a variety of social conditions and institutions, such as industrialization, urbanization, secularization, and the breakdown of the extended family. James T. Patterson, a historian at Brown University, summarized the institutional or collective perspective held by many social workers:

The basic causes of poverty were not personal weakness but two deeper problems: an economy insufficiently abundant to provide subsistence for all the able-bodied, and a social order that inequitably distributed what wealth there was. In these fundamental ways poverty was then and continued to be a structural, not a moral, matter.

(Patterson 2000, 6)

Other social critics focused on the pathologies of capitalism. Samuel Hopkins Adams, the managing editor of McClure's, a popular “muckraking” magazine, characterized the influence of American business leaders on the public's health as “malignant” and “incredibly stupid” (Adams 1908, 246–7), and presidential candidate and New Jersey governor Woodrow Wilson, hardly a political radical, spoke of the “heartless” American economic system (Williams 1973, 390). Indeed, the very nature of an urban-based, industrial capitalist marketplace was itself seen as endangering the health of Americans. As one contemporary observer noted:

The stress and strain, worry and anxiety attendant on fierce competition in business and professional life is enervating and devitalizing. It embarrasses or suspends organic function and lessens resistance to morbid influences. As a consequence we fall easy victims to almost any disease, such as pneumonia, tuberculosis, typhoid fever, cancer and grave kidney lesions.

(Rucker 1906, 1840)

Social scientific analyses confirmed the anecdotal and subjective critique of the health-harming consequences of American capitalism. Typical of this analysis was the 1907/8 Pittsburgh Survey, a multivolume study of working conditions in that city's steel industry: “The health of the workers … is affected by social conditions and industrial environment, wages paid, hours of work required, and the nature of the occupation itself” (Butler 1911, 358). The study documented how dangerously long, arduous workdays, along with dust and debris in overcrowded and inadequately ventilated workplaces, contributed to digestive, pulmonary, nervous system, and infectious diseases. Other empirical studies attributed the high death and disease rates to overcrowded and unsanitary housing, the unsafe handling of food, poor ventilation in places of public entertainment (e.g., movie theaters), and inadequate public sanitation and human waste disposal (East Chicago Department of Public Health 1916). Progressive Era reformers argued that all these changes conspired to undermine the health of the American people as well as their social and moral fabric.

Death, Disease, Disability, and the Individual

While many Progressive Era critics underscored the social etiology of disease, an alternative—at times complementary—explanation also enjoyed widespread support. In this view, through ignorance, irresponsibility, indifference, or immorality, individuals were at least complicit in their own physical ills and misfortunes. One particularly noteworthy turn-of-the-century version of this perspective was W.E.B. Du Bois's classic sociological analysis, The Philadelphia Negro: A Social Study (1899). Du Bois found that black mortality rates from consumption (TB) and pneumonia were double those of the white population. Like many Progressive Era scholars, Du Bois attributed the high death rate in part to the working and living conditions of black Americans: “Bad ventilation, lack of outdoor life for women and children, poor protection against dampness and cold are undoubtedly the chief causes of this excess death rate” (Du Bois 1967, 152).

Du Bois, however, did not relieve his fellow African Americans of complicity in their own ill health.

Negroes are as a mass ignorant of the laws of health. One has but to visit a Seventh Ward church on Sunday night and see an audience of 1500 sit two or three hours in the foul atmosphere of a closely shut auditorium to realize that long formed habits of life explain much of Negro consumption and pneumonia.

He concluded that “in habits of personal cleanliness and taking proper food and exercise, the colored people are woefully deficient” (Du Bois 1967, 161). Both individual irresponsibility and social conditions were responsible for the unhealthy circumstances of Philadelphia's African-American population (Du Bois 1967, 160).

Du Bois was not alone in recognizing the interaction between personal imprudence and social conditions in the etiology of disease and early death. Sheila Rothman has observed that several Progressive Era analysts implicated both dangerous and unhealthy social conditions (e.g., tenement living) and risky personal behavior (e.g., unsanitary personal hygiene) in spreading tuberculosis among the urban poor. Referring to immigrants, a favorite target of reformist critics, Rothman noted that “explanations as to why the immigrant population was most prone to tuberculosis … focused equally on underlying social conditions and on personal moral failings” (1994, 184). This view was also held by S.A. Knopf, a German-born physician and professor of medicine at the New York Post-Graduate Medical School, who explained the prevalence of TB among urban tenement dwellers in terms of their being “badly housed, underfed [and] overworked” as well as “weakened by disease, intemperance and excesses” (1906, 1680, italics added).

Not just the poor but Americans in general engaged in excesses in their daily lives that undermined their own health. In 1908 one of the country's foremost “nerve” specialists enumerated some of Americans' ill-informed and harmful habits: “Too much smoking, too much drinking, too much worrying, too much working, will enfeeble the nerve vitality, depreciate the corpuscles, and make a man old before his time” (Baily 1908).

In sum, although collectivist explanations for the nation's health problems were important to the Progressive reformers' indictment of American society, many social commentators argued that people were not merely prisoners of the social and economic circumstances in which they found themselves. Instead, as a result of ignorance, laziness, immorality, or lack of willpower, too many Americans made foolish, self-harming, and socially costly behavioral choices.

We Are Our Own Worst Enemies

The 20th century ended with near-universal agreement among public health officials and academic observers that in regard to ill health, we are our own worst enemies. In 1990, for example, John Iglehart, editor of the influential journal Health Affairs, expressed the view of Americans' health that prevailed at the close of the century: “Most illness and premature death are caused by human habits of living that people choose for themselves” (Iglehart 1990, 4). The notion that irresponsible and seemingly avoidable lifestyle choices were the main causes of Americans' morbidity and premature mortality received the official imprimatur of the federal government in the 1979 report Healthy People: The Surgeon General's Report on Health Promotion and Disease Prevention: “Personal health habits play critical roles in the development of many serious diseases and in injuries from violence and automobile accidents. Many of today's most pressing health problems are related to excesses—smoking, drinking, faulty nutrition, overuse of medications, fast driving, and relentless pressure to achieve” (U.S. Dept. of Health, Education and Welfare 1979, 14).

Nearly a quarter of a century later, the nation's top health policy administrator was still calling for the improvement of individual and national health through more prudent lifestyle choices. According to Secretary of Health and Human Services Tommy Thompson, “To stem the epidemic of preventable diseases that threaten too many Americans, we need to move from a health care system that treats disease to one that avoids disease through wiser personal choices” (Pear 2002). Although many public health advocates and government officials identified environmental, genetic, socioeconomic, and cultural causes of illness and premature death, the national debate over health at the century's end was dominated by the belief that most of the bad things that happen to us are the result of poor judgment and foolish behavior (see, e.g., Rosner and Markowitz 1991, 6). The assertion by Joseph Califano, the former secretary of the Department of Health, Education and Welfare, that “we have met the enemy and they are us” appeared unassailable in light of the evidence linking the major causes of illness and premature death with various personal habits and daily behavioral choices (Califano 1986, 188). In fact, one study attributed eight of the nine leading causes of death, accounting for nearly 1 million deaths per year, to lifestyle-related decisions (McGinnis and Foege 1993).

The result of this concern with Americans' irresponsible decisions based on lifestyle was an outpouring of state and federal legislation to educate citizens about the dangers of their daily habits and, if necessary, to force them to behave more prudently. Thus the last decades of the century witnessed, among other public policies, the adoption of mandatory seat-belt use laws; restrictions on the advertising, sale, purchase, and use of alcohol and tobacco products; the requirement that food manufacturers post in a prominent place the nutritional content of their products; the placement of health warning labels on everything from beer to electric appliances; revisions to the school curriculum relating to health awareness, physical fitness, and sexual activity; the banning of “junk food” from school cafeterias; and the widespread use of “sin taxes” to discourage certain kinds of harmful behavior, as well as to raise revenues (see Leichter 1991).

The private sector, too, joined this war on personal irresponsibility. Insurance companies gave preferential rates to nonsmokers and teetotalers (e.g., risk rating); restaurant menus offered “healthy choices” (endorsed by the American Health Association); some employers required prospective and current employees to adopt more healthy lifestyles as a condition of employment, retention, and eligibility for employee benefits (e.g., CNN will not hire smokers); some companies established wellness programs, opened exercise facilities for their workers, and funded smoking-cessation programs; and some companies (including U-Haul and Hershey Food Corporation) raised overweight employees' health insurance copayments (Leichter 1997, 362).

Thus the 20th century ended much as it had begun, with Americans being exhorted to eat more wisely and drink more moderately, exercise more often, and lose more weight. Although the overall message of the two periods was similar, it also differed in important ways.

Choice and Chance at the Beginning and End of the Century

The exhortation to adopt a healthier lifestyle is predicated on the assumption that people have the capacity and freedom to make wiser choices. Implicit in this view is the Aristotelian notion that a person is responsible only for those acts that he or she freely and voluntarily chooses. Thus people who act involuntarily, through either coercion or unavoidable ignorance, should not be held responsible or blamed for their actions. Free and informed choice is, in turn, a function of two factors: (1) an individual's personal circumstances (e.g., socioeconomic status, level of education, type of occupation, and residence) and (2) the larger scientific and technological environment in which the individual lives.

At least since the mid-19th century, public heath experts have known that the poor are less free than the affluent to make wise lifestyle choices. As one early 20th-century source put it: “The lesser cost of damaged goods is a fearful temptation of the slender purse of the ignorant women of the tenements; the stores where she buys her food-supplies offer but little choice for well or ill” (Godfrey 1909, 272). Ninety years later, an article in the New York Times entitled “As the Rich Get Leaner, the Poor Get French Fries” noted, “Whether by limited finances or personal preference, health concerns [with regard to nutrition] are a great divider between the haves and have nots” (O'Neill 1992). In fact, Meredith Minkler's end-of-century observation about the relationship between poverty and health has almost certainly always been valid:

Indeed, a voluminous body of evidence has demonstrated that social class is one of the major, and perhaps even the major risk factor for disease. Studies have in fact shown that there is a clear gradient in social class and mortality rates: Not only do people in the highest socioeconomic groups have the lowest mortality rates, but these rates increase at each correspondingly lower rung of the socioeconomic ladder.

(Minkler 1989, 20)

The poor have always had fewer opportunities to choose where they live and work, what foods they can eat, and how to spend their leisure time. But from a public health perspective they had even fewer choices at the beginning than at the end of the century because there were proportionally far more poor people in 1900 than in 2000. In his book America's Struggle against Poverty in the Twentieth Century, James T. Patterson estimated that about 40 percent of Americans in 1900 lived in poverty, compared with a little more than 11 percent in 2000 (Patterson 2000, 12). The problem was compounded by the clear relationship between educational achievement and risky behavior. For example, in 1999, 40 percent of those people who had not graduated from high school smoked cigarettes, compared with just 13 percent of those with an undergraduate degree. And just as there were proportionately more people living in poverty at the beginning than at the end of the century, so too at that time there were more poorly educated Americans. In 1910 just 13 percent of the population had graduated from high school, compared with 83 percent in 1999.

In part because of how they earned their living, the poor and poorly educated typically also had less latitude and ability to make wise lifestyle choices. As one early 20th-century commentator explained, “We know to-day that persons habitually engaged in hard indoor work present a higher mortality than persons more favorably situated, and that the character of occupations influences to a great extent not only the average expectation of life, but also the prevalence of certain diseases” (Kober 1901). Once again, the data suggest that people living at the beginning of the century were far more likely to be employed in risky jobs than were those at the end of the century, and therefore they were far more susceptible to work-related diseases and premature death. In 1900, for example, 42 percent of male workers were engaged in primary occupations such as forestry, mining, fishing, and farming, compared with just 4 percent in 1998.

In sum, certain objective conditions limit the ability of individuals to make choices that will enhance their prospects of living healthy lives. This limitation could be attributed to poorer education and poverty, which made a larger proportion of Americans at the beginning of the 20th century less informed about and less financially able to make health-promoting lifestyle choices than at the end of the century.

The second factor affecting whether individuals are free to choose wisely is related to the development and dissemination of medical and scientific information. The choices of even affluent and well-educated people are only as good as the information on which they base their lifestyle decisions. Although rapidly expanding, this information was still quite limited at the beginning of the 20th century. In this regard, consider the seemingly trivial example of the middle-class women who placed their families at risk by buying the asbestos dining-room table pads advertised in a 1902 issue of Good Housekeeping (Asbestos Pad for Dining Tables 1902). No one would suggest that these women behaved irresponsibly, since neither the manufacturer nor the public knew about the health dangers posed by asbestos. Ignorance of risk, unless it is intentional, does not constitute irresponsible behavior.

Consider another example. In 1900, heart disease was the fourth leading cause of death in the United States, yet unlike other major diseases, there was remarkably little discussion of it in either the medical journals or the popular press (for an exception, see Groedel 1901). The reason for this relative silence was that unlike influenza/pneumonia, TB, and gastritis, health experts k little about the complex etiological origins of heart disease. Contrast this with a late 20th-century study in a medical journal that identified literally hundreds of coronary risk factors (Hopkins and Williams 1981).

Moreover, people cannot be held responsible for contracting illnesses or diseases that they do not even know exist, and in 1900 medical science had not yet identified many diseases familiar to us today. The 1899 Bertillon Classification of Causes of Death, the forerunner of the International Classification of Diseases (ICD), listed 160 categories of diseases, whereas the current ICD contains 999. Some scholars would attribute part of this to the “disease inflation” resulting from the “medicalization” of certain life-cycle and behavioral phenomena. Such “conditions” as menopause, childbirth, hyperactive child behavior, and aging now are viewed, and often treated, as “diseases,” whereas in the past they would have been accepted as natural, and in most instances untreatable, facts of life (Goldstein 1992, 23). Nonetheless, many diseases were not yet known because the technology was not yet available to identify them. According to Leon Eisenberg, the relationship between knowledge and prudent choice in health matters is clear: “The assumption is that in earlier days and in ‘simpler’ societies, individuals and families were ‘free’ to make their own choices. To the contrary, the difference between past and present is a matter of having more information on which to make choices.” Eisenberg also suggested another information-based difference between the two ends of the last century with regard to the relative capacity of individuals to make health-promoting, and avoid health-harming, decisions: “The present differs from the past only in two respects. The pace of change is much more rapid now and the systematic collection of public health information enables us to detect the effect of the changes and to design methods to reverse those which are undesirable” (Eisenberg 1987, 103, 104).

In a real sense, then, more Americans at the end of the last century were better able to make informed choices about how to improve their health and to prevent disease than at the beginning of the century. Closely related to this capacity for choice was the relative role of science in informing and directing those choices.

The Promise and Perils of Science

At the start of the 20th century, educated Americans were almost euphoric about the promise and performance of modern science in general and of medical science in particular. W.T. Sedgwick, a turn-of-the-century public health expert and professor of biology at the Massachusetts Institute of Technology, observed in 1908, “Before 1880 we knew nothing; after 1890 we knew it all; it was a glorious ten years” (quoted in Freymann 1975, 531–2). No germ would escape the reach of scientists (Hendrick 1909/10, 653). Readers of The American Magazine were assured that “within the lifetime of men full-grown to-day, consumption will lose its old terrors, pneumonia will give up its secret, and typhoid will go the way of smallpox and malaria” (Hirshberg 1906, 660). Burton J. Hendrick, a newspaper editor and frequent contributor to popular magazines, was even more optimistic: “Medical science seems pointed fairly toward the goal which half a century ago would have seemed as unattainable as another golden age—the elimination, from civilized society, of all contagious diseases” (Hendrick 1909/10, 653).

At the beginning of the century, the guiding hand of science promised to help middle-class Americans control their own physical destiny. To be credible, health-promoting and disease-preventing advice had to be clothed in the garb of modern science and medicine. Thus, Irving Fisher of Yale University and Eugene Lyman Fisk of the Life Extension Institute provided Americans with “rules for healthful living based on modern science” (Fisher and Fisk 1916, italics added). Throughout the first decade of the 20th century, “men of science” writing in both popular magazines and scientific journals gave their imprimatur to a whole host of health-promoting activities. Readers were urged to stand up for 20 minutes after eating, gradually change from winter- to summer-weight underwear, ride bicycles, thoroughly flush out their colons, eat only raw nuts and fruits, apply cocoa butter or cod liver oil to their breasts, abstain from alcohol, moderate their sexual appetites, fast on a regular basis, and listen to music each day to relieve stress. All this wellness and fitness advice was presented as if it were based on sound scientific analysis.

Whereas Americans at the beginning of the century embraced self-help advice in part because it was linked to modern science and medicine, Americans at the end of the century embraced health promotion through lifestyle modification in part because of their disenchantment with modern science and medicine. Many Americans at the end of the century had become disillusioned with modern medicine's ability to deal with their health concerns. Instead of being the instruments for reducing the risk of death and disease, science and technology—including nuclear energy, automobiles, petrochemicals, and electromagnetic fields—were widely viewed as being the source of many of these risks. Michael S. Goldstein traced Americans' national disenchantment with science and technology to the end of World War II:

The cherished notion that science and technology will inevitably improve our lives has been under attack ever since the development and use of the atomic bomb. More recently, the accidents at Three Mile Island and Chernobyl, as well as the unremitting tide of pollution in our air and water, remind us that even the peaceful use of technology can be dangerous.

(Goldstein 1992, 10)

Americans now view science and medicine as contributing not only to their current health woes but also to their confusion over the nature and cause of these problems. Depending on which scientific study you believe, cellular telephones may or may not cause cancer; alcohol may or may not be good for your heart; sunscreens may or may not protect you from melanomas; margarine may or may not be healthier than butter; and hormone replacement treatment may be either good for women (e.g., it reduces the risk of osteoporosis) or harmful (e.g., it increases the risk of heart disease and breast cancer). The confidence in and certainty about the benefits of medicine that prevailed at the beginning of the last century have been, in the minds of many, replaced with skepticism and even hostility.

At the end of the 20th century, confusion was only part of the problem. Americans also discovered that incompetence and carelessness by hospital personnel are significant causes of death in this country. According to the Institute of Medicine of the National Academy of Sciences, medical errors may be the cause of as many as 98,000 hospital deaths per year (Pear 2000). The empirical evidence of the disenchantment with modern medicine is striking. In a review of Americans' attitudes toward health, Blendon and Benson found that between 1966 and 2000, Americans' faith in health care professionals plummeted from 73 percent reporting confidence in health providers to 44 percent (Blendon and Benson 2001, 39). In addition, in a review of three major national surveys that examined the public's confidence in the medical profession from the mid-1960s to the late 1990s, Schlesinger concluded, “These surveys suggest that over a 30-year period, American medicine went from being perhaps the most trusted to being one of the least trusted social institutions” (2002, 189).

In the late 20th century, many educated Americans were convinced that they could do more than any doctor, hospital, or piece of medical technology to live longer and healthier lives. As a result, they took their health into their own hands, consuming billions of dollars worth of vitamins and mineral supplements, trying an endless number of weight-loss programs, attending yoga and Pilates sessions, and compulsively exercising. The evidence of the self-help craze is impressive (see Eisenberg et al. 1998). The percentage of Americans reporting that they engaged in some daily activity to keep physically fit increased from 21 percent in 1961 to 67 percent in 1990 and 78 percent in 1995. The Wall Street Journal reported that in 1999, 30 million Americans belonged to health clubs, up from 24 million in 1995, and the Sporting Goods Manufacturers Association announced that sales of sports equipment, apparel, and footwear rose to $46.5 billion in 1999, a 54 percent increase since 1990 (Kulish 2000, R16). Unlike Americans in the early 20th century who accepted responsibility for their own health because medical science showed them how they could, Americans in the late 20th century accepted responsibility for their own health because medical science seemed to have failed them.

The Externalities of Imprudence: National Survival versus Balancing Budgets

What we eat, how much we weigh, with whom and how we engage in sex, whether we smoke cigarettes or abuse alcohol, and whether we exercise obviously affect our personal well-being. These lifestyle decisions and conditions also have social implications. They are, to use the terminology of John Stuart Mill, not merely “self-regarding” but “other-regarding” activities as well. In fact, Americans at both ends of the last century were often reminded by government officials and business leaders of the social consequences of their putative personal recklessness. The two periods differed, however, in the particular externalities that were emphasized.

Few domestic issues so dominated public and private policy debates in the last two decades of the 20th century as the explosive rise in health care costs. Between 1980 and 2001, the cost of health care increased from a total of $247.3 billion per year, or $1,052 per person and 8.9 percent of the gross domestic product (GDP) in 1980, to $1.424 trillion or $5,035 per person and 14.1 percent of GDP in 2001. Perhaps most important, health care inflation exceeded the overall inflation rate in 17 of the last 20 years of the 20th century. Public policymakers, especially those at the state level, worried that uncontrolled health care costs were preventing them from serving the needs and demands of their citizens in other areas such as education, public safety, mass transit, and environmental protection. Similarly, corporate leaders feared that rapidly rising health care costs (e.g., health insurance) were adding to the price tag of American products and harming their competitiveness in the global marketplace.

It would be incorrect, of course, to conclude that high and rapidly increasing national health care costs were solely the result of irresponsible lifestyle choices. Other factors, including an aging population, overall inflation, costly advances in medical technology, and the very structure of health care financing have all contributed to the problem. Nevertheless, many policymakers agreed in principle and practice with the conclusion of John H. Knowles, former president of the Rockefeller Foundation, that “the greatest portion of our national [health] expenditures goes for caring for the major causes of premature, and therefore preventable, death and disability” (Knowles 1977, 75). Although no comprehensive and accurate price tag can be placed on the costs associated with risky behavior, they are certainly substantial. The Centers for Disease Control, for example, estimated that from 1995 to 1999, cigarette smoking, which was the leading cause of premature death in the United States, resulted in an annual cost of $150 billion in health-related economic losses (National Center for Chronic Disease Prevention and Health Promotion 2002). In addition, the National Institute on Alcohol Abuse and Alcoholism calculated that the economic cost of alcohol abuse in 1998 was $185 billion (National Institute on Alcohol Abuse and Alcoholism 2001).

Although there obviously were additional negative results of harmful personal habits—for example, drunken drivers kill others as well as themselves, and nonsmokers suffer adverse health effects from smokers' cigarette smoke—the focus of the national debate over the consequences of reckless personal choices was the substantial economic burden they imposed on society. As John Knowles put it: “The cost of sloth, gluttony, alcoholic intemperance, reckless driving, sexual frenzy, and smoking is now a national, not an individual responsibility. This is justified as individual freedom—but one man's freedom in health is another man's shackle in taxes and insurance premiums” (Knowles 1977, 59).

At the beginning of the last century, policymakers, academics, and opinion shapers also calculated the economic cost to society of risky behavior. In 1909, for example, an economist at Cornell University estimated that the cost to society from the “overfatigue” of workers due to “carelessness in diet or unnecessary loss of sleep” exceeded $1 billion per year (Loss from Sickness Huge 1909, 3), and some years later the Roosevelt Conservation Commission on National Vitality figured the financial loss due to preventable disease and premature mortality amounted to about $1.5 billion per year (reported in Fisher and Fisk 1916, 136). It was not the immediate material costs of individual irresponsibility that most worried public officials and social commentators at the beginning of the century, however, as much as the very survival of the nation itself. The first years of the century were marked by an extraordinary national concern about race deterioration, degradation, and suicide. A fairly characteristic, if somewhat hyperbolic, conclusion was that “the unanimous opinion of all foreign and most native observers is that the American race is degenerating, becoming lank, nervous, dyspeptic, frivolous and immoral” (Hutchinson 1909). Much of this concern, especially on the part of eugenicists, had to do with the “contamination” and “evisceration” of the Anglo-Saxon race as a result of the huge immigration into the United States of “inferior sorts” from central, eastern, and southern Europe. (Between 1900 and 1914 more than 13 million immigrants came to this country.)

The “deterioration” of the race was not merely a function of racial “contamination” and “evisceration.” It also had a great deal to do with the personally irresponsible and, in the case of alcohol abuse and prostitution, immoral behavior of both the immigrant and nonimmigrant populations. No less an early 20th-century public health luminary than Luther H. Gulick warned that “those nations which devoted their leisure time to re-creating health and building up beautiful bodies have tended to survive, while those nations which turned, in their marginal hours, to dissipation have written for us the history of national downfall” (Gulick 1909, 33). In a similar vein, Irving Fisher and Eugene Lyman Fisk connected individual “healthful living” to the future well-being of the nation by reminding Americans that they were “the trustees of the racial germ plasm that we carry” and “that we have no right, through alcoholic or other unhygienic practises, to damage it; but that, on the contrary, we are under the most solemn obligation to keep it up to the highest level” (Fisher and Fisk 1916, 165).

Nowhere was the concern about the early 20th-century relationship between lifestyle and “race suicide” more dramatically drawn than in the area of prostitution and sexually transmitted diseases (STDs). The problem was twofold. First, prostitution and STDs contributed to both the deterioration of the population and the undermining of the family. In self-indulgent and socially irresponsible fashion, men were transmitting STDs from harlots to innocent brides. The result was that “society paid for the neglect in wrecked homes, childless marriages, invalidism, blindness and insanity” (Stokes 1923, 5). A second implication of sexually transmitted diseases was that they threatened not only the nation's social fabric but also its national defense, as STDs undermined the military's ability to recruit a healthy fighting force. Indeed, during World War I, American recruiters had to reject as physically unfit a significant proportion of prospective recruits because of sexually transmitted diseases. In addition, they worried that even those who were free of disease when they entered the armed forces might fall prey to temptation and contract an STD either in training centers in this country or when they went abroad. In 1917 Secretary of the Navy Josephus Daniels urged a medical group to help the military deal with the problem because “we are fighting for the safety of democracy. Victory is jeopardized by the preventable diseases which destroy the fighting strength of armies and navies” (Daniels 1917, 16).

Although examination of the policy implications and responses to the negative consequences of lifestyle choices are beyond the scope and purpose of this article, it should be noted that in both the early and late 20th century, governments and citizens intervened to limit the freedom of individuals to make putatively foolish decisions. Examples include forbidding the existence of red-light districts adjacent to military training centers; prohibiting the production, sale, and consumption of alcoholic beverages; having to wear a seat belt when driving or riding in a motor vehicle; and barring smoking in most public places. In any case, Americans at both ends of the last century accepted that when their personal choices harmed others, the state had the right to limit their freedom in the name of the social good.

Sacred versus Secular Morality

Health and morality, as Allan Brandt and Paul Rozin noted, have always been “deeply and fundamentally entangled” (1997, 2). For example, moral value systems have historically been enlisted in health promotion campaigns as a way of identifying, legitimizing, and authenticating prescribed and proscribed health-related behaviors. Thus the presence of diseases such as gonorrhea or AIDS may be viewed as divine retribution for sinful behavior, and good health as a gift from God in recognition of living a virtuous and moral life. Wellness movements have always relied on such value systems as both a road map to and a barometer of healthy living. The difference between the early and late 20th century is not in the moralization of health but in the canonical source on which healthy-living advocates relied. It is my view that in the earlier period the text was sacred, and in the later period it was mainly secular.

As historian Robert M. Crunden explained, Progressives were a socially, demographically, and politically polyglot group with one thing in common: “In general they shared moral values and agreed that America needed a spiritual reformation to fulfill God's plan for democracy in the New World” (Crunden 1982, ix). For many Progressive commentators, these values grew largely out of a rather severe, northern European, Protestant background and were clearly evident in the moral indignation that suffused such health promotion campaigns as the “social hygiene movement” or the “purity crusade” against prostitution and sexually transmitted diseases, as well as the prohibitionists' battle against “demon rum.” Two examples illustrate the use of Christian moral values in the campaign to improve the nation's individual and collective health.

The first was the so-called Muscular Christianity movement, which began in the 19th century but took on a more “scientific” form early in the 20th century. This movement emphasized that individuals were obligated as Christians to maintain a healthy body. As one of the movement's proponents explained it:

Round shoulders and narrow chests are states of criminality. The dyspepsia is heresy. The headache is infidelity. It is truly a man's moral duty to have good digestion, and sweet breath, and strong arms, and stalwart legs, and an erect bearing, as it is to read his Bible, or say his prayers, or love his neighbor as himself.

(quoted in Whorton 1982, 281)

Muscular Christianity gained academic legitimacy at the turn of the century when it was incorporated into the emerging, university-based field of physical education. Like other Progressive Era reforms, the teaching and learning of physical education were partly driven by the need to halt the nation's perceived physical and spiritual deterioration. “Indeed,” historian James C. Whorton has observed, “much of the impetus for the advance of physical education derived from the Progressive conviction that American society had degenerated to the point where it was facing a physical and moral emergency” (Whorton 1982, 248). Thus women were urged to exercise to improve the national stock, and young men took required physical education courses to “find a motor outlet for their pent-up feelings and emotions,” particularly the corrupting and sinful temptations of sex, gambling, and alcohol (Sargent 1909/10, 14).

A second Christian-based movement was less concerned with moral and bodily reclamation through physical fitness than it was with personal willpower and Christian faith to overcome disease and illness. This particular effort was the Emmanuel Church movement, an approach to health promotion and disease prevention that combined religion, psychology, and positive thought. The movement's founders, two Boston ministers, preached the power of mind over body. Typical of their reported successes was a “hopeless alcoholic” who, after just a few sessions with Dr. Samuel McComb, one of the movement's founders, never returned to his “evil habits” (Baker 1908/9, 202). “Believe in freedom, in the power to lead a healthy, well-balanced harmonious life, and you will experience what you believe” (McComb 1908, 55).

Early 20th-century advocates of health promotion and disease prevention through lifestyle modification often relied on Christian values to legitimize and strengthen their message. Living a more healthy life was not only in the national interest, it was also the Christian (i.e., “the morally right”) thing to do. Although some people at the end of the century, especially those on the Christian right, also relied on a biblically inspired message, the moral underpinnings of the late-century health movement had a more subtle and secular moral text. Good health, while obviously prized in its own right, came to represent more than merely a state of physical (and perhaps mental) well-being. For many Americans, especially the more affluent, it symbolized a secular state of grace. As such, good health constituted affirmation of a life lived virtuously. In effect, a canon defined those who were the “chosen” people—as well as those who were not. Like the religious leaders of the beginning of the century, the keepers of the canon—the self-appointed moral elite who exhorted others to follow low-fat, high-fiber diets, not to smoke, to exercise regularly, to consume alcohol only in moderation, to practice yoga, and to drink at least eight glasses of designer-label water per day—helped determine Americans' social and economic success.

This new secular morality is part of a long historical tradition in which a priestly class establishes rituals relating to personal habits involving food and drink, sexual relations, personal hygiene and grooming, clothing, birth, and death. The purpose, or at least the interpretation, of these rituals has varied. One traditional view of the Judaic dietary laws outlined in Leviticus 11 and Deuteronomy 14 and advocated by the medieval physician Maimonides was that they were primitive rules of public health and personal hygiene. Hence, it was long assumed that such admonitions and prohibitions dealing with diet (e.g., do not eat pork or shellfish) were based solely on the dangers that these foods posed to human health. More recent interpretations, however, suggest that this analysis is incomplete (Levine 1989, 248). Thus, the biblical scholar Baruch Levine has argued that the abominations of Leviticus are part of an elaborate set of rules and rituals intended to establish the distinctiveness and, hence, holiness of the people of Israel. This, I think, is made clear in Leviticus 20:24–26:

I am the Lord your God: I have made a clear separation between you and the nations, and you are to make a clear separation between clean beasts and unclean beasts and between unclean and clean birds. You must not contaminate yourselves through beast or bird or anything that creeps on the ground, for I have made a clear separation between them and you, declaring them unclean. You must be holy to me, because I the Lord am holy. I have made a clear separation between you and the heathen, that you may belong to me. (Revised English Bible, 1989, italics added)

In effect, the highly detailed rituals and elaborate paraphernalia associated with the late 20th-century wellness movement in the United States serve much the same social purpose as did the biblical dietary and sexual prescriptions and proscriptions that were less public health measures than rules for defining the path to spiritual salvation and distinguishing the ancient Israelites from their neighbors (Leichter 1997). In other words, what emerged at the end of the 20th century was a health and wellness movement that helped set and protect social boundaries by defining acceptable and unacceptable lifestyles. Just as “pre-modern societies patrolled their boundaries with dramatic rituals of inclusion and exclusion” (Turner 1984, 224), so too do modern societies. Referring to cigarette smoking, Sherwin Feinhandler observed: “People tend to evaluate their personal association with others according to whether the others are on the inside or outside of various social boundaries… Tobacco has served to exclude people from, or to distinguish, groups—to maintain boundaries” (Feinhandler 1986, 183).

Health promotion and disease prevention at the beginning and end of the last century carried moral, and implicitly exclusionary, implications for society. The difference between the two periods lay in the nature of the moral authority on which advocates and adherents of each wellness movement relied.

Conclusion

In 1908 a physician, Dr. Pearce Baily, castigated Americans for “too much smoking, too much drinking, too much worrying, too much working” (Baily 1908). Seven decades later another physician, Dr. John Knowles, lit into us for “sloth, gluttony, alcoholic intemperance, reckless driving, sexual frenzy, and smoking” (1977, 59). Americans at both the beginning and the end of the 20th century were assigned much of the responsibility for their own ill health and premature death as the result of some personal weakness. Aside from the empirical question of whether such blame was deserved—although careful analysis at the end of the century leaves little doubt on this score—I believe that such an assignation was all but inevitable given the individualistic default in the American political culture, that is, a predisposition to venerate the individual and his or her rights and responsibilities over that of the social, economic, racial, or religious group.

Despite the common ideology of personal responsibility, or victim blaming, that characterized elite views of health issues at both the beginning and end of the 20th century, there were differences in both the substance of the debate and the objective reality in which it took place. I think a case can be made that more people at the end of the century were, and could be, better informed about, more able to participate in, and therefore more accountable for their own physical well-being than those at the beginning of the century. It is ironic that elite evaluations of the perceived efficacy of modern science and medicine became, in quite different ways and for different reasons, the justification for the self-help movements during these two eras.

Finally, I think it is important to underscore the close relationship between morality and health, on the one hand, and individual responsibility, on the other. At the start and conclusion of the 20th century, the failure to follow a responsible lifestyle with regard to one's exercise, diet, alcohol and tobacco use, and sex life carried with it moral opprobrium. While one might argue that the potential costs of irresponsible behavior in the first decades of the century (e.g., eternal damnation, or at least divine disfavor) were theoretically greater than at century's end (e.g., social ostracism), the point is that lifestyle choice took on social meaning and value beyond simply the assessment of a person's health status.

Acknowledgments

The author wishes to thank the Milbank Memorial Fund for its generous support of this project. Thanks also go to Aaron Ray, Jean Caspers, and Carol McCulley for their superb research assistance. Two anonymous readers for this journal made invaluable suggestions. Any errors of fact or analysis that remain are solely those of the author.

References

  1. Adams SH. Guardians of the Public Health. McClure's Magazine. 1908. pp. 241–52. May–October.
  2. Asbestos Pad for Dining Tables. Advertisement. Good Housekeeping. 1902;34:75. January–June. [Google Scholar]
  3. Baily P. Dr. Pearce Baily Points Out the Signs That Tell of Too Much Work. New York Times. 1908. p. 1. March 20, sec. 5.
  4. Baker RS. The Spiritual Unrest. American Magazine. 1908/9. pp. 192–205. November–April.
  5. Berger SM. Forever Young: 20 Years Younger in 20 Weeks. New York: Morrow; 1989. [Google Scholar]
  6. Blendon RJ, Benson JM. Americans' Views on Health Policy: A Fifty-Year Historical Perspective. Health Affairs. 2001. pp. 33–46. March–April. [DOI] [PubMed]
  7. Brandt AM, Rozin P. Introduction to Morality and Health. In: Brandt AM, Rozin P, editors. New York: Routledge; 1997. pp. 1–11. [Google Scholar]
  8. Butler EB. Women and the Trades: Pittsburgh, 1907–1908. New York: Charities Publication Committee; 1911. [Google Scholar]
  9. Califano J. America's Health Care Revolution. New York: Random House; 1986. [Google Scholar]
  10. Crunden RM. Ministers of Reform: The Progressives' Achievement in American Civilization, 1889–1920. New York: Basic Books; 1982. [Google Scholar]
  11. Daniels J. Men Most Live Straight If They Would Shoot Straight. Washington, D.C.: Navy Department; 1917. [Google Scholar]
  12. Delgado R. S.F. Schools Join War on Obesity, Ban Junk Food. San Francisco Chronicle. 2003. p. 1. January 15.
  13. Du Bois WEB. The Philadelphia Negro: A Social Study. New York: Schocken Books; 1899/1967. [Google Scholar]
  14. East Chicago Department of Public Health. Public Health in East Chicago, Indiana: A Study of Life Wastage from Preventable Disease and a Plea for an Adequate Health Department. East Chicago: 1916. [Google Scholar]
  15. Eisenberg DM, Davis RB, Ettner SL, Appel S, Wilkey S, Van Rompay M, Kessler RC. Trends in Alternative Medicine Use in the United States, 1990–1997. Journal of the American Medical Association. 1998. pp. 1569–75. November 11. [DOI] [PubMed]
  16. Eisenberg L. Value Conflicts in Social Policies for Promoting Health. In: Doxiadis Spyros., editor. Ethical Dilemmas in Health Promotion. Chichester: Wiley; 1987. pp. 99–116. [Google Scholar]
  17. Feinhandler SJ. The Social Role of Smoking. In: Tollison RD, editor. Smoking and Society: Toward a More Balanced Assessment. Lexington, Mass: Lexington Books; 1986. pp. 167–87. [Google Scholar]
  18. Fisher I, Fisk EL. How to Live: Rules for Healthful Living Based on Modern Science. New York: Funk & Wagnalls; 1916. [Google Scholar]
  19. Freymann JG. Medicine's Great Schism: Prevention vs. Cure: An Historical Interpretation. Medical Care. 1975. pp. 525–36. July. [DOI] [PubMed]
  20. Godfrey H. The Food of the City Worker. Atlantic Monthly. 1909. pp. 267–77. January–June.
  21. Goldstein MS. The Health Movement: Promoting Fitness in America. New York: Twayne; 1992. [Google Scholar]
  22. Green H. Fit for America: Health, Fitness, Sport and American Society. Baltimore: Johns Hopkins University Press; 1986. [Google Scholar]
  23. Groedel JM. How to Avoid Heart Troubles. Good Housekeeping. 1901. pp. 39–42. January–June.
  24. Gulick LH. Popular Recreation and Public Morality. American Academy of Political and Social Science. 1909. p. 33. July–December.
  25. Habitual Pie-Eating Ruining Health of Children. San Francisco Chronicle. 1910. p. 3. April 4.
  26. Hendrick BJ. Some Modern Ideas on Food. McClure's Magazine. 1909/10. pp. 653–69. November–April.
  27. Hirshberg LK. Popular Medical Fallacies. American Magazine. 1906. pp. 655–60. May–October.
  28. Hopkins PN, Williams RR. A Survey of 246 Suggested Coronary Risk Factors. Arteriosclerosis. 1981;40:1–52. doi: 10.1016/0021-9150(81)90122-2. [DOI] [PubMed] [Google Scholar]
  29. Hutchinson W. Evidence of Race Degeneration. Annals of the American Academy of Political Science. 1909. p. 43. July.
  30. Iglehart JK. From the Editor. Health Affairs. 1990;9(2):4–5. doi: 10.1377/hlthaff.9.3.4. [DOI] [PubMed] [Google Scholar]
  31. Knopf SA. Relation of the Medical Profession in the Twentieth Century to the Tuberculosis Problem. Journal of the American Medical Association. 1906. pp. 1679–83. January–June.
  32. Knowles JH. The Responsibility of the Individual. In: Knowles JH, editor. Doing Better and Feeling Worse: Health in the United States. New York: Norton; 1977. pp. 57–80. [Google Scholar]
  33. Kober GM. Review of “Scientific Books. Science. 1901. p. 730. July–December.
  34. Kulish N. Putting a Price on Health. Wall Street Journal. 2000. p. R16. May 1.
  35. Leichter HM. Free to Be Foolish: Politics and Health Prevention in the Untied States and Great Britain. Princeton, N.J.: Princeton University Press; 1991. [Google Scholar]
  36. Leichter HM. Lifestyle Correctness and the New Secular Morality. In: Brandt AM, Rozin P, editors. Morality and Health. New York: Routledge; 1997. pp. 359–78. [Google Scholar]
  37. Levine BA. Leviticus. New York: Jewish Publication Society; 1989. [Google Scholar]
  38. Loss from Sickness Huge. New York Times. 1909. October 22, 3.
  39. McComb S. Heredity and Will Power. Good Housekeeping. 1908. pp. 53–5. January–June.
  40. McGinnis JM, Foege W. Actual Causes of Death in the United States. Journal of the American Medical Association. 1993;270(19):2207–12. [PubMed] [Google Scholar]
  41. Minkler M. Health Education, Health Promotion and the Open Society: An Historical Perspective. Health Education Quarterly. 1989;16(1):20. doi: 10.1177/109019818901600105. [DOI] [PubMed] [Google Scholar]
  42. National Center for Chronic Disease Prevention and Health Promotion. MMWR—Annual Smoking Attributable Mortality, Years of Potential Life Lost, and Economic Costs—United States, 1995–1999. 2002. [accessed June 30, 2003]. Available at http://cdc.gov/tobacco/research.
  43. National Institute on Alcohol Abuse and Alcoholism. Alcohol Alert. 2001. [accessed June 30, 2003]. (January) Available at http://www.niaa.nih.gov/pulicationsaa51.htm.
  44. O'Neill M. As the Rich Get Leaner, the Poor Get French Fries. New York Times. 1992. pp. C1–C6. March 18.
  45. Patterson JT. America's Struggle against Poverty in the Twentieth Century. Cambridge, Mass: Harvard University Press; 2000. [Google Scholar]
  46. Pear R. Clinton to Order Steps to Reduce Medical Mistakes. New York Times. 2000. p. A15. February 22.
  47. Pear R. Emphasize Disease Prevention, Health Secretary Tells Insurers. New York Times. 2002. p. A14. January 22.
  48. Resek C. Introduction to The Progressives. In: Resek C, editor. Indianapolis: Bobbs-Merrill; 1967. pp. xi–xxxiii. [Google Scholar]
  49. Rosner D, Markowitz G. Deadly Dust: Silicosis and the Politics of Occupational Disease in Twentieth-Century America. Princeton, N.J.: Princeton University Press; 1991. [Google Scholar]
  50. Rothman SM. Living in the Shadow of Death: Tuberculosis and the Social Experience of Illness in American History. Baltimore: Johns Hopkins University Press; 1994. [Google Scholar]
  51. Rucker ST. The Strenuous Life and Its Effects in Disease. Journal of the American Medical Association. 1906. pp. 1839–40. January–June.
  52. Sargent DA. The Future of Physical Education. McClure's Magazine. 1909/10. pp. 14–20. October–April.
  53. Schlesinger M. A Loss of Faith: The Sources of Reduced Political Legitimacy for the American Medical Profession. Milbank Quarterly. 2002;80(2):185–235. doi: 10.1111/1468-0009.t01-1-00010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Schlosser E. Fast Food Nation: The Dark Side of the All-American Meal. Boston: Houghton Mifflin; 2001. [Google Scholar]
  55. Sedgwick WT. The Call to Public Health. Science. 1908. pp. 193–202. July–December. [DOI] [PubMed]
  56. Sinclair U. The Jungle. New York: Vanguard Press; 1906. [Google Scholar]
  57. Stokes JH. Today's World Problem in Disease Prevention. Ottawa: F.A. Acland; 1923. [Google Scholar]
  58. Tomes N. The Gospel of Germs: Men, Women and the Microbe in American Life. Cambridge, Mass: Harvard University Press; 1998. [PubMed] [Google Scholar]
  59. Turner BS. The Body and Society: Explorations in Social Theory. Oxford: Blackwell; 1984. [Google Scholar]
  60. U.S. Department of Health, Education and Welfare. Healthy People: The Surgeon General's Report on Health Promotion and Disease Prevention. Washington, D.C.: U.S. Government Printing Office; 1979. [Google Scholar]
  61. Whorton JC. Crusaders for Fitness: The History of American Health Reformers. Princeton, N.J.: Princeton University Press; 1982. [Google Scholar]
  62. Williams WA. The Contours of American History. New York: New Viewpoints; 1973. [Google Scholar]

Articles from The Milbank Quarterly are provided here courtesy of Milbank Memorial Fund

RESOURCES