Abstract
“Undone science” refers to areas of research that are left unfunded, incomplete, or generally ignored but that social movements or civil society organizations often identify as worthy of more research. This study mobilizes four recent studies to further elaborate the concept of undone science as it relates to the political construction of research agendas. Using these cases, we develop the argument that undone science is part of a broader politics of knowledge, wherein multiple and competing groups struggle over the construction and implementation of alternative research agendas. Overall, the study demonstrates the analytic potential of the concept of undone science to deepen understanding of the systematic nonproduction of knowledge in the institutional matrix of state, industry, and social movements that is characteristic of recent calls for a “new political sociology of science.”
Keywords: undone science, research agendas, social movements, environmental health, science policy
Since the 1980s, the modern university has undergone a well-recognized diversification from publicly funded research to an increasing emphasis on private funding sources, technology transfer, and economic competitiveness (e.g., Kleinman and Vallas 2001; Slaughter and Rhoades 2004). A corresponding diversification in science and technology studies (STS) has led to renewed attention to the role of extrainstitutional factors such as states, industries, and social movements in the shaping of scientific research fields and technological design choices (Klein and Kleinman 2002; Frickel and Moore 2006a, 2006b). Among the changes that this “new political sociology of science” brings to STS is a shift of attention from the microsociological accounts of how knowledge and technologies are constructed to the mesosociological and macrosociological political and institutional organization of scientific knowledge and science policy. Here, analytical concern centers on distributional inequalities in technoscience and the ways that formal and informal manifestations of power, access to resources, relations among organizations, and procedures for rule making create losers as well as winners and explain both institutional stasis and change. For example, why does science pay dividends more often to some groups than to others? What explains the selection of certain areas of scientific research and technological design choices and the neglect of others? This shift in focus to the institutional politics of knowledge and innovation brings into sharper relief the problem of “undone science,” that is, areas of research identified by social movements and other civil society organizations as having potentially broad social benefit that are left unfunded, incomplete, or generally ignored.
This article brings together four recent studies to elaborate the concept of undone science and move forward the more general project of a political sociological approach to the problem of research priorities and scientific ignorance. Three of the four studies cluster in the area of environmental science and technology: the development of alternatives to chlorinated chemicals, better understanding of toxic exposure to air pollution through alternative air monitoring devices, and the environmental etiology of cancer. The fourth study is based on interviews with scientists from a wide range of academic disciplines about forbidden knowledge. Taken together, the research demonstrates the analytic potential of undone science to extend and deepen the new political sociology of science by providing a political sociological perspective on the problem of research agendas and more general issues of the construction of knowledge and ignorance. We begin with a brief review of the existing literature. Our discussion highlights some of the basic contours that the case studies reveal about undone science and that in turn can guide future research.
1. Background
The concept of undone science locates the systematic nonproduction of knowledge in the institutional matrix of governments, industries, and social movements characteristic of the political sociology of science. Specifically, Hess (2007) has been concerned with the absences of knowledge that could have helped a social movement or other civil society organization to mobilize the intellectual resources needed to confront an industrial and/or political elite that, from the perspective of the challenging organization, is supporting policies that are not broadly beneficial, either to the general society and environment or to the historically disempowered groups (Woodhouse et al. 2002; Hess 2007). Because elites set agendas for both public and private funding sources, and because scientific research is increasingly complex, technology-laden, and expensive, there is a systematic tendency for knowledge production to rest on the cultural assumptions and material interests of privileged groups. However, it is only a tendency. Given the opportunities created by a diversity of funding sources, divisions among elites, differences among social movement organizations, the limited and partial autonomy of the scientific field (Bourdieu 2004), and the potential for some research projects to be completed without extramural funding, there is some room for research that supports social movement perspectives on research agendas, even when the research conflicts with the interests of elites. Nevertheless, because research fields themselves are constituted by agonistic relations between dominant and nondominant networks, even when “undone science” is completed, the knowledge may become stigmatized and the credibility and standing of scientists who produce it may suffer (Hess 2007).
Contemporary discussions of undone science have various precedents. In some ways, Marx’s critique of political economy and his effort to develop an alternative research field of Marxist political economy was an early exploration of undone science, in that Marx both critiqued the assumptions of mainstream economics and developed a framework for alternatives within the field (Marx 1967). In a similar vein, feminist research and multicultural science studies have highlighted the systematic lack of attention paid to gender, race, and related issues in science. Feminist research has also described how gender-laden assumptions shape the development of research programs and, like Marxist scholarship, has proposed alternative research frameworks and programs (e.g., Haraway 1989; Harding 1998; Forsythe 2001).
Historical research highlights the institutional constraints of completing undone science. Of particular relevance to the new political sociology of science is the study of how the contours of entire disciplines or research programs have been shaped by military and industrial funding priorities, and consequently how some subfields have been left to wither on the vine while others have been well tended by government and industrial funding sources (e.g., Noble 1977; Forman 1987; Markowitz and Rosner 2002). Historians and others have also offered detailed investigations of the dynamics of intellectual suppression and purposeful policy decisions to avoid some areas of research, usually research that would challenge powerful industrial interests (MacKenzie and Spinardi 1995; Zavestoski et al. 2002; Martin 2007). In the emerging literature on the social production of ignorance or what some historians have called “agnotology” (Proctor and Schiebinger 2008), additional studies of particular relevance examine the industrial funding of contrarian research to generate a public controversy and scientific dissensus (Proctor 1995), the role of the government and industry in rendering knowledge invisible by producing classified knowledge and trade secrets (Galison 2004), and problems of imperceptibility for chemically exposed groups (Murphy 2006).
Functionalist and constructivist sociologies of science have also contributed indirectly to the understanding of undone science, primarily through discussions of the epistemic status of ignorance and uncertainty. Merton (1987) identified “specified ignorance” as knowledge that researchers have about topics that deserve further inquiry. Zuckerman (1978) also noted that theoretical commitments, or what Kuhnians would call “paradigms,” could result in decisions by scientists to characterize some areas of specified ignorance as not worth studying. The sociology of scientific knowledge also examined the role of uncertainty and interpretive flexibility in the generation and resolution of controversies, both within the scientific field and in broader public fora (e.g., Collins 1985, 2002). In critical analyses of risk assessment and statistical analysis, STS scholars have also brought out the unanticipated consequences of broader forms of ignorance that are not considered within the horizon of standard risk assessment practices (Hoffmann-Riem and Wynne 2002; Levidow 2002). Sociologists have also examined the production of the “unknowable,” as occurred when claims were made that an accurate count of ballots for the 2000 U.S. presidential election was impossible (Hilgartner 2001), and “regulatory knowledge gaps,” which are among the unintended consequences of the U.S. Environmental Protection Agency’s (EPA) environmental testing program in New Orleans following Hurricane Katrina (Frickel 2008; Frickel and Vincent 2007). Gross (2007, 2009) has drawn on the general sociology of ignorance to distinguish various forms of scientific ignorance, including nonknowledge, or known unknowns that are considered worth pursuing; negative knowledge, or knowledge deemed dangerous or not worth pursuing; and “nescience,” or a lack of knowledge about the unknown, a form of ignorance that is a precondition for a surprise because it is an unknown unknown.1 In Gross’s terms, undone science is a type of nonknowledge when viewed from the perspective of social movements, but from the perspective of some research communities and elites, it may be viewed as negative knowledge.
In an effort to map in more detail the concept of undone science, this study summarizes four research projects. The four studies are based primarily on semistructured interviews and/or participant-observation, which are appropriate methodological choices given the exploratory nature of the research and the need, at this stage, to understand the dimensions and features of undone science. The following sections summarize the aspect of these four independently designed research projects that have encountered the phenomenon of undone science. Because social movement and other civil society organizations have frequently encountered a deficit of research on health and environmental risks associated with exposure to industrial pollutants, it is not surprising that three of the cases considered here focus on the health and environmental sciences. The question of generalizability across various scientific research fields cannot be resolved in this study; our goal is the preliminary one of mapping and exploring undone science.
2. Regulatory Paradigms, Dyads, and the Undoable
Howard’s research on the “chlorine sunset” controversy is based on interviews and document analysis. He conducted twenty-seven semistructured interviews, lasting an hour on average, with staff members of federal regulatory agencies in the United States and Canada, staff members of the International Joint Commission (IJC), members of the Great Lakes Science Advisory Board, staff members or individuals otherwise associated with nongovernmental organizations (NGOs), academic or governmental members of the industrial ecology or green chemistry communities, and industrial chemists in industry and academia. A number of transcripts were supplemented with additional information from follow-up correspondence. Documents analyzed included (1) reports, press releases, Web documents, and other materials published by NGOs, the chemical industry, and federal agencies; (2) articles and commentaries in newspapers and popular and trade magazines; (3) research articles and commentaries in scholarly anthologies and peer-reviewed scholarly journals; (4) books written by key actors; and (5) transcripts of Congressional testimony.
A little-studied controversy involving one of the major branches of industrial chemistry documents a striking example of undone science and illustrates the role it can play in structuring conflict between competing regulatory paradigms. Much of the controversy has centered on the Great Lakes region, where extensive chemical manufacturing and contamination has occurred; where scientists have documented threats to wildlife and humans from persistent, toxic, industrial chlorinated pollutants; where extensive citizen activism has emerged around this threat; and where a quasigovernmental advisory body has assumed a leadership role in addressing this concern (Botts et al. 2001). A number of environmental and health advocates have argued, based both on fundamental toxicology and on long historical experience with chlorinated synthetic chemicals (e.g., DDT and PCBs), that the entire class of thousands of such substances should be tentatively presumed dangerous and that the chemical industry accordingly should wean itself from most major uses of chlorine (Thornton 1991, 2000; International Joint Commission [IJC] 1992; see Howard 2004). The analysis offered here briefly considers the character and function of undone science in the debate provoked by proposals for a “chlorine sunset.”
The chlorine sunset controversy revolves around conflict between two sharply contrasting regulatory paradigms: risk and precaution (Thornton 2000; Howard 2004). The powerful chemical industry has coevolved with, supports, and is supported by the dominant U.S. and Canadian environmental regulatory regime, which restricts chemical industry decision making only to the extent that detailed calculation of risk indicts individual chemical substances. Meanwhile, Greenpeace, a marginalized, reputedly radical environmental NGO, and the IJC, a prominent but marginalized binational advisory organization, argued for a regulatory regime based on the precautionary principle (see Tickner 2003), which in their view justified governmental action against an entire class of industrial chemicals. The dominant paradigm assumes the unit of analysis to be the individual substance and places the burden of proof on the public to prove harm; in contrast, the challenger paradigm allows, even requires, the primary unit of analysis to be the entire class of substances and places the burden of proof on corporate officials. Within this matrix of political and epistemological conflict, the political economy and political sociology of undone science can be seen to revolve around a series of three dyads, each paradigm implying parallel formulations of “done science” and undone science. The three dyads are summarized in Table 1.
Table 1. Dyads of Done, Undone, Undoable Chlorine Science in Dominant and Challenger Paradigms.
Regulatory Paradigm | What Is Done or Would Be Done? | What Remains Undone? |
---|---|---|
Risk (dominant) | Ad hoc identification of unsafe chlorine chemicals (explicit role for government) | Systematic identification of unsafe chlorine chemicals (implicit role for government) |
Systematic development of chlorine chemicals (explicit role for industry) | Systematic development of nonchlorine alternatives (implicit role for government) | |
Precaution (challenger) | Systematic development of nonchlorine alternatives (explicit role for industry) | Ad hoc identification of essential and safe chlorine chemicals (explicit role for industry) |
One dyad appears in the context of health impacts research. Industry and federal officials operating in the risk paradigm hold that the legitimate goal of health impacts research performed or mandated by government is ad hoc identification of individual chlorinated chemicals that cannot be safely manufactured and used. In this paradigm, chlorine chemistry itself is seen as immune to fundamental interrogation; the role of public science is limited to documenting the odd substance that can be definitively proven harmful and, on that basis, restricted. “We’ve made the point over and over again that you have to look at each product’s physical and chemical characteristics to draw conclusions about what it is going to do in the environment,” argued Brad Lienhart, of the Chlorine Chemistry Council. To do otherwise would be to “[make] non-science—or nonsense—into science” (quoted in Sheridan 1994, 50).
Beginning in the early 1990s, “sunset” proponents vigorously argued that such research is incapable of interrupting a long series of chlorinated “Pandora’s poisons” from entering the environment and human tissues long before their deleterious effects are documented. Inevitably remaining undone, they argued, is science capable of systematically identifying unsafe chemicals from among tens, perhaps hundreds, of thousands of chlorinated industrial substances, by-products, and breakdown products, a scope of research that the risk paradigm is sometimes assumed to provide but, owing to the sheer enormity of the undertaking, cannot. The government’s effort to identify unsafe chlorinated chemicals is ad hoc precisely because it cannot, in any meaningful sense, be systematic; not only are available resources insufficient, but the enterprise is technically infeasible. Viewed in this light, the science is undoable. The IJC argued:
There is a growing body of evidence that [suggests that] these compounds are at best foreign to maintaining ecosystem integrity and quite probably persistent and toxic and harmful to health. They are produced in conjunction with proven persistent toxic substances. In practice, the mix and exact nature of these various compounds cannot be precisely predicted or controlled in production processes. Thus, it is prudent, sensible and indeed necessary to treat these substances as a class rather than as a series of isolated, individual chemicals. (IJC 1992, 29)
A second dyad appears in the risk paradigm’s stance on innovation. Industry has systematically pursued the development of chlorine chemistry, developing chlorinated chemicals and expanding markets for them; meanwhile, advocates of chlorine precaution have pointed to the need to systematically develop nonchlorine alternatives. This is in part science that the risk paradigm has long left undone—historical research and development trajectories that could have led to a wider range of nonchlorine chemicals and processes being available today. The implication of the historical analysis offered by a leading sunset proponent (Thornton 2000; see also Stringer and Johnston 2001) is that over the past century the technological, economic, and political momentum of chlorine chemistry has to some extent bent the overall industry research and development agenda toward chlorine and away from nonchlorine alternatives. Here undone science consists of a body of nonchlorine chemicals and processes that might now exist but for the long dominance of research and development predicated on chlorine. It is a point seemingly acknowledged by a confidential IJC informant who did not support the commission’s sunset recommendation: “There’s no reason why we couldn’t, as a global society, live a non-chlorine lifestyle. It’s just, you know <laughs>, that ain’t gonna happen, because that is not our history! We’re kind of, in a way, captives of our past.”
In the risk paradigm, with its laissez-faire orientation, such research and development need not be undertaken by the industry but instead is tacitly left to whichever agency or organization might care to undertake it. Viewed from the vantage point of the industry, with its adamantine conception of chlorine chemistry as technologically and economically inevitable, the only conceivable motivation for conducting such research and development would be some kind of ideological fetish (see, e.g., Chlorine Chemistry Council n.d.). It would represent “a veiled attempt to return to a pre-industrial Eden,” one industry supporter suggested (Amato 1993). Crucially, although this agenda would have been and would now be technically feasible, such research would be hobbled by the absence of a sizable cadre of technoscientists devoted to the project and by a lack of financial resources to sustain the effort.
A third dyad occurs within the challenger, precautionary paradigm and directly counters the values and priorities of the dominant paradigm’s dyads. Paired with precaution advocates’ assertion of the need for research to systematically develop nonchlorine alternatives—here seen as industry’s responsibility rather than the public’s—is an explicit assertion that industry should assume the burden of making the case for any specific chlorinated chemicals (or chemical processes) that can be demonstrated to be both essential (i.e., nonsubstitutable) and capable of being manufactured and used in ways that (to some as yet unstated standard) pose no significant environmental hazard. Industry’s motivation for undertaking this latter effort would, of course, be profit. And owing to the presumably quite limited number of substances to be evaluated, it would be both technically feasible and, given the industry’s substantial financial and technical resources, affordable.
The chlorine sunset controversy is now effectively dormant. In the face of bitter industry resistance and U.S. and Canadian governmental intransigence, the IJC and Greenpeace ceased promoting their sunset recommendations in the mid-1990s (Howard 2004). Thornton’s book, which appeared in 2000, reawakened (and in significant ways deepened) the debate, but it did so only briefly. The sunset proposals have not visibly shifted policy at any level in North America. A major international treaty on persistent organic pollutants signed in 2001 represented an important victory for activists, but it also underscored the lingering, unresolved character of the chlorine debate: all twelve of the “dirty dozen” substances it required to be phased out are chlorinated compounds, and each was targeted on the basis of its discreet, well-documented characteristics. Meanwhile, thousands of far less extensively studied chlorinated chemicals—and chlorine chemistry as a whole—remain unregulated.
This analysis of the chlorine sunset controversy illustrates how regulatory regimes influence the construction and articulation of research priorities. In this case, advocates of the risk and precaution paradigms, on the basis of competing understandings of the appropriate unit of regulatory analysis and appropriate regulatory burden of proof, promote competing conceptualizations of science both done and undone. More specifically, the case suggests that done and undone science in such a controversy can be understood as occurring in dyadic pairs and that a major role for challenger discourses is making the implicit undone portion of dyads within the dominant paradigm visible and explicit. This analysis also highlights an important category of undone science in technoscience controversies—undoable science—that improves understanding of how regulatory regimes constrain the identification of undone science. Here, close examination of precautionary advocates’ critique of the risk paradigm clarifies the process through which conventional regulatory structures veil undoable science in the form of systematic research for which insufficient resources and insufficient technical means are available.
3. Standards as Solutions to and Sources of Undone Science
Ottinger’s research on community-based air monitoring as a strategy for producing knowledge about environmental health hazards is based primarily on participant-observation in two environmental justice NGOs: Communities for a Better Environment (CBE) in Oakland, California, and the Louisiana Bucket Brigade in New Orleans, Louisiana (Ottinger 2005). As part of her ethnographic fieldwork, she devoted ten hours per week as a technical volunteer (Ottinger has a background in engineering) for each organization during two consecutive years between 2001 and 2003. At both organizations, her participation involved researching a variety of air monitoring strategies and developing tools for interpreting results from those methods. Her study is also informed by semistructured interviews of one to two hours each. She interviewed thirteen scientist-activists, community organizers, and community residents in California and more than forty activists, regulators, and petrochemical industry representatives in Louisiana. The interviews addressed organizing and community-industry relations, broadly defined, and frequently touched on issues related to ambient air monitoring techniques, with about one-third taking air monitoring as a primary theme.
The case of community-friendly air monitoring involves similar issues of undone science and regulatory politics to those discussed for the chlorine controversy, but at a grassroots, community level. In communities adjacent to refineries, power plants, and other hazardous facilities, known as “fenceline communities,” residents suspect that facilities’ emissions of toxic chemicals cause serious illnesses. However, there is a dearth of scientific research that could illuminate, in ways credible to residents, the effects of industrial emissions on community health (Tesh 2000; Allen 2003; Mayer and Overdevest 2007). The use of air sampling devices known as “buckets” provides one avenue for addressing issues of undone environmental health science. With the low-cost, easy-to-operate devices, fenceline community residents and allied environmental justice organizers measure concentrations of toxic chemicals in the ambient air, collecting data about residents’ exposures that is necessary (though not sufficient) to understanding chemical health effects. Designed in 1994 by a California engineering firm and adapted for widespread dissemination by Oakland-based non-profit CBE, the buckets “grab” samples of air over a period of minutes. By taking short samples, buckets can document chemical concentrations during periods when air quality is apparently at its worst—when a facility is flaring or has had an accident, for example—providing otherwise unavailable information about residents’ exposures during pollution peaks.
Both activists’ strategies for air monitoring and experts’ responses to activist monitoring are significantly shaped by agreed-upon procedures for collecting and analyzing air samples and interpreting their results. When measuring levels of toxic chemicals in the ambient air, regulatory agencies and chemical facilities routinely use stainless steel Suma canisters to collect samples, which are then analyzed using a method specified in the Federal Register as Federal Reference Method (FRM) TO-15. Although the canisters can be used to take short-term samples, when regulators want to represent air quality broadly, samples are taken over a twenty-four-hour period every sixth day. Where they exist, regulatory standards for air quality form the context for interpreting the results. Louisiana, one of only two U.S. states with ambient air standards for the individual volatile organic chemicals measured by FRM TO-15, specifies eight-hour or annual averages that ambient concentrations are not to exceed; monitoring data are compared to these standards to determine whether air quality poses a potential threat to public health.2
Specifying how air toxics data are to be collected and interpreted, these formal (e.g., FRM TO-15) and informal (e.g., the twenty-four-hour, sixth day sampling protocol) standards shape how bucket data are received by regulatory scientists and chemical industry officials. First, they act as a boundary-bridging device; that is, the standards help to render activists’ scientific efforts recognizable in expert discourses about air quality and monitoring.3 Although activists and experts collect their samples with different devices—buckets for activists, Suma canisters for experts—both strategies rely on air sampling to characterize air quality and both use FRM TO-15 to analyze the samples. The shared analytical method makes the results of individual bucket samples directly comparable to those of canister samples. Moreover, because activists use the FRM, an EPA laboratory in California was able to conduct quality assurance testing early in the bucket’s development, allowing activists to refute charges that chemicals found in bucket samples were somehow an artifact of the sampling device and to claim, more generally, that the bucket was an “EPA-approved” monitoring method.
To the extent that the standards, particularly the FRM, serve a boundary-bridging function, they help undone science get done: they allow data from an alternate method of measuring air quality, bucket monitoring, to circulate with some credibility among experts and, consequently, to address questions of pressing concern to community members but hitherto ignored by experts. Activists’ monitoring with buckets has even prompted experts to undertake additional monitoring of their own. For example, in Norco, Louisiana, where resident-activists used buckets to document very high concentrations of toxic compounds in their neighborhood, Shell Chemical in 2002 began an extensive ambient air monitoring program (Swerczek 2000).4
Simultaneously, however, standards for air monitoring serve a boundarypolicing function: the same suite of regulatory standards and routinized practices that give buckets a measure of credibility also give industrial facilities and environmental agencies a ready-made way to dismiss bucket data. Specifically, ambient air standards are typically expressed as averages over a period of hours, days, or years.5 Bucket data, in contrast, characterizes average chemical concentrations over a period of minutes. Environmental justice activists nonetheless compare results of individual samples to the regulatory standard—asserting, for example, that a 2001 sample taken near the Orion oil refinery in New Sarpy, Louisiana, showed that “the amount of benzene in the air that day was 29 times the legal limit” (Louisiana Bucket Brigade 2001)—but experts vehemently reject such claims. In a 2002 interview, Jim Hazlett, part of the Air Quality Assessment division of the Louisiana Department of Environmental Quality, complained about activists’ inaccurate use of bucket data:
You can’t really take that data and apply it to an ambient air standard . . . . So we see a headline, the citizen group over here found a, took a sample and found benzene that was 12 times the state standards. Well, it’s not true. I’m sorry, but that’s not what it was.
In the view of Hazlett and other experts, only the average concentrations of regulated chemicals can be meaningfully compared to the standards and thus contribute to determining whether air pollution might pose a threat to human health.
Ambient air standards, and the average-oriented air sampling protocols that they require, thus prove to be a mechanism for policing the boundary between activists’ and experts’ claims about air quality, marking experts’ data as relevant and activists’ data as irrelevant to the assessment of overall air quality, to the determination of regulatory compliance, and to discussions of chemical plants’ long-term health effects. As boundary-policing devices, standards circumscribe activists’ contributions to doing undone science. To the extent that bucket monitoring has resulted in increased enforcement activity by regulators (O’Rourke and Macey 2003) or additional ambient air monitoring by industrial facilities, the additional monitoring has been undertaken to confirm activists’ results, track the causes of the chemical emissions, and fix what are assumed to be isolated malfunctions but usually not to query the possibility that routine industrial operations might pose systematic threats to community health. Even Shell’s program in Norco, which collects rare data on chemical concentrations in a fenceline community, is oriented to long-term averages and thus does not shed light on the potential effects of the pollution spikes that occur with regularity as a result of flaring and other unplanned releases.
As in the chlorine sunset controversy case, the example of bucket monitoring demonstrates how regulatory systems shape conflicts over undone science, even at the local level of community-based research and activism. In this instance, efforts by neighborhood activists (and other outsiders to science) to see undone science done in their own backyards illustrate the asymmetrical operation of regulatory standards and standardized practices. Air monitoring standards function as boundary-bridging devices that enable activist use of an alternative, more cost-effective method and therefore help address an aspect of environmental health science left undone by experts. However, standards also serve as boundary-policing devices. These reinforce experts’ authority to define how health risks in fenceline communities should be evaluated, shutting down debates over fundamental research questions and associated methodological approaches—debates, for example, over whether average or peak concentrations of air toxics are most relevant to their determining health effects. Because it is exactly these debates that activists would, and must, provoke to shift scientific research priorities, the standards’ boundary-policing aspect tends to dominate most locally organized attempts to counter undone science. However, this case also illustrates the importance of standards’ boundary-bridging aspects that enable community activists to actually and forcefully enact shifts in research priorities, rather than merely advocate for alternative scientific agendas.
4. Diversity Within Movements and Research Fields
Gibbon’s research is based on ethnographic fieldwork, ongoing since 1999, that examines the social and cultural context of developments in breast cancer genetics in the United Kingdom. The larger study addresses how the knowledge and technologies associated with breast cancer genetics are put to work inside and outside clinical settings, at the interface with a culture of breast cancer activism (see Gibbon 2007). The discussion presented here draws on fieldwork conducted in a leading high-profile U.K. breast cancer research charity between 1999 and 2001 and again in 2005–2006. The fieldwork involved the analysis of promotional documents produced by the organization, participant-observation of a range of events, and more than forty-five in-depth semistructured interviews and five focus groups with the organization’s fundraisers, advocates, scientists, and staff.
Given the exponential growth in lay/patient and public activism in relation to breast cancer in the last twenty to thirty years (Klawiter 2004; Gibbon 2007), this would seem to be an arena where we might expect to see challenges related to undone science. In one sense, the rapid expansion in breast cancer activism has achieved much to reduce the space of undone science in breast cancer. Like AIDS activism in the 1990s, so-called breast cancer activism is often held up as an exemplary instance of successful collective lay/public/patient mobilization that has helped to raise awareness of the disease, promote a discourse of female rights, and redress gendered inequities in scientific research and health provision (e.g., Anglin 1997; Lerner 2003). It would from this perspective seem potentially to be a clear example of epistemic modernization, where research agendas may be opened up to the scrutiny of lay/patient/public communities (Hess 2007).
Yet paradoxes abound in an arena where growing collective awareness of the disease also helps ensure that the management of risk and danger is the burden of individual women (Kaufert 1998; Fosket 2004; Klawiter 2004). The situation reflects what Zavestoski et al. (2004) have referred to as the “dominant epidemiological paradigm” of breast cancer, one that strongly informs the parameters of scientific research and medical intervention by focusing on lifestyle and/or the genetic factors of individuals and that has engendered some resistance from civil society groups. In the United States, for example, recent lobbying efforts to draw attention to alternative strategies for breast cancer have involved collaborations between specific cultures of breast cancer and broader environmental justice movements (Di Chiro 2008) in pursuit of what Brown and colleagues term a “lab of one’s own” (2006). Nevertheless, breast cancer activism is characterized by diverse cultures, and consequently, the issue of undone science is also disjunctured and differentiated within national and across international arenas. Despite the growth of health activism around breast cancer research, environmental risk factors in breast cancer etiology remain one domain of undone science that continues to be marginalized in mainstream discourse.
The particular institutional parameters that serve to sustain the space of undone science in breast cancer are illustrated by examining the predominant culture of patient and public activism in the United Kingdom. In this context, understanding how breast cancer activism operates to preserve undone science requires paying attention not only to the marginalization of environment-focused breast cancer activism (Potts 2004) but also to an institutionalized culture of cancer research, where breast cancer activism can reference and symbolize quite different activities (Gibbon 2007). Since the early part of the twentieth century, cancer research in the United Kingdom has been rooted in an institutional culture of first philanthropic donation and then charitable fundraising, helping ensure a public mandate influencing patterns of research in cancer science (see Austoker 1988). Like earlier public mobilization around the so-called wars on tuberculosis and polio, the “war” fought by the cancer charity establishment in the United Kingdom has proved not only a resilient cultural metaphor (Sontag 1988) but also a reflection of ongoing public support and investment in cancer research. As a result, cancer research in the United Kingdom is mostly sustained as a modernist project waged by a scientific community, focused on a cure (Löwy 1997) and supported by cancer charities that are funded significantly by public resources in the form of voluntary donations.
The influences of this project on undone breast cancer science are visible within a high-profile breast cancer research charity, where narratives of involvement and identification reveal the scope of activism, the ways that this institutional culture informs the parameters of civic engagement, and how activists’ engagement with research is limited to certain areas of activities. In one instance, for example, a group of women responded to the meaning of “involvement” in ways that mixed the morality of fundraising with campaigning work and also with moral sentiments such as “giving something back,” “helping make a difference,” or somehow “being useful,” as this excerpt illustrates:
I was in the middle of treatment, chemotherapy, and I just happened to read—it was October—and I happened to read an article in a magazine, I think the launch of their [the charity’s] £1,000 challenge. And at that point I was feeling [a] sort of a wish, a need, to put something back . . . . And I got the certificate and I got invited to the research center … there was something that drew me to it . . . . So [it] was mainly fundraising, but I could feel something could develop there. So at one point I said to one of the girls on the fundraising team, “Can I help in a voluntary way? I’ve got skills I’m not using, particularly proofreading, editing, language leaflets, making things clear.” And then it seemed to be very useful, from a “Joe public” point of view. And it’s developed into almost like a little job; it’s given me a whole new life … and I feel like I’m putting something back. And my life has value . . . . So, it’s terrific. Really, it’s terrific.
Although often difficult to tease apart fundraising as a form of activism and the highly successful marketing strategies of the charity, narratives such as the one above suggest that lay/civic engagement in breast cancer research does little to challenge a traditional expert/lay dynamic. Instead, women became “involved” mostly in the pursuit of reproducing and sustaining traditional parameters of scientific expertise.
Such activism has been constituted through “heroic” acts of fundraising, which were in turn wedded to the pursuit of basic science genetic research, collectively situated as a form of “salvationary science”(Gibbon 2007, 125). This continues to be a salient motif for engagement in the charity, with very few women seeing their involvement in terms of influencing a research agenda or affecting the research priorities of the charity. Although a number of women interviewed spoke of being involved in a charity in terms of “campaigning” or being active around the “politics of health care,” their narratives exhibited a general lack of interest in influencing scientific research and a strong feeling about the inappropriateness of “stepping on the toes of the scientists.” As two interviewees put it:
I don’t think any of us would push it in anyway, because we can’t appreciate if you’re a nonscientist. I don’t … appreciate the process sufficiently to be able to direct it in a particular direction and say, “Hey, why don’t you look at this?”
I don’t think laypeople can make a significant contribution to what we should study. I know that a lot of people would agree with me on that.
While some interviewees observed that the whole point of being an advocate for those with breast cancer is, as one woman explained, “You’re not a scientist,” others noted that the research undertaken by the charity was widely perceived in terms of a “gold standard.” Many, including those who strongly identified more as “advocates” rather than “fundraisers,” also believed that the standard of expertise might potentially be threatened or undermined by training a wider community of people affected by breast cancer to have a say in scientific research.6
Overall, interview data suggest that despite thirty years of growing activism around breast cancer and a much more open concern with implementing, developing, and identifying with advocacy, a particular institutional context continues to sustain, color, and influence the lay/patient and public mobilization around the disease. The morality of fundraising and the faith in the expertise of scientific research expressed by these women cannot be abstracted from the institution of cancer charities in the United Kingdom. The complex and diverse nature of breast cancer activism here and elsewhere shows that what is required in understanding the dynamic space of undone science in breast cancer is a careful mapping and analysis of the nexus of interests that coalesce at particular disease/science/public interfaces (Epstein 2007; Gibbon and Novas 2007). The dense imbrication of some segments of the breast cancer movement with various institutions of scientific research in the United Kingdom means that undone science appears only to a segment of the advocacy community that has itself been historically marginalized within the larger breast cancer movement. Thus, unlike the two previous cases, which examine conflicts between industrial and government elites in conflict with social movement actors, the case of breast cancer research demonstrates conflicting notions of undone science within movements.
Additionally, however, support for research into environmental etiologies of cancer may yet come from within institutional cultures of science. Postgenomic researchers have increasingly begun to explore what is described as “gene/environment interaction,” where the importance of a seemingly broader context of molecular interaction is becoming important (Shostak 2003). As such, researchers examining social movements must be attentive to subtle shifts around the space of undone science of breast cancer from within and outside mainstream science as different configurations of health activism interface with seemingly novel targets of scientific inquiry in contrasting national contexts. As this study shows, undone science demarcates a highly dynamic cultural space characterized by interorganizational and intraorganizational competition mediated by advances in technoscientific research and clinical practice.
5. Movements as Sources of Undone Science
Kempner’s research is based on an interview study that examines “forbidden knowledge,” a term used to capture scientists’ decisions not to produce research because they believe it to be taboo, too contentious, or politically sensitive (a type of negative knowledge in the terminology introduced above). In 2002–2003, she and colleagues conducted ten pilot and forty-one in-depth, semistructured telephone interviews with a sample of researchers drawn from prestigious U.S. universities and representing a diverse range of disciplines, including neuroscience, microbiology, industrial/organizational psychology, sociology, and drug and alcohol research (Kempner, Perlis, and Merz 2005). Those fields were chosen to gauge the range, rather than the prevalence, of experiences with forbidden knowledge. Interviews lasted between thirty-five and forty-five minutes and were audiotaped, transcribed, coded, and analyzed according to the principles of grounded theory (Strauss and Corbin 1990).
While many social movements organize around the identification and completion of undone science, others devote themselves to making sure that some kinds of knowledge are never produced. They are not alone. The idea that some knowledge ought to be forbidden is deeply embedded in Western cultures and appears in literature through the ages, from Adam and Eve’s expulsion in Genesis to Dr. Frankenstein’s struggle with a monster of his own creation (Shattuck 1996). Mertonian rhetoric aside, most people agree that some science poses unacceptable dangers to research subjects or to society at large. The widely accepted Nuremberg Code, for example, places strict limits on human experimentation, in an effort to ensure that some science—such as Nazi human experimentation in World War II—is never done again.
Determining which knowledge ought to remain undone can often be contentious, as illustrated by current high-profile public debates surrounding the ethics and implications of stem cell research and cloning technologies. Nevertheless, as in research agenda-setting arenas (Hess 2007), debates and decisions about what knowledge should remain off limits to the scientific community typically occur among elites: legislators and federal agencies perennially issue guidelines and mandates regarding which research should not be conducted, setting limits on everything from reproductive and therapeutic cloning to studies of the psychological effects of Schedule I drugs, like heroin and marijuana. Scientists and the lay public both have limited opportunities to voice their opinion in these discussions. In dramatic cases, scientists have attempted to preempt mandates via self-regulation, as was the case in 1975 when scientists meeting at Asilomar called for a moratorium on certain kinds of recombinant DNA research (Holton and Morrison 1979).
According to the forty-one elite researchers interviewed for this case study, these formal mechanisms account for only a portion of the limitations that can produce undone science (Kempner, Perlis, and Merz 2005). More often, researchers described how their research had been hamstrung by informal constraints—the noncodified, tacit rules of what could not be researched or written. Yet researchers were very clear about what constituted “forbidden knowledge” in their respective fields. The boundaries of what could not be done had been made known to them when either they or a colleague’s work had been targeted for rebuke—in essence, their work had breached an unwritten rule. The management of forbidden knowledge, thus, worked much as Durkheim said it would: once someone’s research had been identified as especially problematic by, for example, a group of activists, their work became a “cautionary tale,” warning others “not to go there” (Kempner, Bosk, and Merz 2008).
In this way, social movement organizations and activists are able to play an important role in debates about what ought to remain undone, whether or not they are invited to the table. Besides their influence on shaping research agenda-setting arenas, social movements can and do influence individual researchers’ decisions not to pursue particular types of studies. In recent decades, for example, animal rights organizations have had an enormous influence on the kinds of research that scientists choose not to produce. We found that the researchers in our sample who work with animal models took seriously the threat posed by those organizations. They spoke of “terrorist-type attacks” and told stories of colleagues who received “razor blades in envelopes” and “threatening letters.” Others faced activists who staked out at their houses. Researchers learned from these cautionary tales and, in many cases, said that they had self-censored as a result. One researcher, for example, explained that he would not work with primates—only “lower order” animals like mice and drosophilia because:
I would like to lunatic-proof my life as much as possible … I, for one, do not want to do work that would attract the particular attention of terrorists …
The paranoia was acute. One researcher refused to talk to the interviewer until she proved her institutional affiliation: “For all I know, you are somebody from an animal rights organization, and you’re trying to find out whatever you can before you come and storm the place.”
Over time, the overt interventions of animal rights organizations in the production of research have redefined the ethics of animal research, ushering in legislation like the Animal Welfare Act of 1985, which requires research institutions that receive federal funding to maintain “Institutional Animal Care and Use Committees” (Jasper and Nelkin 1992). However, lay groups do not need to use such directly confrontational tactics to influence researchers’ decisions, especially if the groups are successful in their attempts to reframe a particular social problem. For example, substance abuse researchers argued that their research agendas were limited by the success of the Alcoholics Anonymous’ campaign to define treatment for alcoholism as lifelong abstinence from drink. Although these researchers would like to conduct “controlled drinking” trials, in which alcoholics are taught to drink in moderation, they argued that “There’s a strong political segment of the population in the United States that without understanding the issues just considers the goal of controlled alcohol abuse to be totally taboo.” The mere threat of interference from the grassroots was enough to keep many researchers from conducting certain studies. Several drug and alcohol researchers described great unwillingness to conduct studies on the health benefits of “harm reduction” programs, such as those that distribute free condoms in schools or clean needles in neighborhoods, because they might attract unwanted controversy from lay groups who oppose such public health interventions.
Thus, in some contrast to the role that social movement organizations and lay experts/citizen scientists play in exposing undone science and encouraging knowledge creation in chemical, air monitoring, and breast cancer research, this study shows that the same actors can also play a powerful role in determining which knowledge is not produced. Moreover, conflict over the direction of funding streams, while critically important to the political of research agenda setting, do not solely determine what science is left undone. Rather, social movements are also effective beyond research agenda-setting processes that occur at the institutional level; this study provides evidence that they also shape the microlevel interactional cues and decision-making process of individual scientists. Although more research is needed to understand the circumstances under which researchers decide to self-censor in response to pressure from outside groups, this case suggests that social movements may have much greater potential to thwart research than originally thought. The implications are intriguing and deserve greater attention. On one hand, disempowered groups may leverage these techniques to gain a voice in a system of knowledge from which they are typically excluded. On the other hand, it is troubling to learn that the subsequent “chilling effect” happens privately, often without public discussion and in response to intimidation and fear.
6. Discussion
The diverse cases provide an empirical basis for moving forward the theoretical conceptualization of undone science in relation to a new political sociology of science and that program’s concern with how research agendas are established. Perhaps the most significant general observation is that the identification of undone science is part of a broader politics of knowledge, wherein multiple and competing groups—including academic scientists, government funders, industry, and civil society organizations—struggle over the construction and implementation of alternative research agendas. To a large extent, our case studies focus on attempts by civil society or quasigovernmental organizations to identify areas of research they feel should be targeted for more research. However, the identification of undone science can also involve claims about which lines of inquiry should warrant less attention than they currently receive, either because there are decreasing social returns on continued investments in heavily researched areas or because the knowledge is deemed not worth exploring and possibly dangerous or socially harmful—what Gross (2007) calls “negative knowledge.” Examples of the latter include the research programs and methods targeted by animal rights groups and research on chlorinated chemicals targeted by Greenpeace. There are many other cases that would fit this role for civil society organizations, including calls for research moratoria on weapons development, genetically modified food, nuclear energy, and nanotechnology.
Five more specific insights follow from and add complexity to this general observation. First, while we see undone science as unfolding through conflict among actors positioned within a multiorganizational field, as Gibbons’ case shows, definitions of undone science may also vary significantly within different organizational actors, coalitions, or social movements. Some portions of the movement may be captured by mainstream research, and consequently advocacy is channeled into support for the experts’ prioritizations of research agendas. Thus, a research topic such as environmental etiologies of breast cancer may represent undone science to a marginalized segment of breast cancer advocates and their allies in the scientific community, but it may represent negative knowledge to the majority of breast cancer advocates and the dominant cancer research networks. To further complicate the picture, rapid developments and changes within the scientific field, such as the development of genomic research to better pinpoint environmental or epigenetic factors, may result in shifts in research priorities that can open up opportunities for research in areas of undone science. Here, one sees that internal changes and differences among both researchers and civil society advocates interact to define shifting coalitions of research priorities.
Second, the dynamic nature of coalitions and alliances that emerge around undone science suggests that the articulation of research priorities is often a relatively fluid process; even when civil society groups target some areas of scientific research as deserving low or no priority, their views may in turn lead to the identification of other areas of research deserving higher priority. For example, the position of an animal rights group may begin with opposition to some types of animal research but lead to support for more “humane” forms of animal research that have been reviewed by animal research committees. Likewise, the position of an organization such as Greenpeace in opposition to chlorinated chemicals is linked to an articulation of the need for research on green chemistry alternatives. As these examples suggest, the identification of undone science can be viewed as multifaceted outcomes of coalitions and conflict among diverse groups representing various social categories, each promoting a mix of topics seen as deserving more, less, or no attention from the scientific community.
Third, making sense of the complex processes that produce undone science involves attending to the distributions of power, resources, and opportunities that structure agenda setting within the scientific field. An important element of field structure is the role of regulatory regimes in shaping definitional conflicts over research priorities. Howard’s work suggests that done and undone environmental science dyads can be a key expression of the regulatory paradigm in which they occur and intimately linked to the way expertise is conceptualized and deployed in the paradigm. Furthermore, he proposes that until mainstream science faces a challenger, important forms of undone science within the dominant paradigm can remain implicit and unarticulated. In other words, undone science may take the form of a latent scientific potential that is suppressed through “mobilization of bias” (Lukes 2005; see also Frickel and Vincent 2007). Ottinger (2005) also notes the important role of regulatory standards in defining opportunities for activists who attempt to get undone science done largely using their own resources. In the case of air monitoring devices, an alternative research protocol and data gathering device operated by laypeople provides a basis for challenging official assurances of air quality safety. Rather than advocate for shifts in a research agenda, they simply enact the shift. In Howard’s terms, the lay research projects also dramatize the implicit and unarticulated bias in the dominant method of air quality monitoring. Ottinger’s (2005) focus on the double role of standards as enabling and constraining factors in establishing both the conditions and limitations of undone science is intriguing, and it remains for future research to examine the efficacy of tactical dynamics in relation to structural constraints encountered across a range of regulatory and research contexts.
Fourth, while access to financial resources is an implicit focus of efforts to identify undone science, Kempner’s research demonstrates that the interaction of civil society and research priorities is not restricted to the broad issue of funding. Although civil society organizations can exert an effect on research funding allocations, as we have seen especially in environmental and health research priorities, Kempner notes that there are other mechanisms that can cause such shifts. Her work suggests that efforts to study the problem of undone science should also consider the role that a moral economy has in shaping scientists’ decisions about what research programs they will and will not pursue (Thompson 1971; on moral economy in science, see Kohler 1994). Furthermore, even if scientists do not accept in principle the notion that certain knowledge should remain undone, they may simply decide not to invest in some areas of research because of intense direct pressures from civil society organizations such as animal rights groups. As a result of individual decisions not to engage in an area of research, changes in the research agendas of a field can occur even when funding is not shifting dramatically.
Finally, sometimes structural constraints such as limited access to resources coincide with practical constraints to produce “undoable science.” In the case of the chlorine sunset provisions, precaution advocates see governmental programs for screening individual chemicals as obscuring a plain fact: the sheer number of chemicals and their complex interaction with ecological and biological systems make it impossible to predict whether a given concentration of a given chemical will in any meaningful sense be “safe” or whether it will be a risk. As a result of this “wicked problem” (Rittel and Weber 1973), the articulation of undone science as a goal for research prioritization and funding—in this case, the standard assumption of a need for ever more research on the environmental, health, and safety implications of new chemicals—turns against itself, because the call for research into specific chemicals tacitly supports a regulatory framework that systematically generates a policy failure (see Beck 1995).
7. Conclusion
This study demonstrates some of the ways in which the analysis of undone science can enrich empirical understandings of research agenda-setting processes. The considerable variation we find in just four cases suggests that one promising avenue for future research lies in developing more systematic comparisons across academic, government, industry, and community settings. Doing so will further elaborate the ways in which the institutional contexts of research—including different sets of political and economic pressures, normative expectations, resource concentrations, and sizes and configurations of research networks—shape the articulation of undone science and the successful or failed implementation of alternative research agendas.
Our broader aim in seeking to give undone science higher visibility within STS is to broaden the foundations for a new political sociology of science. Much like feminist and antiracist science studies, the political sociology of science situates questions relating to the uneven distribution of power and resources in science at the center of the STS project while remaining attentive to how knowledge and its inverse—ignorance—is socially shaped, constructed, and contested. As we have argued here, one of the crucial sites where questions of power, knowledge, and ignorance come together is in the domain of research agenda setting, where intense coalitions and conflicts are forged to gain access to the limited resources that ultimately shape what science is done and what remains undone.
Biographies
Bios
Scott Frickel is associate professor of sociology at Washington State University, where he studies science, environment, and social movements. He is author of Chemical Consequences: Environmental Mutagens, Scientist Activism, and the Rise of Genetic Toxicology (Rutgers University Press, 2004) and coeditor with Kelly Moore of The New Political Sociology of Science: Institutions, Networks, and Power (University of Wisconsin Press, 2006).
Sahra Gibbon is a research fellow in the Anthropology Department at University College London. She is author of Breast Cancer Genes and the Gendering of Knowledge (Palgrave Macmillan, 2007) and coeditor with Carlos Novas of Biosocialities, Genetics and the Social Sciences: Making Biologies and Identities (Routledge, forthcoming).
Jeff Howard is assistant professor at the University of Texas at Arlington School of Urban and Public Affairs. His research focuses, in part, on the problematic role of experts and expert knowledge in environmental decision making—an interest rooted in his experience as a Greenpeace staff member in the 1980s (prior to the case examined here).
Joanna Kempner is assistant professor of sociology at Rutgers University and member of the Institute for Health, Health Care Policy and Aging Research. Her research examines the intersection of medicine, science, politics, and gender.
Gwen Ottinger is a research fellow in the Environmental History and Policy Program at the Chemical Heritage Foundation. Her work explores how expertise is constructed in the everyday interactions of engineers, scientists, residents, and activists at an oil refinery’s fenceline.
David J. Hess is professor of Science and Technology Studies and director of the Ecological Economics, Values, and Policy Program at Rensselaer Polytechnic Institute. His research focuses on the social studies of science, technology, health, the environment, and social movements. His most recent books are Alternative Pathways in Science and Technology (MIT Press, 2007) and Localist Movements in a Global Economy (MIT Press, 2009).
Notes
The term “negative knowledge” originally comes from Knorr-Cetina (1999), but our usage follows Gross’s amplification (2007).
North Carolina also has ambient air standards for this class of pollutants. The federal government has not set such standards; only total levels of volatile organic chemicals, in addition to five other “criteria pollutants,” are regulated by the Clean Air Act.
A significant body of work in social studies of science demonstrates how standards and standardized practices help coordinate scientific work across heterogeneous communities and distant research sites (see for example Star and Griesemer 1989; Fujimura 1996).
In presenting the program to Norco residents, one chemical engineer representing Shell even acknowledged the legitimacy of activists’ data, reiterating the claim that the buckets were EPA-approved.
Louisiana is not alone in this; the National Ambient Air Quality Standards, for example, regulate one-hour, eight-hour, twenty-four-hour, or annual averages of criteria pollutants.
A few women did acknowledge that they would want to have more training in the field of scientific research to enable them to be, as they put it, more “credible” and “not be discounted.” They sought to become, as one woman said, “an informed layperson as opposed to somebody who can’t be dismissed.” It was clear that there were boundaries placed on what this might mean in relation to informing or influencing scientific research.
References
- Allen BL. Uneasy alchemy: Citizens and experts in Louisiana’s chemical corridor disputes. Cambridge, MA: MIT Press; 2003. [Google Scholar]
- Amato I. The crusade against chlorine. Science. 1993;261(5118):152–4. doi: 10.1126/science.8327884. [DOI] [PubMed] [Google Scholar]
- Anglin MK. Working from the inside out: Implications of breast cancer activism for bio-medical policies and practices. Social Science and Medicine. 1997;44(9):1043–415. doi: 10.1016/s0277-9536(96)00321-8. [DOI] [PubMed] [Google Scholar]
- Austoker J. A history of the Imperial Cancer Research Fund 1902-1986. New York: Oxford University Press; 1988. [Google Scholar]
- Beck U. In: Ecological politics in an age of risk. Weisz A, translator. Cambridge: Polity; 1995. [Google Scholar]
- Botts L, Muldoon P, Botts P, von Moltke K. The Great Lakes water quality agreement. In: Hisschemöller M, Hoppe R, Dunn W, Ravetz J, editors. Knowledge, power, and participation in environmental policy analysis. New Brunswick, NJ: Transaction; 2001. pp. 121–43. [Google Scholar]
- Bourdieu P. Science of science and reflexivity. Chicago: University of Chicago Press; 2004. [Google Scholar]
- Brown P, McCormick S, Mayer B, Zavestoski S, Morello-Frosch R, Gasior Altman R, Senier L. “A lab of our own” Environmental causation of breast cancer and challenges to the dominant epidemiological paradigm. Science, Technology and Human Values. 2006;31(5):499–536. [Google Scholar]
- Chlorine Chemistry Council. Pandora’s poison: Putting political ideologies ahead of public health—and hope. [accessed November 11, 2000]; (n.d.) http://www.pandoraspoison.org/industry_views/ccc_statement.html.
- Collins H. Changing order: Replication and induction in scientific practice. Beverly Hills, CA: Sage; 1985. [Google Scholar]
- Collins H. The third wave of science studies: Studies of expertise and experience. Social Studies of Science. 2002;32(2):235–96. [Google Scholar]
- Di Chiro G. Living environmentalisms: Coalition politics, social reproduction, and environmental justice. Environmental Politics. 2008;17(2):276–98. [Google Scholar]
- Epstein S. Patient groups and health movements. In: Hackett EJ, Amsterdamska O, Lynch M, Wacjman J, editors. New handbook of science and technology studies. Cambridge, MA: MIT Press; 2007. pp. 499–539. [Google Scholar]
- Forman P. Behind quantum electronics: National security as basis for physical research in the United States, 1940-1960. Historical Studies in the Physical and Biological Sciences. 1987;18(1):149–229. [Google Scholar]
- Forsythe D. Studying those who study us: An anthropologist in the world of artificial intelligence. Stanford, CA: Stanford University Press; 2001. [Google Scholar]
- Fosket J. Constructing ‘high risk women’: the development and standardization of a breast cancer risk assessment tool. Science, Technology & Human Values. 2004;29(3):291–313. [Google Scholar]
- Frickel S. On missing New Orleans: Lost knowledge and knowledge gaps in an urban hazardscape. Environmental History. 2008;13(4):634–50. [Google Scholar]
- Frickel S, Moore K, editors. The new political sociology of science: Institutions, networks, and power. Madison, WI: University of Wisconsin Press; 2006a. [Google Scholar]
- Frickel S, Moore K. Prospects and challenges for a new political sociology of science. In: Frickel S, Moore K, editors. The new political sociology of science: Institutions, networks, and power. Madison, WI: University of Wisconsin Press; 2006b. pp. 3–31. [Google Scholar]
- Frickel S, Vincent MB. Katrina, contamination, and the unintended organization of ignorance. Technology in Society. 2007;29:181–8. [Google Scholar]
- Fujimura J. Crafting science: Standardized packages, boundary objects, and ‘translation.’. In: Pickering A, editor. Science as practice and culture. Chicago: University of Chicago Press; 1996. pp. 168–211. [Google Scholar]
- Galison P. Removing knowledge. Critical Inquiry. 2004;31:229–43. (autumn) [Google Scholar]
- Gibbon S. Breast cancer genes and the gendering of knowledge: Science and citizenship in the cultural context of the ‘new’genetics’. Basingstoke, UK: Palgrave Macmillan; 2007. [Google Scholar]
- Gibbon S, Novas C, editors. Bio-socialities, genetics and the social sciences. London: Routledge; 2007. [Google Scholar]
- Gross M. The unknown in process: Dynamic connections of ignorance, non-knowledge, and related concepts. Current Sociology. 2007;55:742–59. [Google Scholar]
- Gross M. Ignorance and surprise: Science, society, and ecological design. Cambridge, MA: MIT Press; in press. [Google Scholar]
- Haraway DJ. Primate visions: Gender, race, and nature in the world of modern science. New York: Routledge; 1989. [Google Scholar]
- Harding S. Is science multicultural? Postcolonialisms, feminisms, epistemologies. Blooomington, IN: Indiana University Press; 1998. [Google Scholar]
- Hess D. Alternative pathways in science and industry: Activism, innovation, and the environment in an era of globalization. Cambridge: MIT Press; 2007. [Google Scholar]
- Hilgartner S. Election 2000 and the production of the unknowable. Social Studies of Science. 2001;31(3):439–41. [Google Scholar]
- Hoffmann-Riem H, Wynne B. In risk assessment, one has to admit ignorance. Nature. 2002 Mar;416:123. doi: 10.1038/416123a. [DOI] [PubMed] [Google Scholar]
- Holton G, Morrison RS, editors. Limits of scientific inquiry. New York: W. W. Norton & Company; 1979. [Google Scholar]
- Howard J. Toward intelligent, democratic steering of chemical technologies: Evaluating industrial chlorine chemistry as environmental trial and error. PhD Diss., Rensselaer Polytechnic Institute; Troy, NY: 2004. Proquest no. 845710461. [Google Scholar]
- International Joint Commission (IJC) Sixth biennial report on Great Lakes water quality. Washington, DC: IJC; 1992. [Google Scholar]
- Jasper JM, Nelkin D. The animal rights crusade: The growth of a moral protest. New York: Free Press; 1992. [Google Scholar]
- Kaufert P. Women, resistance and the breast cancer movement. In: Lock M, Kaufert P, editors. Pragmatic women and body politics. Cambridge: Cambridge University Press; 1998. pp. 287–309. [Google Scholar]
- Kempner J, Bosk CL, Merz JF. Forbidden knowledge: The phenomenology of scientific inaction. 2008. Unpublished manuscript. [Google Scholar]
- Kempner J, Perlis CS, Merz JF. Forbidden knowledge. Science. 2005;307:854. doi: 10.1126/science.1107576. [DOI] [PubMed] [Google Scholar]
- Klawiter M. Breast cancer in two regimes: the impact of social movements on illness experience. Sociology of Health and Illness. 2004;26(6):845–74. doi: 10.1111/j.0141-9889.2004.00421.x. [DOI] [PubMed] [Google Scholar]
- Klein HK, Kleinman DL. The social construction of technology: Structural considerations. Science, Technology, and Human Values. 2002;27(1):28–52. [Google Scholar]
- Kleinman DL, Vallas SP. Science, capitalism, and the rise of the ‘knowledge worker’: The changing structure of knowledge production in the United States. Theory and Society. 2001;30:451–92. [Google Scholar]
- Knorr-Cetina K. Epistemic cultures: How the sciences make knowledge. Cambridge, MA: Harvard University Press; 1999. [Google Scholar]
- Kohler RE. Lords of the fly: Drosophila genetics and the experimental life. Chicago: University of Chicago Press; 1994. [Google Scholar]
- Lerner B. The breast cancer wars. Hope, fear and pursuit of a cure in twentieth Century-America. Oxford: Oxford University Press; 2003. [PubMed] [Google Scholar]
- Levidow Les. Ignorance-based risk assessment? Scientific controversy over GM food safety. Science as Culture. 2002;11(1):61–7. [Google Scholar]
- Brigad Louisiana Bucket. Land sharks: Orion Refining’s predatory property purchases. New Orleans: Inkworks Press; 2001. [Google Scholar]
- Löwy I. Between bench and bedside: Science, healing and interleukin-2 in a cancer ward. Cambridge, MA: Harvard University Press; 1997. [Google Scholar]
- Lukes S. Power: A radical view. 2nd ed. New York: Palgrave Macmillan; 2005. [Google Scholar]
- MacKenzie D, Spinardi G. Tacit knowledge, weapons design, and the uninvention of nuclear weapons. American Journal of Sociology. 1995;101:44–99. [Google Scholar]
- Markowitz G, Rosner D. Deceit and denial: The deadly politics of industrial pollution. Berkeley: University of California Press; 2002. [Google Scholar]
- Martin B. Justice ignited: The dynamics of backfire. Lanham, MD: Rowman & Littlefield; 2007. [Google Scholar]
- Marx K. Capital. Vol. 1 New York: International Publishers; 1967. [Google Scholar]
- Mayer B, Overdevest C. Bucket brigades and community-based environmental monitoring. Paper presented at the annual meeting of the Society for Social Studies of Science; Montreal. 2007. [Google Scholar]
- Merton R. Three fragments from a sociologist’s notebook: Establishing the phenomenon, specified ignorance, and strategic research materials. Annual Review of Sociology. 1987;13:1–28. [Google Scholar]
- Murphy M. Sick building syndrome and the problem of uncertainty: Environmental politics, technoscience, and women workers. Durham, NC and London: Duke University Press; 2006. [Google Scholar]
- Noble D. America by design: Science, technology, and corporate capitalism. New York: Alfred A. Knopf; 1977. [Google Scholar]
- O’Rourke D, Macey GP. Community environmental policing: Assessing new strategies of public participation in environmental regulation. Journal of Policy Analysis and Management. 2003;22(3):383–414. [Google Scholar]
- Ottinger G. Grounds for action: Community and science in environmental controversy. PhD Diss., University of California; Berkeley: 2005. [Google Scholar]
- Potts L. An epidemiology of women’s lives: The environmental risk of breast cancer. Critical Public Health. 2004;14(2):133–47. [Google Scholar]
- Proctor RN. Cancer wars: How politics shapes what we know and don’t know about cancer. New York: Basic Books; 1995. [Google Scholar]
- Proctor RN, Schiebinger L, editors. Agnotology: The making and unmaking of ignorance. Stanford, CA: Stanford University Press; 2008. [Google Scholar]
- Rittel H, Webber M. Dilemmas in a general theory of planning. Policy Sciences. 1973;4:155–69. [Google Scholar]
- Shattuck R. Forbidden knowledge. New York: Harcourt Brace and Company; 1996. [Google Scholar]
- Sheridan J. Chlorine chemistry: An endangered species? Industry Week. 1994 Jan 3;:49–50. [Google Scholar]
- Shostak S. Locating gene-environment interaction: At the intersections of genetics and public health. Social Science and Medicine. 2003;56:2327–42. doi: 10.1016/s0277-9536(02)00231-9. [DOI] [PubMed] [Google Scholar]
- Slaughter S, Rhoades G. Academic capitalism and the new economy: Markets, states, and higher education. Baltimore, MD: The Johns Hopkins University Press; 2004. [Google Scholar]
- Sontag S. Illness as Metaphor and AIDS and its Metaphors. New York; Doubleday: 1988. [Google Scholar]
- Star SL, Griesemer JR. Institutional ecology, ‘translations’ and Boundary objects: Amateurs and professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39. Social Studies of Science. 1989;19:387–420. [Google Scholar]
- Strauss A, Corbin J. Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park: Sage Publications; 1990. [Google Scholar]
- Stringer R, Johnston P. Chlorine and the environment: An overview of the chlorine industry. Boston: Kluwer Academic; 2001. [Google Scholar]
- Swerczek M. Orion promises air samples. The Times Picayune, New Orleans. 2000 Sep 29;:B1–2. [Google Scholar]
- Tesh SN. Uncertain hazards: Environmental activists and scientific proof. Ithaca: Cornell University Press; 2000. [Google Scholar]
- Thompson EP. The moral economy of the English crown in the eighteenth century. Past and Present. 1971;50:76–136. [Google Scholar]
- Thornton J. The product is the poison: The case for a chlorine phase-out. Washington: Greenpeace USA; 1991. [Google Scholar]
- Thornton J. Pandora’s poison: Organochlorines and health. Cambridge: MIT; 2000. [Google Scholar]
- Tickner JA, editor. Precaution, environmental science, and preventive public policy. Washington, DC: Island; 2003. [DOI] [PubMed] [Google Scholar]
- Woodhouse EJ, Hess D, Breyman S, Martin B. Science studies and activism: Possibilities and problems for reconstructivist agendas. Social Studies of Science. 2002;32(2):297–319. [Google Scholar]
- Zavestoski S, Brown P, Linder M, McCormick S, Mayer B. Science, policy, activism, and war: Defining the health of Gulf War veterans. Science, Technology, & Human Values. 2002;27(2):171–205. [Google Scholar]
- Zavestoski S, Morello-Frosch R, Brown P, Mayer B, McCormick S, Gasior R. Embodied health movements and challenges to the dominant epidemiological paradigm. Research in Social Movements, Conflict and Change. 2004;25:253–278. [Google Scholar]
- Zuckerman H. Theory choice and problem choice in science. Sociological Inquiry. 1978;48(3-4):65–95. [Google Scholar]