Abstract
When scientific research collides with social values, science’s right to self-governance becomes an issue of paramount concern. In this article, I develop an account of scientific autonomy within a framework of public oversight. I argue that scientific autonomy is justified because it promotes the progress of science, which benefits society, but that restrictions on autonomy can also be justified to prevent harm to people, society, or the environment, and to encourage beneficial research. I also distinguish between different ways of limiting scientific autonomy, and I argue that government involvement in scientific decision-making should usually occur through policies that control the process of science, rather than policies that control the content of science.
Keywords: scientific values, social values, autonomy, freedom, responsibility, censorship, public oversight, Lysenko
I. Introduction
In July 2003, the House Energy and Commerce Committee held hearings to consider whether to cut funding for grants that supported research relating to human sexuality and HIV/AIDS risks associated with drug use. The grants had already been approved by a study section (i.e. a peer review committee) at the National Institutes of Health (NIH) that evaluates research proposals based on their scientific merit and potential contribution to contribution to public health. The four grants were culled from a list of 198 grants targeted by the Traditional Values Coalition, a conservative political interest group that lobbies Congress and donates money to political candidates. In January 2004, the Commerce Committee decided to continue to fund the grants after hearing testimony from NIH Director Elias Zerhouni (Kaiser 2004).
The debate about the funding of these NIH grants represents a clash between scientific and social values, one of the oldest problems in the philosophy of science and still one of the most important issues today (Longino 1990, Shrader-Frechette 1994, Rosenberg 1995, Resnik 1998, Goldman 1999, Kitcher 2001). The study section members believed that the disputed research had scientific merit and would make a significant impact on public health; the interest groups that opposed the grants believed that the research was contrary to important social values. Though many writers have reflected on the clash between science and social values (e.g. Guston 2000, Kitcher 2001), few have developed a systematic account of how to adjudicate this conflict. In this article, I will use the concept of autonomy to examine the relationship between scientific and social values. I will argue that scientists, scientific groups, and scientific organizations should be granted autonomy within their domain of expertise, under the rubric of public oversight. The public, acting through the government, should regulate the process of science but should avoid regulating the content of science.
The article will proceed as follows. Section II will distinguish between scientific and social values. Section III will develop several arguments for the autonomy of science, drawing on insights from the history, philosophy and sociology of science. Section IV will explore moral and political arguments for limitations on the autonomy of science. Section V will distinguish between government restrictions on the process of science and government restrictions on the content of science. Section VI will apply the framework developed in Section V to several case studies.
II. Scientific vs. Social Values
Science, like law, or business, is a human institution that embodies certain values. Scientific values are the goals and norms that govern the conduct of scientists (Longino 1990, Hull 1988, Kuhn 1977, Merton 1973). The norms of science are justified in terms of the goals of science. Although some commentators have attempted to characterize science in terms of a single goal, such as the pursuit of knowledge or truth, the weight of argument shows that science has many different goals (Resnik 1998). Suppose, for example, that truth were the only goal of science. If this were the case, then scientists should be just as concerned about acquiring insignificant truths, such as the number of carpet fibers in the White House, as significant ones, such as the number of species on the planet Earth. Yet scientists are (and should be) more concerned with significant truths than with insignificant ones (Kitcher 1993). Since many different factors determine the significance of a true statement, including its ability to explain different phenomena and make reliable predictions, the goals of science include not only truth but also explanation and prediction (Kitcher 1993). Furthermore, applied sciences, such as medicine or engineering, have practical goals in addition to epistemic ones. Engineers, for example, are interested in solving practical (design) problems, and physicians are interested in treating diseases and relieving human suffering.
The norms of science justify and shape the methods, rules, procedures, and traditions of science. Science has epistemological norms, such as testability, simplicity, precision, objectivity, consistency, novelty, as well practical (or ethical) norms, such as honesty, openness, community, freedom, fairness, integrity, and accountability (Resnik 1998, Kuhn 1977, Merton 1973). Due to the social nature of epistemology and our dependence on other people for information, many of science’s norms, such as honesty, openness, community, and freedom, have an ethical and epistemological dimension (Resnik 1998). Epistemological and ethical norms govern hypothesis generation; experimental design; recording, storing, sharing, analyzing, and interpreting data; peer review; publication; collaboration; credit allocation; interactions with the media; and intellectual property. For example, the randomized controlled trial (RCT) is a method used in biomedical research for testing the efficacy of a new treatment. RCTs embody objectivity, precision, testability and other epistemic norms. Authorship guidelines promote honesty, fairness, accountability, and other ethical norms (Shamoo and Resnik 2003). Scientific norms prescribe, but do not necessarily describe, scientific conduct: they are ideal standards that individual researchers or research communities sometimes fail to live up to (Resnik 1998).
The values that constitute good scientific practice can be distinguished from other values in society, such as personal, cultural, moral, political, business, legal, and religious values (Longino 1990). At one time, many scientists, scholars, and lay people viewed science as objective (i.e. free from the influence of social values). Since the 1960s, historians, philosophers, and sociologists of science have compiled empirical evidence and developed conceptual arguments that undermine this image of science. Social values can influence virtually all aspects of the research process, ranging from problem selection, to hypothesis formation, to data analysis and interpretation, to concept formation and theory acceptance (Longino 1990, Ziman 2000). Since there is now widespread agreement among scholars and scientists that social values frequently have a significant impact on science, the debate has shifted to an examination of the extent to which science is influenced by social values (Haack 2003). The influence of values on science may depend on the context of science: researchers working for industry or the military face greater pressures to compromise scientific values than researchers working in academia. (Ziman 2000). Also, some sciences may be more value-laden than others. For example, social values have a greater impact on the social sciences than on the physical sciences (Rosenberg 1995).
This article will not explore these important questions concerning the influence of social values on the practice of science. Even if one grants, as I will, the thesis that science is often affected by social values, it is still an open question as to whether science ought to be affected by social values. I will acknowledge, on the one hand, that actual science, as practiced by human beings, is often shaped by social values; yet I will also maintain, on the other hand, that scientists should try to minimize the impact of social values on their research projects and other activities. Scientists should strive for objectivity, even if they often fall short of this goal: objectivity is a regulative ideal (Shrader-Frechette 1994, Kitcher 1993, 2001, Haack 2003; Resnik 2007).
One might object the objectivity cannot be a regulative ideal because it is an unreachable goal. Science will always be inundated by social values, and attempts to eliminate some values will only succeed in introducing others. Objectivity is therefore an illusion and the pursuit of objectivity is a pipe dream (Latour and Woolgar 1986). The reply to this objection is to say that the fact that a goal is unreachable does not imply that it is not worth pursuing. A goal may guide conduct, even if it is not achievable. There will never be a perfectly just state, but governments should still seek justice. No human being will ever be perfectly virtuous, but people should still strive for virtue. As long as it is possible to make some progress toward objectivity, then pursuing this goal will not be in vain (Kitcher 1993, 2001).[1]
III. The Autonomy of Science
As noted earlier, scientific and social values sometimes conflict. When this happens, questions concerning the governance and control of science arise. Should scientists control their own work or should society have some control over science? How much control should society have over science? Though most philosophers and scientists have followed Francis Bacon (1561–1626) in maintaining that science should serve society, few have moved beyond this cliche to spell out the precise relationship between scientific and social values.
One way of viewing this relationship is through the lens of autonomy, because the concept of autonomy sets boundaries for self-governance and external governance. The concept of autonomy plays an important role in moral, political, and social philosophy. In thinking about this concept, it is important to distinguish between the capacity for self-governance and the right of self-governance (Dworkin 1988). An autonomous person is someone who is capable of performing the cognitive tasks, such as reasoning and judgment, necessary for sound decision-making (Buchanan and Brock 2004). The right to autonomy is a right to make decisions pertaining to one’s body, mind, property, and relationships. A person might be capable of making decisions yet face significant limitations on his or her autonomy. For example, a man who is serving a life sentence for murder may be perfectly capable of making decisions, but his right to make decisions make be severely limited, due to his incarceration. Conversely, a person may have a right to autonomy, yet be incapable of making decisions, due to loss of consciousness, mental illness, etc.
What does it mean for science be autonomous or have a right of autonomy? In the philosophy of science, autonomy is usually equated with methodological, epistemological or metaphysical self-governance. For example, the claim that biology is autonomous often is understood as the claim that biology has its own forms of explanation and analysis, or that there are biological properties that are not reducible to physical properties (Rosenberg 1985). In this essay, I will focus on science’s moral, political, or social autonomy, rather than its methodological, epistemological, or metaphysical autonomy.
So how could a social institution, such as science, have moral, social, or political autonomy? Although science does not make decisions, science is composed of people who do. Individual scientists, groups (such as research teams or laboratories), and organizations (such as journals, universities, or professional associations) make decisions relating to research, education, and other scientific activities (Ziman 2000).[2] It is important to include all of these different levels of decision-making in our analysis because they interact with each other in important ways: individuals are affected by government restrictions on groups and organizations and vice versa. For example, a biochemist cannot publish a paper about a particular topic if the government has enacted laws banning journals from publishing papers on that topic, and a professional association of biochemists cannot allow members of a particular race to join the association if the government has enacted laws banning members of that race from joining scientific associations.
Since individuals, groups, and organizations all make choices pertaining to science, scientific autonomy should encompass these different levels of decision-making. The capacity for autonomy in science can therefore be defined as the ability of individuals, groups, and organizations to make decisions relating to scientific activities, such as research, education, and publication. The right to autonomy in science can be defined as the right of individuals, groups, and organizations to make decisions related to scientific activities. These definitions will be useful in examining arguments for and against scientific autonomy.
Now that we have a better sense of what scientific autonomy is, we can explore arguments in favor of autonomy in science. There are two distinct ways of justifying a right to autonomy: a deontological strategy and a utilitarian (or instrumentalist) strategy. The deontological strategy defends autonomy in science by appealing to the individual’s political, moral, or legal rights. One could argue that individuals have a right to engage in scientific activities because they have moral or political rights to freedom of thought and expression (Donnelly 2002; Dworkin 1988). Alternatively, one might argue that individuals have a right to engage in scientific activities because they have Constitutionally-protected legal rights to freedom speech, freedom of thought, and freedom of association (Robertson 1977). After establishing the individual scientist’s right to autonomy, the deontological strategy then argues for the right to autonomy at the level of the group or organization, since individuals cannot exercise their rights unless they are allowed to associate, communicate, and collaborate with other individuals.
The utilitarian justification for autonomy takes a very different tack. This strategy begins with the observation that science benefits society through its practical applications in technology, engineering, agriculture, medicine, communication, and other human endeavors (Bush 1945). Science cannot produce the knowledge that yields these impressive results unless individual scientists, scientific groups, and organizations are allowed to make their own decisions. Thus, the utilitarian strategy moves in the opposite direction from the deontological one, because it starts with the justification of autonomy at the level of the social institution (science) and works its way down to the justification of autonomy at the level of the individual.
Though both strategies play an important role in defending the autonomy of science, I will emphasize the utilitarian strategy in this article. There are several reasons why I have chosen this focus. First, the deontological strategy does not, in my view, represent a conflict between scientific and social values, because it defends the autonomy of science on the basis of political, moral, or legal considerations. As such, it represents a conflict among social values (e.g. freedom of speech vs. national security), not a conflict between scientific values and social values (e.g. truth vs. profit). Second, the deontological strategy may not convince people who do not agree with the political, moral, or legal values that support freedom of thought and expression. A fascist government, with little regard for freedom of speech, would still be concerned about supporting scientific research in order to advance practical goals, such as national security and public health and safety. Third, over the years, the utilitarian strategy has played a more important role in science policy debates than the deontological strategy. Science policy arguments frequently appeal to the importance of science for national defense, economic competitiveness, and public health, and rarely mention the value of freedom of thought and expression (Bush 1945, Guston 2000).
To help clarify the utilitarian argument for scientific autonomy I will restate it as follows:
Scientific research generates knowledge with many practical applications in technology, engineering, agriculture, medicine, and other fields.
The practical applications of scientific research are valuable (i.e. worth having or attaining).
To generate knowledge, individual scientists, groups, and organizations must be allowed to make decisions relating to their activities, such as problem selection, hypothesis formation, experimental design, data analysis, data interpretation, publication, peer review, theory acceptance, and education.
Therefore, scientists, scientific groups, and organizations should be allowed to make decisions relating to their activities; they should be granted a right to autonomy.
The argument consists of one normative premise (premise 2) and two descriptive ones (premises 1 and 3). I will not examine the normative premise in this article, since I will assume that most people would agree that the practical applications of science are valuable. I also will not examine premise 1 because I will assume that there is overwhelming evidence that science yields practical results (Guston 2000, Ziman 2000). While there may be some disagreements about whether any particular scientific discipline, such as astrophysics or cultural anthropology, yields practical applications, there is near unanimous agreement that science, as a whole, produces many useful results. However, I would like to comment a bit more about the third premise.
There is historical evidence that government interference in scientific decision-making can stunt or retard the growth of science (Sheehan 1993). The example I will discuss in this article is the negative effect of government control of science in the former Soviet Union, where biology suffered the effects of Marxist ideology from 1930s to the 1960s. Following the Russian revolution of 1917, the All Union Communist Party (a.k.a. the Bolsheviks) demanded that all social institutions, included science, conform to Marxist political theory. Members of the Party opposed scientific ideas they regarded as the product of Bourgeoisie thought, such as free market economics and Mendelian genetics. They also favored scientific ideas that supported the idea of re-engineering human society along Marxist lines. In the 1920s, Trofim D. Lysenko (1989–1976) developed a theory of inheritance that found favor among powerful members of the Party. Lysenko developed a process, known as vernalization, which involved soaking and chilling seeds from summer crops for winter planning. Lysenko claimed that vernalization could improve agricultural productivity, when, in fact, it could not. Scientists and politicians accepted Lysenko’s ideas, even though he had little evidence to support his ideas, he did not keep good research records, and he manipulated the data by not reporting negative results (Sheehan 1993).
In 1930, the Ukrainian Commissioner of Agriculture created a vernalization department at a genetics institute in Odessa (Sheehan 1993). Lysenko proposed a theory to explain vernalization phenomena: one can alter the development of a plant by changing its environment because plants have different needs at different stages of development. Lysenko and I.I. Prezent, a member of the Communist Party, proposed a new environmental theory of heredity that stood in sharp contrast to Mendel’s theory of inheritance. The theory found favor with other members of the Communist Party, because it implied that human behavior can be changed through environmental manipulation, making it possible to overcome greed, selfishness, and possessiveness to create a communist state. Proponents of Mendelian genetics objected to the environmental theory as unscientific and unsound, but their criticisms could not overcome the theory’s political appeal (Sheehan 1993). Lysenko soon won the support of Joseph Stalin (1878–1953), the General Secretary of the Communist Party. In 1938, Lyenko was appointed President of the Lenin Academy for Agricultural Sciences in 1938, and in 1940 he became Director of the Department of Genetics at the Soviet Academy of Science (Hossfeld and Olsson 2002).
By 1948, Lysenkoism became the official view of the Communist Party, and the Soviet government began to repress Mendelian genetics. Soviet scientists who attacked Lysenkoism or endorsed Mendelianism were denounced, declared mentally ill, imprisoned, exiled, or even murdered. Soviet scientists were not allowed to teach Mendelian ideas or conduct research in Mendelian genetics until the 1960s, when the period of official repression ended (Joravsky 1986). The suppression of ideas that contradicted Marxist ideology had a devastating effect on Soviet genetics, but many other disciplines also suffered, including zoology, botany, evolutionary biology, agronomy, and economics (Sheehan 1993). Before the 1940s, some of the world’s leading geneticists, such as Theodosius Dobzhansky (1900–1975), lived in the Soviet Union, but by the 1960s, genetics and many other scientific disciplines in the Soviet Union were decades behind Western science (Joravsky 1986).
Lysenkoism is an extreme example of what can happen when the government restricts the autonomy of individuals, groups, and organizations; yet, it still offers us some important lessons that apply to situations where science is not as politicized. The Soviet Union’s repression of views that contradicted Lysenkoism undermined the progress of science in several different ways. First, the Soviet government’s actions interfered with objectivity of science. Theories of inheritance were accepted or rejected based on political reasons, not epistemological ones. Scientists were forced to ignore the evidence against Lysenkoism and the evidence in favor Mendelianism. Second, the actions of the government interfered with communication among scientists and the sharing of ideas. Honest, open communication is vital to scientific inquiry, criticism, and debate (Burke 1995); yet, the Soviet government stifled the exchange of information concerning some topics. Scientists were, rightfully, afraid to criticize Lysenkoism in public or to discuss or teach Mendelian theory.
Third, the repression of Mendelian genetics nearly extinguished creativity in many areas of science. Creativity flourishes only when scientists are free to explore new ideas, theories, and methods and to challenge existing ones (Kantorovich 1993). The Soviet government violated the freedom of many citizens, and scientists were no exception. The government dictated the areas of science and the scientific ideas that would and would not be studied. It established a rigid research program geared toward promoting Marxist ideology. The government interfered with freedom of scientists, scientific groups, and scientific institutions.
Fourth, the Soviet government’s restrictions had a widespread impact. Many different research disciplines were directly or indirectly affected by the government’s repressive policies. The plague of Lysenkoism spread throughout the research community and affected many different scientists, scientific groups, and scientific institutions. Even people working in fields of research far removed from human genetics were apprehensive about potential intimidation, harassment, or repression (Joravsky 1986).
The moral of Lysenkoism is that governments should be very wary of interfering with scientific decisions. Scientists (and scientific groups and organizations) should be granted autonomy within their domain of practice and expertise. The progress of science depends on its independence from government control and authority.
IV. Limitations on Scientific Autonomy
So far, I have argued that the progress of science requires that scientists, scientific groups and organizations be allowed to make decisions pertaining to their work, free from outside, governmental interference. However, autonomy, including scientific autonomy, is not an absolute right. Autonomy may be restricted to prevent harm to other individuals, groups, society, or the environment. For example, freedom of speech does not give one the right to slander a neighbor, divulge confidential information about one’s patients, or yell “fire” in a crowded theater (Feinberg 1987). Likewise, scientific autonomy may be restricted to prevent scientists, scientific groups, organizations, or disciplines from harming individuals, groups, society, or the environment (Kitcher 2001, Shamoo and Resnik 2003). Some situations where scientific research may be restricted or controlled to prevent harm include: 1) research involving human subjects, which may threaten human welfare or rights; 2) research involving animal subjects, which could lead to needless animal pain or suffering; 3) research with dangerous, controlled substances, such as cocaine or heroin, which could lead to drug abuse or drug diversion; 4) research with biological, chemical, radiological or other materials that could be used to make weapons of mass destruction. In each of these situations, society has enacted laws and regulations designed to prevent scientists from causing harm.
In addition to restrictions on autonomy imposed by specific laws or regulations, there are restrictions on autonomy resulting from contractual obligations. Scientists, scientific groups, organizations, or disciplines may voluntarily restrict their conduct via contracts or other agreements. When individuals enter contracts, they decide to limit their freedom in order to obtain the benefits promised in the contracts (Calamari and Perillo 1998). For example, if X agrees to mow Y’s lawn on Saturday morning for $25, X has made a decision to limit his own freedom. If X wants to earn the $25, he must perform his part of the contract. Scientists also voluntarily restrict their own freedom by entering contracts with universities, private companies, or granting agencies. In these contracts, scientists promise to do specific things, such as teach, consult, lecture, or conduct research, in order to receive specific benefits, such as salary, honoraria, funding for research, etc. The parties in these contracts (e.g. the scientists, the university, and the company) are free to negotiate the terms and conditions. For example, if a company decides to conduct a clinical trial on the efficacy of drug X, it may offer a contract to clinical researchers interested in conducting the trial. The researchers (and their universities) are free to accept or reject this offer, or negotiate terms and conditions.
Contracts with research sponsors play an important role in problem selection in science. Most scientists require large sums of money to conduct their research. To obtain funding, scientists often must decide to work on a problem that a sponsor is willing to pay to have studied. A scientist may become interested in a particular research problem, but the scientist usually cannot work on that problem unless he can find someone to pay for it. Research almost always follows the money. Research sponsors usually follow the advice of scientific experts in deciding whether to sponsor a particular study or pursue a particular domain. Government agencies, for example, use scientists to help set funding priorities and to serve on peer review panels that evaluate funding proposals. The public can also participate in government science funding decisions by serving on review panels, helping to set priorities within agencies, or lobbying legislators, who have some oversight authority over these agencies. The public has a right and a duty to help decide how government agencies allocate research funds (Dresser 2001, Resnik 2007).
Research sponsors can influence many scientific decisions other than problem selection, such as research design, record keeping, data analysis, data sharing, and publication. While government agencies usually require scientists to share data and publish results, private companies usually place restrictions on data sharing and publication. In some instances, private companies have prevented researchers from publishing results that were unfavorable to their products (Resnik 2007). Research grants with government agencies usually include a wide variety of restrictions, such as prohibitions on discrimination or harassment, laboratory safety standards, protection of animal and research subjects, and rules pertaining to dealing with allegations of research misconduct (Shamoo and Resnik 2003).
Scientists, scientific groups, and organizations also voluntarily restrict their autonomy through an implicit contract with society in addition to explicit contracts with employers or funding agencies (Shrader-Frechette 1994, Resnik 1998). Society provides scientists, scientific groups, organizations and disciplines with education, training, money, equipment, administrative support, and others resources. In return, scientists (and groups or organizations) have an obligation to benefit society by conducting research, teaching, giving expert advice, and engaging in other valuable activities. This obligation to benefit society, also known as social responsibility, constitutes an ethical, not a legal restriction on scientific autonomy, since scientists are not legally obligated to do good for society. For example, Rachel Carson’s Silent Spring (1964) helped to launch the environmentalist movement in the United States by warning people about the dangers of DDT and other chemicals. Carson published the book out of a sense of social responsibility. Other scientists have followed their sense of social responsibility to try to stop nuclear proliferation, to develop vaccines for infectious diseases, to report fraud and corruption, and so on (Shamoo and Resnik 2003).
V. Making Decisions about Science
To summarize the article to this point, I have argued for the autonomy of science with some limitations. Scientists, scientific groups, and organizations should free to conduct research and engage in other scientific activities, but this freedom can be restricted to prevent harm to people, society, to the environment, or to promote social goods. I do not claim to be the first person to have defended these claims. Indeed, many other writers (e.g. Schrader-Frechette 1994, Guston 2000, Kitcher 2001, Goldman 1999) have said nearly the same thing. But very few writers have attempted to specify exactly how scientific autonomy should be restricted. The remainder of this article will focus on this problem.
The citizens within a particular society can use the government to restrict or control social institutions, such as science. The different branches of government—legislative, executive, and judicial—can influence social institutions through statutes, regulations, and legal rulings. Government action can take place at the federal, state, or local level. While there are many different ways that the government can control social institutions, I will focus on two types of control in this essay: 1) restrictions on processes related to those institutions; and 2) restrictions on content on the content of those institutions.
To illustrate the difference between process restrictions and content restrictions, consider government control of speech. Process restrictions on speech include laws governing the time, place, and manner of speech (Nowak and Rotunda 2004). For example, a city might enact an ordinance requiring a permit for a parade, or a state might pass a law requiring protesters to be at least 100 feet away from an abortion clinic. Content restrictions on speech are rules that govern what speech can be about. For example, a law making it a crime to urge a crowd to commit acts of violence would be a content-based restriction on speech. Generally, a law concerning the content of speech constitutes the greatest threat to freedom of speech, because it can have a chilling effect not only on the speech that is specifically regulated but also on other forms of speech. Content-based restrictions are legitimate only to serve compelling government interests, such as the need to protect individuals or society from harm (Nowak and Rotunda 2004).
Applying this analysis to science, we can say government restrictions may affect the process of science (i.e. how science is conducted) or the content of science (i.e. what science is about). There are many different laws and regulations that pertain to the process of science, such as rules concerning research with human or animal subjects, fabrication or falsification of data, intellectual property, allocation of research funds by government agencies, disclosure of conflicts of interest, and laboratory safety. Restrictions on the content of science include laws forbidding researchers from disclosing classified information or trade secrets, and restrictions on the use of government funds for specific types of research, such as research on human embryonic stem cells (Shamoo and Resnik 2003).
In general, restrictions on the content of science constitute a greater threat to scientific autonomy than restrictions on the process of science, because restrictions on content can adversely affect creativity, communication, and collaboration. Scientists who are wary of restrictions on the content of their work may refrain from defending, discussing, or communicating controversial or politically unpopular ideas or theories in order to avoid conflict or outside control. This can have chilling effect on the research environment because scientists may not be willing to risk talking or writing about subjects that could lead to political repercussions. The Soviet government’s repression of Mendelian genetics was a restriction on the content of science.
Though content restrictions are usually more burdensome than process restrictions, some content restrictions are more burdensome than others. There is a difference between controlling the content of science by restricting the use of government funds, and controlling the content of science by restricting scientific communication, such as censorship or the classification of research. Restrictions on the use of government funds are not as burdensome as restrictions on communication, because scientists can often find non-government sources of funding for their research. For example, in August 2001, the President George W. Bush placed significant restrictions on the use of NIH funds for research on human embryonic stem cell research, but private companies and state governments have helped to overcome this funding shortage by investing hundred of millions of dollars in this research. Other countries, such as Germany, have banned human embryonic stem cell research altogether (Resnik 2007). The Soviet government used both types of content restriction in its repression of Mendelian genetics.
There are also important differences in process restrictions: some process restrictions are also more burdensome than others. General rules (or guidelines) are usually less restrictive than specific rules, because general rules must be interpreted when applied to particular situations, and interpretation allows for degree of freedom. If a rule says, “minimize risks to human subjects,” one is free to choose among many different ways of adhering to this rule. Specific rules usually do not permit this degree of freedom, because they are designed to provide detailed guidance concerning the different situations that may arise. Thus, specific rules often need to be very detailed and complicated, and this complexity can create enormous administrative burdens. For example, the Department of Health and Human Services (DHHS) has developed regulations for research with human subjects, which have been adopted by 17 federal agencies. The rule concerning the confidentiality of human research data simply states that “When appropriate, there are adequate provisions to protect the privacy of subjects and to maintain the confidentiality of data” (DHHS 2005, 45 C.F.R. 46.111(a)7). This rule is far simpler and easier to administer than the Privacy Rule found in the Health Insurance Portability and Accountability Act (HIPAA), which became effective in 2003. The “simplified” version of the rule contains thousands of definitions and provisions and is 84 pages long (DHHS 2006).[3]
Table 1 (below) summarizes the types of restrictions on science discussed in this article. In general, controls on the content of communication pose the greatest threat to the autonomy of science, while general rules concerning the process of science pose the smallest threat to the autonomy of science. To avoid undermining the autonomy of scientists, scientific groups, and scientific organizations, the government should influence science through process-based restrictions and avoid content-based restrictions. When the government imposes content-based restrictions, it should focus on restrictions on the use of government funds and avoid restrictions on communication. However, restrictions on communication can be justified for a very compelling government interest, such as the need to prevent a serious threat to public health or safety. For example, the government is justified in preventing the disclosure or discovery of information that would pose a significant threat to national security.
Table 1.
Government Control of Science
| Process restrictions |
| General rules for the conduct of science |
| Specific rules for the conduct of science |
| Content restrictions |
| Control of the use of government funds |
| Control of communication |
VI. Case Studies
I will now apply the views developed in Section V to case studies that illustrate the types of restrictions on science.
Protections for Human Subjects in Research
Since the 1970s, the United States has had regulations governing the conduct of research with human subjects. These rules pertain to research that is supported by federal agencies, such as the DHHS, as well as research that is submitted to the Food and Drug Administration (FDA) to support an application for approval of a new drug, biologic or medical device (Shamoo and Resnik 2003). These regulations require that research projects be reviewed by an Institutional Review Board (IRB). The IRB must ensure that the research adheres to seven different requirements, including reasonable relation of risks to benefits, minimization of risks, appropriate selection of subjects, adequate provisions for informed consent, documentation of consent, confidentiality protections, and data and safety monitoring (if needed). The regulations were adopted in response to a series of scandals involving research with human subjects, most notably, the Tuskegee Syphilis Study, in which the subjects (poor black men) were not told that they were in an experiment to observe the etiology of untreated syphilis. The subjects also did not receive treatment for syphilis (penicillin) when it became available. Many other countries have adopted laws or regulations that govern the conduct of research with human subjects. There are also several international codes of ethics (Shamoo and Resnik 2003).
As noted earlier, the DHHS regulations concerning protections for confidentiality and privacy are general rules that give scientists considerable latitude. Most of the other human research regulations are also general rules. Although scientists sometimes complain about that the rules are too restrictive, and consumer and patient advocates complain that the rules are not restrictive enough, for these rules have done a good job of protecting human research subjects while allowing valuable research to go forward. Making the regulations more complex and detailed would probably significantly hamper research with human subjects without greatly increasing the protection for human subjects. The regulations are at the appropriate level of generality.
Challenging Scientific Peer Review
For the second case, let’s return to the discussion of the House Energy and Commerce Committee’s examination of NIH grants. Although this attempt to cut funds was not successful, it was a troubling departure from the normal funding process, and it was a significant threat to the autonomy of science. There are ample opportunities for public input into NIH funding decisions through normal channels. The NIH includes public representatives and laypeople on committees that set funding priorities within the institute. Peer review committees include laypeople to provide a public perspective on the social significance of proposed research studies. The NIH holds periodic town meetings to solicit opinions from the public about research funding. Additionally, the NIH Director and Directors of different NIH institutes frequently testify at Congressional committees concerning NIH’s plans for addressing different diseases and public health problems. Holdings hearings on specific grants that have already been approved by the NIH circumvents the normal funding process, disrupts approved projects, and can have a chilling effect on scientists and scientific groups. Scientists who are aware of political difficulties with some types of research, such as research on HIV and drug abuse or prostitution, may decide not to submit grant proposals in these areas, to avoid potential controversy, holdups, and so on. The attempt to rescind these grants was an unjustifiable content-based restriction on science.
Censorship of Government Science
Since 2001, administrators at United States government science agencies have censored or attempted to govern scientists on numerous occasions (Mooney 2005). For example, in June 2003, administrators at the Environmental Protection Agency (EPA) attempted to change a report on the environment. The administrators tried to remove data pertaining to global temperatures for the last 1,000 years, eliminate any references to humanity’s role in climate change, and delete claims that global warming can have dire consequences on human health and the environment. The administrators also attempted to soften the impact of the report by inserting qualifiers, such as “might” and “may,” in various places (Union of Concerned Scientists 2004). In the January 2006, officials at the National Aeronautics and Space Administration (NASA) tried to prevent NASA scientist James Hansen and from communicating with the public concerning global climate change. Hansen also charged that officials at the National Oceanic and Atmospheric Administration (NOAA) had also censored scientists who had attempted to discuss global climate change with the public. In February 2006, a spokesman for NASA, George Deutshe, allegedly rewrote comments by NASA scientists to downplay the seriousness of global climate change (Eilperin 2006).
These efforts to censor government scientists were not very successful, since the scientists managed to get their message across concerning global climate change. Even so, attempts at censorship are a serious threat to scientific autonomy because they interfere with freedom of expression, and, ultimately, freedom of thought and creativity. Censorship is a restriction on the content of scientific communication. Rather than engaging in censorship, government agencies should establish rules and policies for public communications. For example, most agencies require scientists to include a disclaimer that they do not represent the views of the agency or the United States government. Most agencies also have rules pertaining to submitting articles for publication, communicating with the media, and using one’s institutional affiliation. Censorship is an unjustified departure from these policies.
To be fair, one should point out that there are important differences between controlling a government scientist’s communications with the public, and the type of censorship that occurred in the Soviet Union. The scientists working at NASA and the NOAA did not face political persecution or imprisonment. They were still at liberty to express their views, if they were willing to risk loss of employment. When they agreed to work for the government, the scientists also agreed to abide by the regulations and policies set by the government, including rules relating to communications with the public. One might argue, therefore, that government scientists do not and should not enjoy the type of freedom that academic scientists enjoy. Government scientists are similar, in some ways, to scientists who work for industry, because they may face restrictions stemming from their employment contracts.[4]
While I agree that government scientists should abide by rules for communications with the public and that working for the government is different from working at a university, this does not justify the type of censorship of scientists that has been practiced by some administrators working for the George W. Bush Administration. Government scientists are not bureaucrats or hired guns: they are independent researchers who are part of the research community. They should be able to interact with other scientists on equal terms and express their views in public debates about controversial issues involving science and public policy. Clearly, government agencies also have the right to express their own opinions and send a unified message. When government scientists write or speak about issues, they should be careful to state that their opinions do not represent those of the agency of which they work, or of the United States government. If an agency disagrees with the opinions expressed by one of its scientists, then the agency is free to convey its divergent views through publications, press releases, and other forms of communication.
Threats to the Milk Supply
In November 2004, Lawrence Wein and Yifan Liu submitted an article to the Proceedings of the National Academy of Sciences (PNAS) describing a mathematical model for contaminating the U.S.’ milk supply with botulunum toxin. The article estimated the amount of toxin necessary to kill several hundred thousand people and identified several factors for improving the safety and security of the milk supply (Wein and Liu 2005). The editors of PNAS approved the authors’ final version of the manuscript on April 20, 2005, with an expected publication date of May 30, 2005. On May 27, 2005, the PNAS editors received a letter from Stewart Simonson, Assistant Secretary for PublicHealth Emergency Preparedness of the DHHS, expressing concern about the national security implications of the article and requesting that PNAS delay publication of the article. On June 7, 2005, members of the PNAS editorial team met with officials from the DHHS to discuss the article. The editors decided to go ahead with their plans to publish the article with only minor copyediting changes. The article appeared in print on June 28, 2005, along with an editorial explaining PNAS’ reasons for publishing it. According to the editorial, publication of the article was justified because the article contained important information that can be used to improve biodefense and it did not provide terrorists with any information that was not already available on the World Wide Web (Alberts 2005). The article also contained information, ideas, and methods useful to researchers who develop computer models of disease epidemics.
Although the DHHS did not succeed in stopping the publication of Wein and Liu’s article, the agency tried to do so. The government’s involvement in this decision was an attempt to restrict the content of scientific communication, the greatest type of threat to scientific autonomy. Were the government’s actions justifiable? Should the DHHS have simply ignored the article? In this case, I think the government’s involvement was justifiable, due to the potentially harmful effects of the research. Normally, the government should refrain from any involvement concerning publication decisions, but this was an exceptional case, due to the seriousness of the harm. Moreover, there are some good things that came from this case. First, the DHHS did not attempt to stop publication of article; it only sought to delay publication. The agency could have attempted to make the research classified, but it did not. Second, the DHHS and the NAS had a productive dialogue about the issues, which may be useful to other journals. It is worth noting the government has taken some steps at the macro-level to deal with national security issues raised by research in the biomedical sciences by establishing In 2004, the DHHS formed the National Science Advisory Board for Biosecurity (NSABB), which is charged with helping researchers to develop
A system of institutional and federal research review that allows for fulfillment of important research objectives while addressing national security concerns; guidelines for the identification and conduct of research that may require special attention and security surveillance; professional codes of conduct for scientists and laboratory workers that can be adopted by professional organizations and institutions engaged in life science research; materials and resources to educate the research community about effective biosecurity; strategies for fostering international collaboration for the effective oversight of dual use biological research (NSABB 2005).
The NSABB’s role is advisory and supportive: it does not issue any mandates to researchers.
VII. Conclusion
When scientific research collides with social values, science’s right to self-governance becomes an issue of paramount concern. In this article, I have argued for the autonomy of science within a framework of public oversight. It is important to allow scientists, scientific groups, and organizations to make decisions pertaining to their activities, because excessive government control can threaten the progress of science. However, it is also important to place restrictions on scientific autonomy to prevent harm to people, society, or the environment, and to encourage research that benefits society. Government control of scientific decision-making should usually take place through restrictions on the process of science, rather than through restrictions on the content of science. When the government places restrictions on the content of science, it should avoid controlling scientific communication and should focus instead on controlling the allocation of research funds.
In closing, it is worth mentioning that scientists can avoid excess government intrusion into their work by taking responsibility for their actions and regulating their own conduct. Self-regulation is almost always preferable to regulation by an outside authority. Some strategies for self-regulation include teaching science students about their ethical, social, and legal responsibilities in research, engaging in continuing education activities that address the responsible conduct of research, and developing ethics codes and guidelines (National Academy of Sciences 2002).
Acknowledgments
This research was supported by the intramural program of the National Institute of Environmental Science (NIEHS), a part of the National Institutes of Health (NIH). It does not represent the views of the NIEHS or NIH. I am grateful to Darlene Switalski for help in preparing the manuscript.
Biography
David B. Resnik, JD, PhD, is a Bioethicist at the National Institute of Environmental Health Sciences, National Institutes of Health. He has published six books and 140 articles on ethical and philosophical issues in science and medicine.
Footnotes
Can science make progress toward the goal of objectivity? I believe that it can, although I will not provide a thorough discussion of this point in this article. See Kitcher (1993), Haack (2003).
For more on decision-making by groups and institutions, see McMahon (2001).
For more on the HIPAA privacy rule as it relate to research with human subjects, see Harrelson and Falletta (2007).
The role of government scientists is a fascinating topic that I do not have space to address in detail here. While government scientists aspire to be a part of the greater community of academic scientists, they are also civil servants who must serve the public and abide by rules set by the government.
References
- Alberts B. Modeling attacks on the food supply. Proceedings of the National Academy of Sciences. 2005;102:9737–38. doi: 10.1073/pnas.0504944102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Buchanan A, Brock D. Deciding for Others. 2. Cambridge: Cambridge University Press; 2004. [Google Scholar]
- Burke J. The Day the Universe Changed. 2. Boston: Back Bay Books; 1995. [Google Scholar]
- Bush V. Science: the Endless Frontier. Washington: U.S. Government Printing Office; 1945. [Accessed: January 10, 2007]. Available at: http://www.nsf.gov/about/history/vbush1945.htm. [Google Scholar]
- Calamari J, Perillo J. The Law of Contracts. 4. St. Paul, MN: West Group; 1998. [Google Scholar]
- Carson R. Silent Spring. New York: Fawcett Crest; 1964. [Google Scholar]
- DHHS. (2005). 45 C.F.R. 46. Protection of Human Subjects.
- DHHS. HIPAA Administrative Simplification. 2006. [Accessed: March 16, 2007]. Available at: http://www.hhs.gov/ocr/AdminSimpRegText.pdf.
- Donnelly J. Universal Human Rights in Theory and Practice. Ithaca, NY: Cornell University Press; 2002. [Google Scholar]
- Dresser R. When Science Offers Salvation: Patient Advocacy and Research Ethics. New York: Oxford University Press; 2001. [Google Scholar]
- Dworkin G. The Theory and Practice of Autonomy. Cambridge: Cambridge University Press; 1988. [Google Scholar]
- Eilperin J. Censorship is alleged at NOAA. Washington Post. 2006 February 6;:A7. [Google Scholar]
- Feinberg J. Harm to Others. New York: Oxford University Press; 1987. [Google Scholar]
- Goldman A. Knowledge in a Social World. New York: Oxford University Press; 1999. [Google Scholar]
- Guston D. Between Science and Politics. Cambridge: Cambridge University Press; 2000. [Google Scholar]
- Haack S. Defending Science Within Reason. New York: Prometheus Books; 2003. [Google Scholar]
- Harrelson J, Falletta J. The privacy rule (HIPAA) as it relates to clinical research. Cancer Treatment Research. 2007;132:199–207. doi: 10.1007/978-0-387-33225-3_10. [DOI] [PubMed] [Google Scholar]
- Hossfeld U, Olsson L. From the modern synthesis to Lysenkoism, and back? Science. 2002;297:55–56. doi: 10.1126/science.1068355. [DOI] [PubMed] [Google Scholar]
- Hull D. Science as a Process. Chicago: University of Chicago Press; 1988. [Google Scholar]
- Joravsky D. The Lysenko Affair (reprint edition) Chicago: University of Chicago Press; 1986. [Google Scholar]
- Kaiser J. Sex studies ‘properly’ approved. Science. 2004;303:741. doi: 10.1126/science.303.5659.741a. [DOI] [PubMed] [Google Scholar]
- Kantorovich A. Scientific Discovery. Albany, NY: SUNY Press; 1993. [Google Scholar]
- Kitcher P. The Advancement of Science. New York: Oxford University Press; 1993. [Google Scholar]
- Kitcher P. Science, Truth, and Democracy. New York: Oxford University Press; 2001. [Google Scholar]
- Kuhn T. The Essential Tension. Chicago: University of Chicago Press; 1977. [Google Scholar]
- Latour B, Woolgar S. Laboratory Life: The Social Construction of Scientific Facts. Princeton, NJ: Princeton University Press; 1986. [Google Scholar]
- Laudan L. Science and Values. Berkeley, CA: University of California Press; 1986. [Google Scholar]
- Longino H. Science as Social Knowledge. Princeton: Princeton University Press; 1990. [Google Scholar]
- McMahon C. Collective Rationality and Collective Reasoning. Cambridge: Cambridge University Press; 2001. [Google Scholar]
- Merton R. The Sociology of Science. Chicago: University of Chicago Press; 1973. [Google Scholar]
- Mooney C. The Republican War on Science. New York: Basic Books; 2005. [Google Scholar]
- National Academy of Sciences. Integrity in Science. Washington, DC: National Academy Press; 2002. [Google Scholar]
- National Science Advisory Board for Biosecurity. Welcome. 2005. [Accessed: November 21, 2006]. Available at: http://www.biosecurityboard.gov/index.asp.
- Nowak J, Rutunda R. Constitutional Law. 7. St. Paul, MN: West Group; 2004. [Google Scholar]
- Resnik D. The Ethics of Science. New York: Routledge; 1998. [Google Scholar]
- Resnik D. Openness vs. secrecy in scientific research. Episteme. 2006;2:135–147. doi: 10.3366/epi.2005.2.3.135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Resnik D. The Price of Truth: How Money Affects the Norms of Science. New York: Oxford University Press; 2007. [Google Scholar]
- Robertson J. The scientist’s right to research: a Constitutional analysis. Southern California Law Review. 1977;51:1203–78. [PubMed] [Google Scholar]
- Rosenberg A. The Structure of Biological Science. Cambridge: Cambridge University Press; 1985. [Google Scholar]
- Rosenberg A. Philosophy of Social Science. 2. Boulder, CO: Westview Press; 1995. [Google Scholar]
- Shamoo A, Resnik D. Responsible Conduct of Research. New York: Oxford University Press; 2003. [Google Scholar]
- Sheehan H. Marxism and the Philosophy of Science. Amherst, NY: Humanity Books; 1993. [Google Scholar]
- Shrader-Frechette K. Ethics of Scientific Research. Lanham, MD: Rowman and Littlefield; 1994. [Google Scholar]
- Union of Concerned Scientists. Scientific Integrity in Policymaking. Washington: Union of Concerned Scientists; 2004. [Accessed: June 30, 2006]. Available at: http://www.ucsusa.org/scientific_integrity/interference/reports-scientific-integrity-in-policy-making.html. [Google Scholar]
- Wein L, Liu Y. Analyzing a bioterror attack on the food supply: the case of botulinum toxin in milk. Proceedings of the National Academy of Sciences. 2005;102:9984–89. doi: 10.1073/pnas.0408526102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ziman J. Real Science. Cambridge: Cambridge University Press; 2000. [Google Scholar]
