Abstract
Background
As part of an empirical study investigating how life scientists think about ethical and societal implications of their work, and about life science research in general, we sought to elucidate barriers that scientists might face in considering such implications.
Method
Between 2005 and 2007, we conducted a study consisting of phone interviews, focus groups, and a national survey of life scientists at biomedical research institutions. The study population included graduate students, postdoctoral fellows, faculty, clinical instructors, and research staff. We analyzed data through qualitative and quantitative methods.
Results
In analyzing the data, we found that life scientists do, in fact, face barriers to considering ethical and societal implications of research. We categorized these barriers as falling into four broad domains: (1) lack of awareness of ethical and societal implications; (2) lack of relevance of such concerns to their specific research; (3) self-confidence in their ability to resolve such concerns; and (4) aspects of the daily practice of science itself.
Conclusions
Life science researchers experience elements inherent in their training and in the conduct of science as barriers to thinking about ethical and societal implications related to their work. These findings suggest areas in which research ethics educators, bioethicists, and the scientific community can focus their efforts to improve social and ethical accountability in research.
Keywords: Bioethics, qualitative methods, quantitative methods, empirical research
INTRODUCTION
There have been increasing appeals for scientists to integrate social and ethical considerations into their research and for their training to include guidance on how to do so. Many of these appeals come from scientists themselves (Alberts 2000; Eisen and Berry 2003; Kitcher 2004; Lane 1997; Lane 1998; Leshner 2003; Leshner 2005; Luria 1972; Reddy 2009; Rotblat 1999). Although such calls to action on the part of researchers have been voiced for decades, these requests remain important as the boundary between science and society increasingly blurs, placing research activities in a larger “context of implication” that extends beyond the laboratory and immediate applications (Gibbons 1999). Additionally, although the public is supportive of research and federal funding of that research (National Science Board 2008, 2010), some have argued that maintaining and even increasing public trust in the research enterprise requires continual attention to the ethical, policy, and social implications of research as well as the public’s concerns and hopes for research, not just from science advocates but from scientists themselves (Rogers 2003).
Scholars have argued that social responsibility should be explicitly taught to young scientists beginning at the graduate level of training (Beckwith and Huang 2005; Pimple 2002; Ziman 2001). There has also been an increased focus from federal research funders (Neal, Smith, and McCormick 2008 Ch 19), and policymakers (Boehlert 2007; Brown 1995) on the importance of instilling values of social accountability, collaboration, and basic ethical conduct and integrity in funding of funding. For instance, since 1992, the National Institutes of Health (NIH) has required its funded pre- and postdoctoral trainees to participate in Responsible Conduct of Research (RCR) courses (Steneck and Bulger 2007). Some scholars have drawn attention to this issue specifically in the context of clinical research ethics. Emanuel and colleagues articulated a set of seven requirements for ethical clinical research that are widely cited (Emanuel, Wendler, and Grady 2000), and many view this as an ideal framework for training researchers (and their institutions) to consider the broader social responsibilities of clinical researchers.
In 2005, as part of the “bench to bedside movement”, the NIH launched the Institutional Clinical and Translational Science Awards (CTSA) program; a critical part of the program’s request for applications is a research ethics component (RFA-RM-06-002). This consortium of academic research ethics scholars have helped develop and implement initiatives that incorporate more attention to research ethics in training curricula and the formation of more research ethics consultation services. Moreover, as detailed in the 2007 America COMPETES Act, the National Science Foundation (NSF) has taken steps to implement an ethics education requirement for all of the trainees it supports (America COMPETES 2007). Finally, in 2009, the NIH issued guidelines asking institutions to incorporate interactive elements into to the standard lecture-based courses used for RCR instruction. The guidelines emphasized the importance of in-person or face-to-face interaction with RCR faculty, adding a statement that a solely web-based curriculum was not sufficient.
However, efforts at ethics education for researchers have met with limited success (Anderson, Horn, Risbey, et al. 2007; Antes, Wang, Mumford, et al. 2010; Smith-Doerr 2006). Not all institutions require all of their trainees (graduate students and postdoctoral fellows) to take RCR courses, and the goals of such research ethics education vary widely from institution to institution (Kalichman 2007; Kalichman and Plemmons 2007). Additionally, studies assessing the effectiveness of the NIH-required RCR courses suggest that while there is a statistically significant acquisition of knowledge about regulations and compliance, there is little improvement in ethical decision-making skills, attitudes, and behaviors (Plemmons, Brody, and Kalichman 2006; Powell, Allison and Kalichman 2007).
In addition to the shortcomings of educational efforts to develop biomedical scientists’ ethical reasoning skills, researchers themselves are often reluctant to engage in discussions about the societal implications of their work (Wolpe 2006), Scholar Paul Root Wolpe has proposed eight reasons why scientists are not keen to engage with ethical issues, ranging from feeling that they are not trained in ethics and believing that their scientific work has little to do with ethics, to thinking that others will make ethical decisions and that ethicists mostly say ‘no’ to new technologies (Wolpe 2006). Furthermore, a recent study suggested that one reason why scientists may be reluctant to discuss such concerns is the research ethics training itself; this study found that RCR courses can be detrimental to participants’ willingness to seek help in problem-solving (Antes, Wang, Mumford, et al. 2010).
Our work expands on these previous studies of RCR courses and ethical discussions by exploring, through interviews, focus groups, and surveys, whether and how life science researchers engage with ethics. We previously reported findings revealing that many life scientists do in fact find that their work raises ethical and societal concerns or questions (McCormick, Boyce and Cho 2009). Slightly more than half of the researchers included in the national survey indicated they have had concerns about social or ethical issues in their research, whereas approximately a third could foresee having such questions. Less than a fifth indicated that they anticipated definitely having ethical and societal concerns (McCormick, Boyce and Cho 2009).
We then conducted a secondary analysis of our data sets and identified an emergent broad theme, a theme that we have labeled “barriers”. Further analysis of this theme involved determining what barriers scientists may perceive or experience that make it challenging for them to include ethical and societal considerations in their daily research activities. Here, we present the results of this secondary analysis, which suggest life scientists do in fact perceive or experience a range of barriers. Our findings provide empirical evidence to support Wolpe’s suggestions as to why researchers have difficulty engaging in these issues and suggest specific mechanisms through which to encourage life scientists to incorporate ethical and societal considerations into their research.
METHODS
We combined the individual data sets into a single, larger set for this secondary analysis (Figure 1). Our preliminary secondary analysis of this larger data set identified an unexpected, common topic across all phases of our study: “barriers” our participants perceive or actually face in considering the ethical and social implications of research. Given the relevance of this topic to the broader research ethics community, we proceeded to thoroughly analyze the combined qualitative data using a standard grounded theory approach.
Figure 1.
Schematic Collection of Data from the Life Scientists, Science, and Society Project
Key: Each box represents one phase of the parent study; arrows indicate how analysis from each phase informed the next phase. The boxes are staggered in the order each phase was conducted. The final lower box represents the secondary analysis reported here. No researcher took part in more than one phase of the study.
The methods we employed have been described elsewhere (McCormick, Boyce and Cho 2009), but a brief summary is provided here. We conducted an empirical study involving interviews, focus groups, and surveys. In December 2005, we conducted a pilot survey of Stanford University scientists conducting “genetics” research. We identified these individuals by searching the websites of faculty with laboratories in the basic life science. We examined the self-reported research focus of the laboratories as noted on their websites for terms related to genetics, genomics, or evolution, or gene therapy, gene transfer, or microarray. The pilot survey focused largely on determining how researchers who identified themselves as doing genetic research view (1) ethics in science and (2) how responsive they might be to a research ethics consultation service. The aim was to gauge receptiveness to a research ethics consultation service as well as to obtain preliminary information regarding how a small sample of researchers at our institution think about ethics in the context of their research. We targeted this group of life scientists in large part because genetic and genomic research raise a number of challenging ethical, policy, and social considerations that researchers should take into account. A total of 150 surveys, consisting of 9 questions, were mailed to Stanford graduate students, postdoctoral fellows, faculty, and research staff identified through our search of departmental websites. Most of the questions were closed-ended, but we did solicit and obtain responses to two open-ended questions.
One goal of the pilot survey was to inform the development of a semi-structured interview guide, which we designed using the data generated in this survey, particularly the responses to the open-ended questions. Interviewees were randomly selected from a larger sampling population comprised of Stanford life science researchers using a database created specifically for this study (McCormick, Boyce and Cho 2009). An invitation letter, along with a $10 bookstore gift card, was mailed to the selected graduate students, postdoctoral fellows, research staff, and faculty. Two of the authors (JM and AB), along with a research assistant, conducted the semi-structured telephone interviews, which lasted between 15–45 minutes. All interviews were audio-recorded and transcribed.
We were interested in learning whether the interviewees ever thought about ethical and societal implications related to the life sciences and their particular thoughts on these issues. Specific questions in the final semi-structured interviewer guide included the following:
Tell me about the research you do.
What are some of the ethical, social, and policy implications related to your research that you think about?
To whom do you talk about these kinds of issues?
To whom might you go for advice if you had an ethical or societal concern or question related to your research?
The interviewees’ responses to these questions often mentioned barriers faced by the participants themselves, and scientists generally, in trying to incorporate social, policy, and ethical considerations into their daily work and research activities.
The rich data generated from the interviews prompted us to expand our qualitative data set. We opted to use focus groups because this method would allow us to obtain data from researchers in conversation with their peers about the ethical and social considerations of their work. Four two-hour focus groups were conducted in September 2006. (One two-hour pilot focus group was also conducted and included in this secondary (?) analysis.) One hundred twenty researchers were invited to participate in these focus groups; these researchers were selected by stratified random sampling from the same database developed for the telephone interviews. All participants received lunch and a $75 gift card, and they provided consent to have the focus groups audio-recorded and transcribed.
The questions in the moderator’s guide were informed by the data from our previous telephone interviews with individual researchers as well as the open-ended questions of the pilot survey. A sample of the questions is shown in Figure 2. We also explored the focus group participants’ thoughts on the role of bioethics and controversy in research (Figure 2). We modified the opening question to help stimulate the start of the discussion.
Figure 2.
Sample of Questions Used in the Focus Groups*
* The moderator used additional questions to probe responses as appropriate
To determine the generalizability of our findings from the single institution study, we conducted a national survey. The survey instrument was informed by data collected from the pilot survey, interviews, and focus groups. Drawing on the previously collected qualitative data, we were able to craft responses to closed-ended questions that reflected the language likely to be used by the researchers (respondents). Seven research universities were selected for the national survey, including Stanford and six others, chosen from 98 universities identified from a publicly available list of the top 100 NIH university awardees in 2004.
Two thousand paper surveys were mailed to faculty, research staff, instructors, postdoctoral and clinical fellows, and graduate students. We limited our sample to life science researchers from five departments at each of the seven institutions: biochemistry, biological sciences, genetics (non clinical), pathology, and psychiatry/behavioral sciences (clinical). As we were particularly interested in the perspectives of researchers at Stanford about bioethics, bioethicists, and research ethics, we oversampled for this population, mailing 500 surveys to this group 250 surveys to researchers at each of the remaining six institutions. A $5 gift card to a local coffee shop was enclosed with national survey along with a letter inviting them to participate in the study.
The national survey consisted of four sections that focused on issues surrounding the relationships between life science research and society; science communication and public engagement; politics and policy-making in the sciences; and demographic data. The questions were designed to identify researchers’ perceptions and attitudes toward social, ethical, and policy implications of research; bioethicists; and research ethics consultation services. Respondents were asked whether they would speak to bioethicists about a question or concern they might have; how useful they thought a research ethics consultation service might be; the relevance of discussions on the ethical, social, and policy implications of research for scientists; and what kinds of societal and ethical questions they might have or anticipate encountering in their research. This last query was designed to elicit a general sense of what respondents were considering as potential ethical, social, and policy implications for their research and science in general. The format of the survey questions was fairly diverse and included multiple choice, Likert scales, ordered category items, and ranking of choices. Several open-ended questions were designed to elicit narrative responses concerning what scientists considered to be the ethical, social, and policy issues related to their own work or research generally; these responses are not included in our secondary analysis.
For the secondary analysis reported here, the qualitative data were analyzed using consensus coding and grounded theory, an inductive process in which hypotheses emerge from the data themselves, and those same data are then systematically used to test, refine, and validate hypotheses (Corbin and Strauss 2008). Thematic saturation was reached after we completed our analysis of the combined qualitative data. Transcribed recordings were imported into MaxQDA, (VERBI GmbH, Germany), a qualitative data analysis software program, to facilitate systematic and consensus coding by two members of the research team, as described elsewhere (Sankar, Cho, Wolpe, et al. 2006). Qualitative data from all four stages of the study were analyzed collectively in this process.
Using basic descriptive statistics, we analyzed our quantitative data in Stata v10 (StataCorp, TX), a statistical analysis application. Quantitative data from the pilot survey were analyzed separately from those of the national survey, due to differences in format and question type. The quantitative data are not reported here.
We received approval from the Stanford Institutional Review Board (IRB) for all aspects of this research.
RESULTS
For the qualitative elements of our research reported here, the response rates were as follows: 64/150 (43%) researchers responded to the pilot survey; 16/20 (80%) invited researchers were interviewed); 29/120 (24%) participated in focus groups, and 856 responded to the national survey (50% response rate)1. No researcher took part in more than one phase of the study. Descriptions of participants’ basic characteristics (e.g. position, area of research) were included in a previous publication of our results, which focused on researchers’ attitudes about the value of a research ethics consultation service as a resource (McCormick, Boyce and Cho 2009).
We report here on a theme identified in the secondary analysis of our research, namely what researchers perceive or experience as barriers to thinking about ethical and societal implications of research and to actively incorporating ethical and social considerations into their work. Researchers’ perspectives on barriers to considering the ethical and societal implications of their research, and incorporating these concerns into their work, clustered into four overarching categories: (1) lack of awareness about ethical and societal issues; (2) belief that societal and ethical concerns are not relevant to one’s specific research; (3) confidence in being able to independently handle any issues that arise; and (4) inability to meld ethical and societal considerations into the day-to-day realities of scientific practice. This last obstacle is multifaceted and comprised of many elements of scientific culture, including the pressure to produce and publish results, the hierarchical structure of research laboratories, and interactions with institutional bureaucracy. Although there is overlap and relationships across these issues, the way in which we have categorized the barriers reported by researchers facilitates a more fine-grained and instructive discussion of the challenges.
Lack of Awareness
Our analysis of the qualitative data suggests that some biomedical researchers may not contemplate the ethical and societal implications of their work due to a general lack of awareness of such issues, especially as related to their own work, or because they believe that such issues rarely occur. Some respondents expressed the idea that life scientists may not be cognizant that their research potentially impacts the broader community and are unaware that their work does in fact have ethical and societal implications. As one professor offered,
“I would venture to guess that a significant percentage of researchers potentially may not be aware of the moral or ethical implications of what they are doing.” (Interview 15)
A postdoctoral fellow explained that this apparent lack of awareness may stem from confusion about what exactly qualifies as such a concern:
“I think a major issue is that many scientists, myself included, don’t have a clear idea of what an ethical issue necessarily is, and if it applies to them.” (National survey, open-ended response)
These data suggest that biomedical scientists may not reflect on ethical and societal concerns simply because they are unaware that such broader considerations may be tied to their laboratory research. Unacquainted with how to define or clearly identify such considerations, researchers may neglect the social effects of their work or too narrowly conceptualize ethical implications, and presume that their work does not raise such considerations.
Our data also support the view that, particularly in the context of one’s own research, ethical and social conundrums rarely occur. A graduate student noted,
“Something like this [research ethics consultation] would definitely be useful if the school was continuously encountering ethical questions, but it just seems that neither myself nor the one hundred or so researchers I know have run into ethical dilemmas.” (Pilot Survey, open-ended response)
The view that ethical dilemmas are extra-ordinary and infrequent events suggests that some researchers may therefore determine that the attention, time, and resources needed to consider the ethical and societal implications of their work are better directed elsewhere.
Lack of Relevance
Our data indicate that judgments about relevance may preclude some life scientists from being aware of, and thus thinking about, ethical and societal implications in the context of their own work. Issues of relevance and awareness are closely related. Indeed one may be unaware of the relevance of the research enterprise to the larger social and political world. Or, an individual may be very aware of the social and ethical implications of research generally, but believe for her particular work such considerations are not relevant.
Although over four-fifths (686/835) of those who responded to the national survey agreed that “for the most part,” ethical and societal issues are relevant to life science research, fewer than one-third (246/836) of these surveyed life scientists agreed that their own research “has direct ethical and social implications.” As an instructor we interviewed stated,
“Ethically we’re far away. We’re working on invertebrates, they don’t [even] have eyes or anything.” (Interview 10)
A research staff member echoed this view, stating consideration of societal and ethical implications was,
“not such a big deal for me because I’m working on fish and stuff; people generally don’t think of fish in the same way as they think of mammals.” (Interview 12)
In other words, many life scientists, especially those not engaged in human subjects research or those whose work is not immediately translatable into human applications, may not believe that their work affects the broader community. Scientists may be aware of ethical and social issues related to particular types of research, but assume that they do not have to worry about these matters. On the other hand, some researchers who have never engaged in clinical research, research using human subjects, or in research on socially sensitive topics (e.g., genetics of skin color pigmentation in rodents or of brain development in zebra fish) may have a very limited understanding of the ethical and social implications of any type of research.
Our data also indicate that the notion of relevancy, or lack thereof, is associated with the idea of controversy. Some scientists view “controversial” research, such as human embryonic stem cell research and genetic testing, as biological research in which there are clear societal implications, in part because of the public debate and socially sensitive nature of these topics.
High Personal Thresholds and Self-Confidence
Still, a number of respondents expressed an overwhelming sense of confidence that they themselves were well equipped to appropriately identify and address ethical and social issues related to their research. Approximately thirty-four percent (285/831) of our nationally surveyed researchers agreed with the statement that they “are capable of handling any ethical or societal issue related to [their] research on [their] own,” suggesting that some have a high bar, or threshold, for what qualifies as a “big” dilemma requiring outside help.
Our qualitative data support this finding; a small but notable portion of researchers reported a sense of confidence that they themselves are well equipped to appropriately identify and address ethical and social issues. As one graduate student noted,
“I guess I haven’t met a lot of scientists who were unsure about the ethics of what they were working on.” (Interview 3)
Another graduate student concurred in the response to an open-ended question in the pilot survey:
“I feel I am able to determine and resolve ethical and societal concerns pertaining to my research on my own.” (Pilot Survey, open-ended response)
In further analysis of all our qualitative data, we found that those respondents expressing this high level of confidence were primarily graduate students.
Our qualitative data do however support a belief in limits to an individual’s capabilities. Our respondents, including graduate students, did temper their strong confidence with an appreciation that external expertise may at times be necessary to confront ethical dilemmas. When this expertise is needed depends on a personal threshold, above which a problem becomes complex enough to warrant outside help. As one graduate student expressed,
“If I didn’t think something was a very big issue…I would think that it would be something I could handle myself.” (Interview 1)
This statement not only reflects varying levels of confidence, but also suggests that feelings of self-confidence may be intertwined with notions of a personal threshold of what constitutes “big issues” as well the relevance of ethical and social implications to one’s work.
Our data also indicate that the threshold level is very much a personal determination, differing not only between researchers, but also between life scientists and those individuals, such as ethicists and policymakers, who are not as invested in the research and may be able to view it in a broader context. As a postdoctoral fellow explained,
“Whether I could separate myself completely and look at [my research] completely in an unbiased way, I don’t think so.… I think it’s good that there are ethical centers and that there are people who can look at what we do from the outside, because when you’re in it, it is a little hard to always take a step out of it and understand the ethical compli-[cations]…” (Interview 6)
We postulate that scientists, who are accustomed to troubleshooting experiments at the bench, may first try to independently tackle ethical dilemmas, accordingly adjusting the stringencies of their thresholds. As this graduate student candidly noted,
“… sometimes people can be stubborn and want to figure out things themselves…” (Interview 1)
Incongruence with Scientific Practice
Our analysis of the data further indicates that the daily practice of science itself might serve as an obstacle to thinking about ethical and social implications of research. In addition to expressing a lack of time and energy to meet one’s scientific responsibilities while also addressing broad social concerns, some of the researchers admitted their fears that ethical considerations would hinder the scientific process. By extension, researchers were therefore apprehensive of ethicists. As one professor noted,
“My experience with some ethicists [is] finding problems where they don’t really exist.” (Pilot Survey, open-ended response)
Another professor concurred with the opinion that ethics interferes with scientific practice, stating that ethicists,
“…would come in and say, oh you can’t do that, and potentially wreck someone’s project…[it] would not be in the individual’s best interest.” (Interview 15)
The scientists in our studies seem to perceive ethical considerations as fertile ground for obstruction of research, impeding the discovery of novel findings. Some believe that the lack of expertise or familiarity with the actual process of scientific research contributes to ethicists’ tendencies “to restrict.”2 As one professor stated,
“Very quickly you get into a situation where other people…can’t really understand exactly what’s going on…the nitty-gritty of a project that one’s doing…People who aren’t actively doing the research, it’s very difficult for them to get an idea of what’s being done.” (Interview 14)
Our data suggest that some life scientists view an IRB’s review of their protocols as the only place and time for incorporating thoughtful discussions on possible ethical and social considerations. As one professor described,
“If the IRB…saw what I was planning on doing and they said it’s okay, and then somebody comes along and says, oh, no, how could you possibly do that, I’d say…I passed it by the IRB, which is what I’m supposed to do and that’s a panel of people that …aren’t gonna benefit from my research and they thought it was fine so how can I be criticized [on the ethics of it]?” (Interview 17)
Furthermore, based on our data, scientists appear to conflate ethics and considering societal implications with compliance and following regulations. Another professor, half-jokingly but also quite seriously, further elaborated on the content of the ethical discussions that do take place:
“Usually it’s [the conversation] laced with expletives and the letters IRB.” (Focus Group 5)
In fact, researchers often feel overwhelmed by the increasing regulatory bureaucracy they must deal with daily. Our quantitative data show that approximately 53% percent (434/819) of researchers disagreed with the statement that “the current regulatory environment does not place too much burden [on them].” As a professor explained,
“I’m at the point where I’m feeling like if I have to fill out one more form to submit a grant that I’m just not gonna do it.” (Focus Group 5)
Beyond the burden of regulations, our respondents viewed another aspect of scientific culture, the hierarchical structure of research laboratories, as a barrier to engaging in discussions about ethics. In attempting to address an ethical dilemma, our participants did not want to overstep a more senior scientist, fearing negative repercussions. As any ethical dilemma would inherently be intertwined with the scientific experiments at hand, a postdoctoral fellow remarked,
“I would…go to my boss because that’s still fundamentally a scientific discovery.…I could potentially get into trouble with my boss if I just shared results without having consulted him first.” (Interview 6)
A second postdoctoral fellow echoed this sentiment, noting a potential hurdle to incorporating the recommendations from consulting with a research ethicist.
“PIs [Principal Investigators] do not like/discourage involvement of outside bodies when or once ethical issues arise (a fact, not a hypothetical concern).” (Pilot Survey)
These qualitative comments on potential negative repercussions are consistent with our quantitative findings that more life scientists would seek out advice from a colleague (61% or 517/847) than advice from a trained bioethicist (26% or 217/847). Although bioethicists should possess the appropriate expertise for advising on ethical and societal questions, biomedical scientists are more comfortable asking for a fellow researcher’s assessment of a dilemma, especially if speaking to a bioethicist would bypass a senior colleague.
Finally, the fact that scientists by nature tend to be reductionists may impede the inclusion of ethical and societal considerations within daily scientific routines. While researchers may acknowledge implications broadly, they have a tendency in their daily research activities to focus narrowly on their own data. As this postdoctoral fellow quite bluntly commented,
“We’re informaticists. We… think in… numbers, and…, we see all this very much as numbers, and it’s true that, and maybe we should spend more time on this, we don’t always think of the ethical implications of where this is going because we see this much more as data and we are removed from it [ethical and social implications].” (Interview 6)
For many biomedical researchers, the everyday practice and culture of science itself is seen as incongruent with thinking about ethical and societal implications of research. Between the pressure to produce and the structure of laboratories, researchers may have little time and limited means with which they are comfortable navigating – and using – to engage with ethical considerations.
DISCUSSION
The findings from the secondary analysis of the qualitative data are consistent with our previously published analysis of the quantitative data, indicating that although some researchers are aware of social and ethical implications, and do recognize that they will encounter issues in this regard, a good proportion believe otherwise (McCormick, Boyce and Cho 2009). Moreover, the barriers of lack of awareness and lack of relevance together provide further qualitative support for Wolpe’s second deterrent to thinking about ethics: “My scientific work has little to do with ethics” (Wolpe 2006).
Whether or not they are cognizant of ethical and societal consequences of research or too narrowly conceptualizing such implications, scientists may assume that they themselves have not and will not encounter such issues. Similarly, believing that their work does not immediately or directly affect society, scientists may discard ethical considerations as irrelevant. In other words, scientists who do not view ethical considerations as concerns closely connected to their research excuse themselves from considering such issues within the frame of their scientific planning. This barrier to thinking about ethical issues can also be conceptualized in terms of what we have called “perceived proximity,” or how close scientists perceive they are to public concerns, which in turn affects the emphasis they place on contemplating social issues. We have further elaborated on this theory of perceived proximity and social accountability elsewhere (Ladd, Lappe, Mccormick, et al. 2009).
Conversely, self-confidence represents another barrier to thinking about ethical and societal implications of research. Some of our researchers were strongly confident that they were capable of handling ethical dilemmas independently, perhaps as they are accustomed to troubleshooting scientific problems. Our quantitative data suggest this attitude is prevalent across all career positions, while our qualitative data highlight in particular the self-confidence of graduate students. The small sample of interviewees and focus group participants does not allow us to generalize or make definitive conclusions about graduate students. However, the strong statements made by these students might come from a lack of experience with complex ethical or social issues raised by research.
This attitude within our sample may be grounded both in the young students’ inexperience and in the very nature of graduate schooling. A Ph.D. dissertation requires a deep, but narrow, focus on specific hypotheses within the laboratory without the same emphasis on exploring the broader social context of research. Thus, some graduate students may not give much thought to the societal implications related to their work, assuming any such concerns that might arise are minor. This potential “trivialization” may foster a sense of confidence that they, on their own, can resolve any dilemmas that might occur, whether in the short-term or stemming from eventual downstream applications of their research. These findings are consistent with a recent study suggesting that the ethical training component(s) of scientists’ careers could in fact lead to overconfidence, “self-enhancement bias”, and decrease in help-seeking behavior (Antes, Wang, Mumford, et al. 2010). Another study indicated that researchers believe that intuition is an appropriate guide to solving moral dilemmas, and that their own moral compasses are well calibrated (Deming, Fryer-Edwards, Dudzinski, et al. 2007).
Our secondary analysis of the qualitative data revealed that this self-confidence is tempered by recognition that some ethical concerns may be so complex that outside advice is necessary. At what point a dilemma qualifies as “big” depends on an individual threshold, a level at which the researcher is hesitant to independently tackle the ethical problem. This concept of thresholds echoes scholar Nicholas Steneck’s proposed bell curve of research behaviors: at either tapered end are problems which are generally agreed upon as misconduct or acceptable, while the largest area in the center of the distribution represents “questionable research practices” (Steneck 2006). In the case of thinking about ethical dilemmas, the largest area of the curve houses problems that may be considered to be either “easily solvable” by the individual scientist (i.e., independently) or as necessitating a second opinion, depending on the researcher’s personal threshold for addressing ethical debates. The course of action also depends on what researchers view as within the scope of actionable issues (e.g., perceiving control over issues of research misconduct, but not over the broad social impacts of research once it moves beyond the confines of the laboratory).
Nonetheless, even if an issue is deemed to merit outside expertise, there is an additional barrier to seeking bioethicists’ advice. Our respondents worried, as Wolpe suggested, that “ethicists mostly say ‘no’ to new technology” (Wolpe 2006). In other words, ethics itself was viewed as a burden, as a detriment to their scientific experiments, by our participants. The quantitative findings also support our respondents’ verbalized wariness of ethicists impeding scientific projects and imply the importance of trust and established collegial relationships.
Our data situate this obstacle to contemplating ethical issues within the framework of scientific practice itself: aspects of scientific culture comprise a barrier to thinking about ethics. In the highly competitive atmosphere in which many researchers work, there is ever-mounting pressure to produce results and publish novel findings. Our respondents felt that ethics and bioethicists might stall their experiments, hindering their abilities to be successful, productive members of the scientific community. Believing that bioethicists will delay or preclude research, scientists are reluctant to consult them in a fast-paced environment that rewards the production of novel results over solving ethical dilemmas. The former leads to scientific publications, which are an inherent part of securing degrees, jobs, and promotions; obtaining funding for experiments; and thus surviving in an often competitive research environment (Kennedy 1997 Ch 1; Neal, Smith and McCormick 2008 Ch 6; Rubin 2006).
Furthermore, the perception that ethics is synonymous with regulatory compliance creates a situation in which many researchers may narrowly construe the ethical and social implications of research. With this in mind, the role of ethicists and ethical, legal, and social implications (ELSI) researchers is viewed as watching over scientists, to catch anyone doing something “wrong,” rather than to help them think through broader concerns or how to do things better: how scientific findings are situated in a larger context, what the potential positive and negative implications of the findings might be, how to tailor research designs to minimize harms and maximize benefits, or even how to enable the research to go forward more smoothly and quickly.
Additionally, our respondents noted that the established hierarchy of research laboratories may discourage consultation with anyone with expertise beyond the domain of the life sciences. In other words, junior scientists are wary of negative repercussions if they bypass senior level scientists. The theme of hierarchy and its impact on conduct was also noted in a recent study of biomedical researchers (Geller, Boyce, Ford, et al. 2010). Therefore, discussions of social concerns remain within a limited scientific context rather than moving to a platform focused on ethical frameworks and thinking. This scientific context encourages life science researchers to be reductionists, to focus narrowly on particular hypotheses with the goal of finding a solution using the tools of science and the scientific method. The specialized expertise and knowledge required to be successful often leaves little room for considering ethical and societal implications related to scientific questions, processes, or potential outcomes. This is not due to any fault of the scientists, but is instead how they are traditionally trained.
Our conclusions have important implications for research ethics education, consultation services within academic institutions, and perhaps even policies governing metrics for professional success. Our findings support Wolpe’s observations of scientists’ relationship with ethics (Wolpe 2006). However, our results further suggest that an increased educational effort framing ethical and social considerations as an integral part of scientific work could be incorporated and more strongly emphasized, not only from the very start of the academic career trajectory, but throughout scientists’ professional development. In an ongoing review of data from our national survey, we found that a majority of researchers (551/835) agreed that more training would be useful to “deal with” ethical and societal implications of their research.
In addition, trainees and junior scientists may need to be given “permission” to consider ethical and social issues through modeling by senior scientists, and even senior scientists might require some educational awareness or “re-training” in the context of ethical and social implications related to research. To firmly integrate social responsibility and consideration of ethical issues into the scientific process, both institutionally and at large, the scientific community and academic leadership could formally recognize such efforts by faculty as essential for a successful scientific career and for the success of science in general. These educational and institutional changes can help to create socially responsible (Kitcher 2004) or “civic scientists”, (Lane 1997; Lane 1998), who are both cognizant of the effects of their work on society and familiar with engaging with external experts to dissect ethical issues. Early integration and sustainment of this societal focus may also help to make junior scientists more comfortable with speaking to bioethicists and make senior scientists more amenable to such consultations.
These findings are relevant to the role of research ethics consultation services, which are increasing in number, especially for institutions engaging in clinical research (Cho, Tobin, Greely, et al. 2008; Cho, Tobin, Greely, et al. 2008; De Melo-Martín, Palmer and Fins 2007). As more CTSAs are created over the next few years, it is likely that such services will begin to take on more prominence in the clinical and translational research arena. As part of the CTSA program, NIH appears to be placing an increased emphasis on community engagement and inclusion of research ethics considerations in the scientific process (CTSA Consortium’s Community Engagement Key Function Committee, CTSA Community Engagement Workshop Planning Committee 2009; National Center for Research Resources, National Institutes of Health, U.S. Department of Health and Human Services; CTSA Consortium; NCRR PROGRESS). Based on what is presented here and in previous work, (Ladd, Lappe, McCormick, et al. 2009; McCormick, Boyce and Cho 2009), we suggest that while life scientists generally see a need for research ethics consultation services and do agree that ethical and social issues can arise as a result of life science research, there are definite barriers to directly tackling these issues in their own work. Additional data from our national survey suggest that there are life scientists (280/831) who realize they can’t necessarily handle ethical and social questions arising from their work on their own
For research ethics consultation services to be successful in collaborating with life scientists to develop proactive problem solvers with regards to ethical and societal concerns, those providing these consultation services should be cognizant of our findings concerning potential barriers, many of which may be culturally ingrained in the scientific community. Research ethics consultants and their institutions may therefore need to devise approaches to work in concert with the current constraints of scientific culture in order to more effectively facilitate scientists’ incorporation of ethical and social considerations at every stage of the research pipeline. For instance, the research ethics community could couple research ethics consultation and other mechanisms for increasing research awareness with study design, biostatistics, and informatics consultation services. As researchers are already accustomed to utilizing biostatistics consultation, such integration would facilitate researchers’ (a) ability to identify ethical and social issues relevant to their own work; (b) familiarity with ethical and societal issues commonly associated with life science research; and (c) incorporation of ethical and social considerations into their regular research process. Successful integration as proposed here would require institutional support and acceptance of the view that successful scientific research is based not only on sound study design and analyses, but also on thoughtful consideration of the potential ethical, social, and policy implications.
Limitations
Our sample was not inclusive of all academic research institutions in the United States or all scientific disciplines, and it was limited by our use of publicly available websites for participant selection. However, given the representation of a range of positions and areas of research, and for our national survey, different types of institutions, we believe that our findings provide initial data on what scientists perceive, and experience, as barriers to ethical and societal considerations. Finally, three of our four research instruments included explicit queries about research ethics consultation services and the likelihood of using them. As such, an alternative interpretation of our findings may be that these barriers are only encountered in the context of using such a service. Cognizant of this limitation, we focused only on responses for which the surrounding context strongly supported an interpretation that the statements reflected consideration of ethical and societal implications more generally.
Conclusions
The parent multi-stage study was designed to explore how life scientists think about the ethical and societal implications of their own work or of biomedical research in general. A secondary analysis of the data from our parent study as a combined single unit, revealed a specific theme important to engaging researchers in discussions about ethical, policy, and social considerations relevant to the research they do. Here, we have presented what life scientists report to be several barriers to thinking about such issues. We categorized these obstacles as: (1) lack of awareness; (2) lack of relevance; (3) high threshold and self-confidence; and (4) aspects of scientific practice and culture itself.
These data provide a starting point to further investigation and may serve as a guide in designing additional research instruments. For example, a next step might be to conduct a large national survey focused on querying life scientists about the categories of barriers we have identified. Given the global nature of science, determining how researchers from other countries experience (or do not experiences) these barriers would be very insightful. Finally, further exploring each of the barriers we describe in more detail with different disciplines within the life sciences would provide empirical data that could help inform the creation of mechanisms facilitate the ability of researchers to actively consider ethical, policy, and social issues of their research and take steps to address and minimize negative implications of this research.
Acknowledgments
The authors wish to thank Ravi Garg, Anny Lin, Mariel Bailey, Cassia Wells, and Sujana Bhattacharyya. The authors also wish to acknowledge support by grant #P50 HG003389 from the US National Institutes of Health, National Human Genome Research Institute. Finally we thank the anonymous reviewers and editorial staff for their insightful comments.
Footnotes
Of the 2,000 surveys that were mailed, 293 were not delivered, leaving 1707 surveys that were assumed to have reached the intended recipient. In addition, not every participant responded to all questions.
“I would have a worry that you would say no, you can’t do this, or in order to do this you must do this or that or the other thing, and the function would be to restrict the possibilities of what a researcher could do. That’s my feeling… because I think what you would be interested in doing is saying, you’ve got an ethical problem here or dilemma here.…If I had perceived that that was the situation previously,…I guess I already would have understood that. But if it was something I hadn’t really understood or something I disagree with as an ethical problem or I personally didn’t have an ethical problem with it, then I might think, well, gee, now I’ve got a problem with doing the project that I didn’t see any kind of problem with.” (Interview 17, Professor).
Contributor Information
Jennifer Blair McCormick, Mayo Clinic.
Angie M. Boyce, Cornell University
Jennifer M. Ladd, Stanford University
Mildred Cho, Stanford University.
References
- U.S. Department of Health and Human Services. [Accessed April 27, 2011];RFA-Rm-06-002: Institutional Clinical and Translational Science Award. ( http://grants.nih.gov/grants/guide/rfa-files/RFA-RM-06-002.html). Released October 12, 2005. Expired March 28, 2006.
- America COMPETES Act, Public Law No: 110–69. Title VII, Sec. 7009. http://ecip.loc.gov/cgi-bin/bdquery/z?d110:HR02272:@@@L&summ2=m&2007
- Alberts B. Science and Human Needs. National Academy of Sciences’ 137th annual meeting; Washington, D.C. 2000. [Google Scholar]
- Anderson MS, Horn AS, Risbey KR, et al. What Do Mentoring and Training in the Responsible Conduct of Research Have to Do with Scientists’ Misbehavior? Findings from a National Survey of Nih-Funded Scientists. Academic Medicine. 2007;82(9):853–860. doi: 10.1097/ACM.0b013e31812f764c. [DOI] [PubMed] [Google Scholar]
- Antes A, Wang X, Mumford M, et al. Evaluating the Effects That Existing Instruction on Responsible Conduct of Research Has on Ethical Decision Making. Academic Medicine. 2010;85:519–526. doi: 10.1097/ACM.0b013e3181cd1cc5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beckwith J, Huang F. Should We Make a Fuss? A Case for Social Responsibility in Science. Nature Biotechnoogy. 2005;23(12):1479–1480. doi: 10.1038/nbt1205-1479. [DOI] [PubMed] [Google Scholar]
- Boehlert SL. The Role of Scientists in Policymaking Paper presented at: AAAS-CSPO S&T Policy Review: Highlights of the 2007 Forum on S&T Policy. Aug 21, 2007. [Google Scholar]
- Brown GE. What Is the Future for the Physical and Mathematical Sciences?. Paper presented at: AAAS Annual Meeting; Aug 6, 1995; Atlanta, GA. [Google Scholar]
- Cho MK, Tobin SL, Greely HT, et al. Research Ethics Consultation: The Stanford Experience. IRB: Ethics & Human Research. 2008;30(6):1–6. [PMC free article] [PubMed] [Google Scholar]
- Cho MK, Tobin SL, Greely HT, Mccormick JB. Strangers at the Benchside: Research Ethics Consultation. American Journal of Bioethics. 2008;8(3):4–13. doi: 10.1080/15265160802109322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Corbin J, Strauss A. Basics of Qualitative Research. 3. Thousand Oaks, CA: Sage Publications; 2008. [Google Scholar]
- CTSA Consortium’s Community Engagement Key Function Committee, Ctsa Community Engagement Workshop Planning Committee. Researchers and Their Communities: The Challenge of Meaningful Community Engagement. 2009 http://www.ctsaweb.org/uploadedfiles/Best%20Practices%20in%20Community%20Engagement_Summary_2007-2008.pdf.
- De Melo-Martín I, Palmer LI, Fins JJ. Viewpoint: Developing a Research Ethics Consultation Service to Foster Responsive and Responsible Clinical Research. Academic Medicine. 2007;82(9):900–904. doi: 10.1097/ACM.0b013e318132f0ee. [DOI] [PubMed] [Google Scholar]
- Deming N, Fryer-Edwards K, Dudzinski D, et al. Incorporating Principles and Practical Wisdom in Research Ethics Education: A Preliminary Study. Academic Medicine. 2007;82:18–23. doi: 10.1097/01.ACM.0000250028.51329.6b. [DOI] [PubMed] [Google Scholar]
- Eisen A, Berry RM. The Absent Professor: Why We Don’t Teach Research Ethics and What to Do About It. American Journal of Bioethics. 2003;2(4):38–49. doi: 10.1162/152651602320957556. [DOI] [PubMed] [Google Scholar]
- Emanuel EJ, Wendler D, Grady C. What Makes Clinical Research Ethical? JAMA. 2000;283(20):2701–2711. doi: 10.1001/jama.283.20.2701. [DOI] [PubMed] [Google Scholar]
- Geller G, Boyce A, Ford D, Sugarman J. Beyond “Compliance”: The Role of Institutional Culture in Promoting Research Integrity. Academic Medicine. 2010;8:1296–1302. doi: 10.1097/ACM.0b013e3181e5f0e5. [DOI] [PubMed] [Google Scholar]
- Gibbons M. Science’s New Social Contract with Society. Nature Biotechnology. 1999;402(6761 Suppl):C81–84. doi: 10.1038/35011576. [DOI] [PubMed] [Google Scholar]
- Kalichman MW. Responding to Challenges in Educating for the Responsible Conduct of Research. Academic Medicine. 2007;82(9):870–875. doi: 10.1097/ACM.0b013e31812f77fe. [DOI] [PubMed] [Google Scholar]
- Kalichman MW, Plemmons DK. Reported Goals for Responsible Conduct of Research Courses. Academic Medicine. 2007;82(9):846–852. doi: 10.1097/ACM.0b013e31812f78bf. [DOI] [PubMed] [Google Scholar]
- Kennedy D. Academic Duty. Cambridge, MA: Harvard University Press; 1997. Acadmic Freedom, Academic Duty. [Google Scholar]
- Kitcher P. Responsible Biology. BioScience. 2004;54(4):331–336. [Google Scholar]
- Ladd JM, Lappe MD, Mccormick JB, Boyce AM, Cho MK. The “How” and the “Whys” of Research: Life Scientists’ Views of Accountability. Journal of Medical Ethics. 2009 doi: 10.1136/jme.2009.031781. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lane N. Double Helixes and Double-Edged Swords. 9th International Conference on Genes, Gene Families and Isozymes; 1997. [Google Scholar]
- Lane N. The Civic Scientist and Science Policy. 23rd Annual AAAS Colloquium on Science and Technology Policy; Washington, D.C. 1998. [Google Scholar]
- Leshner AI. Public Engagement with Science. Science and Engineering Ethics. 2003;299(5609):977. doi: 10.1126/science.299.5609.977. [DOI] [PubMed] [Google Scholar]
- Leshner AI. Where Science Meets Society. Science and Engineering Ethics. 2005;307(5711):815. doi: 10.1126/science.1110260. [DOI] [PubMed] [Google Scholar]
- Luria SE. Slippery When Wet: Being an Essay on Science, Technology, and Responsibility. Proceedings of the American Philosophical Society. 1972;116(5):351–356. [Google Scholar]
- Mccormick JB, Boyce AM, Cho MK. Biomedical Scientists’ Perceptions of Ethical and Social Implications: Is There a Role for Research Ethics Consultation? PLoS ONE. 2009;4(3):e4659. doi: 10.1371/journal.pone.0004659. [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Center for Research Resources, National Institutes of Health, U.S. Department of Health and Human Services. NIH Publication No 09-7404. US Government Printing Office; Washington DC: Jul, 2009. Clinical and Translational Science Awards, Advancing Scientific Discoveries Nationwide to Improve Health: Progress Report 2006–2008. [Google Scholar]
- National Science Board. Science and Engineering Indicators 2008. Vol. 1. Arlington, VA: National Science Foundation; 2008. Nsb-08-01. [Google Scholar]
- National Science Board. Science and Engineering Indicators 2010. Vol. 1. Arlington, VA: National Science Foundation; 2010. Nsb-10-01. [Google Scholar]
- Neal HA, Smith TL, Mccormick JB. Grand Challenges for Science and Society in Beyond Sputnik: US Science Policy in the Twenty-First Century. Ann Arbor, MI: The University of Michigan Press; 2008. [Google Scholar]
- Pimple KD. Six Domains of Research Ethics. A Heuristic Framework for the Responsible Conduct of Research. Science and Engineering Ethics. 2002;8(2):191–205. doi: 10.1007/s11948-002-0018-1. [DOI] [PubMed] [Google Scholar]
- Plemmons DK, Brody SA, Kalichman MW. Student Perceptions of the Effectiveness of Education in the Responsible Conduct of Research. Science and Engineering Ethics. 2006;12(3):571–582. doi: 10.1007/s11948-006-0055-2. [DOI] [PubMed] [Google Scholar]
- Powell S, Allison M, Kalichman MW. Effectiveness of a Responsible Conduct of Research Course: A Preliminary Study. Science and Engineering Ethics. 2007;13(2):249–264. doi: 10.1007/s11948-007-9012-y. [DOI] [PubMed] [Google Scholar]
- Reddy C. Scientist Citizens. Science. 2009;323(5920):1405. doi: 10.1126/science.1173003. [DOI] [PubMed] [Google Scholar]
- Rogers PG. Scientists, It’s Time to Speak Up. The Scientist. 2003;17(15):8. [Google Scholar]
- Rotblat J. [Accessed April 27, 2011];Science and Humanity in the Twenty-First Century. ( http://nobelprize.org/nobel_prizes/peace/laureates/1995/presentation-speech.html). Delivered September 6, 1999.
- Rubin GM. Janelia Farm: An Experiment in Scientific Culture. Cell. 2006;125(2):209–212. doi: 10.1016/j.cell.2006.04.005. [DOI] [PubMed] [Google Scholar]
- Sankar P, Cho M, Wolpe P, Schairer C. What Is in a Cause? Exploring the Relationship between Genetic Cause and Felt Stigma. Genetics in Medicine. 2006;8:33–42. doi: 10.1097/01.gim.0000195894.67756.8b. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Smith-Doerr L. Learning to Reflect or Deflect?: U.S. Policies and Graduate Programs’ Ethics Training for Life Scientists. In: Frickel S, Moore K, editors. The New Political Sociology of Science: Institutions, Networks, and Power. Madison, WI: University of Wisconsin Press; 2006. [Google Scholar]
- Steneck NH. Fostering Integrity in Research: Definitions, Current Knowledge, and Future Directions. Science and Engineering Ethics. 2006;12(1):53. doi: 10.1007/pl00022268. [DOI] [PubMed] [Google Scholar]
- Steneck NH, Bulger RE. The History, Purpose, and Future of Instruction in the Responsible Conduct of Research. Academic Medicine. 2007;82(9):829–834. doi: 10.1097/ACM.0b013e31812f7d4d. [DOI] [PubMed] [Google Scholar]
- Wolpe PR. Reasons Scientists Avoid Thinking About Ethics. Cell. 2006;125(6):1023–1025. doi: 10.1016/j.cell.2006.06.001. [DOI] [PubMed] [Google Scholar]
- Ziman J. Getting Scientists to Think About What They Are Doing. Science and Engineering Ethics. 2001;7(2):165–176. doi: 10.1007/s11948-001-0038-2. [DOI] [PubMed] [Google Scholar]


