Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Sep 22.
Published in final edited form as: Acad Med. 2010 Mar;85(3):519–526. doi: 10.1097/ACM.0b013e3181cd1cc5

Evaluating the Effects that Existing Instruction on Responsible Conduct of Research Has on Ethical Decision Making

Alison L Antes 1, Xiaoqian Wang 2, Michael D Mumford 3, Ryan P Brown 4, Shane Connelly 5, Lynn D Devenport 6
PMCID: PMC4578657  NIHMSID: NIHMS717604  PMID: 20182131

Abstract

Purpose

To examine the effects that existing courses on the responsible conduct of research (RCR) have on ethical decision making by assessing the ethicality of decisions made in response to ethical problems and the underlying processes involved in ethical decision making. These processes included how an individual thinks through ethical problems (i.e., meta-cognitive reasoning strategies) and the emphasis placed on social dimensions of ethical problems (i.e., social–behavioral responses).

Method

In 2005–2007, recruitment announcements were made, stating that a nationwide, online study was being conducted to examine the impact of RCR instruction on the ethical decision making of scientists. Recruitment yielded contacts with over 200 RCR faculty at 21 research universities and medical schools; 40 (20%) RCR instructors enrolled their courses in the current study. From those courses, 173 participants completed an ethical decision-making measure.

Results

A mixed pattern of effects emerged. The ethicality of decisions did not improve as a result of RCR instruction and even decreased for decisions pertaining to business aspects of research, such as contract bidding. Course participants improved on some meta-cognitive reasoning strategies, such as awareness of the situation and consideration of personal motivations, but declined for seeking help and considering others’ perspectives. Participants also increased in their endorsement of detrimental social–behavioral responses, such as deception, retaliation, and avoidance of personal responsibility.

Conclusions

These findings indicated that RCR instruction may not be as effective as intended, and in fact, may even be harmful. Harmful effects might result if instruction leads students to overstress avoidance of ethical problems, be overconfident in their ability to handle ethical problems, or overemphasize their ethical nature. Future research must examine these and other possible obstacles to effective RCR instruction.


Responsible conduct of research (RCR) education, the label commonly applied to research ethics instruction for scientists, is receiving increasing attention by the scientific community as a potential remedy for research misconduct. This attention is evident in the growing number of resources available from organizations such as the Responsible Conduct of Research Education Committee (RCREC)1 and the Office of Research Integrity (ORI),2 as well as in mandates for RCR instruction from funding agencies, including the National Institutes of Health and National Science Foundation.3,4 Another sign of the increasing focus on RCR instruction is the convening of university administrators and RCR instructors and researchers at the 1st Biennial RCR Education, Instruction, and Training conference held in April 2008 sponsored by the ORI and Washington University.5 One of the most common concerns expressed at this conference was the reality that although RCR instruction is being adopted to manage the integrity of scientific work, little evidence is available to demonstrate the effectiveness of such instruction. Needless to say, evaluation evidence is critical for determining whether these training efforts are indeed fruitful and for guiding future directions in instructional development. Therefore, the purpose of this study was to provide some initial evidence bearing on the effectiveness of existing RCR courses.

Not only is evaluation evidence limited, but when evaluation studies are conducted, outcome measures vary by study, making it difficult to compare conclusions about effectiveness across studies.6 Recent meta-analyses of scientific and business ethics courses concluded that ethics instruction as currently conducted is minimally effective.6,7 These meta-analyses, however, suffer from the limitation that instructional effects were aggregated across a wide variety of outcome measures. The present study examines the effects of existing RCR instruction using a single index across courses.

Evaluation efforts must first specify the intended outcomes of instruction. Without clear delineation of the desired outcomes, measurement and interpretations of the outcomes regarding course effectiveness are not possible.8 A number of outcomes might be suggested: 1) knowledge, particularly of RCR principles and guidelines, 2) skills, such as solving problems and making ethical decisions, and 3) attitudes, such as believing that research ethics are important and that researchers have a responsibility to behave ethically.9 Ultimately, however, the goal of ethics instruction is to change the behavior of those trained.

Because of the distal nature of behavioral outcomes and the difficulty of measuring real-world behaviors, behavioral outcomes are rarely assessed as training outcomes.10 In fact, one commonly applied assessment approach is to ask participants to rate their liking of, or satisfaction with, the course. Although this assessment approach is relatively easy and provides useful information about trainee reactions, it does not directly access actual learning. Thus, we proposed measuring ethical decision making, as this is a proximal outcome that will likely influence real-world behavior. This assessment approach captures whether and how trainees apply the learned knowledge and skills needed to make decisions in response to ethical problems.11 Hence, the current study examined several components of ethical decision making, including the ethicality of decisions made in four domains of research and two types of underlying social–cognitive processes driving ethical decision making, specifically metacognitive reasoning strategies and social–behavioral responses.

Prior research has identified four key behavioral domains of research ethics: data management, study conduct, professional practices, and business practices.12 Data management pertains to handling, storing, sharing, and reporting data. Study conduct relates to the treatment of human and animal subjects, adherence to institutional review board guidelines, and maintenance of confidentiality and anonymity. Professional practices concern adherence to professional commitments, mentoring, and treatment of collaborators. Finally, business practices pertain to contract and grant bidding, the use of physical resources, conflicts of interest, and laboratory management. Because the effectiveness of RCR instruction might vary across these domains, it is important to consider the multidimensionality of ethical decision making in research and to assess the effectiveness of instruction across these domains.

In addition, examining social–cognitive process variables underlying ethical decision making can provide additional insights into the effectiveness of RCR instruction and ultimately provide more specific feedback regarding the impact of instruction. In the present study, we assessed seven meta-cognitive reasoning strategies known to be critical to ethical decision making (see Table 1).11,1315 Meta-cognitive reasoning strategies are the mental processes engaged in by an individual to actively think about the situation and work through the ethical problem.13,14 Use of these strategies involves focused analysis of the problem, reflection on and consideration of individuals involved, and anticipation of possible immediate and long-term consequences. Specifically, one might question one’s own perceptions and motivations in a situation or anticipate the likely impact of one’s decisions on oneself and others involved. It is particularly important to consider the impact of instruction on these meta-cognitive reasoning strategies because they are transferrable training outcomes that can be applied across situations and ethical problems.

Table 1.

Social–Cognitive Processes Underlying Ethical Decision Making*

Social–cognitive variables Construct definition
Meta-cognitive reasoning strategies
  Recognizing circumstances Awareness of relevant principles, individuals involved, key goals, and critical causes of the problem
  Seeking help Asking for advice from an objective individual, seeking institutional resources, or considering what others have done in similar situations
  Questioning one’s judgment Considering that one’s interpretation of the problem and potential decisions might be biased or based on faulty assumptions
  Anticipating consequences Considering possible outcomes, including the likely short- and long-term consequences of possible decision alternatives
  Managing emotions Assessing and regulating emotional responses to the problem that can hinder objectivity
  Analyzing personal motivations Considering deeply rooted personal motivations, values, and goals and how they might affect decision making in the situation
  Considering others Recognizing and being mindful of others’ perceptions and concerns and the likely impact of one’s actions on others
Social–behavioral responses
  Involvement of others Choosing responses that require others to be involved in decision making or implementing a decision
  Retaliation Responding in an aggressive, vengeful, or spiteful manner
  Deception Misleading or hiding the truth from others
  Active involvement Active engagement in responding to the situation rather than passively doing nothing or waiting it out
  Avoidance of responsibility Diffusing, avoiding, or deflecting personal responsibility for actions or decisions
  Selfishness Responding in a way to promote personal gain or aggrandizement
  Closed-end decision making Responding in such as way as to curtail the possibility for subsequent options
*

As presented in the literature.11,1315

In addition to influencing how people think through ethical problems, RCR instruction might affect how individuals approach the social dimensions of ethical problems, such as whether an individual accepts personal responsibility for his or her actions or whether an individual is honest with others. Thus, we examined seven social–behavioral responses that are common to ethical problems (see Table 1). The importance of considering social dimensions becomes particularly clear when one considers the social nature of scientific research, which requires interaction with many other people, from collaborators to research participants.14 In fact, many ethical guidelines emphasize the social dimensions of scientific work. Overall, the intent of this study was to use a reliable and valid measure across existing RCR courses to assess the effectiveness of RCR instruction in terms of critical learning outcomes, including the ability to reason through ethical problems, navigate the social dimensions of ethical problems, and ultimately make ethical decisions.

Method

Courses and participants

The University of Oklahoma’s institutional review board reviewed and approved this study. Participation in this study was open to any RCR course being conducted at any institution in the United States. Participant recruitment began in the fall of 2005 and continued concurrently with data collection throughout the spring of 2007. Recruitment announcements stated that a nationwide, online study was being conducted to examine the impact of RCR instruction on the ethical decision making of scientists. Instructors of RCR courses at all levels (e.g., graduate student and faculty) in the biological, health, or social sciences were encouraged to contact the research team via a specified email address to acquire additional information and enroll in the study.

The study recruitment effort yielded over 200 contacts with faculty at 21 institutions across the nation. About 40 (20%) instructors agreed to enroll in the study, and ultimately 21 of these courses yielded data from course participants, for a total of 173 RCR students. With the exception of four courses at private universities, the RCR courses took place at public universities. According to the Carnegie Foundation classifications,16 with the exception of one small medical school and one medium four-year research university, the universities were large four-year research universities. Additionally, these research universities were designated as institutions with high or very high research activity. In terms of location, the 21 responding institutions were distributed across regions of the U.S., including four on the West Coast, six on the East Coast, and several from the Midwest (n = 3), South (n = 4), and Southwest (n = 4).

Twelve instructors from the 21 courses returned the background data questionnaire. On average, the instructors were 57 (SD = 7.1) years of age, and the majority of instructors reported having PhD degrees, with the remaining having MD, DVM, or EdD degrees. Most instructors reported that their current field of study was biology or medicine, and all instructors, with the exception of two, reported teaching between 2 and 15 research ethics courses in the past. All instructors but one reported being awarded at least one contract or grant during their career and attending at least one conference within the last two years. With the exception of one instructor, all reported authoring at least one publication, with the majority authoring between 14 and 21 publications. Thus, the instructors were active mid- to senior-level faculty.

Instructors from all courses completed a 68-item course content questionnaire, surveying the key characteristics of their courses (for instance, the target audience and instructional approach). This survey revealed that the courses were typically semester-long, required courses aimed primarily at graduate students in biomedical fields. Instructors reported that their courses were moderately time intensive, and all reported covering the nine RCR topics (e.g., human subjects and publication practices) recommended by the ORI.2 Instructors reported using moderately complex cases and both individual and group activities. They also reported discussing, to a moderate extent, approaches for solving ethical problems or making ethical decisions. Any differences in scores on these scales were not statistically significant (p > .05). Thus, the courses were quite similar in their content and delivery.

The course participants consisted of 173 individuals, with 131 participants identifying themselves as graduate students pursuing PhD or MD degrees. The remaining participants identifying themselves as junior-level (n = 10), mid-level (n = 1), or senior-level professionals (n = 6), and the remaining 25 participants did not report. This composition of course participants is typical for RCR courses as they generally focus on PhD level graduate students. The course participant background information also suggested that the average course participant was reasonably representative of a typical RCR course participant at U.S. research universities.

Specifically, the average course participant was 27 years old (SD = 4.7), and the majority were female (58%) with the remaining (35%) being male (7% were unreported). Sixty percent of participants identified themselves as Caucasian, 14% as Asian, 5% as Hispanic, 4% as African American, and 8% as Middle Eastern, Native American, or Other (9% did not report). The majority of students reported studying biological science (84%), and the remaining, health (11%) and social science (5%). Thus, our sample represented biomedical course participants fairly well, but was limited in terms of social scientists. This participant composition reflects the emphasis on RCR instruction in the biomedical sciences and the more limited emphasis in the social sciences. Seventy-one percent of participants reported being required to complete the course as a degree requirement, and 15% reported that it was an elective (the remaining were unreported).

Procedure

We distributed the recruitment message via four primary outlets. These outlets included (1) posting the announcement on the ORI’s website; (2) contacting groups involved in ethics or RCR education (e.g., the Poynter Center and the RCREC) and asking them to post the message online and/or to provide the message to their list of email contacts; (3) contacting coordinators of major conferences, workshop, or other events focusing on RCR instruction and asking them to provide the announcement as a flyer to conference participants; and (4) contacting the compliance officers and/or the vice presidents for research at institutions classified by the Carnegie Foundation16 as doctoral research universities-extensive and medical schools (n = 205) and asking them to forward the announcement to individuals involved in RCR instruction at the institution, or to provide instructor contact information directly to the research team.

Upon contacting the research team, instructors were asked to complete and return electronically an informed consent form, a background form, and a survey about the content of their course. Next, we provided instructors with a unique instructor identification code that would allow their students access to the online measures. We also provided instructors with a series of messages that they could either email their students or pass out in class to introduce the purpose of the study and note that they volunteered to make the study available to their students. This information emphasized that student participation was voluntary, that responses would be anonymous, and that the instructor would not be aware of who had or had not participated. Students who chose to participate were then able to read the procedures for accessing the online website.

Upon accessing the online website, students created a unique username and password, which allowed linkage of pre- and post-course measures, but did not allow identification of specific students by the researcher or the instructor of the course. Upon accessing the online website, participants indicated their field (social, health, or biological), so that the measure applicable to their scientific field would be provided to them. The participant instructions indicated that participants should log-on to the website within the first two weeks of the course to complete the pre-course materials. These materials were: (1) an informed consent form, (2) a background form, and (3) the pre-course ethical decision-making measure. Following the conclusion of their course, the participant instructions indicated that they should log-on to the website within two weeks to complete the post-course materials, which included the post-course ethical decision-making measure and a debriefing document. Several short emails were sent to course instructors at the start and conclusion of their course asking them to provide a reminder message to course participants.

Measures

The outcome measure (Chart 1) used to evaluate the effectiveness of RCR courses was the ethical decision-making measure developed and validated by Mumford and colleagues.11, 15,17 The measure assesses ethical decision making in relation to four major dimensions of ethical behavior, specifically data management, study conduct, professional practices, and business practices.12

Chart 1.

Example Item from Ethical Decision-Making Measure (Biological Science)*

Scenario
Bowers’s laboratory investigates mechanisms of synaptic plasticity using long-term potentiation (LTP) in hippocampal slices. The mechanisms under study include pre- and postsynaptic processes and a newer possibility, modulation by glial cells. He maintains a highly competitive atmosphere, rewarding the most productive members of the team with authorships and fellowship bonuses. All three postdoctoral students in the lab are productive, but Stanek has scored the greatest number of successes, including the linking of astrocyte membrane depolarizations to the establishment and persistence of LTP.
Ethical Problem
Stanek has experienced a few disasters recently—an inability to maintain viable slices, breakdowns in recording equipment—that he is unable to explain. He has no proof, but suspects that another post doc, Clements, has tampered with the preparations when he was out of the lab. Clements seems envious of Stanek’s success and he has heard rumors that Clements sabotaged his colleagues as a graduate student. One of the other postdocs, Minnis, who herself has had a run of low productivity, has urged Stanek to take some sort of action. What should he do?
Response Options
Choose two from the following:
  1. Confront Clements about the situation face-to-face

  2. Confront Clements in the weekly lab meeting

  3. Relate his suspicions to Bowers and agree on how to proceed

  4. Together with the other postdocs, lay a trap to catch Clements in the act

  5. Try to push Clements out of the lab through innuendo and rumor

  6. Retaliate in kind by contaminating the glutamate analogues Clements uses in his research

  7. Try to obtain further documentation of Clements’s tactics by writing to his Ph.D. advisor, inquiring if there could be any truth to past rumors and current suspicions

  8. Be cautious and wait for more convincing evidence to appear before acting

*

This example is one of 36 research scenarios used as part of the ethical decision-making measure developed and validated by Mumford and colleagues.11, 15,17 The measure assessed ethical decision making in relation to four major dimensions of ethical behavior, specifically data management, study conduct, professional practices, and business practices.12 All case events and characters are fictitious.

Headings added for clarification.

This ethical decision-making measure consists of 12 research scenarios for each of the social, health, and biological sciences, so that participants receive scenarios relevant to their respective fields. Each of these scenarios provides the context for three ethical problems that follow from the original scenario and map to the four domains of ethical conduct. Thus, the measure consists of 36 ethical problems, with 18 constituting the pre-measure and 18 constituting the post-measure. For each ethical problem, participants are provided with eight potential responses that reflect low (1), moderate (2), or high (3) levels of ethicality, as determined by field experts using field-relevant norms and guidelines. Participants were instructed to choose two out of the eight options that they thought were most appropriate for addressing the ethical problem. Decision ethicality scores were obtained by averaging point values associated with participant’s two chosen response options and then aggregating the responses selected for each of the questions subsumed under the four dimensions of ethical behavior.

In addition to assessing ethicality of decisions made in response to ethical problems, the measure provides scores for the seven meta-cognitive reasoning strategies and seven social–behavioral responses underlying the choices endorsed by each participant. This scoring of the measure was developed by three expert judges who evaluated the response options according to the extent to which each response option reflected the application of the meta-cognitive reasoning strategies and the endorsement of the social–behavioral responses using a 7-point scale (0 = to no extent, 6 = to a great extent). Participants’ scores on the meta-cognitive reasoning strategies and social–behavioral dimensions were obtained by averaging the scores from the two responses endorsed per problem and then aggregating across all problems.

Data analysis

The course content data were analyzed by conducting analyses of variance in which course was treated as the independent variable and course characteristic scores obtained in the course content survey were treated as dependent variables. The course content analysis revealed no significant differences in course characteristics across courses. Thus, for the subsequent analyses, the data from all course participants were treated as one group.

Next, to examine the effects of RCR instruction, we tested for pre–post changes in decision ethicality scores for the four types of decisions, the seven meta-cognitive reasoning strategies, and the seven social–behavioral responses using dependent sample t tests comparing pre-test means to post-test means on all dependent variables. For this analysis, data were used from those participants who completed both the pre and the post measure (n = 53). Before doing so, however, participants who only completed the pre-test (n = 86) or the post-test (n = 34) were compared to those who completed both in order to determine whether any systematic differences might exist between these groups. This analysis revealed no systematic differences between the groups in terms of the participants, instructors, course characteristics, or main dependent variables of interest. Therefore, the pre–post group is a reasonable representation of the larger group of participants.

Finally, to examine the magnitude of the effects of instruction, Cohen’s d statistic for repeated measures was computed. This effect size estimate takes into account the standardized difference between the pre–post means. Cohen’s d’s up to .20 are considered small, up to .50 are considered moderate, and up .80 are considered large.18

Results

First, the analysis revealed no significant changes in the ethicality of data management, t(51) = −1.26, p = .21, study conduct, t(51) = 0.31, p = .76, or professional practices decisions, t(52) = 0.84, p = .41. The ethicality of business practices decisions, however, decreased significantly following RCR instruction, t(52) = −2.04, p < .05. Table 2 provides the means, standard deviations, and effect sizes for the ethicality of decisions.

Table 2.

Pre–Post Changes in Decision Ethicality Scores for Participants in 21 Courses on Responsible Conduct of Research, 2005–2007

Pre-test Post-test Effect size
M* SD M* SD Cohen’s d
Data management 2.31 .32 2.25 .24 −0.21
Study conduct 2.22 .29 2.24 .29 0.05
Professional practices 2.26 .16 2.29 .19 0.17
Business practices 2.42 .27 2.29 .44 −0.37
*

M = mean.

SD = standard deviation.

p < .05.

Next, we examined changes in meta-cognitive reasoning strategies (see Table 3). The analysis revealed significant increases in four strategies, including recognizing circumstances, t(52) = 4.61, p < .01, questioning one’s judgment, t(52) = 5.90, p < .01, managing emotions, t(52) = 2.14, p < .05, and analyzing personal motivation, t(52) = 3.54, p < .01. In contrast, the findings with respect to seeking help, t(52) = −9.33, p < .01, revealed statistically significant decreases following instruction, and anticipating consequences, t(52) = −1.53, p = .13, and considering others, t(52) = −1.66, p = .10, decreased as well but did not reach statistical significance at the p < .05 level. Thus, mixed findings emerged with respect to meta-cognitive strategies.

Table 3.

Pre–Post Changes in Meta-Cognitive Reasoning Strategy Scores for Participants in 21 Courses on Responsible Conduct of Research, 2005–2007

Pre-test Post-test Effect size
M* SD M* SD Cohen’s d
Recognizing circumstances 3.66 .27 3.89 .26 0.86
Seeking help 1.23 .25 0.77 .27 −1.79
Questioning one’s judgment 3.36 .29 3.66 .32 0.99
Anticipating consequences 3.64 .31 3.56 .27 −0.28
Managing emotions 3.52 .35 3.64 .30 0.38§
Analyzing personal motivations 3.20 .31 3.38 .31 0.56
Considering others 3.43 .30 3.34 .29 −0.29
*

M = mean.

SD = standard deviation.

p < .01.

§

p < .05.

Finally, Table 4 presents the results for changes in the social–behavioral responses. Following training, participants endorsed responses to ethical problems that involved retaliation, t(52) = 7.79, p < .01, deception, t(52) = 8.20, p < .01, avoiding responsibility, t(52) = 4.45, p < .01, and closing off the possibility for future decisions or actions, t(52) = 3.73, p < .01. In terms of responses characterized by active engagement, participants’ scores decreased following instruction, t(52) = −1.88, p = .07, but this change did not reach statistical significance at the p < .05 level. Changes for involving others in the decision-making process, t(52) = 1.09, p = .28, and selfishness, t(52) = .61, p = .55, were minimal and not statistically significant.

Table 4.

Pre–Post Changes in Social–Behavioral Response Scores for Participants in 21 Courses on Responsible Conduct of Research, 2005–2007

Pre-test Post-test Effect size
M* SD M* SD Cohen’s d
Involvement of others 1.91 .23 1.96 .31 0.19
Retaliation 2.05 .32 2.40 .27 1.16
Deception 1.33 .21 1.70 .28 1.53
Active involvement 3.74 .22 3.68 .14 −0.35
Avoidance of responsibility 1.55 .26 1.76 .29 0.78
Selfishness 1.78 .24 1.81 .30 0.10
Close-ended decision making 3.37 .33 3.54 .24 0.59
*

M = mean.

SD = standard deviation.

p < .01.

Discussion

This study revealed mixed effects of RCR instruction on ethical decision making. The ethicality of decisions made with respect to data management, study conduct, and professional practices did not improve or decline following instruction. However, the ethicality of decisions pertaining to business practices decreased. Changes in meta-cognitive reasoning strategies indicated that instruction encouraged the use of certain helpful strategies, such as recognizing the elements of the situation and analyzing personal motivations, as individuals think through ethical problems.

However, the use of other critical strategies, such as seeking help, diminished following RCR instruction. Moreover, the negative effects observed for the social–behavioral responses, which indicated that course participants were more deceptive, retaliatory, closed, and neglectful of personal responsibility in their responses following instruction, countered the positive effects on the meta-cognitive reasoning strategies. Overall, the pattern of effects indicated that, although course participants considered and analyzed situational elements more skillfully, with regard to the social elements of ethical problems, they were more internally focused and closed off when making decisions. Given these countervailing effects for underlying ethical decision-making processes, the lack of improvement, and even decrement, in the ethicality of decisions is not surprising.

Overall, these findings point toward an unexpected conclusion—that a real need exists to reassess a common assumption about RCR instruction. Specifically, we typically take for granted the benefit of RCR instruction. That is, we assume that training in ethics leads to increases in ethical decision making or other integrity-related outcomes. Even those who question its efficacy would probably assume at the very least that such training produced no effects (that is to say, it ―does no harm‖). However, the present data lead to an important question: Is it possible that ethics training could have detrimental effects? The present findings suggest that not only might RCR instruction not be beneficial, but it might actually be a harmful endeavor. Thus, we must consider why RCR instruction may not work and how it could produce negative effects.

First and foremost, RCR instruction might encourage self-protective behavior.19 After RCR instruction, participants’ responses reflected inappropriate closure, less consideration of others, and defensive response patterns, including failure to take personality responsibility, deception, and retaliation. Ethics instruction may induce this internally focused, socially closed-off response to ethical problems if course participants internalize one or both of two messages. First, ethical situations cause serious trouble and ruin people’s careers, and second, other researchers are unethical and thus untrustworthy.14 Instruction emphasizing the nature of severe forms of misconduct (i.e., fabrication, falsification, and plagiarism) and the serious consequences associated with it is especially likely to exacerbate this focus on protecting oneself from ethical situations and others who might engage in misconduct.

In addition to inducing self-protection, RCR instruction might be ineffective, or even detrimental, if it plays into people biases about themselves. People believe that they are better than the average person both cognitively and socially, and such a self-enhancement bias about oneself, especially with respect to ethical behavior, could make it difficult for course participants to view ethics instruction as relevant to them.20 Indeed, instructional effectiveness hinges on student receptiveness and engagement, which is tied to how self-relevant students perceive the training.21 Of note here are research findings showing high-performing scientists to be highly self-confident, dominant, and even arrogant, suggesting that they may be especially susceptible to self-enhancement biases and thus likely to view ethics training as irrelevant to them.22

Not only might natural self-enhancement tendencies render instruction ineffective, but ethics instruction might actually encourage self-enhancement. First, overconfidence could result from completing an ethics course if an individual subsequently believes that he or she knows how to handle all ethical issues and thus is invulnerable to unethical behavior. Such a mindset could promote disengagement in ethical problems and careless decision making. This explanation seems particularly plausible because self-enhancement biases operate especially when people make predictions about their likely future behavior.23 Thus, when thinking about how one might behave in dealing with a future ethical problem, people assume they will behave ethically. In fact, participation in a RCR course may encourage people to affirm their ethical virtues, which may actually allow for misbehavior later because this affirmation promotes unconscious rationalization of one’s questionable behavior.24 This phenomenon is particularly likely if instruction overemphasizes extreme case examples where, by comparison, an RCR participant could only view him or herself as highly ethical, thus enhancing the perception that he or she would never engage in such extremely unethical behavior.

Finally, students’ attitudes towards their RCR course present another plausible explanation for detrimental effects of RCR instruction.25 For instance, an attitude regarding ethics instruction as just another ―pesky requirement‖ needed to obtain a degree or to engage in research might block the effectiveness of instruction. In fact, students might view just sitting through the course week-after-week as a hindrance to their careers, as they could otherwise spend that time on research endeavors. Clearly, future research must seek to examine the explanations for ineffective or harmful instruction proposed here, in addition to other potential risks of RCR instruction and how these obstacles might be overcome.

In future research examining where ethics instruction might go wrong, it is critical to consider that ethics instruction is fundamentally different from instruction in traditional academic courses. First, ethics instruction is not as fact-based as standard courses. Instead, in addition to basic knowledge of rules and principles, it is important to learn strategy and process.15, 26 Thus, ethics courses must teach participants how to think through ethical problems. Research examining how instruction might accomplish this objective is in high demand. The social nature of ethics instruction proves another unique characteristic in that the ethical behavior of scientists has direct implications for their field and even for society. Moreover, RCR instruction itself is a highly social endeavor, and this social dimension to instruction can produce powerful dynamics (e.g., social comparison processes, conformity) that can either enhance or undermine the goals of ethics instruction. Thus, research examining the unique social elements of ethics instruction is an area ripe for future investigation.

Future research should also focus on developing models for evaluating the effectiveness of RCR instruction. The unique nature of RCR instruction raises important questions about how to best assess its effectiveness. Commonly, assessment of RCR course effectiveness centers on knowledge of ethical principles and rules and/or student reactions to the course, typically liking of the course or subjective ratings of its effectiveness. Although knowledge and perceptions of the course are important, they do not address whether people can apply knowledge in a context. Thus, ethical decision-making measures which require people to make decisions about complex, realistic ethical problems offer one way to require course participants to apply knowledge in a more realistic performance-oriented manner.11 Other assessment tools requiring realistic performance, such as problem-solving or handling social conflict, and capturing the complex, ambiguous nature of real-world ethical problems might be developed, and research along these lines is greatly needed to further the field of RCR instruction. These assessment tools should take into account the multidimensional nature of research ethics and consider the importance of social–cognitive processes in ethical decision making and behavior.

Before concluding, we should note the limitations of the current study. One primary limitation was the lack of control over the test-taking environment given the nature of online studies. In addition, missing data led to a limited sample size for the pre–post comparison. Moreover, the voluntary nature of the study (at both the instructor and student levels) allows for the possibility that those who chose to participate differ from those who did not choose to participate. Fortunately, courses included in the study came from universities across the U.S., and the available course participant information suggested that they were reasonably representative of RCR course participants in the U.S. Nevertheless, the voluntary nature of the study leaves open the possibility that this sample was not representative of the broader population of RCR courses or course participants. As a result, the generalizability of these findings to more senior-level trainees and to trainees in fields beyond biomedical is limited by the nature of the present sample.

Furthermore, we did not assess individual difference and situational variables likely to impact responding on the ethical decision-making measure. A common issue in studies of educational interventions is the lack of control over external factors that might impact performance on the outcome measures. For instance, poor mentoring or observation of misconduct could influence outcomes associated with training. Finally, this study examined instructional effectiveness using only one measure. Although the measure is valid and consistent dependent variables are desirable for comparing different RCR courses in the same effort, other outcomes could be examined, and additional ethics measures should be developed and included in future examinations of RCR course effectiveness.

Conclusion

In conclusion, our exchanges with RCR instructors revealed a great desire for information about what constitutes effective ethics instruction and considerable concern that RCR instruction may not be working. This study suggests that not only might RCR instruction be ineffective as currently conducted, but it could be harmful. Thus, the findings presented here ultimately raise more questions than they address, but overall it should be noted that the social–cognitive mechanisms underlying ethical decision making and ethics instruction are critical areas needing further investigation. Although we typically seek to understand what promotes effective instruction, it may be just as important to ask what might hinder effective instruction. Ultimately, the current study offers us a line of research that may facilitate our understanding of research ethics and ethical decision making and how we might construct effective RCR instruction.

Acknowledgements

The authors would like to thank Drs. Stephen Murphy and Ethan Waples for their contributions during the early stages of this project. Caleb Murphy is also thanked for his technical assistance in designing the online data collection website.

Funding/Support: XXX.

Footnotes

Other disclosures: XXX.

Ethical approval: Not applicable.

Contributor Information

Alison L. Antes, Department of Psychology and a research assistant, Center for Applied Social Research at the University of Oklahoma, Norman, Oklahoma.

Xiaoqian Wang, Department of Psychology and a research assistant, Center for Applied Social Research at the University of Oklahoma, Norman, Oklahoma.

Michael D. Mumford, Center for Applied Social Research, and professor of psychology, University of Oklahoma, Norman, Oklahoma.

Ryan P. Brown, psychology, University of Oklahoma, Norman, Oklahoma.

Shane Connelly, Center for Applied Social Research, and associate professor of psychology, University of Oklahoma, Norman, Oklahoma.

Lynn D. Devenport, psychology, University of Oklahoma, Norman, Oklahoma.

References

  • 1.Kalichman MW. Responding to challenges in educating for the responsible conduct of research. Acad Med. 2007;82:870–875. doi: 10.1097/ACM.0b013e31812f77fe. [DOI] [PubMed] [Google Scholar]
  • 2.Office of Research Integrity. Responsible conduct of research education. [Accessed November 20, 2009]; Available at: ( http://ori.dhhs.gov/education/).
  • 3.Dalton R. NIH cash tied to compulsory training in good behaviour. Nature. 2000;408:629. doi: 10.1038/35047242. [DOI] [PubMed] [Google Scholar]
  • 4.NSF funding requires RCR and ethics training. Vol. 15. Rockville, MD: Department of Health and Human Services; 2007. Office of Research Integrity Newsletter; p. 1. [Google Scholar]
  • 5.Washington University School of Medicine. St. Louis, MO: 2008. Apr 17–19, [Accessed November 20, 2009]. First biennial ORI conference on responsible conduct of research education, instruction, and training (RCR-EIT) Available at: ( http://epi.wustl.edu/epi/rcr2008.htm). [Google Scholar]
  • 6.Antes AL, Murphy ST, Waples EP, et al. A meta-analysis of ethics instruction effectiveness in the sciences. Ethics Behav. 2009;19:28–52. doi: 10.1080/10508420903035380. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Waples EP, Antes AL, Murphy ST, et al. A meta-analytic investigation of business ethics instruction. J Bus Ethics. 2009;87:133–151. [Google Scholar]
  • 8.Goldstein IL, Ford JK. Training in organizations: Needs assessment, development, and evaluation. 4th ed. Belmont, CA: Wadsworth; 2002. [Google Scholar]
  • 9.Kalichman MW, Plemmons DK. Reported goals for responsible conduct of research courses. Acad Med. 2007;82:846–852. doi: 10.1097/ACM.0b013e31812f78bf. [DOI] [PubMed] [Google Scholar]
  • 10.Kirkpatrick DL. Evaluating training programs: The four levels. 2nd ed. San Francisco, CA: Berrett-Koehler; 1998. [Google Scholar]
  • 11.Mumford MD, Devenport LD, Brown RP, et al. Validation of ethical decision-making measures: Evidence for a new set of measures. Ethics Behav. 2006;16:319–345. [Google Scholar]
  • 12.Helton-Fauth WB, Gaddis B, Scott G, et al. A new approach to assessing ethical conduct in scientific work. Account Res. 2003;10:205–228. doi: 10.1080/714906104. [DOI] [PubMed] [Google Scholar]
  • 13.Kligyte V, Marcy RT, Sevier ST, et al. A qualitative approach to responsible conduct of research (RCR) training development: Identification of metacognitive strategies. Sci Eng Ethics. 2008;14:3–31. doi: 10.1007/s11948-007-9035-4. [DOI] [PubMed] [Google Scholar]
  • 14.Antes AL, Brown RP, Murphy ST, et al. Personality and ethical decision-making in research: The role of perceptions of self and others. JERHRE. 2007;2:15–34. doi: 10.1525/jer.2007.2.4.15. [DOI] [PubMed] [Google Scholar]
  • 15.Mumford MD, Connelly S, Brown RP, et al. Sensemaking approach to ethics training for scientists: Preliminary evidence of training effectiveness. Ethics Behav. 2008;18:315–339. doi: 10.1080/10508420802487815. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Carnegie Foundation. Carnegie Foundation’s Classification of 3,941 Institutions of Higher Education. [Accessed November 20, 2009];Facts and Figures. Available at: ( http://chronicle.com/stats/carnegie). [Google Scholar]
  • 17.Mumford MD, Murphy ST, Connelly S, et al. Environmental influences on ethical decision-making: Climate and environmental predictors of research integrity. Ethics Behav. 2007;17:337–366. [Google Scholar]
  • 18.Cohen J. Statistical power analysis for the behavioral sciences. 2nd edition. Hillsdale, New Jersey: Lawrence Erlbaum; 1988. [Google Scholar]
  • 19.Coulehan J, Williams PC. Vanquishing virtue: The impact of medical education. Acad Med. 2001;76:598–605. doi: 10.1097/00001888-200106000-00008. [DOI] [PubMed] [Google Scholar]
  • 20.Robins RW, Beer JS. Positive illusions about the self: Short-term benefits and long-term costs. J Pers Soc Psychol. 2001;80:340–352. doi: 10.1037/0022-3514.80.2.340. [DOI] [PubMed] [Google Scholar]
  • 21.Brophy J. Motivating students to learn. 2nd edition. Mahwah, New Jersey: Lawrence Erlbaum; 2004. [Google Scholar]
  • 22.Feist GJ. The influence of personality on artistic and scientific creativity. In: Sternberg RJ, editor. Handbook of Creativity. New York, NY: Cambridge University Press; 1999. pp. 273–296. [Google Scholar]
  • 23.Epley N, Dunning D. The mixed blessing of self-knowledge in behavioral prediction: Enhanced discrimination but exacerbated bias. J Pers Soc Psychol. 2006;32:641–655. doi: 10.1177/0146167205284007. [DOI] [PubMed] [Google Scholar]
  • 24.Monin B, Miller DT. Moral credentials and the expression of prejudice. J Pers Soc Psychol. 2001;81:33–43. [PubMed] [Google Scholar]
  • 25.Baldwin TT, Magjuka RJ, Loher BT. The perils of participation: Effects of choice of training on trainee motivation and learning. Personnel Psychol. 1991;44:51–65. [Google Scholar]
  • 26.Kligyte V, Marcy RT, Waples EP, et al. Application of a sensemaking approach to ethics training for physical sciences and engineering. Sci Eng Ethics. 2008;14:251–278. doi: 10.1007/s11948-007-9048-z. [DOI] [PubMed] [Google Scholar]

RESOURCES