Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Jul 30.
Published in final edited form as: J Law Med Ethics. 2013 Winter;41(4):829–Contents. doi: 10.1111/jlme.12093

Subversive Subjects: Rule-Breaking and Deception in Clinical Trials

Rebecca Dresser 1
PMCID: PMC4520402  NIHMSID: NIHMS708702  PMID: 24446941

Scientific reports about clinical research appear objective and straightforward. They describe a study’s findings, methods, subject population, number of subjects, and contribution to existing knowledge. The overall picture is pristine: the research team establishes the requirements of study participation and subjects conform to these requirements. Readers are left with the impression that everything was done correctly, by the book.

In other places, however, one finds a different and messier picture of clinical research. In this picture, research subjects deviate from the prescribed plan. One author contrasted the “tidy graphics” and “crisp prose” of the New England Journal of Medicine’s HIV/ AIDS trial publications with reports that subjects shared medications and broke other trial rules. Awareness of this behavior, he wrote, could lead insiders to “conclude that knowledge was resting on something rather less solid than bedrock.”1

When their personal interests conflict with the demands of participation, some subjects surreptitiously break the rules. These subjects are subversive – they undermine the research endeavor. Subversive subjects do not necessarily intend to compromise the quality of research. Instead, like investigators who surreptitiously interfere with assignment sequences in randomized clinical trials, subversive subjects elevate their own interests and concerns above the trial requirements that are imposed to produce good data.2

Subjects themselves, and the journalists and scholars who write about them, describe subversive behavior by normal volunteers and patients enrolled in clinical trials. Researchers also report finding various forms of rule-breaking by subjects. But because relatively few studies adopt measures to detect rule-breaking, the behavior often goes unnoticed. Commentators say researchers often overestimate subjects’ adherence to trial rules. According to one review, “Clinical investigators often assume that medications are taken correctly.”3 Another commentator observed that researchers tend to focus on study design and results, “often viewing adherence as a nuisance and a somewhat tangential concern.”4

Even researchers who look for subversive behavior do not detect all of it. Some rule-breaking is, as a practical matter, undetectable. Researchers often have no other option than to rely on subjects’ accounts of their health histories and compliance with trial requirements. In some instances, independent verification is impossible. In others, verification would require researchers to adopt costly and intrusive monitoring procedures that few subjects would tolerate.

Although the full extent of subversive behavior is unknown, its existence is undeniable. The pristine picture of clinical research is inaccurate. Clinical trial subjects are not passive followers of researchers’ orders, they are active agents living their own lives and promoting what they see as their own interests. In rejecting the constraints research imposes, however, subversive subjects diminish the value of research results.5 Their behavior presents risks to their own health and the health of others.

Scholars and policymakers have devoted extensive attention to the ethical issues raised by so-called deception research – studies in which investigators deceive subjects to acquire otherwise unobtainable data.6 But few have considered the reverse situation, in which subjects deceive investigators to advance their own agendas. Subversive subjects are a reality in clinical research, and subjects’ deception merits ethical scrutiny, too.

From one vantage point, subversive subjects behave unethically. By engaging in rule-breaking, they not only create risks, they ignore ethical responsibilities to observe research agreements and tell the truth. At the same time, subversive subjects expose ethical problems with the design and conduct of clinical trials. Features of the research environment create fertile ground for subject subversion. Researchers often turn to intensified policing and guidance strategies to reduce subject subversion, but collaborative reforms are more consistent with the partnership model of clinical research.

Evidence of Subversive Behavior

Normal Volunteers in Phase I Studies

Evidence of subversive behavior in clinical trials comes from subjects’ first-hand accounts, interviews with subjects, case reports, and empirical studies. Professional guinea pigs tell the most vivid stories. These individuals make at least part of their living from the payments they receive for participating in phase I trials evaluating the effects of investigational drugs in healthy individuals. Some have published essays about their experiences. From 1996–2008, a professional guinea pig named Robert Helms published Guinea Pig Zero, a print and online journal reporting on this population’s research experiences.7 A few academics have also written about this population. Medical anthropologist Roberto Abadie’s in-depth ethnography is a particularly rich account of how some professional guinea pigs perceive their “work.”8

Volunteers focused on earning money through research participation adopt a variety of deceptive practices to gain admission to studies. One thing they lie about is past trial participation. Volunteers must wait at least thirty days after a phase I drug trial ends before enrolling in another one.9 Mandatory waiting periods are based on the time it takes for drugs to be eliminated from the body. Researchers want to avoid drug interactions that can compromise data quality and put subjects at risk. But volunteers employ different strategies to avoid the wait. They seek out studies at new locations that have no record of their recent trial participation. To prepare for study screening tests, they “cleanse” their bodies by following certain diets and using substances like cranberry juice, water, goldenseal, marigold flowers, and other herbal remedies. They also take iron supplements to counter the effects of multiple blood samples taken in previous trials.10

Volunteers lie about other things in their quest for trial admission. Studies have a number of eligibility requirements designed to promote data quality and protect subjects from harm. Repeat volunteers know about these requirements and plan accordingly. Besides using the cleansing remedies described earlier, they supply false information about their use of alcohol, medication, recreational drugs, cigarettes, and caffeine. They also lie about their age, medical history, dietary practices, and exercise habits. Some participate in more than one trial at the same time, which is usually not permitted.11

The deception does not end once volunteers gain admission to trials. To earn full compensation, subjects are told they must conform to study requirements. But study participation can demand a lot of subjects. Some chafe against the restrictions and devise ways to evade them without being detected. This sort of rule-breaking is easier to achieve in “outpatient” studies than in the onsite studies that require subjects to live at test facilities for days or weeks. But even volunteers in onsite trials (one subject called them “lockdowns”12) get away with some misbehavior.

Standardizing subjects’ diets and other environmental conditions is a way to increase the odds that test results are produced by the study intervention instead of other factors. But to get around trial diet requirements, subjects have smuggled prohibited food into facilities and broken into locked pantries for forbidden treats. Although meals are monitored in onsite trials, subjects say they hide and later discard the unappetizing food they are supposed to eat. Phase I drug trials often exclude vegetarians, but some gain admission through concealing this information. Once in the trial, they maintain their diet by trading meat for other subjects’ vegetables.13 There are also reports of subjects drinking alcohol and taking illegal drugs during trials.14

Volunteers disregard rules about medication use, too. Subjects who concealed their medication history to gain trial admission find ways to continue taking the prohibited medications.15 Volunteers in offsite trials do not always take their study drugs as directed and even onsite trial subjects are at times able to discard pills.16Participants sometimes fail to report symptoms that could be related to study drugs, because they fear they will be removed from the study for health reasons.17

Professional guinea pigs portray these actions as a reasonable response to researchers’ unrealistic expectations. As one trial subject put it, “The perfect volunteer they require doesn’t exist. Everybody lies about complying. I lied about my family medical history, yeah, about drug use, taking medicine.”18 Experienced volunteers know that truthful answers could result in their exclusion from studies and loss of the payments they hope to receive. They also doubt the value and quality of drug company studies and in turn feel little obligation to play by drug study rules.

Patient-Subjects in Clinical Trials

The above reports paint a disturbing picture of subversion among the repeat volunteers who populate the world of phase I drug testing. By their own admission, they fabricate and conceal information about past trial experience, medical history, adherence to trial requirements, and intentions to cooperate with those requirements. But subversion is not limited to normal volunteers in phase I drug trials. Subversive behavior also occurs in the later-phase studies that evaluate the effects of drugs and other interventions in patients. In this context, the data come primarily from empirical studies of subjects’ noncompliant behavior (sometimes referred to as nonadherence).19 Case reports and interviews supply additional evidence of noncompliance in later-phase trials.

Research sponsors typically do not pay patients for research participation. The customary view is that potential therapeutic benefits to patients in trials substitute for the financial rewards offered to normal volunteers. This view is changing, however, and some trials now offer relatively large payments to patient-subjects. Like normal volunteers in phase I trials, patient-subjects focused on financial rewards may falsify information related to their trial eligibility and conceal their noncompliance once they are admitted.20

More often, it is the potential medical benefit offered by later-phase trials that creates an incentive for patient-subjects to lie. Many individuals coping with debilitating or life-threatening illnesses enter trials with the goal of improving their health. If trial requirements interfere with this goal, individuals may deviate from the rules and conceal this from the research team. The desire to please researchers may also lead subjects to exaggerate their compliance.21

Although subject noncompliance in later-phase trials often goes unnoticed, some investigators use methods that can detect it. These methods allow researchers to compare subjects’ reports about their study behavior with objective measures of that behavior. In one example, investigators in an asthma study gave subjects inhalers that (unbeknownst to the subjects) were equipped to record the date and time that medication was released. Almost one-third of the subjects “dumped” all or most of the medication at least once before meeting with study staff, revealing their failure to use the inhalers as directed. Subjects also failed to disclose this behavior to the researchers.22 In another example, investigators attributed the failure of an HIV prevention trial to surreptitious nonadherence among subjects. Women in the trial had reported using preventive interventions 90 percent of the time, but later blood analysis showed the actual adherence rate was just 30 percent.23

Subversive behavior can be a particular problem in later-phase trials comparing investigational interventions with interventions already in medical use. Patients entering trials are often dissatisfied with their current treatments and hope they will be assigned to receive a new and potentially better one. And treatment trials with placebo-control groups are especially unattractive to such patients.24 As one researcher put it, “People allocated to less desirable control conditions where they feel deprived of their preferred treatment…may lose heart, or act up.”25 This phenomenon is most likely to occur in nonblinded trials, but “even when participants do not know their treatment group, they often guess or suspect, correctly or incorrectly,” the group to which they were assigned.26

Patients who see trial participation as a means to obtain better treatment commit a variety of subversive acts. Some enter trials with the specific intent to drop out early if their symptoms do not improve within a certain time.27 Subjects have also been known to share drugs to ensure that each person receives at least some of the preferred one. During the early years of the HIV/AIDS epidemic, desperate patients admitted they engaged in both kinds of behaviors, as well as “frequent cheating, even bribery, to gain entry to studies.”28 Researchers have observed similar behavior in trials focused on other conditions. So-called “contamination” occurs when “participants assigned to a control condition try to gain access to or adopt elements of the intervention condition.”29 In one study, for example, researchers learned that cancer survivors assigned to the control group of an exercise study were actually exercising at the same level as the exercise intervention group.30

Other studies have documented similar forms of noncompliance among patient-subjects. According to one review, “Research demonstrates that many clinical trial participants are overestimating their adherence and not providing the study investigators with honest self-reports.”31 Estimates are that up to 30 percent of trial subjects fail to take study drugs as directed. One expert suggests that the rates could be even higher: “there is considerable anecdotal evidence, if not hard data, that most patient-subjects are not fully compliant.”32

Participants in Other Human Studies

Subversive behavior isn’t limited to drug and other treatment studies. Participants in other kinds of research also report such behavior. For example, a journalist writing about his experience as a study participant revealed his own rule-breaking. The study he joined was designed to evaluate the effects of a Stone Age diet. As the study progressed, the journalist-subject grew weary of the monotonous diet he was expected to follow. When he lost weight, researchers wanted him to eat more. He failed to comply, however. Instead, he “sneaked extra clothes on the scale to avoid the growing portions of pork and pineapple they heaped on my plate when my weight fell.”33

Subversive subjects also create problems in deception research. Deception is used most commonly in psychology research, but is sometimes used in clinical trials, too. For example, studies of placebo effects in medicine often deceive subjects about study aims and drug trials sometimes fail to inform subjects that they will all receive placebos during a baseline assessment period.34

The problems arise when subjects suspect or know that a study involves deception. Such awareness can affect subjects’ responses, distorting study findings. In post-experimental briefings, researchers typically ask subjects whether they knew or suspected they were being deceived. Using computerized questionnaires and other methods, researchers have discovered that some subjects conceal their suspicion or awareness of deception in a trial. In an ironic twist, subjects deceive the investigators who sought to deceive them. Deceptive subjects say they are concerned that a truthful answer will jeopardize their payment or course credit, get them into trouble, or upset the investigators.35

No one knows how many subjects fail to follow the rules, but it is clear that rule-breaking happens. Some subjects freely admit to noncompliance, but much of the bad behavior is hidden. Although researchers discover some of it, an unknown amount remains undiscovered. To advance their personal desires and concerns, rule-breaking subjects undermine the research enterprise.

What’s Wrong with Subversive Behavior

From a societal perspective, subversive subjects present a serious threat. Their behavior reduces the validity of trial results, with potentially harmful consequences to patients and to subjects participating in subsequent trials. Subversive subjects also expose themselves to heightened research risks. They disregard ethical principles governing promise-keeping and truth-telling, as well.36

Distorted Study Findings

Subversive conduct can lead to inaccurate scientific conclusions about research interventions. For example, undetected noncompliance contributes to mistaken judgments about the safety and proper dosage of investigational drugs. When subjects discard study medications and conceal this action, researchers record a higher level of drug use than actually occurred. The data could lead researchers to conclude that a drug is safe and effective at the recorded rather than actual level of use. But this recommended dosage could be less safe than the lower amount to which subjects were actually exposed.37 Similarly, when subjects fail to report symptoms that might be caused by an investigational drug, researchers may conclude that the drug is safer than it actually is.38 And when subjects share study medications during trials, it becomes more difficult to discern the positive and negative effects of those medications.39 According to some experts, many drugs have been approved at unnecessarily high dosages that were eventually lowered after “we overdosed a whole lot of patients.”40

Subjects entering trials on false pretenses also jeopardize study findings. Researchers fooled into thinking that subjects qualified for drug studies may attribute to a study drug effects actually caused by the subjects’ health conditions, exposure to alcohol, or other factors unrelated to the intervention. In turn, a “potentially useful medication [could be] discarded because of an adverse event falsely attributed to the drug.”41

Although there are methodological strategies aimed at reducing the impact of subject noncompliance, the strategies are imperfect. Researchers and regulators rely on two types of analysis in assessing a drug’s safety and efficacy. One is called intent-to-treat analysis, which includes data from all subjects initially enrolled in a study, including those who later dropped out. This approach supplies information about a drug’s potential clinical impact, because a certain number of patients will also stop taking their drugs. The other approach includes data from only those subjects who completed the trial. This supplies information about the drug’s effects in individuals exposed to the full drug regimen.42 It is much easier to adjust the analysis when noncompliance is detected and subjects are withdrawn from a study than when noncompliant subjects conceal their behavior and remain in a study. Covert noncompliance presents a problem in both analytic approaches, for researchers cannot adjust the analysis to take into account behavior of which they are unaware.43

The bottom line is that rule-breaking subjects reduce the accuracy of research findings. Inaccurate findings create risks for subjects in subsequent studies based on those findings, and for patients taking drugs approved based on faulty data. Inaccurate findings can prematurely halt drug development, too, depriving patients of potentially beneficial new drugs. More broadly, subversive behavior can have a negative impact whenever health recommendations rely on the flawed data that result from such behavior.

Risks to Self

Subjects concealing or fabricating information about their medical histories and prior trial experiences put themselves at risk, too. Certain medical conditions and past drug exposures can increase a person’s susceptibility to harm from trial interventions. In a few documented cases, concealing this information proved deadly to subjects.44 For instance, a young woman participating in a healthy volunteer study at the NIH concealed a history of cardiac arrest and died after an arrest presumably related to drugs administered in the study.45 Undoubtedly more common are less serious acute effects among subjects who withheld relevant health information. One research group described cases in which study volunteers experienced ill effects after failing to tell investigators about food allergies, diabetes, and cardiac abnormalities.46 Possible long-term effects of inappropriate research exposures are another concern, especially effects on repeat volunteers participating in numerous phase I drug trials.47

One could argue that subjects should be free to decide whether to expose themselves to undue risk, but it is not clear that all deceptive subjects understand the risks they are taking. In one survey, for example, a few healthy volunteers admitted they did not realize that an inaccurate health history could elevate the risks of study participation.48 Moreover, there is general agreement that subjects should not be exposed to high risk in research even if they consent to such risk.49

Ethical Violations

Besides risking harm, subversive subjects fail to meet their moral responsibilities “to be truthful and to abide by the terms of participation.”50 When subjects consent to join studies, they consent to observe study requirements. If participation proves too burdensome, they are free to withdraw. But subjects who remain in a study while violating its terms breach their research agreements. Patients entering studies with fixed plans to withdraw if they are assigned to an unwanted treatment group or their condition does not improve also fail to meet their ethical responsibilities. Individuals falsifying or concealing information to gain admission to trials, and those concealing their noncompliant behavior during trials, are engaging in unjustified deception.51

Individuals are not entitled to manipulate the research system for personal gain. As ethicist John Arras has written, “No one has a duty to become a research subject in the first place, but by entering a protocol, subjects enter into a moral relationship with researchers by promising or ‘contracting’ to abide by certain restrictions for the benefits of participation.”52 Indeed, some professional guinea pigs share this view. Just Another Lab Rat!, a website created by a repeat volunteer, lists in a mission statement its goal of “refer[ring] better educated and prepared volunteers to clinics who will be more likely to follow through with studies” and “advocating for responsible and ethical behavior from research volunteers.”53

In sum, it is easy to fault subjects for their subversive behavior. They risk harm to themselves and others, violate agreements, and deceive researchers. But subversive subjects are not the only ones at fault. Some subversive behavior is the unsurprising side effect of a research system with ethical deficiencies.

How Research Contributes to Subversion

Much of the contemporary rhetoric about clinical research highlights the essential contribution subjects make to the research endeavor. The literature is full of statements like the following:

Clinical investigators, research institutions, and funding agencies were indispensable to [the past century’s medical] advances. Equally important were the millions of individuals who agreed to participate in the research that proved the effectiveness of the interventions that worked and, no less importantly, the ineffectiveness of those that did not.54

But subversive subjects expose a gap between the platitudes and the reality. Subversive subjects present a different and less egalitarian vision of human research. In this vision, subjects are treated with disrespect and discourtesy; their interests and contributions are devalued. They are regarded as servants rather than partners, and in turn, feel little allegiance to the research mission.

Subjects trace their disillusionment to several kinds of mistreatment. Repeat trial volunteers complain about substandard conditions in some research units. They tell tales of poorly organized trials, inattentive staffs, and silly rules like mandatory bedtimes in onsite studies. To call attention to problems like these, “report cards” on different research units, assigning low grades for deficiencies like bad food, mediocre staff, and excessive security, waiting times, tests, and follow-up visits. 55

Helms’s report card grades took into account the quality of the consent process as well. Helms and other study volunteers describe several problems in this area. Research team members poorly explain studies and are unprepared to answer subjects’ questions. Copies of study protocols are difficult to obtain and researchers alter consent forms after subjects have signed them.56

Subjects complain that researchers are sometimes dishonest, too. A website called Guinea Pigs Get Paid reports that staff members at some research units inflate the number of subjects needed for trials, misleading individuals about their odds of being accepted. This allows the staff to be highly selective about who is actually chosen, wasting the time of the others who show up. Staff members also tell would-be volunteers that they have been admitted to trials, when they are actually alternates enlisted in case others fail to appear.57 Research coordinators mislead patient-subjects into thinking that it is compliance, rather than study group assignment, that largely determines whether they will benefit from trial participation.58 One investigator told a subject that he was not allowed to withdraw from a study, a clear violation of the federal research regulations.59

When subjects experience disrespectful treatment like this, relationships with researchers can become adversarial. Professional guinea pigs use stark language to express their resentment. They use “images of torture, sex work, or prostitution when describing their activities.”60 They see themselves as “meat puppets” and “brain sluts.”61 One wrote of “sitting around like an animal in the zoo,” with people observing her through windows and delivering food to her room.62

Subjects attribute at least some of their rule-breaking to researchers’ misbehavior. According to Roberto Abadie, professional guinea pigs use “everyday forms of resistance” to oppose “conditions that dehumanize, alienate, and exploit them.”63 This phenomenon is not limited to the professional volunteers — patient-subjects react negatively to unsatisfactory study conditions, too. Noncompliance is more likely in studies involving unreasonable time demands, unfriendly staff, and unpleasant surroundings.64

At times, researchers are also complicit in subversive behavior. According to Helms, “everyone,” including the research staff, knows that guinea pigs lie to get into high-paying trials.65 Sociologist Jill Fisher observed study coordinators who recruited patients for trials by stressing their freedom to drop out if they were assigned to a placebo group.66 One volunteer wrote about a study director who told her that alcohol and certain medications were prohibited during the trial, then winked and said, “But hey, we’re not always going to be looking over your shoulder.”67 Researchers who fail to take study requirements seriously cannot protest when subjects follow their lead.

Hostility and cynicism are not the only detrimental psychological responses fostered by the research environment. Noncompliant subjects who see researchers as authority figures may keep quiet out of a reluctance to let down or anger the team. According to one researcher, people appearing on paper as “super-compliers” are actually among the least compliant. “They are very nice people who do not want to disappoint you….”68 Similarly, subjects in deception research don’t admit to suspecting the deception because “they are worried that they might ruin the study” and thus upset the investigator.69 The researcher’s power to remove subjects from studies is probably the biggest factor in concealed noncompliance. Subjects hoping for full payment, medical benefit, or course credit have strong incentives to cover up their deviations from study requirements.

Researchers often portray subjects as equal partners in the research endeavor. But not every subject sees things this way. At least some believe their role is devalued, their agency overlooked. These subjects say they are at times treated as mere data sources whose personal needs and concerns get in the way of the research process. Other subjects say they are ashamed or afraid to admit when they fall short of researchers’ expectations. Subjects who see researchers as authority figures may respond in ways that damage the research mission.

Responses to Subversion

People worried about subversive subjects propose a variety of measures to address the problem. They endorse three general approaches: more vigilant policing of subjects’ behavior; intensified efforts to guide subjects toward compliance; and increased collaboration with subjects to develop mutually acceptable research conditions.

Researchers attempting to reduce rule-breaking through better policing offer an array of strategies. One is to increase external monitoring of subjects. A variety of “assays, devices, tests, and biochemical measures” enable researchers to detect whether subjects comply with bans on smoking, alcohol, and recreational drugs.70 With electronic monitoring devices, such as the inhalers used in the asthma study described earlier, researchers can determine subjects’ true compliance rates. Experts predict that the future will bring more external monitoring tools. Scientists are developing “smart pills” with microchips that send a computer alert when the pill is swallowed.71 Indeed, two analysts report that researchers and physicians “are on the threshold of having an armamentarium of ‘big brother’ strategies to determine who is noncompliant.”72

Other policing strategies focus on enforcing trial eligibility criteria. To prevent repeat volunteers from violating waiting period and other eligibility requirements, some countries, as well as a few U.S. research institutions, have created centralized databases that store information about each study in which an individual participates. Researchers screening potential subjects for new trials can consult the database to determine a candidate’s prior record and then reject the ineligible individuals.73

Researchers also use screening measures to identify and exclude individuals unlikely to comply with study requirements. According to one review, “the best predictor of future adherence behavior is past adherence behavior.”74 Researchers assess adherence by conducting a brief preliminary study in which they evaluate potential subjects’ performance on tasks like returning phone calls and showing up for visits on time. Subjects who perform poorly are then excluded from the main study.75

Assessing short-term adherence is a relatively costly measure, however, so researchers have tried to find simple demographic criteria that would allow them to separate probable rule-breakers from rule-followers. They haven’t had much success, however. For example, researchers could find no significant differences between subjects who were and were not compliant in the asthma study described earlier.76

Another policing strategy is to threaten and impose monetary penalties on subjects who fail to follow study rules or drop out “for apparently trivial nonmedical reasons.”77 Supporters of the approach say that research agreements should be regarded as contracts and that subjects should be legally responsible for fulfilling their side of the bargain. Although critics raise legal and practical questions about the usefulness of such an approach, defenders say it would deter subjects from misbehaving and send a strong message about the moral responsibilities of individuals enrolled in trials.78

Researchers favoring intensified guidance over policing believe that communication and persuasion are more ethical and effective ways to reduce rule-breaking.79 On the assumption that noncompliance is often due to ignorance or forgetfulness, experts urge researchers to distribute consent forms and informational handouts that clearly describe what subjects are expected to do. Other guidance techniques are designed to make compliance easier. Electronic diaries and cell-phone alerts prompt subjects to take study medications. “Compliance packaging” for study drugs includes clear messages and graphics highlighting essential information about study drug regimens.80

Researchers adopt other guidance measures to “reeducate” subjects during trials. At one-on-one meetings and over the telephone they remind subjects of study requirements and subject responsibilities.81 Subjects who miss appointments, appear unenthusiastic, or exhibit other behaviors suggesting they are “at risk” for noncompliance “become targets to work on.”82 According to one optimistic expert, “Once study participants understand why it is important for them to take the study medications as prescribed, they will feel ‘safer’ providing honest feedback to the study team.”83

Policing and guidance strategies can be effective in detecting and deterring certain forms of subversion, and it is possible to use them in ethically acceptable ways. For example, external monitoring devices can be acceptable if they are used with subjects’ awareness and permission, rather than secretly, as they were in the asthma study described earlier. Although guidance and persuasion strategies can be patronizing and intrusive, this needn’t be the case. Many subjects will welcome reminders, clear information, and other aids that help them reconcile the demands of research participation with the demands of ordinary life.

At bottom, however, policing and guidance approaches reinforce the research hierarchy. Communication and persuasion may be more collegial than policing measures, but both approaches treat subjects more as subordinates than as partners in the research endeavor.

The collaborative approach is a more respectful response, for it regards subversive subjects as agents with their own legitimate concerns and values. Researchers adopting collaborative strategies recognize that features of the research system contribute to rule-breaking and deception. They also see punitive measures and moral censure as arrogant and often ineffective. From their perspective, subversive subjects, like computer hackers, have something important to teach the authorities. Instead of regarding rule-breakers as the enemy, they say, we ought to see them as potential allies in the effort to conduct ethical and scientifically sound clinical trials.

Supporters of the collaborative approach believe that “[m]otivation and commitment by subjects to fulfill their end of the bargain may hinge more on the appreciation of the research subject as a valued member of the research team than as a hired hand.”84 Thus, the best way to proceed is to replace practices that devalue subjects’ contributions with practices that demonstrate appreciation for what they do. And to learn what changes are needed, researchers need only consult the subjects themselves: “Who better than they to advise us on what makes for a ‘rewarding’ research experience?”85

In articles, interviews, and empirical studies, subjects have already said a lot about what could be done to increase their commitment to trial requirements. Simple quality-of-life upgrades could go a long way toward improving the situation. Research facilities should be clean and comfortable. Subjects in onsite drug studies ought to have decent housing, good-quality food, and activities to lessen the boredom that some consider the worst part of the experience. Restrictions on their freedom should be limited to those required by study protocols or the demands of communal living. All subjects should receive free care and compensation for personal losses if they are injured in research.86

The research staff also plays a major role in subjects’ commitment to play by the rules. Subjects have high praise for studies conducted by personable and efficient teams.87 Subjects are grateful when staff members pay attention and respond to their concerns. Subjects appreciate sincere expressions of thanks for the pain, discomfort, and disruption they endure. Subjects value convenience, too. In one survey of patient-subjects, staff flexibility in scheduling research visits was the number-one concern for a majority of the respondents.88 Another study found that subjects were more cooperative and committed when staff members maintained regular contact and supplied information on study progress.89

Another collaborative strategy is to adjust study requirements in ways that promote cooperation. Researchers adopted this strategy in the early years of the HIV/AIDS epidemic, when patients and activists rebelled against strict study rules. Activists eventually persuaded researchers and Food and Drug Administration officials to become more flexible about trial methodology. For example, activists successfully challenged rigid trial eligibility criteria and strict rules governing subjects’ medication use during trials.90

On a broader scale, the move to “adaptive” trial designs is a response to patients’ reluctance to enroll in and complete studies that fail to offer them direct medical benefit. Besides the compliance problems I have described, researchers face a general shortage of patients willing to participate in clinical trials.91 In an attempt to make trials more attractive to patients, researchers are developing study designs that allow randomization ratios, drug dosages, and other trial requirements to be adjusted during the trial, based on accumulating trial data. These designs allow more participants to be assigned to groups with superior outcomes.92 Another relatively new approach is the “preference trial,” which is designed to give at least some participants an opportunity to choose which study intervention they will receive.93 The reasoning is straightforward: “Subjects will simply find it easier to abide by the terms of protocols that pose less restrictive alternatives and require fewer personal sacrifices.”94

Similar collaborative moves could promote compliance in phase I drug trials and other normal volunteer studies.95 Is it really necessary, for example, to exclude vegetarians from phase I drug studies? Are other restrictive eligibility criteria applied more out of habit than genuine scientific need? Could some study tests, visits, dietary requirements, and other measures be eliminated without threatening the quality of the data? The research establishment should be open to potential changes that would make drug safety and other early-phase trials more subject-friendly while preserving study quality.

Community engagement is another collaborative development that could reduce subject subversion. More researchers are asking prospective participants and their communities for advice on how to design and conduct trials. Community engagement can be used to ascertain how representatives of subject populations perceive risks, burdens, and other dimensions of proposed studies. By working with such representatives, research teams can obtain information that allows them to design subject-friendly studies. Subjects are more likely to cooperate with studies that take their needs and interests into account.96 As one researcher commented, “participation that is actually enjoyable and interesting to research participants…has a higher likelihood of retaining them.”97

Conclusion

Years ago, the philosopher Hans Jonas considered a fundamental moral question in human research: under what conditions is it acceptable to put some individuals at risk for the benefit of others? According to Jonas, a human study is most acceptable when it involves subjects who fully identify with and understand the purpose of the research. Jonas argued that “the appeal for volunteers should seek this free and generous endorsement, the appropriation of the research purpose into the person’s own scheme of ends.”98

Subversive subjects occupy a category distant from Jonas’s ideal volunteer. No one, including members of the research community, should be shocked that many – perhaps most – subjects care more about their personal needs and circumstances than about the knowledge-seeking objectives of research. The question is what to do about subversive behavior. Big-brother monitoring may enable researchers to detect and deter rule-breaking, but it is unlikely to increase subjects’ commitment to the research mission. Vigorous guidance and instruction can also be effective, and in some cases will convince subjects to embrace research goals. But a more respectful way to strengthen their personal commitment is to minimize research practices that encourage the subversive mindset.99

Robert Helms founded Guinea Pig Zero “to rescue the value of the contribution that human subjects make to further biomedical research.”100 At times, even the most cynical and jaded professional guinea pigs express pride in the health advances they help bring about. Subjects may participate in trials for personal gain, but many are also altruistic.101 In a hospitable and appreciative research environment, subjects may be more willing to put up with the requirements necessary to generate the good data that lead to medical progress.

Researchers seeking to reduce subversion should listen to people like Robert Helms. They are the people who can describe both the origins of subversive conduct and the changes that would diminish this behavior. They are the people who can help researchers develop trials that are reasonable and humane. With the assistance of subversive subjects, researchers can develop a system that genuinely values study participants and strengthens their commitment to the research endeavor.

Acknowledgements

Thanks to Jill Fisher, Carl Elliott, and colleagues at the Washington University Schools of Law and Medicine for helpful comments on an earlier draft. Preparation of this article was supported by National Institutes of Health CTSA Grant # UL1 TR000448. The contents of this article are solely the responsibility of the author.

References

  • 1.Epstein S. Impure Science: AIDS, Activism, and the Politics of Knowledge. Berkeley: University of California Press; 1996. at 204. [PubMed] [Google Scholar]
  • 2.Researcher Kenneth Shulz observed that because requirements like randomization “annoy human nature,” people involved in research may be tempted to evade those requirements. See Schulz KF. Subverting Randomization in Controlled Trials. Journal of the American Medical Association. 1995;274(18):1456–1458.
  • 3.Smith DL. Patient Nonadherence in Clinical Trials: Could There Be a Link to Postmarketing Patient Safety? Drug Information Journal. 2012;46(1):27–34. at 28. [Google Scholar]
  • 4.Shumaker SA, Dugan E, Bowen DJ. Enhancing Adherence in Randomized Controlled Clinical Trials. Controlled Clinical Trials. 2000;21(5):226S–232S. doi: 10.1016/s0197-2456(00)00083-0. at 226S. [DOI] [PubMed] [Google Scholar]
  • 5.Some rule violations have a more serious impact on research findings than do other violations. But any violation of a rule designed to strengthen the validity of data presents a threat to study quality.
  • 6.See Wendler D, Miller FG, et al. Deception in Clinical Research. In: Emanuel EJ, editor. Oxford Textbook of Clinical Research Ethics. Oxford: Oxford University Press; 2008. at 315–324.
  • 7.Articles from the publication are collected in Helms R, editor. Guinea Pig Zero: An Anthology of the Journal for Human Research Subjects. New Orleans: Garrett County Press; 2002. [last accessed November 4, 2013]. The publication is available at < http://www.guineapigzero.com>.
  • 8.Abadie R. The Professional Guinea Pig: Big Pharma and the Risky World of Human Subjects. Durham, North Carolina: Duke University Press; 2010. [Google Scholar]
  • 9.Resnik DB, Koski G. A National Registry for Healthy Volunteers in Phase 1 Clinical Trials. Journal of the American Medical Association. 2011;305(12):1236–1237. doi: 10.1001/jama.2011.354. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.See Abadie, supra note 8, at 81–82. See also Solow B. The Secret Lives of Guinea Pigs. Independent Weekly. 2000 Feb 9;
  • 11.See Tishler CL, Bartholomae S. Repeat Participation Among Normal Healthy Research Volunteers. Perspectives in Biology and Medicine. 2003;46(4):508–520. doi: 10.1353/pbm.2003.0094.; Apseloff G, Swayne JK, Gerber N. Medical Histories May Be Unreliable in Screening Volunteers for Clinical Trials. Clinical Pharmacology & Therapeutics. 1996;60(3):353–356. doi: 10.1016/S0009-9236(96)90063-6.; Hermann R, Heger-Mahn D, Mahler M, Seibert-Grafe M, Klipping C, Breithaupt-Grogler K, de Mey C. Adverse Events and Discomfort in Studies on Healthy Subjects: The Volunteer’s Perspective. European Journal of Clinical Pharmacology. 1997;53(3–4):207–214. doi: 10.1007/s002280050364.; McHugh J. Drug Test Cowboys: The Secret World of Pharmaceutical Trial Subjects. [last accessed November 4, 2013];Wired. 2007 Apr 24; available at < http://www.wired.com/wired/archive/15.05/feat_drugtest.html>.; Patriquin M. Inside the Human Guinea Pig Capital of North America. MacLeans. 2009;122(33); Solow, supra note 10. A University of Pennsylvania School of Medicine official observed, “We ask subjects to disclose if they’re participating in other trials–but if someone wants to lie, I won’t necessarily know if they’re simultaneously doing a trial across town.” Glenn D. Inside the Risky World of Drug-Trial ‘Guinea Pigs’. Chronicle of Higher Education. 2010 Jul 11;
  • 12.See Solow, supra note 10.
  • 13.See Abadie, supra note 8, at 60–61; T. Dulce, “Spanish Fly Guinea Pig: PPD Pharmaco, Where Slackers Refuel,” in Helms, supra note 7, at 34; C. Elliott, “Guinea-Pigging,” The New Yorker, January 7, 2008; Solow, supra note 10.
  • 14.See Patriquin, supra note 11; Solow, supra note 10.
  • 15.See Patriquin, supra note 11; Apseloff, Swayne, and Gerber, supra note 11.
  • 16.See Abadie, supra note 8, at 60–61; Dulce, supra note 13, at 37.
  • 17.See Hermann, supra note 11; Cohen LP. Stuck for Money: To Screen New Drugs for Safety, Lilly Pays Homeless Alcoholics. Wall Street Journal. 1996 Nov 14;
  • 18.See Abadie, supra note 8, at 24.
  • 19.These terms are used interchangeably. Both terms are criticized on grounds that they “subtly exaggerate the importance of the clinician,” suggesting a hierarchical rather than egalitarian relation between the medical professional and layperson. Steiner JF, Earnest MA. The Language of Medication-Taking. Annals of Internal Medicine. 2000;132(11):926–930. doi: 10.7326/0003-4819-132-11-200006060-00026.; Holm S. What Is Wrong With Compliance? Journal of Medical Ethics. 1993;19(2):108–110. doi: 10.1136/jme.19.2.108.
  • 20. Fisher JA. Medical Research for Hire: The Political Economy of Pharmaceutical Clinical Trials. New Brunswick, NJ: Rutgers University Press; 2009. at 187.; Abadie R. Tracking Professional Guinea Pigs. 2010 Oct 15; available at < http://www.thehastingscenter.org/Bioethicsforum/Post.aspx?id=4933&amp;blogid=140> (last visited November 4, 2013). For a description of payment practices in studies offering payment to both healthy volunteers and patient-subjects, see Grady C, Dickert N, Jawetz T, Gensler G, Emanuel E. An Analysis U.S. Practices of Paying Research Participants. Contemporary Clinical Trials. 2005;26(3):365–375. doi: 10.1016/j.cct.2005.02.003.
  • 21.See Smith, supra note 3, at 30.
  • 22.Simmons MS, Nides MA, Rand CS, Wise RA, Tashkin DP. Unpredictability of Deception in Compliance with Physician-Prescribed Bronchodilator Use in a Clinical Trial. Chest. 2000;118(2):290–295. doi: 10.1378/chest.118.2.290. [DOI] [PubMed] [Google Scholar]
  • 23.Cohen J. Human Nature Sinks HIV Prevention Trial. available at < http://news.sciencemag.org/sciencenow/2013/03/human-nature-sinks-hiv-preventio.html> (last visited November 4, 2013)
  • 24.See Fisher, supra note 20, at 189.
  • 25. Bradley C. Designing Medical and Educational Intervention Studies. Diabetes Care. 1993;16(2):509–518. doi: 10.2337/diacare.16.2.509. at 511. See also Brewin C, Bradley C. Patient Preferences and Randomised Clinical Trials. BMJ. 1989;299(6694):313–315. doi: 10.1136/bmj.299.6694.313. (“despite having full information and giving consent, patients may still find themselves allocated to non-preferred treatments, which lowers their motivation to make the treatment work”).
  • 26.Moyer A. Psychomethodology: The Psychology of Human Participation in Science. Journal of Psychology of Science and Technology. 2009;2(2):59–72. at 64. [Google Scholar]
  • 27. Finn R. Cancer Clinical Trials: Experimental Treatments & How They Can Help You. Sebastopol, CA: O’Reilly; 1999. at 18, 31. Subjects have a protected right to withdraw from research despite their earlier consent to participate. See C.F.R. § 46.116(a) (8) (2011). At the same time, individuals ought to enter trials with a good faith intention to remain unless participation becomes too burdensome.
  • 28.See Epstein, supra note 1, at 228. See also Arras JD. Noncompliance in AIDS Research. Hastings Center Report. 1990;20(5):24–32.
  • 29.See Moyer, supra note 26, at 64.
  • 30.Id.
  • 31.See Smith, supra note 3, at 29.
  • 32.See Fisher, supra note 20, at 192.
  • 33.See McHugh, supra note 11.
  • 34.See Mann H. Deception in the Single-Blind Run-In Phase of Clinical Trials. IRB: Ethics & Human Research. 2007;29(2):14–17.; Miller F, Wendler D, Swartzman L. Deception Research on the Placebo Effect. PLoS Medicine. 2005;2(9):853–859. doi: 10.1371/journal.pmed.0020262.
  • 35.Blackhart GC, Brown KE, Clark T, Pierce DL, Shell K. Assessing the Adequacy of Postexperimental Inquiries in Deception Research and the Factors that Promote Participant Honesty. Behavior Research Methods. 2012;44(1):24–40. doi: 10.3758/s13428-011-0132-6. [DOI] [PubMed] [Google Scholar]
  • 36.Resnik DB, Ness E. Participants’ Responsibilities in Clinical Research. Journal of Medical Ethics. 2012;38(12):746–750. doi: 10.1136/medethics-2011-100319. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.See Smith, supra note 3.
  • 38.A study of normal volunteers showed that only about two-thirds of them promptly informed investigators about adverse events; the remainder withheld information temporarily or permanently. See Hermann et al., supra note 11.
  • 39.See Epstein, supra note 1, at 204. Subjects in the first placebo-controlled trial of AZT for HIV/AIDS admitted to sharing pills, but the trial still found that the drug was beneficial. Epstein reports that the subjects’ noncompliance did not have a major impact on the study’s outcome: “Noncompliance effectively blurred the differences between the treatment arm and the placebo arm, so the demonstration of a statistically significant difference became all the more impressive.” Id., at 238.
  • 40. Mitka M. FDA and Pharma Seek Better Ways to Assess Drug Safety, Efficacy in Clinical Trials. JAMA. 2012;307(24):2576–2577. doi: 10.1001/jama.2012.6684. at 2576. For specific cases in which initially approved dosages were later lowered due to safety concerns, see Smith, supra note 3.
  • 41.See Abseloff et al., supra note 11, at 356.
  • 42.See Fisher, supra note 20, at 182–183; Mitka, supra note 40, at 2576.
  • 43.See Rice S, Trafimow D. Known versus Unknown Threats to Internal Validity. American Journal of Bioethics. 2011;11(4):20–21. doi: 10.1080/15265161.2011.560344.
  • 44.See Tishler and Bartholomae, supra note 11; Apseloff et al., supra note 11.
  • 45.See Apseloff et al., supra note 11.
  • 46.Id.
  • 47.See Abadie, supra note 8, at 74, 158; Shamoo A, Resnik DB. Strategies to Minimize Risks and Exploitation in Phase One Trials on Healthy Subjects. American Journal of Bioethics. 2006;6(3):W1–W13. doi: 10.1080/15265160600686281.
  • 48.See Hermann et al., supra note 11.
  • 49.Miller FG, Wertheimer A. Facing Up to Paternalism in Research Ethics. Hastings Center Report. 2007;37(3):24–34. doi: 10.1353/hcr.2007.0044. [DOI] [PubMed] [Google Scholar]
  • 50.De Ville K. The Case Against Contract: Participant and Investigator Duty in Clinical Trials. American Journal of Bio-ethics. 2011;11(4):16–18. 17. doi: 10.1080/15265161.2011.560349. [DOI] [PubMed] [Google Scholar]
  • 51.See Arras, supra note 28.
  • 52.Id., at 25. See also Resnik and Ness, supra note 36.
  • 53. Just Another Lab Rat! Mission Statement. 2013 available at < jalr.org/mission.html> (last visited November 15, 2013). The website is a project of Paul Clough, a man who earns his living through clinical trial participation. See O’Meara A. Chasing Medical Miracles: The Promise and Perils of Clinical Trials. New York: Walker & Company; 2009. at 111–112.
  • 54.Schaefer GO, Emanuel EJ, Wertheimer A. The Obligation to Participate in Research. Journal of the American Medical Association. 2009;302(1):67–72. doi: 10.1001/jama.2009.931. at 68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.R. Helms, “The What, Why, and How of the GPZ Grading System,” in Helms, supra note 7, at 3–4. See also “22 Nights and 23 Days: Diary of #1J, Drug Study Subject,” 2006, available at <http://www.guineapigzero.com/23days.html> (last accessed April 26, 2013); Hellard ME, Sinclair MI, Forbes AB, Fairley CK. Methods Used to Maintain a High Level of Participant Involvement in a Clinical Trial. Journal of Epidemiological and Community Health. 2001;55:348–351. doi: 10.1136/jech.55.5.348. In a telling incident, after Harper’s Magazine published some of Helms’s report cards, a facility receiving a bad grade sued Helms for libel. See Abadie, supra note 6, at 52–53; Elliott, “Research Volunteers Wanted. Earn Up to $7000,” Tin House, Spring 2008, at 103–06, 104.
  • 56.See Abadie, supra note 8, at 57, 139; Finn, supra note 27, at 119; McHugh, supra note 11; Public Responsibility in Medicine and Research. [last accessed April 26, 2013];In Their Own Voices: A Discussion with Research Subjects Who Also Work in The Field of Subject Protection. 2010 Dec 7; available at < http://www.meetingproceedings.com/2010/aerc/contents/index.asp>; restricted access.
  • 57.Guinea Pigs Get Paid. [last accessed November 4, 2013];Tips for Clinical Trials and Clinical Study Volunteers. 2009 available at < http://www.gpgp.net/tips.html>.
  • 58.See Fisher, supra note 20, at 196–97.
  • 59.Public Responsibility in Medicine and Research. [last visited April 26, 2013; restricted access];What Do Research Subjects Have to Say about Informed Consent? 2011 Dec 3; available at < http://www.eventscribe.com/2011/PRIMR/SearchByDay.asp?day=12/3/2011> (last visited April 26, 2013; restricted access).
  • 60.See Abadie, supra note 8, at 10–11.
  • 61.See Elliott, supra note 55, at 104.
  • 62.See “21 Nights and 23 Days,” supra note 55. In another sign of depersonalization, a volunteer reported that test site staff called him by his trial number instead of his name. See Abadie, supra note 8, at 29.
  • 63.See Abadie, supra note 8, at 157.
  • 64.See Fisher, supra note 20, at 184–85.
  • 65.Public Responsibility in Medicine and Research. A Discussion with Research Subjects and Their Advocates. 2009 Nov 15; available at < http://www.meetingproceedings.com/2009/aerc/contents/index.asp> (last visited November 4, 2013).
  • 66.See Fisher, supra note 20, at 190.
  • 67.E. Elliot, “Panic at Penn,” in Helms, supra note 7, at 29–33, 29.
  • 68.Shelton DL. Patients in Clinical Trials Don’t Always Follow the Program. American Medical News. 2000 Sep 11; [Google Scholar]
  • 69.See Blackhart, supra note 35, at 36.
  • 70.Rand CS, Sevick MA. Ethics in Adherence Promotion and Monitoring. Controlled Clinical Trials. 2000;21(5):241S–247S. 245S. doi: 10.1016/s0197-2456(00)00085-4. [DOI] [PubMed] [Google Scholar]
  • 71.Redfearn S. Smart-Pill Technology Could Monitor Patient Compliance While Improving Clinical Trial Data Quality. 2011 Apr 4; available at < http://www.centerwatch.com/news-online/article/1338/> (last visited November 4, 2013).
  • 72.See Rand and Sevick, supra note 70, at 245S.
  • 73.Other nations have such registries, and some U.S. research institutions do, as well. To be effective in our mobile society, registries need to cover a wide geographic area. See Resnik and Koski, supra note 9. For this reason, a private U.S. venture called Verified Clinical Trials is attempting to establish a worldwide registry. “Verified Clinical Trials,” available at <http://www.verifiedclinicaltrials.com> (last visited November 4, 2013).
  • 74.See Shumaker et al., supra note 4, at 228S.
  • 75.Id.
  • 76.See Simmons et al., supra note 22, at 294. See also Shumaker et al., supra note 4.
  • 77.Edwards S. Assessing the Remedy: The Case for Contracts in Clinical Trials. American Journal of Bioethics. 2011;11(4):3–12. doi: 10.1080/15265161.2011.560340. at 3. [DOI] [PubMed] [Google Scholar]
  • 78.Robertson J. Contractual Duties in Research, Surrogacy, and Stem Cell Donation. American Journal of Bioethics. 2011;11(4):13–14. doi: 10.1080/15265161.2011.560343. [DOI] [PubMed] [Google Scholar]
  • 79.See Resnik and Ness, supra note 36.
  • 80.See Smith, supra note 3, at 32.
  • 81.See Fisher, supra note 20, at 193–198.
  • 82.See Shumaker et al., supra note 4, at 229S.
  • 83.See Smith, supra note 3, at 30. Education will not always do the trick, however. Research coordinators told Jill Fisher that subjects who understand the scientific justification for placebo-controlled trials are not necessarily more accepting of assignment to a placebo group. See Fisher, supra note 20, at 189–190.
  • 84.Reame NK. Treating Research Subjects as Unskilled Wage Earners: A Risky Business. American Journal of Bioethics. 2001;1(2):53–54. doi: 10.1162/152651601300169103. [DOI] [PubMed] [Google Scholar]
  • 85.Id., at 54.
  • 86.Elliott C. Justice for Injured Research Subjects. New England Journal of Medicine. 2012;367(1):6–8. doi: 10.1056/NEJMp1205623. [DOI] [PubMed] [Google Scholar]; Dresser R. Aligning Regulations and Ethics in Human Research. Science. 2012;337(6094):527–528. doi: 10.1126/science.1218323. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.See, for example, R. Helms, “La Crème de la Crème: Thomas Jefferson University,” in Helms, supra note 7, at 8–9; Donno, “Awake with a Vengeance,” in id., at 22–27. Abadie reports that in recent years, competition among research organizations has produced improved conditions in some locales but not others. He also learned that some guinea pigs do not like the “fancy sites” because they are too large and impersonal. See Abadie, supra note 8, at 22–23. The comments on staff behavior bring to mind Michael Kahn’s plea for more emphasis on basic etiquette in medical training: “The very notion of good manners may seem quaint or anachronistic, but it is at the heart of the mission of other service-related professions. The goals of a doctor differ in obviously important ways from those of a Nordstrom’s employee, but why shouldn’t the clinical encounter similarly emphasize the provision of customer satisfaction through explicit actions?” Kahn M. Etiquette-Based Medicine. New England Journal of Medicine. 2008;358(19):1988–1989. doi: 10.1056/NEJMp0801863. at 1988.
  • 88.McDonald D, Lamberti MJ. The Psychology of Clinical Trials: Understanding Physician Motivation and Patient Perception. Centerwatch Research Brief. 2006 Oct 4; available at < http://www.centerwatch.com/news-online/article/566/the-psychology-of-clinical-trials-understanding-physician-motivation-and-patient-perception> (last visited November 4, 2013).
  • 89.See Hellard et al., supra note 55.
  • 90.See Epstein, supra note 1, at 208–264; Dresser R. When Science Offers Salvation: Patient Advocacy and Research Ethics. New York: Oxford University Press; 2001. at 21–43.
  • 91.Weisfeld N, English RA, Claiborne AB. Envisioning a Transformed Clinical Trials Enterprise in the United States: Establishing an Agenda for 2020. Washington, DC: National Academies Press; 2012. [PubMed] [Google Scholar]
  • 92.Meurer W, Lewis R, Berry D. Adaptive Clinical Trials: A Partial Remedy for the Therapeutic Misconception? Journal of the American Medical Association. 2012;307(22):2377–2378. doi: 10.1001/jama.2012.4174. [DOI] [PubMed] [Google Scholar]
  • 93.See Floyd A, Moyer A. Effects of Participant Preferences in Unblinded Randomized Controlled Trials. Journal of Empirical Research on Human Research Ethics. 2010;5(2):81–93. doi: 10.1525/jer.2010.5.2.81.; Janevic M, Janz NK, Dodge JA, Lin X, Pan W, Sinco BR, Clark NM. The Role of Choice in Health Education Intervention Trials: A Review and Case Study. Social Science and Medicine. 2003;56(7):1581–1594. doi: 10.1016/s0277-9536(02)00158-2.
  • 94.See Arras, supra note 28, at 31. See also Finn, supra note 27, at 117, describing subject’s success in convincing researchers to require a lower number of biopsies in a trial.
  • 95.Researchers addressing subversive subjects in deception research note that there are “several ethical and methodological reasons why researchers should use deception sparingly.” The inability to accurately detect subjects’ awareness of deception is “yet another reason” for reducing their use of this technique. See Blackhart et al., supra note 35, at 36.
  • 96.Lynch JA. ‘Through a Glass Darkly’: Researcher Ethnocentrism and the Demonization of Research Participants. American Journal of Bioethics. 2011;11(4):22–23. doi: 10.1080/15265161.2011.560351. [DOI] [PubMed] [Google Scholar]; Marsh V, Kamuya D, Rowa Y, Gikonyo C, Molyneux S. Beginning Community Engagement at a Busy Biomedical Research Programme: Experiences from the KEMRI CGMRC-Wellcome Trust Research Programme. Social Science & Medicine. 2008;67(5):721–733. doi: 10.1016/j.socscimed.2008.02.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.See Moyer, supra note 26, at 68.
  • 98.Jonas H. Philosophical Reflections on Experimenting with Human Subjects. Daedalus. 1969;98(2):219–247. at 236. [Google Scholar]
  • 99.Empirical evidence should be collected on the effectiveness of different strategies to reduce rule-breaking.
  • 100.See Abadie, supra note 8, at 51. In a panel presentation to researchers and Institutional Review Board staff and committee members, Helms called on the audience to value what healthy guinea pigs do for modern medicine. “Don’t think about us as couch potatoes who just take money,” he asked. See Public Responsibility in Medicine and Research, supra note 65.
  • 101.See Abadie, supra note 8, at 41; Stunkel L, Grady C. More Than the Money: A Review of the Literature Examining Healthy Volunteer Motivations. Contemporary Clinical Trials. 2011;32(3):342–352. doi: 10.1016/j.cct.2010.12.003.; Hermann et al., supra note 11.

RESOURCES