Abstract
Clinical nutrition research has played a pivotal role in establishing causality between diet or nutrient intake and health outcome measures and in the determination of dietary requirements and levels of supplementation to achieve specific outcomes. Because the studies are performed with humans, clinical nutrition research can be readily translated into public health messages. However, there are many challenges and considerations unique to the field, such as the baseline nutritional status of study participants, defining appropriate control groups, effective blinding of participants and investigators, the evolving ethics of randomized control trials, and a tension in a priori decisions regarding inclusion of nutritionally vulnerable participants versus representative samples of general populations. Regulatory approvals that place increasing burdens on the ability of investigators to carry out and complete research protocols have grown dramatically in recent years. There is much room for improved efficiency in the approval and reporting processes aimed at protecting volunteers and providing transparency to the public. Decreased redundancy would have a direct benefit to clinical nutrition research and investigators. Despite these challenges, the information to be gained and the rewards of clinical nutrition research remain high.
Keywords: clinical nutrition research, dietary interventions, dietary supplements, human study ethics, human study regulations, institutional review boards, randomized control trials
INTRODUCTION
Clinical nutrition research involves the study of the effects of dietary interventions on one or more biological or health-related endpoints in human participants. Such research is foundational to providing evidence for dietary guidance and public health messaging. Experimental dietary modifications may include several components of a diet, often involving changes in whole food dietary patterns and, consequently, changes in multiple macronutrients and micronutrients simultaneously. A prime example is the Dietary Approaches for Stopping Hypertension (DASH) diet, which has been shown to reduce blood pressure in adults.1 Increasing understanding of the influence of dietary patterns on health is a focus of the most recent iteration of the Dietary Guidelines for Americans (2015–20).2 Alternatively, experimental dietary modifications may focus on the addition or subtraction of a single nutrient to the diet. Good examples are the randomized controlled trials (RCTs) that showed periconceptual folic acid supplements reduce the incidence of neural tube defects, which led to the now widespread practice of folic acid fortification around the world.3 Clinical nutrition studies have historically been key to determining nutrient requirements. This is especially true for micronutrients, where a single nutrient can be manipulated while other potential dietary confounders are held constant. For example, the level of calcium intake shown to optimize calcium accretion in adolescence4 is used as the recommended dietary allowance for calcium for adults.4
Basic principles for designing, managing, and conducting clinical research studies are available in the literature.5 The goal of this perspective paper is to overview some of the newer and specific challenges associated with conducting clinical nutrition research. Hopefully, shared experiences can help new researchers to the field be better prepared to navigate the obstacles.
SPECIAL ISSUES WITH DIET/NUTRIENTS AS THE INTERVENTION
The gold standard for clinical research is the double-blind RCT. Randomized controlled trials provide interventions that reduce confounding and allow causation to be inferred. In clinical nutrition studies, the whole diet can be controlled, as in metabolic balance studies, or just a nutrient or bioactive compound can be withheld or provided as a supplement with an otherwise self-selected diet. Randomized controlled trials for single micronutrients or bioactive ingredients are much easier to accomplish than macronutrient, whole food, or dietary pattern studies. The more complex the design of menus; the procurement, storage, and transfer of the intervention to the participant; and the participant’s ability to handle receipt of the intervention, including storage, preparation, and protocol compliance, can be daunting. Even when a study is conducted at a clinical research site, there are logistical challenges, such as parking availability, the number of subjects that can be accommodated at a given time, storage concerns, and so forth. Moreover, it is difficult, if not impossible, to blind participants to the forms of whole foods or macronutrients that make up their study meals. In contrast, micronutrients or other bioactive compounds may be provided in the form of pills that are identical in appearance to placebo, thus allowing for true blinding, similar to pharmaceutical drug trials. However, there are important differences between diet/nutrient RCTs and drug RCTs that greatly influence the design and conduct of such trials as well as their interpretation.
Challenges with nutrition interventions that are not found in drug trials are summarized in Table 1. For compliance with a dietary pattern or macronutrient feeding study, the diet or food has to be well tolerated; there is also a higher bar for the dietary intervention to be appealing than for drugs meant to treat a specific health problem where motivation to participate and comply with study protocols may be higher due to the immediacy of the condition (eg, cancer, heart disease, Alzheimer’s disease). A food or diet must be accepted by a wide range of tastes and cultural preferences in most studies. Furthermore, unlike with drugs, the whole diet may shift when the aim is to study 1 dietary component. This is especially true for macronutrients (ie, protein, fat, and carbohydrates). When 1 is manipulated, so is another. This begs the question: Was the response due to increased protein or fat, for example, or to a decrease in the displaced macronutrients? And this assumes that the alterations in macronutrients are made while maintaining equal overall caloric intake. Alternatively, a specific macronutrient may be altered without changing intake of other macronutrients, but this imposes the confounding factor that overall caloric intake has been changed. Moreover, proteins, fats, and carbohydrates are general categories of nutrients with subcategories that may have very different dietary and metabolic properties. For example, when designing a low-fat experimental diet to be compared with a high-fat diet, what are the forms and distribution of fats (saturated, monounsaturated, polyunsaturated) that will be used in each diet? In addition, when changing whole foods in a diet, it is very difficult to maintain equal intakes of micronutrients and bioactive compounds because the levels of these substances can vary dramatically within different foods.
Table 1.
Study design factor | Issues for consideration |
---|---|
Test nutrient or dietary pattern | Are other nutrients being displaced, and will this confound interpretation of the study? Is the dietary formulation appealing and conducive to participant compliance? |
Background status | What is the baseline dietary status of the test population? Will this affect the effectiveness of the intervention? |
Delivery of test diet or supplement | How does the form, matrix, other components, and processing of the test diet or nutrient supplement affect the conduct of the trial and the participants’ ability to follow the study protocol? |
Controls | Is there an ethical and meaningful comparator? |
Length of study | Is the intervention sufficiently long for an effect on a biomarker or health endpoint to be observed? |
Blinding | How can a food or diet pattern be blinded to both the participants and the investigators? |
Study population | Is the priority to study the population likely to benefit from the intervention or to achieve generalizable results? |
Another confounding factor is that the background intake and status of a nutrient of interest can greatly influence the response being studied. In drug trials, unlike in nutrition trials, there is an absence of the drug at baseline. This is almost never true for nutrient studies, thus requiring consideration of the baseline status of the nutrient in question in the study population. Our current ability to assess usual intake of a nutrient, much less a bioactive constituent, is poor,6,7 and only some nutrients have a good biochemical status indicator (eg, a blood or urinary analyte). Moreover, if the baseline level of the intervention substance is already adequate, little change in outcome can be expected. Many nutrients have threshold intakes—that is, the enzyme, carrier, or receptor becomes saturated. The majority of studies do not consider the starting status of participants being recruited or even assess intake or status at the start of the study. On the other hand, recruiting only those with an intake level below a certain threshold or those most at risk for the outcome being measured limits the generalizability of findings. In addition, withholding a nutrient in an RCT from individuals known to be low or deficient in that nutrient may create an ethical dilemma.
As with many drugs, the form a nutrient or bioactive food component takes, the matrix that it is in, and the dose can influence its bioavailability. For example, synthetic folic acid is more bioavailable than natural folates found in foods,8 and the bioavailability of vitamin B12 varies among natural food sources9 and decreases with increasing dose due to physiological limits on absorption.10 Processing of food can render the compound of interest more or less bioavailable through exposure to temperature, change in pH, exposure to other constituents, and so on. Some nutrients share the same transporters and compete for absorption (eg, zinc and copper), whereas absorption of some nutrients may be affected by other dietary constituents (eg, the chelation of minerals such as magnesium, calcium, zinc, and iron by phytates).
Of special interest for dietary guidance are acute effects that may differ from chronic exposure. For example, whey proteins enhance calcium absorption acutely but not when they are consumed chronically.11 Similarly, vitamin C enhances iron absorption acutely but not chronically.12 Consumption of supposedly bioactive ingredients purified from whole food can also have unexpected results. One of the most famous examples is from the Alpha-Tocepherol, Beta Carotene Cancer Prevention Study.13 Fruits and vegetables have long been associated with reduced risk of lung and other cancers. But when >29 000 Finnish male smokers were given rather high doses of alpha-tocopherol (50 mg/d) and beta carotene (20 mg/d) alone or together for 5–8 years, a higher incidence of lung cancer occurred in men who received beta carotene than in those who did not. This finding suggests that the presumption of benefit or lack of potential harm, which is often ascribed to vitamins, may not apply to all individuals. This may be particularly true for individuals with initiated cancers, where specific nutrients or dietary patterns may promote progression. This phenomenon of “feeding” initiated cancers may underlie the findings of the Alpha-Tocepherol, Beta Carotene Study, as well as the controversial finding that colorectal cancers might have been temporarily promoted in the United States and Canada after initiation of government-mandated folic acid fortification policies.14,15
What is an ethical control? The concept of using standard of care as the control in medicine has become adopted in dietary supplement studies. In osteoporosis research, the typical control group receives calcium and vitamin D supplements rather than a placebo. Is this appropriate for nutrition research? Classic depletion/repletion studies were a staple of nutrition research in previous decades, and they revealed much of our fundamental knowledge about the metabolic and physiologic effects of nutrient deficiencies. However, deliberately depleting participants in a nutrient to the point of harm—or even to the point of biochemical impairment without overt clinical or physiological consequence—is now typically considered unacceptable. In recent years, institutional review boards (IRBs) have extended this ethical concept to RCTs in which deficiency is not being induced by the study protocol per se but a portion or all of the study population is low or deficient in a specific nutrient at baseline based on their own selected dietary patterns or other circumstances. It could be argued that there is value to understanding the effectiveness of an intervention in people in their natural condition, and to their credit, IRBs often allow study of subjects whose usual intakes, diet patterns, or nutrient status are insufficient. However, in some cases, IRBs have concluded that because those conducting the trial are aware that some or all of the study sample has low or deficient status, it is unethical to conduct a placebo-controlled trial because a portion of the participants will, by receiving placebo, go untreated for their deficiency. Continuing with this line of reasoning could lead to the conclusion that, because nutritional deficiencies are known to exist all over the world, nutritional research should not be performed at all and that researchers should, instead, dedicate their time to finding deficient individuals and repleting them.
Of course, it is not being suggested that individuals in current or imminent danger from nutritional deficiencies be denied intervention. Rather, it is being pointed out that part of the problem may be semantic and of the research community’s own making. Individuals tend to be labeled “deficient” if they have a blood level of a nutrient below a specific cutoff value. However, often these individuals have no overt, clinical signs of deficiency and, therefore, may more accurately be considered to have “low” or “suboptimal” status. For these individuals, it may be ethical to deny treatment (ie, by including them in a placebo group) based on the concept of “equipoise.”
Equipoise, in medicine, refers to the uncertainty around whether a treatment will be effective.16,17 With appropriate consideration of risks and benefits, the principle of equipoise may be used to justify a particular placebo-controlled intervention study. Consider the experience of 1 of the present authors (J.W.M.): He was part of a research team that had found, as others had reported in the literature, that a particular patient population had low circulating levels of a particular nutrient but no overt clinical signs or symptoms directly ascribable to the “deficiency.” The research team submitted a proposal to the National Institutes of Health (NIH) to perform an RCT in this patient population to determine whether supplements of the nutrient might be beneficial. A reviewer of the proposal commented that the study was unethical because treatment would be withheld from deficient individuals. In rebuttal, it was argued that disallowing the RCTs meant that it would never be known whether supplements of the nutrient would be beneficial, and without credible RCTs, medical practice (which did not include supplements of the nutrient in these patients) would not be changed. Therefore, by the principle of equipoise and the state of the evidence known at the time, it was argued that it would be unethical not to perform the study because there was a chance that patients might benefit in the future. This argument was successful, and the study was funded.
Making successful ethical arguments in favor of nutritional research strategies and protocols is essential to the future of nutrition and health. In particular, it must be recognized that, as knowledge accumulates, nutritional recommendations change over time. An example comes from clinical bone research. Supplementation with calcium and vitamin D is considered standard of care, and it has been the practice to include these supplements across groups in bone research studies, including the placebo group. Thus, knowledge of the efficacy of diet and drugs on bone outcome measures without calcium and vitamin D supplements is lacking. However, recent questions surrounding the safety of these supplements, have led to a decline in sales. Researchers must, therefore, be able to carry out nutrition research on bone outcomes and other topics using strategies and protocols that are both ethical and can improve understanding in ways that allow for evidence-based decisions for health care to be made.
Another issue with nutrition research is the lack of ability to intervene for a sufficient length of time to investigate chronic disease outcomes. An inherent limitation is that funding periods are typically ≤5 years. Obesity and chronic diseases of most concern today (eg, cancer, vascular disease, neurodegenerative disease, osteoporosis) have long latency progressions. Partly because it is difficult to have intellectual property around diet or dietary components, there is much less funding available to support long-term nutrition studies compared with drug trials. It is also more feasible for volunteers to take a drug daily than to change their diet for a prolonged period of time. Providing a prescribed diet for a study is labor intensive and operationally difficult for a lengthy period. To illustrate, an ongoing study involves 4 diets (ie, DASH-high sodium, DASH-low sodium, usual-high sodium, and usual-low sodium) at 5 energy levels for 20 different menu preparations for >500 adolescents for 25 days each. Blood pressure and serum lipid changes can be monitored in this time frame but not disease outcomes.
Blinding, which is a study design element to reduce bias, is difficult to achieve with most nutrition studies. For example, with the aforementioned DASH/sodium dietary study, the kitchen staff involved in menu preparation has to know the intervention, and it is difficult to disguise from the participants a diet rich in fruits, vegetables, and dairy from one low in these foods or whether the food is salty or not. Nevertheless, some principles encompassed in the philosophy of blinding can be applied. For example, staff collecting primary outcome measures can be blinded to the intervention, and interventions can be coded so that statisticians analyzing the data are unaware of intervention assignment.
Deciding on a study population is also a challenge, especially with limited funding. There is a tension between achieving generalizable findings and sufficient power to determine an effect. Because of the typically small effects of diet on physiologic responses and the long latency of effects of diet on disease outcomes, there is strong motivation to select a homogeneous, narrow population to reduce the sample size required to see an effect. Selection of a group most likely to be responsive (inadequate nutrient status, high-risk population, etc) and a homogeneous population to reduce variance (narrow age range, same life stage, same sex, same ethnicity, etc) is more likely to produce a positive outcome. However, it comes at a cost for the generalizability of results, which is important for establishing public health guidelines. Seldom are nutrition RCTs sufficiently funded to recruit enough volunteers to represent a population and also include enough volunteers in the subpopulation that is likely to have an outcome during response the time of the trial. A more recent concern is defining a healthy population, considering that more than half of American adults have at least 1 chronic condition.
The tension between generalizability and effect may be decided by impact goals. For public health messaging, generalizability is critical because the target audience is the masses. With an increasing interest in personalized nutrition, identifying subgroups or individuals who can benefit from an intervention becomes the larger goal.
COMPLEX EFFECTS
With the increased ability to process complex data, nutrition research is expanding the outcomes being considered from single pathways or tissues to multiple outcomes, be they beneficial or harmful. For example, the gut, with its microbiome, is being viewed as an organ that actively participates in processing and synthesizing nutrients together with the host to impact health. Bone is being considered in conjunction with muscle and adipose tissue in new ways.
There is also concern about the possible harmful consequences of dietary supplements or fortification programs designed to fill gaps between nutrient intake levels compared with requirements. A current debate is over folic acid. In the United States, it was mandated that wheat flour be fortified with folic acid by 1998; subsequently, the incidence of neural tube defects decreased dramatically between 1998 and 2004, by 19%–32%, justifying the mandate because it achieved its intended purpose of reaching women of childbearing age, who comprised the vulnerable subgroup.18,19 However, concern over an associated increased risk of cancer, especially colorectal cancer, with folic acid fortification and folic acid supplement use has caused some to question the practice for the entire population.14,15,20 Moreover, there is increasing, although circumstantial, evidence that an imbalance of low vitamin B12 status with high folate status can have negative effects on development,21 cognition in older adults,22–24 and response to B12 supplementation.25 Nevertheless, >80 countries have adopted mandatory folic acid fortification of at least 1 cereal grain (http://www.ffinetwork.org/index.html).
Overall, fortification of foods is declining. Demand for unfortified ready-to-eat cereals is concerning because consumption of such cereals could widen the nutrient gap between intake levels and recommendations, especially for children and the elderly. Fortified ready-to-eat cereals are a major source of vitamin B12 in a bioavailable form for the elderly, and fortification in general helps most Americans meet recommendations.26 However, concern over safety and lack of benefit of dietary supplements, including calcium, vitamin D, vitamin E, and multivitamins, has received a great deal of media attention.
COPING WITH BIAS
Scientists routinely have to navigate bias, both that of others and their own. Important examples of the former include the biases of reviewers of grant applications and manuscripts, as well as public and professional perceptions. External assumptions of bias can be particularly acute when the research is funded by industry, which has become a growing issue as federal funding declines and industry funding is sought to fill the void and maintain research programs. Biases and misinformation in the media, especially in opposition to industry-supported research, can be particularly strong and widespread. Examples of individual bias include the desire for respect and recognition among peers, the academic imperative to “publish or perish,” a personal history of supporting a specific position, personal passions, ideologies or philosophies, religious or ethical orientations, nationality, ethnicity, and financial conflicts of interest.
In addition, nutrition scientists encounter some unique biases. At a basic level, there is no agreement about the best approach to study the role of nutrition in health and disease. Basic scientists prioritize finding a molecular mechanism for what a nutrient does or how nutrient status influences molecular machinery. Without that, they are not convinced of the phenomenon. Critics of this approach disagree. What is learned from in vitro studies may not represent the human condition and may very well be an artifact of the manipulated environment. Animal models provide the distinct advantage of allowing long-term controlled diet designs with disease outcomes. However, no animal model is a completely satisfactory model of a human disease. Randomized controlled trials in humans are relevant, allow causal inference, and minimize confounding but typically suffer from poor compliance, are of inadequate duration to have disease outcome measures, and are criticized for being artificial compared with the human experience. Epidemiology attempts to find relations in the context of usual behavior and, thus, may fulfill the desire to study steady-state phenomena. On the other hand, results are associational and not causal. Teasing out the role of 1 nutrient or food or a diet pattern from the milieu of confounders is a daunting task. Moreover, the methodologies to capture what individuals eat remains crude. Each line of evidence provides insights, but none are perfect or ideal in nutrition research.
Biases arise even within each approach to studying nutrition. When reviewing a report of an RCT, evaluating compliance with the intervention is subject to one’s experience. If the paper reports 100% compliance, a reviewer could be concerned about coercion. For example, if the study was conducted in an African village that required approval of the village chief, the culture may be that everyone follows the chief’s decision. Similar differences in cultural perceptions exist for subject reimbursements. Some cultures find reimbursements coercive, and others find them ethical to offset subject burden. Some international IRBs will not review applications with budgets below some set amount, which could be viewed as inappropriate to the purpose of an IRB by others.
Each study environment has its own unique circumstances to be considered. For example, in the United States, controlled feeding studies in adolescents have been conducted at summer research camps; eleven such camps have investigated calcium metabolism (eg, Jackman et al.27). Going away to camp is a well-accepted American experience, but the concept is met with curiosity and sometimes skepticism outside the United States. Summer camp environments alter the concept of subject burden. Many measurements can be taken over time compared with what is feasible in a study visit to a clinic where each added measurement adds to the length of the visit. Time is rarely a consideration at a camp, where the challenge is to fill time with activities that are enjoyable to the participants.
All types of research can contribute to our understanding of nutrients and health. Knowledge of whether there is an effect, at what dose and for whom the effect occurs, the mechanism involved, and whether the effect can feasibly be translated into practice are all needed. It is prudent to weigh all evidence and evaluate it critically.28
EXPANSION OF REGULATORY APPROVALS
The growth in regulatory approvals required for clinical nutrition research in the last 3 decades has made research daunting, especially for junior scientists who can ill afford the months to years that may be required to get a study launched (Box 1). It is a worthwhile endeavor to streamline the cumbersome approval process for conducting human research to stimulate this valued type of research contribution. Without clinical trials, there can be no updated systematic reviews for evidence to set guidelines.
Institutional review board training, approvals, and reporting
Good clinical practice training
Clinical trials registries
Sponsor audits and progress reports
Conflict-of-interest training and reporting
Data sharing plans
Data safety and monitoring plans and reports
Investigative new drug applications or requests for exemptions
Isotopic tracer approvals and protocols
Approval from an IRB to begin a research study involving human subjects has been required for many years. As institutional programs around IRBs have grown, the amount of effort required by investigators to annually update IRB approvals and to apply for and receive permission for slight modifications to the recruitment process or study design has multiplied. This effort is compounded when >1 institution is engaged in the research, and the approval of >1 IRB is required. Attempts are being made for reciprocity across IRBs, but there is still much room for improved efficiency.
For some trials, an additional oversight burden is placed on investigators when sponsors provide monitoring of the study. All oversight reports, including monitor assessments, data safety reports, and reports from data safety and monitoring boards, must be provided to the IRB. Although the goal of a single oversight entity is laudable, the research enterprise has grown so complex that today there is no such entity. The multiplicity of reviews is a threat to efficient and effective protection of subjects.
Human subject protection training for investigators and key research personnel came into vogue in the early 2000s. The burden for institutions to prepare and deliver these trainings resulted in the development of the Collaborative Institutional Training Initiative in 2000. The content has significantly expanded over the years and now encompasses a series of 14 different trainings, with various parts and subparts. Each research institution establishes which members of the research team require training, which series are required, and which modules within a series are required, making the demonstration of compliance with training requirements problematic when >1 institution is engaged or when researchers change institutional affiliations.
Nevertheless, the IRB is a salient safeguard for subject protection. Streamlining the process should be imperative across all sectors. It is important that new investigators build substantial time for training and IRB approval into protocols in the development stage.
In 1997, the US Congress passed a law (the Food and Drug Administration Modernization Act) requiring registration of clinical trials. In 2000, NIH launched a publically available clinical trial registry, ClinicalTrials.gov. The requirements for its use have continued to expand, such that in 2005 the International Committee of Medical Journal Editors began requiring trial registration as a condition of publication. In 2006 the World Health Organization stated that all clinical trials should be registered, and in 2007 the World Health Organization launched the International Clinical Trials Registry Platform. Also in 2007 Congress passed the Food and Drug Administration Amendments Act, expanding the requirements for submission to ClinicalTrials.gov and imposing civil monetary penalties for noncompliance. In 2015, NIH modified the definition of “clinical trial,” again expanding the net to bring more studies into these registries. Beginning in 2017, all investigators who design, oversee, manage, or conduct clinical trials will be required to complete online training in good clinical practice. Clinical trial grants submitted to NIH will need to include plans for registering the trial in ClinicalTrials.gov.
This evolution directly impacts investigators. As an example, top journals such as the New England Journal of Medicine, JAMA, and the American Journal of Clinical Nutrition deny manuscript submission for studies not registered, but when registration must occur is a question of concern. In this shifting environment, it is noted that registration was initially required prior to manuscript submission, but now registration is required by NIH and journals prior to recruitment of volunteers. In addition, editors check the registration sites to make sure the manuscripts represent the a priori primary aims of the study; if not, the author(s) must declare that a finding is being reported that was not initially a primary aim.
At this point, there are multiple levels of oversight and reporting requirements for research whose aim is to improve the health and welfare of people. To ensure that there is objectivity in this research, financial conflicts of interest must be declared at several levels (eg, to the university when a grant or renewal is submitted, to the journal when submitting a manuscript, to some journals when reviewing a manuscript, and to the audience when presenting research in a public venue). Training on conflicts of interest is another professional requirement for investigators. Conflict-of-interest training is not unique to clinical research and is required for many professions, such as law. Disclosing financial conflicts of interest is important for transparency to garner trust from the public.29 While there is no evidence that redundancy in conflict-of-interest training has improved the quality of the science produced or protected human subjects, the training does involve significant financial and human resources. And worse, even with transparency, some people have a bias that privately funded research yields only results that are favorable to the funder's interests.
Looking ahead, data sharing for clinical research is now expected from federal sponsors, but mechanisms for sharing data are not publically available yet. In some areas of science (eg, DNA sequence sharing, Genbank), public databases are valuable. The approaches that have been used, including online supplementary information with journal articles and authors’ personal webpages, are unsatisfactory. Libraries will likely help with solutions in the near future.
INVESTIGATIVE NEW DRUGS
In September 2013, a guidance document for clinical investigators, sponsors, and IRBs was released by the US Food and Drug Administration (FDA) to determine whether human research studies can be conducted without an investigative new drug (IND) application.30 In effect, researchers working on food and nutrition intervention studies with health outcomes beyond nutritional deficiencies were expected to apply to the FDA to determine whether they were exempt from requiring an IND. The process was managed the same as for drugs, and the application process was unclear, even as to whom the inquiry should be directed. The Center for Food Safety and Nutrition, for example, has no staff to review IND applications. This non–legally binding guidance led many industry sponsors to take their clinical research overseas to avoid delays in conducting domestic research. In addition, it left industry in a regulatory dilemma; if a company were to file an IND to perform a clinical trial, would the product be considered a food or a drug? In the latter case, both extensive monetary investment and approval processes would be required. The ruling compromised clinical nutrition research in the United States, especially research on bioactive foods and ingredients, and stalled productivity of untenured faculty, thus endangering the food research structure of the country.
In response to a national protest in the form of letters to the FDA signed by >70 nutrition and food science administrators and professional societies, the FDA issued a notice of stay (for parts of the guidelines), which was published in the Federal Register on October 30, 2015 (80 FR66907). Investigative new drug regulations [21 CFS 312.2(b)] state that clinical investigations of a biologic product lawfully marketed in the United States are exempt from IND application requirements if they meet the 5 designated criteria outlined in Box 2. Still, NIH will not allow self-interpretation if these requirements are met. Thus, if the research is sponsored by NIH, investigators must apply for exempt status for an IND.
All 5 of the following criteria must be met for a clinical investigation of a biologic product to be exempt from requirements for an investigative new drug application:
The investigation is not intended to be reported to the FDA as a well-controlled study in support of a new indication for use, nor intended to be used to support any other significant change in the labeling for the drug.
The investigation is not intended to support a significant change in the advertising for a prescription drug product.
The investigation does not involve a change in route of administration, dosage level, or patient population, or other factor that significantly increases the risk (or decreases the acceptability of risk) with use of the drug product.
The investigation is conducted in compliance with the requirements for institutional review (21 CFS 56) and informed consent (21 CFS 50).
The investigation is conducted in compliance with the requirements of 21 CFS 312.7 (ie, the drug may not be represented as safe or effective, nor may it be commercially distributed) for the purposes for which it is under investigation.
USE OF ISOTOPIC TRACERS
Use of radioactive isotopic tracers also requires an IND application if the tracers are used in basic research for immediate therapeutic, diagnostic, or similar purposes or otherwise to determine the safety and efficacy of the product. Exemptions may be allowed if these conditions do not apply (21 CFS 361). However, use of radioisotopes in humans must be approved by a Radioactive Drug Research Committee that is composed of and approved by the FDA. Stable isotopes typically do not require an IND application.
Intravenous isotope preparation requires sterile techniques and standard pharmaceutical compounding protocols. Careful documentation, including source and lot of all chemicals used, aliquot scheme, and pyrogenicity, and sterility testing are necessary for stored preparations. FDA regulation USP 797 covers some of the materials, but there are no specific regulations for stable isotopes in humans.
CONCLUSION
Clinical nutrition researchers encounter many hurdles, including difficulties with recruiting volunteers, navigating a complex maze of approvals, and coping with myriad biases. Special scientific issues involved with clinical nutrition research include study designs that increase or decrease the status of a nutrient, food, or bioactive agent but often do not compare presence with absence of the compound (as is typical in drug trials); ethical issues regarding withholding of a nutrient from participants who are low or deficient in that nutrient; study populations that may already be sufficient in the compound of interest and, thus, may not show benefit of supplementation; interventions that are difficult to blind to both the subjects and the investigators; and a tension between studying subgroups most likely to respond versus recruiting a representative and, therefore, generalizable sample. Nevertheless, clinical nutrition research is an essential endeavor that provides the evidence base underlying dietary requirements and public health messages. Despite its intricacies, clinical nutrition research can have a profound impact, both to individuals and populations, thus justifying the effort.
Acknowledgments
Funding/support. C.M.W. is supported by a grant from the National Heart, Lung, and Blood Institute (U01 HL117835).
Declaration of interest. The authors declare no conflicts of interest relevant to the content of this article.
References
- 1. Appel LJ, Moore TJ, Obarzanek E et al. . A clinical trial of the effects of dietary patterns on blood pressure. DASH Collaborative Research Group. N Engl J Med. 1997;336:1117–1124. [DOI] [PubMed] [Google Scholar]
- 2. US Department of Health and Human Services, US Department of Agriculture. 2015–2020 dietary guidelines for Americans. 8th ed.December 2015. http://health.gov/dietaryguidelines/2015/guidelines/. Accessed May 28, 2017. [Google Scholar]
- 3. MRC Vitamin Study Research Group. Prevention of neural tube defects: results of the Medical Research Council Vitamin Study. Lancet. 1991;338:131–137. [PubMed] [Google Scholar]
- 4. Institute of Medicine. Dietary Reference Intakes for Calcium and Vitamin D. Washington, DC: National Academies Press; 2011. [PubMed] [Google Scholar]
- 5. Griel AE, Psota TL, Kris-Etherton PM. Chapter 9: Designing, managing and conducting a clinical nutrition study. In: Monsen ER, Van Horn L, eds. Research Successful Approaches, 3rd ed.Chicago; Academy of Nutrition and Dietetics; 2008;101–116. [Google Scholar]
- 6. Dhurandhar NV, Schoeller D, Brown AW et al. . Energy balance measurement: when something is not better than nothing. Int J Obes (Lond). 2015;39:1109–1113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Subar AF, Freedman LS, Tooze JA et al. . Addressing current criticism regarding the value of self-report dietary data. J Nutr. 2015;145:2639–2645. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Suitor CW, Bailey LB. Dietary folate equivalents: interpretation and application. J Am Diet Assoc. 2000;100:88–94. [DOI] [PubMed] [Google Scholar]
- 9. Watanabe F. Vitamin B12 sources and bioavailability. Exp Biol Med. 2007;232:1266–1274. [DOI] [PubMed] [Google Scholar]
- 10. Chanarin I. The Megaloblastic Anemias. Oxford, UK: Blackwell Scientific Publications; 1969. [Google Scholar]
- 11. Zhao Y, Martin BR, Wastney ME et al. . Acute versus chronic effects of whey proteins on calcium absorption in growing rats. Exp Biol Med. 2005;230:536–542. [DOI] [PubMed] [Google Scholar]
- 12. Hunt JR, Gallagher SK, Johnson LK. Effect of ascorbic acid on apparent iron absorption by women with low iron stores. Am J Clin Nutr. 1994;59:1381–1385. [DOI] [PubMed] [Google Scholar]
- 13. The Alpha-Tocopherol Beta Carotene Cancer Prevention Study Group. The effect of vitamin E and beta carotene on the incidence of lung cancer and other cancers in male smokers. N Engl J Med. 1994;330:1029–1035. [DOI] [PubMed] [Google Scholar]
- 14. Mason J, Dickstein A, Jacques PF et al. . A temporal association between folic acid fortification and an increase in colorectal cancer rates may be illuminating important biological principles: a hypothesis. Cancer Epidemiol Biomarkers Prev. 2007;16:1325–1329. [DOI] [PubMed] [Google Scholar]
- 15. Vollset ES, Clarke R, Lewington S et al. . Effects of folic acid supplementation on overall and site-specific cancer incidence during the randomised trials: meta-analyses of data on 50 000 individuals. Lancet. 2013;381:1029–1036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Freedman B. Equipoise and the ethics of clinical research. N Engl J Med. 1987;317:141–145. [DOI] [PubMed] [Google Scholar]
- 17. London AJ. Equipoise in research: integrating ethics and science in human research. JAMA. 2017;317:525–526. [DOI] [PubMed] [Google Scholar]
- 18. Crider KS, Bailey LB, Berry RJ. Folic acid food fortification—its history, effect, concerns, and future directions. Nutrients. 2011;3:370–384. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Miller JW. Folic acid fortification. In: Herrmann W, Obeid R, eds. Vitamins in the Prevention of Human Diseases. Berlin, Germany: De Gruyter; 2011;273–293. [Google Scholar]
- 20. Smith AD, Kim YI, Refsum H. Is folic acid good for everyone? Am J Clin Nutr. 2008;87:517–533. [DOI] [PubMed] [Google Scholar]
- 21. Yajnik CS, Deshpande SS, Jackson AA et al. . Vitamin B12 and folate concentrations during pregnancy and insulin resistance in the offspring: the Pune Maternal Nutrition Study. Diabetologia. 2008;51:29–38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Morris MS, Jacques PF, Rosenberg IH et al. . Folate and vitamin B-12 status in relation to anemia, macrocytosis, and cognitive impairment in older Americans in the age of folic acid fortification. Am J Clin Nutr. 2007;85:193–200. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Morris MS, Selhub J, Jacques PF. Vitamin B-12 and folate status in relation to decline in scores on the Mini-Mental State Examination in the Framingham Heart Study. J Am Geriatr Soc. 2012;60:1457–1464. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Moore EM, Ames D, Mander AG et al. . Among vitamin B12 deficient older people, high folate levels are associated with worse cognitive function: combined data from three cohorts. J Alzheimers Dis. 2014;39:661–668. [DOI] [PubMed] [Google Scholar]
- 25. Brito A, Verdugo R, Hertrampf E et al. . Vitamin B-12 treatment of asymptomatic, deficient, elderly Chileans improves conductivity in myelinated peripheral nerves, but high serum folate impairs vitamin B-12 status response assessed by the combined indicator of vitamin B-12 status. Am J Clin Nutr. 2016;103:250–257. [DOI] [PubMed] [Google Scholar]
- 26. Fulgoni V, Keast DR, Bailey RL et al. . Foods, fortificants, and supplements: where do Americans get their nutrients? J Nutr. 2011;141:1847–1854. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Jackman LA, Millane SS, Martin BR et al. . Calcium retention in relation to calcium intake and postmenarcheal age in adolescent females. Am J Clin Nutr. 1997;66:327–333. [DOI] [PubMed] [Google Scholar]
- 28. Blumberg J, Heaney RP, Huncharek M et al. . Evidence-based criteria in the nutritional context. Nut Rev. 2010;68:478–484. [DOI] [PubMed] [Google Scholar]
- 29. Rowe S, Alexander N, Clydesdale F et al. . Funding food science and nutrition research; financial conflicts and scientific integrity. Nutr Rev. 2009;67:267–272. [DOI] [PubMed] [Google Scholar]
- 30. US Food and Drug Administration. Guidance for clinical investigators, sponsors, and institutional review boards on investigational new drug applications—determining whether human research studies can be conducted without an investigational new drug application; availability. Fed Regist. 2013;78:55262–55263. https://federalregister.gov/a/2013-21889. [Google Scholar]