Skip to main content
Advances in Nutrition logoLink to Advances in Nutrition
. 2022 Jul 8;13(4):1324–1393. doi: 10.1093/advances/nmac043

Valuing the Diversity of Research Methods to Advance Nutrition Science

Richard D Mattes 1, Sylvia B Rowe 2, Sarah D Ohlhorst 3,, Andrew W Brown 4, Daniel J Hoffman 5, DeAnn J Liska 6,2, Edith J M Feskens 7, Jaapna Dhillon 8, Katherine L Tucker 9, Leonard H Epstein 10, Lynnette M Neufeld 11, Michael Kelley 12, Naomi K Fukagawa 13, Roger A Sunde 14, Steven H Zeisel 15, Anthony J Basile 16, Laura E Borth 17, Emahlea Jackson 18
PMCID: PMC9340992  PMID: 35802522

ABSTRACT

The ASN Board of Directors appointed the Nutrition Research Task Force to develop a report on scientific methods used in nutrition science to advance discovery, interpretation, and application of knowledge in the field. The genesis of this report was growing concern about the tone of discourse among nutrition professionals and the implications of acrimony on the productive study and translation of nutrition science. Too often, honest differences of opinion are cast as conflicts instead of areas of needed collaboration. Recognition of the value (and limitations) of contributions from well-executed nutrition science derived from the various approaches used in the discipline, as well as appreciation of how their layering will yield the strongest evidence base, will provide a basis for greater productivity and impact. Greater collaborative efforts within the field of nutrition science will require an understanding that each method or approach has a place and function that should be valued and used together to create the nutrition evidence base. Precision nutrition was identified as an important emerging nutrition topic by the preponderance of task force members, and this theme was adopted for the report because it lent itself to integration of many approaches in nutrition science. Although the primary audience for this report is nutrition researchers and other nutrition professionals, a secondary aim is to develop a document useful for the various audiences that translate nutrition research, including journalists, clinicians, and policymakers. The intent is to promote accurate, transparent, verifiable evidence-based communication about nutrition science. This will facilitate reasoned interpretation and application of emerging findings and, thereby, improve understanding and trust in nutrition science and appropriate characterization, development, and adoption of recommendations.

Keywords: evidence base, methods, nutrition science, precision nutrition, research, translation

Introduction

The strongest, most reliable, and actionable knowledge stems from confirming research findings by applying dissimilar methods and approaches to study common problems. Hence, embracing the varied methodologies used by nutrition scientists within the American Society for Nutrition (ASN) is critical to the conduct of nutrition science and the formulation of healthful and equitable dietary recommendations. However, recent events have tended to polarize, rather than harmonize, perspectives among ASN members, as evidenced by increasing advocacy of certain points of view at the expense of reasoned consideration of alternate interpretations of the evidence base. This comes with the risk of compromising the quality of science, member productivity, preeminence of the ASN, and credibility of science itself.

The current climate also threatens the translation and application of nutrition science. Trust in science, researchers, and health care practitioners has historically been high (1), and a recent poll by the Pew Research Center indicates a high proportion of the public continues to believe scientists act in the public's interest (2). However, there are growing threats to this status. The credibility of scientists and the work they undertake is increasingly attacked by public leaders, misreported by the media, and questioned, to varying degrees, by segments of the population based on their political views, economic status, ethnicity, and social standing. This has been exacerbated by caustic public controversies that extend beyond normal scientific debate among scientists themselves. This threatens the advancement of science and acceptance and adoption of credible recommendations to improve health and well-being. The issue is particularly salient for nutrition science. It has been reported that only,

  • 10% of Americans say they know “a lot” about what nutrition scientists do, and 26% say they know nothing at all,

  • 12% of Americans believe nutrition scientists are transparent about conflicts of interest while 37% believe nutrition researchers rarely or never are transparent about conflicts of interest,

  • 24% of Americans indicate that nutrition scientists provide fair and accurate accounts of their work, but 43% think research misconduct among nutrition scientists is a moderate or very big problem, and

  • 28% of Americans report that nutrition scientists do a good job in conducting research most or all the time, although 41% believe that nutrition researchers do not take responsibility for professional mistakes (2).

Most Americans derive their knowledge of what nutrition scientists do from traditional and social media. Thus, it is critical that information from these sources be clear, unbiased, and transparent. This has been increasingly difficult to ensure with growing polarization of views by nutrition scientists on issues such as the roles of different methodologies in addressing important research questions. Importantly, Americans who are more knowledgeable about science hold higher confidence that scientists act in the public's interest. Thus, efforts to promote interdisciplinary work, recognition, and respect for cross-functional skills, and to improve the accuracy and transparency of communication about nutrition science to the media, are warranted.

Charge

In May 2020, the ASN Board of Directors commissioned a task force to prepare a white paper to highlight how the breadth of methods used in nutrition science can and should harmonize to advance discovery, interpretation, and application of knowledge in the field. This ASN white paper addresses this charge by discussing the strengths and limitations of a wide array of methodologies used in nutrition research, as well as the appropriate scope of interpretation of the evidence each yields. This paper is not intended to be a compendium of all methodologies; rather, the goal is to highlight how different approaches and methods can be used together to build strong science. Greater collaborative efforts within the field of nutrition science will require an understanding that each method or approach has a place and function that should be valued and used together to create the strongest nutrition evidence base possible.

Although the primary audience for this report is nutrition researchers and other nutrition professionals, a secondary aim is to develop a document useful for the various audiences that translate nutrition research, including journalists, clinicians, and policymakers. The intent is to promote accurate, transparent, verifiable evidence-based communication about nutrition science. This will facilitate reasoned interpretation and application of emerging findings and, thereby, improve understanding and trust in nutrition science and appropriate characterization, development, and adoption of recommendations. This action is part of a longer-term strategic planning vision for ASN's centennial, ASN 2028, to be more outward facing as well as more relevant to the field of nutrition research and policy.

In 2016, ASN commissioned the report “Best practices in nutrition science to earn and keep the public's trust” (3). A blue-ribbon panel reviewed the literature and other publicly available information related to 1) conflict of interest and objectivity, 2) public benefit, 3) standards of scientific rigor and reproducibility, 4) transparency, 5) equity, 6) information dissemination (education, communication, and marketing), and 7) accountability. Recommendations from that report are being implemented within the Society to create resources for nutrition scientists (4). These include a model conflict-of-interest disclosure form and guiding principles for managing and conducting nutrition research funded by entities at interest. The present white paper builds on this effort by focusing more on nutrition science itself; emphasizing how the varied methods used by nutrition scientists combine to build the strongest evidence base to support the advancement of basic science, clinical practice, and the development of health and nutrition policies and programs.

Approach

Richard Mattes and Sylvia Rowe were identified as co-chairs of the Nutrition Research Task Force to draft the white paper, with Sarah Ohlhorst as the primary ASN support staff. To constitute the task force, ASN initiated a call for applications and nominations through its various communication channels. The goal was to assemble a task force with expertise that represented the scope of nutrition research including, but not limited to, areas such as ’omics approaches, cell culture, animal models, biomarkers, clinical/human trials, population studies, food and nutrition policy, statistics, big data, mechanisms, translation, community/public health interventions, implementation research, behavior, education, dietary assessment, precision nutrition, and life stages. It was also important that the diverse spectrum of ASN membership be represented, including early-career professionals and established researchers with experience serving in many different roles, including clinical practice, academia, government, and industry. To ensure these requirements were met, additional members were invited directly. Task force members were not selected to be an advocate for a specific area of nutrition research, rather with the goal of having task force members focus on the complementarities and synergies between methods and approaches to advance nutrition science. The task force was established in July 2020 with 15 members in addition to the 2 co-chairs.

The task force began work in August 2020 with a series of virtual meetings. To generate the scope of topics to be included in the report, task force members were first asked to consider the various nutrition research interfaces they encounter in their work and to identify related areas. Next, each member identified 3 to 5 key challenges the field of nutrition would likely encounter in the near future. To stimulate thinking, members were asked to consider several approaches that have, on occasion, generated friction within the field of nutrition science. These included observational studies versus clinical trials, basic versus applied research, animal models versus human testing, acute versus chronic trials, qualitative versus quantitative methods, and surrogate endpoints (biomarkers) versus established clinical endpoints.

Next, the task force considered the organizational structure of the report. An initial concept was to present findings according to a hierarchical research pyramid. However, it was decided that this approach might emphasize differences between methods and rank orders of importance instead of providing the view that each contributes in its own way to a whole. A matrix format was also considered, whereby nutrition issues would be juxtaposed with the methods that may be used to address them. However, this resulted in concern about redundancy. Finally, a consensus emerged to organize the report around methods applicable to precision nutrition.

Precision nutrition was identified as an important emerging nutrition topic by the preponderance of task force members, and the relevance of the topic was reinforced by the recent NIH precision nutrition research initiative “Nutrition for Precision Health, powered by the All of Us Research Program” and the related Common Fund Request, “Data Science Challenges and Opportunities in the Field of Precision Nutrition.” According to the NIH, an underlying assumption of precision nutrition is that everyone responds to diet and nutrition interventions differently. Therefore, for the purposes of this report, precision nutrition is defined as the development of nutrition recommendations relevant to both individuals and population subgroups using a framework that considers multiple, synergistic influencers including dietary habits, genetic background, health status, microbiome, metabolism, food environment, physical activity, socioeconomics, psychosocial characteristics, and environmental exposures (5).

This theme was adopted for the report because it lent itself to integration of many approaches in nutrition science. However, it was recognized that there are topics of equal importance (e.g., issues related to population health) that would require some methods and approaches that differ from those applicable to addressing questions related to precision nutrition. It is also important to recognize that precision nutrition is a complex concept. A person's nutritional needs must reflect issues such as diverse backgrounds, health status, and access to resources. Thus, it must be clearly noted that this report is meant to emphasize the value of integrating the many methods required and used by nutrition scientists. Precision nutrition is used only as a way to organize nutrition research methods rather than the topic of this report.

The task force was divided into 7 writing groups, each with a different scope of methods to consider. Each group prepared an outline of the topics they proposed to cover, which were reviewed and revised by the full task force to minimize redundancies and gaps. The resulting content in each area is as follows:

  • Health disparities

It has long been recognized that contextual factors, such as economic, social, cultural, behavioral, and environmental factors, have profound impacts on the extent to which evidence-based health and nutrition interventions improve outcomes in diverse populations. These core factors drive inequities in access to nutritious and safe food and to health and other services, resulting in pervasive health disparities. Precision nutrition approaches and successful program design and adaptation through implementation science research facilitates an understanding of the nutrition issues in varying populations, and the contextual factors that impact dietary behaviors and choices. In this section, we highlight several research methodologies that have been used to 1) describe and understand health disparities and their determinants, 2) translate that knowledge for better program design, and 3) measure program progress and impact. These include community-based participatory research, focused ethnographic studies, and impact pathway analysis. For each methodology, we discuss examples of application and explore strengths and limitations. We also include a review of measures of socioeconomic status (SES) and the food environment, 2 measures that are particularly important in health disparities research. This overview is intended to be illustrative, not comprehensive, and concludes with a discussion of several remaining opportunities and challenges to improve research design, rigor, and potential for uptake to enhance precision nutrition approaches and address health disparities.

  • Cognitive performance and behaviors

The methodologies in this section fall into 5 areas: 1) the initiation and cessation of eating, 2) appetite, 3) sensory attributes of foods, 4) effects on cognitive performance, and 5) brain imaging and diagnostic questionnaires. Although a number of the methodologies are based on both objective and subjective measures, others utilize one or the other. Concern about the interpretation of results is a primary issue for each methodology area. For example, questionnaires developed to reflect decision processes, such as the decision to begin or cease eating, must be reliable and predictive of an outcome. Other questionnaires provide ranges of scores to reflect the impact of different conditions or a change in condition. Objective measures, such as brain imaging, necessarily entail methods of transformation and/or reporting of magnitude or quality, often by software associated with the technology. For results to be interpretable, there is a need for values to be standardized and relatable to selected outcomes. This section seeks to provide a general description of major methodologies in these 5 areas, as well as their key strengths and limitations. The methodologies in this section are considered useful for research purposes but are not yet able to provide diagnostic information to allow an individual to adjust their food-related behaviors or strategies in relation to precision nutrition recommendations at this time.

  • Dietary assessment

Assessment of individual dietary intake is essential for precision nutrition. The most common methods include 24-h recall (24HR) and food-frequency questionnaires (FFQs). Strengths of the 24HR include open-ended questions, allowing for diverse dietary patterns. Limitations include underreporting of intake due to forgotten foods and portion underestimation, and the need for multiple days to reflect usual intake. Strengths of the FFQ include efficient estimation of usual long-term intake. Limitations include reliance on a food list and assumptions tailored for specific populations, with biased estimation for subgroups. Although self-administered questionnaires offer cost-savings, dietary assessment is best administered by trained interviewers, and this is essential for low-literate people. Biomarkers are useful adjunct measures to validate self-reporting and/or to combine with self-reporting to better estimate exposure. Statistical techniques have been explored for biomarker calibration of intake in study subsets or with feeding studies, but generalizability to populations outside of those used for the calibration remains a concern. New technologies include using photographs to document portion size and smart devices to measure swallowing or arm movements, but these remain limited and require participant cooperation and investigator review. As precision nutrition moves forward, investment in improving dietary assessment is critically needed. This includes updating nutrient databases and enhancing and/or developing more efficient and inclusive tools.

  • Genetics and epigenetics

Metabolic heterogeneity—the variation in processing nutrients via metabolic pathways—arises, in part, from genetic and epigenetic variations between people that can alter metabolic and signaling pathways. Furthermore, the effects of genetic or epigenetic variations may only become apparent when metabolism is challenged by disease or nutrient deficiency or excess. Thanks to recent advances, it is now possible to completely sequence a person's genome, revealing deletions, mutations, and even single nucleotide polymorphisms (SNPs) that may underlie differences in nutrient needs. These differences are at the core of precision nutrition.

Beyond the genome, epigenetic variations derived from inheritance and differences in life exposures, experiences, and diets can change the expression of genes in a way that can, in turn, alter nutritional metabolic and signaling pathways. The challenge for precision nutrition will be to expand the evaluation of individual gene, epigenetic, and transcript variations to study patterns of variations that predict metabolic heterogeneity and differences in outcome/responses to dietary interventions. To accomplish this approach, the field will need to develop appropriate data and better computational models and tools.

  • Microbiome

The human microbiome shows high interindividual variability and is influenced by numerous factors including age, sex, diet, environment, and circadian rhythms. From a precision nutrition perspective, recent studies suggest that specific microbiome patterns are associated with higher health risks and differential responses to certain foods and nutrients. In this section, the following experimental methods for investigating the microbiome in the context of precision nutrition are briefly reviewed: dynamic in vitro multicomponent fermentation systems, animal models, human clinical interventions, and cohort studies. Overall, due to the complexity of the diet and the considerable interindividual variability and complexity of the microbiota, combining different experimental designs is necessary to understand the relationships between diet–microbiome and health. Further, the increasing number of diet–microbiome studies and various approaches to study design warrant consideration of key factors, such as standardization of sample collection and analysis workflows, host–microbiome interactions, influences of other biological and environmental factors, and integration with ’omics data to ensure that data collected are robust and reporting is sufficiently complete to enhance replicability. While the field is extremely promising, we must be cautious about overinterpreting findings from diet–microbiome studies.

  • Nutritional status

Determining the nutritional status of an individual is an important part of nutrition research because understanding how diet and dietary patterns influence nutrient absorption and metabolism is fundamental to supporting sound policies on nutrition and health. Among the many methods used to assess nutritional status, the use of dietary surveys to measure dietary intake is the primary approach for monitoring the diet of people and populations. There are limitations to these methods at the individual level, but at the population level, dietary intake methods are useful for accurately assessing dietary trends and patterns. Also, measuring energy expenditure and body composition of individuals provides data that can be used at both individual and population levels to understand how energy balance may promote health or prevent disease. While current Dietary Reference Intakes are largely estimated from nutrient levels in limited studies in adults, additional research is needed to identify homeostatically regulated nutrient biomarkers, and to apply these measures in studies across all ages and genders, including assessment of genetic variation. This section discusses the application of these methods using precision nutrition as a model for exploring strengths and weaknesses when applied to individual or population studies.

  • Cross-cutting considerations

The sections described so far discuss different domains that are important in establishing an evidence base for precision nutrition guidance. Yet, the task force emphasizes that the science of nutrition is not a disjointed collection of disciplinary silos. Thus, this section describes cross-cutting methods, themes, and issues that acknowledge the overlap among the sections and lays the groundwork for the future of nutrition research. Methodological considerations have arisen at the interface of multiple domains of nutrition science, such as ’omics, big data, data mining, and artificial intelligence (AI)/machine learning; challenges in capturing the relevant nutritional measurement of interest; and application of statistical approaches to enable the use of observational data to address questions of causality and improvement in study designs that will answer pragmatic questions and enhance the ability to provide precision nutrition guidance. The section concludes by focusing on cross-cutting principles to advance the science of nutrition: 1) training in interdisciplinarity as a skill; 2) selecting designs, measurements, and communication approaches with an explicit focus on fit for purpose; and 3) creating a field-wide dedication to rigorous, reproducible, and transparent science through open science practices.

The format adopted for preparing each section was as follows:

  • Writing Section Topic Introduction (∼1 page): (for example, genetics/epigenetics):

    • Method 1

    • Method 2

    • Method 3

  • Strengths of Each Method (using Precision Nutrition topical theme) (2–5 pages)

  • Limitations of Each Method (using Precision Nutrition topical theme) (2–5 pages)

  • Key Considerations in Interpretation of Data Generated by These Methods (2–5 pages):

    ○ What conclusions can and cannot be drawn fromthese types of methods?

    ○ How do findings from these methods interface withother methods?

  • In Closing (∼1 page)

Each section has undergone an internal review process in which each group critically reviewed at least 1 other writing group's section. Comments were provided to the primary authors and revisions were made. The compiled report was then edited by a science writer and shared with the full task force for an additional review. The present report incorporates the latest editorial and content suggestions of the task force and is presented to the ASN Board of Directors for consideration.

Health Disparities

Introduction

Precision nutrition is usually understood to focus on the genetic and other biological relationships between diet and health, with the aim to optimize health outcomes through interventions adapted to unique individual needs. Achieving this aim, however, requires taking into consideration the psychosocial, social, and economic circumstances that directly and indirectly influence health outcomes, behaviors, ability and willingness to adhere to treatments, and responses to health and nutrition interventions (6). The latter set of social factors, often referred to as social determinants of health (SDOH) (7), may be defined at the individual (e.g., gender, education, nutrition literacy), societal (e.g., economic, social stability, educational attainment), or environmental (e.g., food production, proximity and nature of food environment and markets, access to clean air and water, physical safety) levels. Social determinants are core drivers of inequities in access to nutritious and safe food and to health care and other services. They result in profound health disparities. “Health disparities” are defined as differences in health outcomes and their determinants between segments of the population, as defined by social, demographic, environmental, and geographic attributes. Disparities are closely linked to economic, social, and/or environmental disadvantage. By contrast, “health inequities” are defined as systematic, avoidable differences in health outcomes and their determinants between segments of the population based on race/ethnicity, gender and other demographics, SES, and geography (8).

This section will highlight critical advances in precision nutrition and its potential to understand diet and health relationships, to ultimately enhance the potential of nutrition and health interventions to improve outcomes of individuals. Globally, most health and nutrition interventions, and the research that informs them, are designed and implemented at the population level. The “one size does not fit all” principle central to precision nutrition must be applied to research that informs nutrition and health interventions at the population level. We frame research of this nature under the umbrella of implementation research, or research that has the explicit purpose of informing decision-making needs for population health and nutrition programs (9). This approach is particularly important to addressing disparities in health outcomes in the United States and globally. Figure 1 shows an overview of the types of decisions that need to be made during nutrition intervention program design, delivery, improvement, and scaling-up. Rigorous research should inform decisions to ensure that priority nutrition-related issues are identified, potential evidence-based programs can be adequately adapted to the unique context to address those needs, and to inform a continual cycle of improvement to design and delivery that can ensure that potential to improve nutrition and related health outcomes can be reached.

FIGURE 1.

FIGURE 1

Overview of the types of program relevant questions that are asked to adapt nutrition interventions to context, and implementation research methodologies to address them. (9).

In this section, we review research methodologies used to inform program decision making in nutrition, highlighting several examples from Figure 1. It is important to note that research methodology does not define implementation research, as many methodological approaches may be considered. The defining difference is the selection and utilization of a methodology to answer a question required for program decision making and, as a result, the need to engage with the eventual users of that evidence in ways that would rarely occur in scientific discovery research. Analogous to precision nutrition's intent to enhance the quality of evidence related to individual nutrition risk and the potential of interventions to mitigate such risks, implementation research allows unique features of context to be brought to bear to improve the potential for nutrition programs at the population level. With this context, this section has 2 objectives: 1) to present examples of several research methodologies in implementation research for nutrition, and examples of their application, and 2) to explore some particularly challenging constructs to measure that are critically important to ensure that nutrition programs are adequately adapted to context. Because of their salience for health disparities, we focus on measures of economic well-being and measures of the food environment.

Research methodologies used in implementation research

Community-based participatory research

Because of historical disenfranchisement within certain groups, the issue of power must be considered in health disparities research. Community-based participatory research (CBPR) is a promising approach to deconstruct hierarchies and balance power dynamics by engaging community members in the design of tailored dietary interventions for communities experiencing health disparities. Through authentic engagement with community members, community input is integrated throughout all stages of the research process from design and implementation to evaluation.

CBPR is a health disparities research method that can be used to identify options for action in the form of policy and programs (10). For nutrition, the CBPR approach has been used in studies to promote healthy diet and nutrition and weight management in historically disenfranchised communities (11–13). For example, in recent decades, a growing body of research has used CBPR approaches to promote healthy eating and weight control in African–American communities, which experience high rates of overweight, obesity, and other diet-related health outcomes (11). A variety of factors account for these differences, including attitudes and preferences related to food, socioeconomic factors, targeted advertising of unhealthy foods and beverages to African Americans, and environmental factors that create barriers to limiting intake of foods high in fat and sugar (14). Neighborhood deprivation and residential racial segregation also play a role (15). Recognizing these contextual factors and the historical distrust among African Americans rooted in the lingering legacy of the Tuskegee experiment, the CBPR approach has been used to initiate grass-roots efforts and coalitions of community-based organizations to address health behaviors. Given the complexity of factors that influence dietary behaviors within and among populations, multicomponent CBPR studies are particularly beneficial when interventions are aimed at multiple levels of the socioecological model.

The collaborative approach of CBPR involves partnerships among academics, community organizations, and community members to increase the perceived value and importance of the research product for all partners. CBPR is participant centered and includes the goal of acknowledging and implementing participants' needs, behaviors, and beliefs concerning their well-being (16, 17). To this end, CBPR involves the participation of those whose life or work is the subject of the research and includes involvement at all stages of the research process, including formulating the research question and goal, developing a research design, selecting appropriate methods for data collection and analysis, implementing the research, interpreting results, and disseminating findings. Such engaged participation is the core defining principle of CBPR, distinguishing this approach from others in the health field.

A key strength of this research method is its ability to address unequal power dynamics that may exist between academia and disenfranchised communities. Given the historical disenfranchisement of communities that experience diet-related health inequities and pervasive distrust in societal systems, the equitable involvement of all partners in the research process aims to address these historically unequal power dynamics (18). Community and stakeholder involvement can inform the development of socially and culturally appropriate nutrition interventions that are tailored for context. Additionally, unlike other frameworks used within health disparities research, this approach steers away from a deficit perspective. Instead of focusing on problems and deficits, particularly of marginalized communities, the method focuses on strengths and assets, collective knowledge, and insights that community partners bring to framing health problems and developing solutions (19). This approach can, therefore, be empowering for communities experiencing disparities.

A limitation of these methods is the time and resources needed. CBPR calls for equitable partnerships that require long-term commitments from researchers and communities. Given the intentionality required to build authentic, longstanding relationships with community stakeholders, the CBPR approach requires unique skillsets that may not be a part of the traditional research training. Poorly implemented CBPR could lead to tokenistic engagement instead of authentic, mutually beneficial relationships between community and academia. Consequently, the researchers must begin building these partnerships well before grant funding is received.

Focused ethnographic studies

Nutrition interventions, regardless of whether they aim to address malnutrition directly (e.g., distribution of nutritional supplements), change food choice and dietary habits, or influence the determinants of these (e.g., by shifting access to food), require in-depth knowledge about the populations and communities in which interventions will be implemented. The importance of individual factors that may determine biological potential to respond to an intervention (e.g., baseline nutritional status) and implementation-related factors (e.g., health or food delivery systems) is well documented (20). Focused ethnographic studies (FESs) seek to bring such quantifiable information together with an emic, or insider perspective (21), from individuals or communities in which interventions are planned (22). As is common in ethnographic research, FESs are a mixed-methods approach. A well-designed FES will start with a theoretical framework and review of relevant existing literature. Additional data collection may include common quantitative or semi-quantitative measures relevant for the research purpose (e.g., socioeconomic, dietary) and approaches to understand society, culture, behaviors, values, perceptions, and other aspects of context. The latter methods can include cognitive mapping, key informant and in-depth interviews, and focus groups. The idea of focus in FESs refers to the application of the studies to a specific research topic. In this regard, FESs differ from classic ethnography, which sought to be comprehensive in studying peoples and cultures (22). To further illustrate, we draw on recent publications of FESs in research on infant and young child feeding (IYCF) in several low- and middle-income countries.

Undernutrition, characterized by stunting, wasting, anemia, and micronutrient deficiency, is still highly prevalent in many low- and middle-income countries (23). While malnutrition is multi-causal, inadequate feeding in the first 2 y of life is a common cause of most forms of childhood malnutrition (24). Ample research has focused on developing nutritional supplements for distribution within programs, and many interventions have been designed to shift breast and complementary feeding practices towards global recommendations (25). Designing programs that are actionable within communities requires information not only about what children eat but why and the values, perceptions, cultural, economic, and other factors that influence caregivers’ decisions. FESs have been used in several country settings to gain emic understanding of child feeding practices and what drives them as well as to explore barriers and opportunities for change. Results illustrate that in many low- and middle-income countries, economic barriers to purchasing foods often recommended as part of behavior change campaigns are a critical challenge, even when knowledge and willingness exist. That said, social conditions and cultural values and beliefs also guide IYCF behaviors, and other factors such as women's time can be a constraint to change (26, 27).

With their mixed-methods approach, FESs are flexible and can include the set of data-collection tools that most appropriately fit the research question and context. The primary disadvantage of this approach is the relative resource intensity, although this can be mitigated given the flexibility. By design, FESs are intended to explore a specific health or nutrition issue in a specific context, so findings cannot be extrapolated beyond that context. However, using a common approach across contexts can reveal important insights into the commonality of barriers and opportunities for change, as was illustrated through the application of FESs to ICYF practices (28).

Impact pathway analysis

Impact evaluation is critical for measuring program results, which, in turn, provides accountability to donors and governments. With this purpose in mind, several published evaluations focus solely on the primary intended outcome from the program (29). To serve the program decision-making requirements, illustrated in Figure 1, and to inform course correction if needed, evaluations must go beyond a sole focus on primary outcomes. They must also provide information on why (or why not) programs work in specific contexts and whether benefits are accrued equitably among subgroups within the targeted population (30). These considerations are critical to assess whether programs have been appropriately adapted to contexts, whether they have potential to address health inequities, and, ultimately, to guide decisions related to continuity, expansion, or modification of the program. Evaluations must also assess social determinants to permit the equity analyses and measure program processes, fidelity and quality of delivery and explore factors that may facilitate or impede quality, coverage, and utilization of program products and services.

Many evaluation designs can be used, depending on resources and context. As with all research, the types of causal inferences that can be made depend on the design selected (30). Regardless of the design used, program evaluation should begin with mapping an impact pathway. The impact pathway provides a visual depiction of program activities, what they intend to achieve, and the process by which they will achieve results, and is accompanied by a list of assumptions that underpin these activities (31). Data-collection tools and processes can then be developed to ensure a comprehensive evaluation of program components as well as intended and unintended intermediate and ultimate outcomes. The appropriate analytical approach for primary outcomes will depend on the evaluation design (e.g., randomized effectiveness trial, nonrandomized matched control groups, theory-based evaluation) and should always follow scientific standards [e.g., CONSORT (Consolidated Standards of Reporting Trials) guidelines if a randomized controlled trial (RCT)]. Beyond primary outcomes, the impact pathway analysis explores intended and, if needed, unintended intermediate outcomes and the processes through which any changes may have occurred. In this manner impact pathway analysis follows the principles of theory-based evaluation (32).

In recent years, an increasing number of nutrition program evaluations have adopted the impact pathway approach. As examples, it was used in Vietnam (33) and Bangladesh (34) to evaluate interventions intended to improve IYCF practices. For each program, mapping was performed for intended interventions (e.g., health worker training, development and distribution of behavior change materials), service delivery approach (e.g., utilization of health clinics at specific child ages), the mechanism through which these interventions would modify IYCF practices (e.g., beliefs about IYCF, self-efficacy related to IYCF), the intended intermediate (breast and complementary feeding practices), and long-term impacts (reduced stunting). The impact pathway analysis in Vietnam, for example, found that health care providers exhibited increased capacity because of the program, resulting in higher-quality IYFC counseling and better breastfeeding knowledge and practice in intervention than comparison communities. However, the authors also found that program utilization, particularly the number of clinic visits per mother/child was likely to be a barrier to further impact. The close relationship developed between evaluation and program teams, while developing the impact pathway, facilitated the discussion of these results and their translation into specific recommendations to adapt and strengthen the program (33). The impact pathway was used in a similar fashion to measure progress and propose modifications to conditional cash transfer programs to improve their impact on nutrition outcomes in several countries (35–37).

Impact pathway analysis approaches to evaluation have several limitations. First, robust data are needed across many aspects of program implementation, data that are sometimes overlooked in impact evaluations. Second, impact pathway analysis itself does not allow direct causal attribution between individual actions and results across the pathway. These limitations, however, also highlight the strengths of the approach. Impact pathway analysis can be applied regardless of the evaluation design. It starts from a clear theoretical framework of the program, which obligates the evaluator to work closely with program implementers to gain a profound understanding of the intended programmatic approach and processes. Through mapping and studying the pathway and the assumptions that underlie it, the approach reveals what is inside the black box of program implementation and permits insights as to whether programs work, while also being useful for generating and testing hypotheses related to why or why not a program works. This type of evaluation is fundamental for determining whether a program was adequately adapted to context or whether additional adaptations are required to realize impact. Discussing these evaluation findings with program implementers can produce concrete and feasible solutions to improve potential for impact (38).

Measuring constructs for adapting nutrition interventions to context

Well-designed implementation research can generate the evidence needed to adapt program design to context. Insufficient attention to disparities is an integral part of implementation research; however, it may lead to interventions that, at minimum, do not address them or, at worst, may perpetuate them. The methodologies described above are well suited to identify and understand health disparities. Researchers, however, must also pay close attention to the measures they use, and the extent to which they are able to capture contextual differences and nuances that may stem from issues with adequate adaptation and translation (if needed) of research tools, questionnaires, and resulting indices. Because of their salience for health disparities, we focus on measures of SES and the food environment.

Measures of SES

The influence of SES inequities on health is a major public health concern and an important driver of disparities (39, 40). SES is particularly important for nutrition and health disparities research because it often dictates access to services as well as access and control of material and social resources in a society. In the case of nutrition, SES influences income (and time) available to purchase and prepare healthy food. SES has been conceptualized in several ways, which typically seek to estimate income, wealth (income plus assets), or as a broader construct of resources that usually includes a combination of measures, both economic and social (e.g., housing materials, physical belongings, education, marital status, family size and arrangement, among others), or a single construct reflective of these (e.g., education) (41–43). Simple proxy measures may also be used, such as education alone (44), or US Census-tracked SES based on zip code (45). Measures are often converted into indices to provide an absolute measure of wealth or poverty in comparison to an absolute value (e.g., proportion of the population living on less than 1 dollar a day), or national benchmark [e.g., the poverty-income ratio (46)]. Other measures provide a relative ranking of individuals or households within a population (e.g., SES indices used by the Demographic and Health Surveys).

There is no single correct way to measure SES, and the choice of measures and resulting indices should be informed by the relevance to the population and the research question for which the information will be used. Unfortunately, insufficient attention has been paid to these issues in nutrition. As a result, research studies have found conflicting results because of the indices used (47), inconsistent associations between SES and health disparities in the United States (43), and between specific SES measures and health outcomes among varying ethnic groups (48, 49). Measurement issues are further complicated when comparing multiple country contexts, although several recent efforts have attempted to address this challenge (44).

In addition to these general challenges, implementation or health disparities research requires that particular attention be paid to adapting measurement to context. The choice of measure and indices must be informed by the study's objective while taking into account additional contextual considerations. For example, reporting accuracy will be influenced by the measure for the range of SES likely in the population, the social and cultural meaning given to various constructs (e.g., individual vs. extended family income/wealth), and the population's acceptance of responding to questions. In the United States, common measures include poverty-income ratio, annual household income (46), Census-tract–level SES based on zip code (45), and several poverty indices. These measures may not account for informal income, safety net resources, and assets and wealth that could potentially buffer socioeconomic stressors linked to poor health outcomes.

Measures of food environment

Given the recognition that the spaces in which people acquire food (through production, gathering, purchasing) influence dietary choices and ultimately nutritional status, research is critical to understanding the influence of the food environment on dietary behaviors and how inequities in food environments may influence and perpetuate health disparities. From existing evidence, we know that food environments vary vastly among and within countries. There are multiple definitions of the food environment, and little consistency and clarity on the constructs to measure it (50). In simplest terms, the food environment represents the interface between food systems and diets (51). Turner et al. (50) propose a detailed framework and associated set of potential data sources to measure the food environment. The authors make a distinction between the external food environment (food availability, price, vender and market properties, marketing, and regulation) and the personal food environment (accessibility, affordability, desirability, and convenience). Applying such a framework to measuring the food environment is critical to understanding pathways by which food environments may influence diets, identifying appropriate points of interventions, and developing interventions through the precision nutrition lens that account for these unique social and community circumstances.

Several measures of food environments have been developed and used to assess the external food environment and personal food environment. Some were developed specifically to capture the context of low-resourced communities in the United States (52), such as the Nutrition Environment Measures Survey. This survey is based on established criteria assessing the relative healthfulness of food and beverages offered (e.g., <800 kcal,<30% fat, <10% kcal from saturated fat for restaurant meals) and has been used to assess the food environment in grocery and convenience stores (53), restaurants (54), and vending machines (55). There is also growing recognition of the need for food environment measures that capture the broad diversity of ways and places food is procured across diverse contexts globally. For example, Down et al. (56) provide a framework and suggestions for measurements that recognize the importance of natural food environments (e.g., lakes, rivers, forests) that are still an important source of food for many people. Recent publications also suggest measures for capturing formal and informal market food environments (56, 57). A plethora of research has described food environments and analyzed associations of the food environment with diet and health outcomes, as well as some research measuring impact of interventions to modify those environments. Much work is still needed to develop and validate indicators that reflect the various constructs of the food environment by adapting to the unique context of food environments while maintaining common constructs that can be compared across contexts.

Methods that apply geographic information system (GIS) technology for accessing exposure are also used to characterize the external food environment and food types available within communities, including urban, rural, and suburban settings (58). In the United States, GIS approaches are commonly used to determine store density or proximity to nearest food store types (e.g., supermarkets, convenience stores, fast-food outlets, and other types of stores) to operationalize food access (59). Aligned with the “health is determined by your zip code” concept, measuring food availability and access to food at the neighborhood level in health disparities research is useful because of pervasive neighborhood-level racial/ethnic segregation, and can help inform policy development at local and regional levels (60). Limitations include the assumption that food store proximity is a proxy for an individual's food environment and dietary intake, and the inability to account for an individual's mobility throughout the day (61). With expanding smartphone usage, even in many low- and middle-income countries, GIS technology holds promise for mapping food environment exposure of individuals over the course of a day. Although still in its infancy, this approach may foster a better understanding of food environment exposures and, ultimately, inform more optimally tailored dietary interventions (62).

Key considerations in data interpretation

We describe several research methodologies and measures important for health disparities research, through the frame of implementation science. The examples described are illustrative only; implementation research is defined by its purpose to generate actionable evidence for decision making related to policies and programs in context, and many different research methodologies may be appropriate. The choice of research methodology must be defined by best fit-to-purpose feasible methodology. There are, however, several considerations in research design and interpretation of data generated, as follows:

  • Approaches that use mixed methods are particularly salient for research related to health inequities and disparities, complementing common nutrition measures with information that can be collected in ways that are sensitive to social and cultural contexts. Quantitative and qualitative data obtained from FESs, for example, including quantitative and semi-quantitative surveys, key informant interviews, and focus groups, are critical to obtaining the insider perspective from individuals or communities where interventions are to be implemented.

  • To ensure that disparities are not inadvertently perpetuated in research, proactive consideration of disparities is critical for all forms of research. For example, social and cultural dynamics should be considered in the recruitment strategy for clinical nutrition trials to ensure diversity in study populations, thereby improving the study's external validity.

  • In research designed to understand context (e.g., CBPR, FES), there will always be tensions in balancing external and internal validity. Using common methodological approaches adapted to varying contexts but measuring similar constructs can be useful to provide insights into trends and patterns in different settings and communities.

  • Regardless of the context, research tools and measures require some level of adaptation, validation, and possibly, translation. It is critical to test whether the fundamental constructs that the research seeks to measure are not lost in that process.

  • The time intensity of the proposed approaches should also be noted. When working to better understand communities that experience health disparities, time is necessary to conduct the formative research needed to adequately identify community needs and adapt interventions accordingly. This can create tensions with programmatic needs for rapid generation of evidence. Working closely with program implementers from the outset of implementation research is critical to navigate these tensions so that scientific rigor is maintained while ensuring programmatic relevance.

In closing

Population-based programs to improve nutrition outcomes and address health disparities must account for the social, cultural, economic, and other factors that influence health, diet, and the willingness and ability to make behavioral changes. We describe several methodologies and measures that can be used to generate such evidence, using the framing of implementation research. This approach is important regardless of research context, whether addressing disparities in health outcomes in the United States or in any context globally. Similarly, precision nutrition research must explore not only the biological and genetic underpinnings of health but the same set of disparate social, cultural, economic, and other factors that directly and indirectly influence health outcomes, behaviors, ability to adhere to treatments, and responses to health and nutrition interventions (6). Social and community circumstances, including the food environment and SES, are core drivers of the inequities in access to nutritious and safe food and to health and other services that contribute to health disparities (8), yet there is much room for improvement in their assessment. Generating high-quality evidence and using fit-for-purpose research methodologies are critical steps to understanding the need for designing and improving interventions to address health disparities and advance health equity in all contexts in the United States and globally.

Cognitive Performance and Behaviors

Introduction

This section encompasses methodologies for the study of eating behaviors, the effect of nutrients on cognitive performance, and effects on regions of the brain that influence behavior, mood, and performance. The applications of these lines of study to precision nutrition are almost implicit. As the areas of study and methods are described, the reader will, no doubt, be reminded of the great variety of behaviors and responses toward foods observed during social occasions, family, or solo mealtimes, and in dining establishments.

What accounts for the tremendous variability observed between individuals, groups, cultures, and other manners of grouping individuals? Clearly, there is a role for the entrainment of habits, desires, favorite characteristics, and any number of other ways of interacting with foods. Is this due to characteristics of the food? If so, then food characteristics can be manipulated in the hope of providing an optimal nutrition experience. But how much of the variability we observe has its origin in the individual? This is a central question of precision nutrition.

It is important to realize that a dichotomy exists for all, or nearly all, areas covered. Many aspects of eating behavior are entrained and are, at least theoretically, available to re-training to some extent. At the same time, gene expression—which accounts for differences in height, hair color, bone characteristics, and other observable variations—has many counterparts in systems related to the intake, digestion, absorption, and metabolic use of nutrients.

Folate is a well-documented example of the variation that exists in systems related to nutrition and behavior. Like many nutrients, folate plays multiple roles, including an important role in the generation and repair of DNA. The methylenetetrahydrofolate reductase (MTHFR) gene codes for a protein of the same name that is critical to an individual's ability to process folate in ways necessary for its functions. Some variations in MTHFR alter a person's ability to utilize dietary folate, but typically are not a threat to health. Other, rare, variations in MTHFR cause profound alterations in the utilization of folate, which during pregnancy, can result in neural tube defects. Supplementation with folic acid to achieve an intake of 400 μg/d during pregnancy reduces the incidence of neural tube defects overall. Folate availability for metabolism is not the sole cause of neural tube defects, but this provides a clear, although rare, example of how individual gene variation can affect nutrient utilization in a person who may otherwise appear healthy.

It is important to keep in mind that variation between individuals is one of the primary considerations when selecting a research method. If all individuals were the same or similar, relatively few studies would be required to determine relationships, and necessary interventions would be well defined.

Measuring eating behavior

The very broad field of assessing eating behavior encompasses a variety of methods including basic neurobiology, behavioral observation, task development, questionnaire construction, and self-report.

Sensory characteristics of food

The sensory, perceptual, and physical attributes of foods influence their ingestion. In fact, a food's sensory qualities are among the most important determinants of food choice and consumption. Sensory systems transduce an array of food-derived stimuli (e.g., light, pressure waves, chemicals) into electrical signals that are conveyed via neurons to specialized centers of the brain to be decoded and acted upon. There is marked individual variability in sensitivity and responsiveness to sensory stimulation. High responsiveness to 1 stimulus or of 1 sensory system is not predictive of high sensitivity to other stimuli or sensory systems. Thus, individuals can perceive the physical world quite differently. Each sensory system is activated by a distinct set of stimuli and receptive processes and the internal signal is conveyed by dedicated, unique nerves to different brain centers. Thus, at the anatomical and physiological level, each sensory system is independent, although the input they provide is integrated at higher brain levels.

People differ in their sensory response to food in many ways. One classic difference is the perception of bitter taste related to phenylthiocarbamide or propylthiouracil. Research suggests that approximately 25% of the population are non-tasters for this bitter taste while 50% are medium tasters and 25% supertasters. Researchers have used these differences in taste to explore consumption of a wide variety of foods, including sweet foods, alcoholic beverages, and vegetables, as well as nonfood consumption that involves sensory mechanisms such as cigarette smoking. Likewise, there has been extensive work on sensory thresholds for sweetness and fattiness, which has led to hypotheses that individual differences in perception may relate to obesity. For example, people with obesity may be less sensitive to sweetness or fattiness, which could result in consumption of more sugar- or fat-containing foods to get the same degree of pleasure. Sugar and fat may also have combined effects, so that some people prefer foods with a specific mixture of sugar and fat in a given food matrix to achieve maximal pleasure. There have also been hypotheses for other qualities and health disorders, such as salt taste and risk for hypertension, sweet taste and diabetes, bitter taste and thyroid disease, or sour taste and renal disorders. However, in no case has a strong predictive relation been documented. Whether this is related to measurement issues or the large array of other determinants of health disorders is uncertain.

There are also wide individual differences in olfactory function, although this area has not been as widely explored in relation to consumption of different types of foods. Visual cues play another key, nonsensory role in estimating portion size, and modifying behavior based on variations in the way food is served. For example, people will serve themselves more food and, thus, consume more if it is served in larger bowls and will consume more if served larger portions. However, it is important to recognize that smell and taste combine with all other sensory properties (appearance, somatosensation, and sound) of a food or beverage to yield the perception of “flavor.”

Sensory measurement methods

Threshold sensitivity measures subjective responses to an array of stimulus concentrations in a given medium (e.g., water, food). In addition to being highly subject to experimental conditions, such as familiarity of the judge with testing procedures, extraneous distractions, and fatigue, thresholds are relatively demanding of a judge's time and attention. In general, thresholds hold limited predictive power for food choice, except where the sensation is disagreeable with respect to hedonic impression or perceived health threat. That is, people are more inclined to reject an unpalatable food than ingest one that is palatable.

Scaling or intensity judgments can be measured to suprathreshold concentrations of stimuli. Selecting concentrations that reflect those encountered in the food supply should yield more behaviorally relevant responses. The relevance of intensity reports will also be determined by the vehicle used to deliver the stimuli (e.g., in model systems or complex foods). Multiple methods are commonly used to assign numeric scores to sensation (not a familiar task to most individuals), with varying demands on an individual's facility. Ranking stimuli eliminates the need for numeric responses but yields only ordinal-level data. Assessment methods are generally more rapid than threshold testing but are still subject to sensory fatigue, especially if it is difficult to clear stimuli from receptors between successive presentations. Intensity ratings can yield insights about the functionality of the sensory system but hold limited predictive power for food choice. The desired intensity level of a stimulus (e.g., saltiness, viscosity) is highly context specific and a response in 1 food system is not necessarily generalizable to a total diet.

Quality ratings may be easily obtained with threshold or scaling procedures or independently. They are heavily influenced by memory and experience. Depending on the goal of the assessment, it is possible to obtain a label for an isolated quality or the proportional contributions to the totality of the stimulus. Findings from assessments can provide insights on basic sensory function, but the impact of responses on food choice will depend on the valence of how the quality is perceived, which reflects individual experience, health concerns, and other nonsensory inputs.

Understanding the sensory capabilities of an individual and their hedonic impressions will complement the utility of nutrition interventions aimed at optimizing health based on a person's physiological status and lifestyle. Dietary recommendations are more likely to be followed if the foods and beverages that are encouraged are palatable. The degree to which palatability creates health risk is not as well established.

Measuring appetite

Appetitive sensations guide food choice as well as patterns and bouts of eating. Appetitive sensations that drive eating can be influenced by the food presented in a meal and food environment, including food availability, food stimuli, ready-to-eat foods versus prepared foods, cooking smells, and observing other people eating. Addressing appetitive sensations is critical for long-term adherence to and therefore success of any dietary recommendation. While there is no universally agreed-upon set of appetitive sensations, several may be important for initiation of eating, including hunger, craving or the desire to eat, and thirst. Hunger motivates the initiation of an eating event and can, but not always, be driven by acute shifts in energy metabolism (63, 64). It is a primary determinant of eating frequency. Perhaps the most important situation that can influence hunger is food deprivation, or the amount of time since the last eating bout. Hunger can vary by time of day, pattern of eating, and types of foods or beverages consumed. Missing meals may increase hunger and stimulate overeating at the next meal, as can occur in binge eating.

The sensations of craving and the desire to eat may also prompt an ingestive event but are based more on cognitive and sensory drivers (65–67). It is common to have a desire to eat in the absence of hunger. Although thirst theoretically reflects hydration needs, temporal properties of the complex systems regulating hydration status make the relation with the sensation of thirst less than straightforward (63, 68). The fact that many beverages now contribute energy to the diet further confounds the role of thirst in ingestive behavior.

It is important to consider individual differences in appetitive cues when developing precision nutrition guidance/recommendations, because there is so much variability in response to subjective experiences like hunger. Dietary recommendations that consider an individual's biology and lifestyle should include an assessment of impact on appetite.

Appetitive sensations are subjective and typically rated on visual analog or category scales (69). At any point in time, individuals can be asked to introspect on their appetitive state and provide a rapid response. No invasive procedures are required. However, these sensations are in continual flux, so the timing of ratings is critical for their interpretation. Ratings are highly sensitive to expectation effects (large changes over an eating event and reciprocity between sensations like hunger and fullness). Rarely are individuals trained in the lexicon used by researchers, resulting in a high risk of confounding reports across sensations, but training and acknowledging individual differences in use of rating scales can improve the quality of these ratings (70). Appetitive and food diaries can be used to assess hunger and time since last eating. Although hunger may vary with food deprivation, it is a subjective experience and should be measured using visual analog scales (63, 64). The Three-Factor Eating Questionnaire provides a measure of susceptibility to hunger cues—rather than naturally occurring hunger—that occur after different periods of food deprivation (71).

Given the challenges in interpreting appetitive ratings, it is possible to supplement self-report with measured biological changes that occur prior to eating. The most commonly measured of these are cephalic phase responses that prepare the body for food ingestion, including salivation, gastric secretion and motility, or conditioned insulin responses (72, 73). Cephalic phase responses can be conditioned to nonfood stimuli, such as time of the day, or cues associated with a common place to eat (74, 75), and can only be used to relate to appetitive cues to eat with careful control of the environment, time of day, and time since last meal. Salivation has also been used as a biological measure of craving (76, 77), and liking can be measured using observational indices of facial affect or by electromyographically assessed facial muscle activity consistent with positive affect (78, 79). These measures are still considered experimental, require special equipment, and are presently not likely to be commonplace in precision nutrition.

With regard to environmental variables that may influence eating, availability of food in the home can be assessed using a Home Food Inventory—pictures of foods stored in cabinets, refrigerators, and on counters, etc., and rated by observers or receipts for foods brought into the home (80). People can report how often they eat alone or who they eat with as part of an eating habit diary. Measuring the influence of many environmental cues such as food aromas and the sight of food in the natural environment is very challenging.

Classical or associative conditioning

Classical or associative conditioning can have important sensory influences on ingestive behavior (81–83). Sensory cues such as taste, smell, or visual characteristics, paired with positive ingestive experiences, become conditioned cues for eating those foods. Thus, presentation of these cues can initiate food consumption even in the absence of hunger (84). Likewise, if a person has a negative experience after consuming a specific food, they may develop an aversion to that food (85). Positive or negative experiences that occur during ingestion play an important role in food selection. Hedonic, reinforcing, or the reward value of food (discussed in the next section) also play important roles in eating, and these combine with associative conditioning to influence food selection and motivation or drive to eat.

Conditioned cues can stimulate eating or cessation of eating. Environmental cues, including time of day, context of eating, smell of food, and visual cues of food, can stimulate eating, even if a person is not hungry. Flavor can serve as a conditioned stimulus in flavor–flavor conditioning or variants of flavor conditioning such as flavor–color conditioning (86, 87). Nutrient properties of food can also lead to conditioned responses and pairing a flavor with differences in energy density can differentially influence rate of eating and the satiety cues associated with cessation of eating and thus the amount of food consumed (88, 89).

The role of conditioned cues in stimulating eating can only be determined using a behavioral task that assesses behavioral or physiological responses to individual cues. For example, presenting a visual, olfactory, or taste cue can lead to eating even in someone who reports not being hungry, and these cues can stimulate cephalic phase physiological responses such as increased salivation or conditioned glucose response that prepare the body for food ingestion. Associative conditioning methods provide an unusual window into factors that may drive eating but require access to eating laboratories and may take considerable time to implement. While there is some evidence that classically conditioned physiological responses are important aspects of weight and metabolic control (74, 90), these methods are not likely to become commonly used in precision nutrition.

Reinforcing value, reward value, and liking, or hedonics, of food

Food reinforcement is one of the most important behavioral constructs that influences eating. Reinforcing value refers to how hard someone is willing to work for food, or in other words, how motivated they are to obtain food (91). This is typically assessed using an operant laboratory task in which a participant is given the opportunity to make responses to specific foods, and the harder they work, the more reinforcing the food becomes. People who find food more reinforcing eat more food than those who are less motivated to eat, and they are more likely to have obesity (92). Foods differ in their reinforcing value, with animal and human research focusing on sugar content or the glycemic index of foods (93–96). Foods with high sugar content or glycemic index reliably increase concentrations of brain dopamine—an important neurotransmitter related to food reinforcement—and activate brain reward centers (95).

There are several questionnaire measures that tap into the motivation to eat. For example, the food choice questionnaire, which takes only minutes to complete, is designed to assess the reinforcing value of food by asking people how many responses they would make for a portion of food (97). Likewise, the reinforcing efficacy questionnaire uses behavioral economic theory to assess the demand for food (98). This asks people how much money they would spend for a serving of food, and by varying the costs, a demand curve can be created.

Based on Robinson and Berridge's (99–101) important theoretical work on wanting and liking of reinforcers, questionnaires have been developed to assess wanting versus liking of food (102). Wanting is related to the motivation to eat while liking is related to the hedonic response to food. Finally, there is the Power of Food Scale, which provides a measure of the appetitive drive for food, or hedonic hunger in the absence of a homeostatic need to eat (103). As noted above, appetitive characteristics of food represent an important set of cues that drive eating behavior. Several studies have shown that the reinforcing value of food is a better predictor of food consumption than liking of food (104).

There is wide interest, and controversy, about generalizing the concept of addiction to food (95, 105, 106). Both the reinforcing value of food and food reward can motivate people to eat, and there is a Food Addiction Scale (107) that directly taps into the construct of food addiction. The Reward Based Eating Drive Scale is another way to assess how much eating is controlled by reward value of the food (108). The Power of Food Scale assesses how much food availability, food stimuli, and the taste of food can drive eating and lead to the feeling that food is controlling you (103).

Using food as a reward can also modify the value of food (109). This is generally considered in relationship to child feeding/eating practices and can be measured using the Child Feeding Questionnaire (110, 111). However, adults can also use food as a reward, which can motivate people to eat. This may be assessed by the Reward Based Eating Drive Scale (108).

While the operant reinforcing value of food task is the gold standard for assessing food reinforcement, it is quite lengthy, sometimes taking 45 min or longer to complete. The act of eating is often considered to be related to the homeostatic versus hedonic aspects of eating. Homeostatic eating refers to eating in response to the biological need to eat or drink rather than the pleasure derived from eating, which is driven by hedonic attributes.

For this reason, many reinforcing value studies feed participants before engaging them in the reinforcing value task to remove homeostatic reasons to eat and isolate eating driven by the motivational characteristics of food (91). The questionnaire approaches to reinforcing value or reward are quite easy to complete. They tap into a similar construct as the laboratory tasks and can assess a wide variety of foods in a short period of time. The separate but related constructs of reinforcing value and behavioral demand are independently predictive of body weight. Each construct can be assessed for the absolute or relative reinforcing value or demand for food, as research has shown that alternatives to food can be an important determinant of eating (98). Having access to alternative nonfood reinforcers can reduce eating (112). This research is consistent with that on environmental enrichment, which has shown that providing access to nondrug alternatives for animal and human studies can reduce drug self-administration (113–115).

Wanting, liking, and appetitive drive to eat can be assessed by validated questionnaires discussed above (95, 103, 107, 108), and liking can be assessed by Likert-type scales. It is important to differentiate liking (how palatable a stimulus is in an absolute sense), preference (which stimulus is preferred over another), and free choice, in which the person selects food to be consumed. Individual differences in food addiction can be assessed, but it is important to consider that the concept of food addiction is controversial (95, 105, 106), as someone can find something to be reinforcing without being addicted to that commodity. Many of these questionnaires appear to assess similar constructs, so care is needed in the choice of the questionnaire. The use of food as a reward by parents is typically assessed by parental report, as measurement of this behavior in the natural environment would be challenging and costly, but concerns about parental report are realistic.

In all these laboratory or questionnaire measures, the conditions of the task must be controlled, including the participant's degree of hunger, or time since last eating, and dietary restraint, which can influence a person's desire or willingness to work for food, indicate liking, or their motivation to eat. It is critical to control the presence of food in the testing environment and the lingering smell of food from previous assessments unless reactivity to food cues is being assessed. It may be important to consider the eating environment and whether the study is implemented individually or in groups, as the presence of other people influences food intake in variable ways. It may be challenging to assess these variables at baseline before a person starts an intervention as the participant may have a positive response bias to show their willingness to change. Overall, the likelihood a food will be avoided or rejected is high if it is regarded as unpalatable, whereas the likelihood of its ingestion because it is palatable is modest because of the many nonsensory factors that determine food choice and the wide array of acceptable food options available to most people.

The reward value of food can be assessed using fMRI (discussed later in this section), which assesses how the sight, smell, or taste of food influences activation of brain reward centers. fMRI studies can assess individual brain sites or patterns of brain activation. In addition, hedonic value of the smell, taste, or even appearance of food can be assessed using subjective rating scales.

Psychological factors

A wide variety of psychological factors can moderate eating and should be considered to assess individual differences in eating when designing a precision nutrition program. Failure to consider these factors may lead to poor adherence and the decision that the diet did not work when, in fact, it was not adhered to. A very popular questionnaire is the Three-Factor Eating Questionnaire, which assesses dietary restraint, disinhibition, and hunger (71). Each of these can independently or interactively influence eating. Dietary restraint relates to voluntary attempts to restrict food intake. This is important to know for any studies in which eating behavior is assessed and for questionnaire approaches assessing reasons for eating. Disinhibition is related to food reinforcement (92) and may be related to other measures of food reward as well as choice of unhealthy foods, poor success in weight-control programs, and weight regain after weight loss (116). Finally, the Hunger scale assesses individual differences in susceptibility to hunger cues, not current hunger.

Another important psychological moderator of eating is emotional, or affective, state. Emotions can reliably influence eating in many people, and the Emotional Eating Scale is a valid and reliable way to measure who may be more likely to eat under emotional situations (117). The effect of emotions on eating is variable, and emotions can increase food consumption in some people and reduce it in others. Emotions can redirect attention from cues normally responsible for initiation or cessation of eating and lead to attentional bias that over- or undervalues food. Depression is well known to influence eating, but with wide variability (118). Depression reduces eating in some people yet increases it in others. Stress can also influence eating. A consistent body of research has shown that stress increases eating, and that stress can make food more reinforcing, consistent with the idea of stress and comfort food consumption (119, 120). Depression can be measured using the Beck Depression Inventory (121). Psychological factors may also cause eating disorders, which can influence the amount and types of food consumed. Eating disorders—including bulimia, anorexia, and binge eating disorder—are best screened for using the Eating Disorders Examination (122) or the Eating Disorders Examination Questionnaire (123).

Each construct for measuring psychological factors uses well-validated questionnaires. Some constructs are related to an increased motivation to eat, such as stress, while others come with a reduced motivation to eat, such as dietary restraint. Some constructs have effects that vary across people. Questionnaires may be very useful to better understand factors that influence eating and are easy to administer and score. It is important to note that measures related to psychiatric diagnoses, such as the Beck Depression Inventory (121) or the Eating Disorders Examination Questionnaire (123), are not designed to diagnose a disease but to alert the investigator to potential problems that should be followed up on with structured diagnostic interviews by a trained professional. In general, people do not overreport these psychological issues, but may sometimes underreport to appear psychologically healthier than they are. Cortisol can be used as a biological measure of stress to validate self-report measures of stress or as the measure of stress, bypassing any self-report bias. As cortisol has reliable rhythms, these need to be considered in its measurement.

Delay of gratification and delay discounting for food

When people are strongly motivated to eat, they want food as quickly as possible. This can be heightened by food deprivation (124, 125), but there are also individual differences in the degree to which people can delay gratification associated with eating. Many are aware of the famous “one marshmallow now, two marshmallows later” study that provided young children the choice of 1 marshmallow now or 2 later. The investigators studied how these individual differences were related to cross-sectional and prospective outcomes (126, 127). Not surprisingly, further research has shown that children who have trouble delaying gratification are more likely to become obese than those better at delaying gratification (128). There are a number of age-appropriate approaches to test delay of gratification in children.

A distinct, but related, construct is delay discounting, which assesses the preference for a smaller immediate reward over a larger, delayed reward (129). This task is derived from behavioral economic research, as opposed to the marshmallow task, which is derived from child development literature. Extensive research has shown that delay discounting is related to obesity, weight gain (130–132), and increases in glycated hemoglobin (HbA1c) (130), as well as drug abuse, activity levels (133), and a wide variety of preventive health behaviors (133). As delay discounting is related to so many disorders and health behaviors, it is considered a trans-disease process (134). In addition, delay discounting moderates the effect of reinforcing value (135, 136). In other words, a person who finds something very reinforcing and who discounts the future is at greater risk for obesity than someone who only shares one of these characteristics. The combination of reinforcing value and delay discounting is labeled as reinforcement pathology and is relevant to understanding eating and obesity (131, 137, 138).

Delay of gratification tasks assess whether a child can engage in self-control and delay receipt of a reward, generally with the option of a larger reward later if they do not take the initial smaller reward. These tasks are obviously sensitive to recent eating and what the reward is. An additional consideration is whether the child trusts the tester, because if they do not it does not make sense to wait for a larger reward that may never come (139). Research has shown the original delay of gratification task is sensitive to family background and SES (140).

Delay discounting tasks are a series of hypothetical questions in which people are asked whether they would prefer a small amount of a reward now or a larger amount later, with the amounts and the temporal distance between choices manipulated. There is a questionnaire version, but the usual method involves changing the reward amounts, a task that takes about 5 min (141). There is also a 5-question adjusting delay task that correlates very highly with the longer task. The common approach to delay discounting is to use various amounts of money as the reward, and is related to obesity, prediabetes, and diabetes. Researchers have adapted the construct for food-specific delay discounting tasks (142). These tasks can be implemented in the person's natural environment by using smartphones to assess usual decision making (143). Delay discounting may be particularly relevant to assessing motivation to prevent a disease, as prevention, by definition, requires engaging in a healthy behavior now for later benefits. People who strongly discount the future will have challenges engaging in healthy behaviors to prevent a disease.

Sensations of fullness, satiety, and satiation

Just as appetitive cues signal initiation of eating, fullness can signal cessation of a meal or termination of eating. Because fullness relates to cessation, not initiation, of a meal, it most strongly influences portion size, not eating frequency (144). Satiation refers to changes in appetite during a meal that are associated with the end of a meal. Satiety is the sensation that most closely associates with the intermeal interval (i.e., time between eating events) (145).

The nutrient composition of foods consumed can influence how full people feel during and after eating. Research suggests that protein is the most satiating nutrient, followed by fiber and complex carbohydrates (146, 147). Studies also suggest that solid foods are more satiating than liquids (148). In addition, cessation of eating is more strongly related to volume of food consumed rather than energy content of the food (149, 150). The general thought is that people stop eating when they are full or are in a state of biological homeostasis—a point when they don't need any more kilocalories or nutrients. However, this idea is too simplistic, as many people continue to eat after their biological needs for food have been met.

A simple approach to assess fullness is to ask how full someone is, which can be accomplished by a visual analog scale. Bartoshuk and colleagues (70, 151) have shown that taste perceptions are generally unreliable, and have provided insights into how to improve subjective assessment of sensory ratings. Cardello and colleagues (152) developed a perceived satiety scale using the principle of magnitude scaling. Fullness can also be rated based on how full the stomach is, which is primarily used with young children (153). Degree of fullness can be rated during a meal or after consuming a specific amount of food. Fullness ratings are also sensitive to expectations and are often incorrectly considered the opposite of hunger.

The most objective measure of satiation is the amount of food consumed in a meal, whether that is kilocalories, nutrients, or volume of food. However, as noted above, satiation can be influenced by the sensory aspects of food, so that someone may stop eating after eating 1 type of food, and then start when a new food is provided. This is the basis for consuming desserts after you cannot eat another bite of your entrée (154, 155). Satiety is assessed by the duration of time from the last meal to the next bout of eating. This can be observed in controlled settings. Unfortunately, it is hard to extract satiety from electronic food diaries because the recording times are not precisely related to the end of the previous meal and the initiation of the next meal.

Attempts have been made to identify biomarkers to more objectively classify factors influencing the end of a meal. The most common indices are gut peptides (156, 157). While they tend to change in concert with appetitive sensations (e.g., reported satiation hormones like cholecystokinin and glucagon-like peptide 1 increase during an eating event as do fullness ratings in study participants), the preponderance of evidence does not support a reliable predictive association within individuals. Stronger evidence is derived by nonphysiological administration of peptides to evoke changes of sensation (158, 159). Commonly, an array of sensations is concurrently rated along with measurement of multiple gut peptides and any noted associations are reported without control for multiple tests or a clear hypothesis. For example, an association between changes in a peptide reported to hold satiation properties (determinant of portion size) with hunger or a desire to eat, which may drive eating frequency, is of questionable meaning.

Habituation and sensory-specific satiety

Research on habituation and sensory-specific satiety has produced some of the clearest evidence that meal cessation is not completely due to reduction in energy needs, stomach distention, or other biologically plausible reasons to stop eating. Habituation is a general learning process that influences reduction in behavioral and physiological response after repeated presentations of a stimulus, as would occur for food stimuli when eating a meal (160). While reduction in response alone is not evidence for habituation, the fact that a person can recover responding to the same stimulus after a novel stimulus is presented provides strong evidence that the decrease is related to habituation. This is called dishabituation (160). The rate of habituation can also be influenced by presenting a variety of foods rather than 1 food (161). Food stimuli can be gustatory, olfactory, visual cues, or a combination of these (160). The rate of habituation is related to the amount of food consumed in a meal (162, 163), with slow habituators consuming more food and being more likely to have obesity (164, 165) than faster habituators. In primate studies, single-cell recordings in brain areas related to eating have shown reduced response to repeated presentations of a food, while the same neuron recovers responses when a new food is presented. Interestingly, activation of the taste cortex continues throughout the presentations, but responses in the hypothalamus demonstrate habituation (166–169). Long-term habituation has been shown. For example, effects of consuming the same food can last over meals or days (164, 170). Research on habituation and reinforcing value has shown that each predicts about 30% of variance in kilocalories consumed in an ad libitum meal (171). Given that habituation rate is related to obesity and the amount of food consumed, an important area of research is in discovering characteristics of foods that slow or speed up habituation.

Sensory-specific satiety is a similar phenomenon, in which people reduce their liking of a food repeatedly presented, while liking of foods not consumed is not reduced (155, 172, 173). As with habituation, presenting a variety of foods reduces the trajectory of liking compared with circumstances where only 1 food is presented. Sensory-specific satiety can arise from even small differences in food characteristics, such as shape. Long-term sensory specific satiety has also been shown.

Given that a bout of eating begins with cues that initiate eating and ends with cues that terminate eating, overeating can result from an excess motivation to eat or the failure to stop eating. Understanding how habituation and sensory-specific satiety are related to satiety and satiation is important because some people overeat because of individual differences in these factors, not only the drive to eat. Others may overeat due to an excessive drive to eat, and some have problems with excessive drive to eat and the failure to develop satiation (174). As with the motivation to eat, basic behavioral and neurobiological research is making great advances in understanding the role of hormones and brain reactivity in why people start and stop eating.

Both habituation and sensory-specific satiety involve repeated presentation of food stimuli to assess individual differences in these variables. To be confident that a decrease in response is due to habituation, it is important to present novel stimuli and to observe dishabituation, or to present a variety of food stimuli and slow down the rate of habituation. Sensory-specific satiety is defined by changes in the hedonic response to food after repeated presentations. For habituation and sensory-specific satiety, it is important to control previous eating, the types of foods studied, and the food environment. In habituation studies it is also very important to control extraneous presentation of nonfood stimuli, as any sensory stimulus can result in dishabituation to food. This is, in part, why people can eat so much while watching television or movies (175).

Measuring patterns of eating

Wide individual differences in eating are relevant to the amount of food and energy intake consumed in a meal. For example, people may eat at faster or slower rates (175), take smaller or bigger bites (176), or change their eating rate during a meal by eating faster as the meal starts and slower as the duration of the eating bout is lengthened. They may eat and then drink, drink first and then eat, eat all of 1 food before starting a second and then third food, or they may eat foods in different orders. They may also combine flavors or eat each food separately.

There can also be different patterns of eating throughout a day. Some people eat 3 meals a day and a snack while others regularly miss breakfast and eat 2 meals. New dietary approaches include time-limited feeding in which people attempt to eat only within a limited number of hours (177). There is also intermittent fasting, where people either do not eat or eat a limited amount of energy for 1 or more days a week.

Physical activity can influence eating patterns in many ways. Because physical activity expends energy, any assessment of overall energy balance needs to account for it. Physical activity can affect appetite and the reinforcing value of food (178). Different types, durations, and temporal relations of physical activity can increase or reduce appetite. Additionally, the pattern of physical activity may be important. Given that exercise alters energy expenditure and activates brain reward processes (179, 180), specific eating and exercise patterns may be important. Exercise before eating may stimulate appetite, particularly in someone who is in energy balance. Exercise after eating can influence the process of satiation, which can shorten or lengthen the time until the next meal. These relations have not been well documented.

The most straightforward way to measure patterns of eating is by direct observation, usually accomplished by videotaping or observing eating directly, and then having coders rate characteristics of interest. More technologically creative ways to measure eating may be less intrusive than knowing someone is watching you eat. For example, plates attached to pressure transducers can quantify how much food is removed from the plate in real time. While this does not measure eating behavior directly, it can provide an index of the rate of eating and the amount of food consumed at each bite. More biologically oriented measures include caps that are placed on the teeth to register bites, audio recordings that can be used to interpret bites, or neck electromyography for assessing swallowing. These can be used to assess eating in the natural environment rather than in the laboratory. Disadvantages include the burden of direct observation, which is time consuming and costly, and lack of access to standardized and well-validated biologically oriented measures of eating patterns.

Patterns can also be measured using electronic food diaries that provide a time stamp to ensure the responses are recorded at expected times. There is extensive research on challenges with having people record their own eating, with underreporting based on their body weight (181, 182). An advancement in recording may be to use biological variables that change during eating, such as continuous glucose monitors, which can provide time-stamped notification of blood glucose, which changes, in part, due to eating. A continuous glucose monitor can also provide a general idea of the composition of foods but, given wide individual differences in how people respond to the same foods, cannot be used to validate food records. Extreme caution is needed when using continuous glucose monitors to validate eating patterns, as changes in activity or acute stress can also influence blood glucose. Another invasive approach is the use of electromyography to measure chewing and eating (183).

The relation between exercise and eating patterns can be assessed using accelerometers, which provide valid measures of time-stamped activity and can be reliably related to eating pattern if used with a time-stamped electronic food diary. Accelerometers are easy to use, with little burden on the participant. Combining accelerometers with food diaries to estimate total energy intake and energy balance can help detect underreporting and, in combination with weight loss, can provide an index of reporting accuracy (184).

Measuring cognitive function

Cognitive function (CF) refers to “multiple mental abilities, including learning, thinking, reasoning, remembering, problem solving, decision making, and attention” (185). Objective tasks or tests, as well as subjective methods, can be used to measure CF. Objective methods typically have the person complete a task or test and measure various attributes of the test (e.g., competition time, speed, accuracy, recall, etc.). Subjective methods collect self-reported data and can be used to measure mood, mental energy, memory, etc. These methods can be used in various study designs: 1) acute [e.g., participant eats food and CF is measured during or shortly after (186, 187)], 2) long-term diet change [e.g., pre- and post-diet assessments (188–190)], or 3) via cross-sectional longitudinal studies (191–193).

Interest in the effects of nutrition on CF has been growing. It has become clear that at least some nutrients (and nonnutrients), alone or in combination, can affect acute measures of cognitive performance, such as short-term memory, reaction time, or vigilance. Within studies that demonstrated improvements in CF lie data that describe people who did not respond as well, and those who responded positively. Some studies have assessed declines in CF relative to their previous norms. Both situations represent the target of precision nutrition to determine how to identify individuals who might benefit and the nutrient regimes that appear most effective for them.

Objective CF methods

Common CF measurements include immediate- and long-term (60-min delay) recall tests, which are used to measure short-term and long-term memory, respectively. For example, study participants are read a list of words and then repeat back as many words as they can remember (194, 195). Similarly, participants can be shown a drawing and asked to recall and re-draw the image at a later time [e.g., Rey-Kim Memory Test–Complex Figure Test (196)]. In addition, the ability to complete tasks can be measured with the Trail Making Test, which consists of tests in which a person draws lines sequentially to connect various points [e.g., 23 numbers distributed on a piece of paper or alternative numbers and letters (197) and the total time and number of errors are recorded to determine CF]. A verbal fluency test gives participants 1 min to produce as many unique words as possible within a given category or that start with a given letter (198–200). The Stroop Color-word Reading Test is used to measure CF and involves color names written in text that is colored a different color (e.g., the word “red” is written in green text) and counting correctly identified words and colors (196).

More complex tasks, such as logical reasoning and semantic processing, are also used. In logical reasoning, participants are shown statements about the ordering of letters (e.g., A follows B; BA) that range from simple active to passive negative (e.g., A is not followed by B) and are asked to read statements and determine if the statement meets the given requirements (194). In Sematic Processing, participants retrieve information from general knowledge; they decide if sentences are true or false [e.g., “Canaries have wings” or “Dogs have wings” (196)]. Additional objective tasks include the choice-reaction time task, visual-search task, egocentric mental-rotation task, the attention-switching task (201), and the n-back test (202).

Subjective CF methods

Subjective methods are essential to measuring CF. Mood can be measured by report on how the participant is feeling, typically with a scaled response (e.g., 0–100). The Bond-Lader Visual Analog Scales uses adjective pairs (e.g., happy-sad, sociable-withdrawn, and calm-excited) to report subjective states, and a three-dimensional score (alert, content, and calm) is calculated based on responses (203). Similarly, the Profile of Mood State Questionnaire inventories mood and arousal states by having participants rate a series of adjectives with a 5-point scale, which can be factored into 6 mood subscales (tension, depression, anger, vigor, fatigue, and confusion) as well as overall total mood disturbance (204). The State-Trait Anxiety Inventory measures anxiety via a 4-point scale in response to statements (e.g., “I am calm”) (205). In addition, self-reported scale data can be collected before and after completing a task to determine reported level of difficulty, effort, and tiring (201). Nutrition-specific subjective measurements have been created, such as the Caffeine Research Visual Analog Scales, which measure CF attributes related to caffeine (e.g., relaxed, alert, jittery, tired, tense, headache, and overall mood) on a scale of 0–100 (206). Self-reported data via subjective scales can be collected in relation to eating behaviors, such as the Barratt Impulsiveness Scale, Dutch Eating Behavior Questionnaire-Restraints, and the Food Cravings Test (207). Subjective tests play an essential role in measuring the effect of nutrition on CF.

Strengths and limitations

A major strength of CF research is the wide breadth of methods used. A combination of subjective and objective tests allows different types of data collection. As CF is complex, the use of numerous tests enables a wide range of ways to capture an effect for a given intervention. Methods can be tailored to specific interventions (e.g., mood instead of overall CF). Many of the CF methods described above have been validated (208, 209), appear to have consensus acceptance based upon wide use in published studies, and have been used to observe diminished, normal, and improved cognitive performance.

Numerous limitations exist for measuring CF. The CF methods described above typically are at a discrete time (i.e., before and after an intervention) and more research on short-term (e.g., hour-to-hour, day-to-day) variation is needed. Moreover, what constitutes variation within a normal range is unknown for many of these methods. As the reliability of these methods is largely unknown, it is unclear whether the same result/score can be produced with repeated administration.

Studies on nutrition and cognitive performance have been uneven in discussing the implications of a battery of tests that provided mixed results. It is currently unknown whether there is an optimal combination or number of tests needed to reveal CF changes. Many studies have been affected by design issues or failure to repeat or confirm observed results (210). With these limitations in mind, future research should include refining how results for nutrition effects on CF are studied and reported.

Key consideration in data interpretation

CF methods and studies described in this section represent a frontier in terms of applying findings. Many studies have reported improvements in 1 or more aspects of CF in response to interventions with nutrients and nonnutrients. In many cases, the improvements were statistically significant but numerically small. It is not clear how much of an improvement would represent a consistent improvement in CF. The link between improvement measured in a study and improvement in everyday life remains to be clearly detailed. Some areas of CF can be assessed by multiple methods. It is important to remember that many of the methods described were developed to measure cognitive performance and link the results to some aspect of function observed in daily living. The choice of methods is left to the researcher. While there is often agreement as to which methods measure which functions, including some standardized collections of tests, there is no consensus around which methods can or should be used consistently to reveal these relations.

Methods to assess cognitive performance in response to nutrition interventions represent an area with potential. To achieve this, there needs to be consensus regarding whether, and how, results can be translated to the nutrition of individuals and groups.

Brain imaging and responses related to food addiction

fMRI

Researchers have used fMRI to assess changes in brain activity in response to foods, particularly the sensory properties of foods and cues related to foods or eating. Images generated by fMRI provide detailed, recognizable images of brain cross-sections with clear distinctions between brain regions. The size, hue, and intensity of color in specific regions of the images represent activity changes to viewers.

fMRI has the advantages of being noninvasive and not exposing the patient to radiation. The images provide clear representations of changes over time. Once a laboratory has been set up for fMRI, the technique is relatively easy for a researcher to use. However, there are significant challenges and caveats associated with fMRI in nutrition research. These relate to the technology or analysis methods used, issues surrounding food intake, and the intersection of the technology and the intake of foods.

Issues related to technology and analysis

Recent examples can help illustrate technology and analysis issues with fMRI studies. Eklund et al. (211) used resting-state fMRI data from 499 healthy controls to conduct 3 million task-group analyses. They assumed that baseline values within the sample were normally distributed and that no consistent shifts in blood oxygenation level–dependent (BOLD) activity would be found, as participants were in the resting state. Under these conditions, they expected a false-positive rate of approximately 5%. Instead, using the 3 most common software packages for fMRI analysis employed at the time, they found rates of false positives as high as 70%. Factors involved in the erroneous results were related to how the software handled the incoming data and whether parametric or nonparametric methods were used.

In earlier work, Sacchet and Knutsen (212) found that using a different value set for the spatial smoothing function conducted by fMRI software—each of which had been used in relevant published studies—could return different results for the localization of reward-based brain activity.

The reliability and reproducibility of fMRI results has been of concern. Bennett and Miller (213) covered a wide range of factors that could potentially affect the reliability of interpretations found in published studies. They provide a summary of their findings: “There is little agreement regarding the true reliability of fMRI results.” Chen et al. (214) also addressed reliability/reproducibility issues. They analyzed 4 independent datasets to assess 1) the test–retest reliability and replicability of resting state fMRI data, 2) how multiple comparison correction strategies impact reliability and reproducibility, and 3) how sample size might influence reliability as well as power and positive predictive value.

Brain imaging methods provide an intriguing counterpart to methods used for assessing cognitive performance. The former seek to identify the functioning areas and pathways of the brain, with exploration of associations with behaviors or cognitive performance. The latter seek to determine the outcomes of these functioning areas and pathways, and in some cases, seek associations between the 2. Research in these areas remains largely exploratory and the methods described have not been validated for creating a nutrition plan in the clinical or guidelines setting. Therefore, applying these methods and their findings to precision nutrition remains aspirational. However, findings have confirmed variation in responses between people and provide an important foundation for precision nutrition. It is common for studies to illustrate that some people either fail to respond to an intervention or respond to a much greater extent than people between these extremes, who most often form the majority of study participants. This suggests that a clinical approach that uses several potential interventions (or doses) in sequence could help determine which intervention would be useful.

In closing

NIH Director Dr. Francis Collins characterized precision nutrition as “…more targeted and effective diet interventions based on an individual's personal characteristics” (215). This will require specific knowledge of an individual's needs, based upon reliable and repeatable clinical tests or assessments. The methodologies reviewed here are considered to accurately identify factors that influence food and eating choices (i.e., eating behaviors) and aspects of cognitive performance, such as memory, attention, and the ability to maintain attention. These methods, along with newer ones, will continue to contribute to our knowledge. In view of current consumer and professional methods to detect changes in mood or metabolic parameters (e.g., rings with indicators and apps connected to smart watches), a similar push to develop real-time methods to detect one's eating behaviors and cognitive performance can be anticipated. Methodologies with the required accuracy, reliability, and repeatability will be developed in time.

Dietary Assessment

Introduction

As we move into a future focused on precision nutrition, measuring individual dietary behavior will continue to be necessary. At the same time, dietary intake is becoming increasingly complex, with new products entering the marketplace at unprecedented levels. Recent advances in biomedical techniques such as genomewide association studies, metabolomics, and proteomics, along with new analytic approaches using AI and machine learning, are providing potential new ways to examine complex processes simultaneously and will help scientists understand differences in response to dietary intake across groups and individuals. At the extreme level, precision nutrition offers the promise of prescribing individualized diets based on specific genetic and metabolic signatures and identified risk of disease outcomes. To get to that point will require precise data on dietary intake, not only of patterns, but of specific nutrients.

In recent years, there has been some investment in improving dietary assessment methods, but this has not matched the development of new biomedical measures. There are now new biomarkers that can be used to validate or enhance dietary intake reports, photographs can better capture portion size, and advances in computing are helping reduce the burden of data collection. Although many advances using cameras have been made, they remain burdensome and cannot provide the ingredients of the item and, therefore, must be coupled with self-report. At the present time, the most-used tools remain the 24HR (or less so, the diet record) and the FFQ. Recent advances in computing and the ability to complete reports online and to include skip patterns to reduce respondent burden, have all contributed to improved precision in these measures but much remains to be done.

Dietary assessment continues to be important in public health nutrition, and, from that angle, the validity and precision of assessment methods for group-level estimates of population distributions are also important and may often be obtained with fewer assessments. For example, NHANES uses 2 nonconsecutive 24HRs from which statistical adjustment can be made to describe population intakes. In general, the validity of the resulting nutrient intakes for reflecting habitual intake depends on the type of data collection (interview, record, or questionnaire/apps), time frame of assessment (current vs. retrospective), portion-size estimation (weighed, models, standard, or none), and the food-composition database. The resulting measurement error can be systematic, which in monitoring studies leads to over- or underestimation of mean intake and the proportion of below or above a certain cutoff (such as the Estimated Average Requirement or relative non-centrality index), or random, generally caused by day-to-day variation in intake. Random variation leads to misclassification around the mean rather than affecting the mean intake and hence also affects the proportion below or above a cutoff. Importantly, in epidemiologic studies, it also increases the likelihood of underestimating associations with health outcomes. Random error can be reduced by increasing the number of daily intakes measured. Here, we briefly describe the main dietary assessment methods and their strengths and limitations, with a focus on precision nutrition. New developments in portion-size estimation and food recognition will also be discussed.

Dietary assessment approaches

Diet records

For many years, the weighed dietary record was considered the gold standard for obtaining accurate dietary data (216). This method relies on compliant participants using a weighing scale to record everything they eat or drink throughout a day or, more often, an entire week. For homemade recipes, they are asked to measure the ingredients, if possible, the total volume, and proportion consumed. For all items, they are asked to measure the portion served and to subtract any leftover portion. When extreme precision is required, an observer may stay with the individual throughout the day to assist in the weighing and measuring (217). This observer method has been used frequently in low- and middle-income countries and with low-literate individuals (218). The record approach avoids concerns about memory during recall, as the foods are recorded in real time, providing accurate assessment of total energy and other nutrients consumed, when carefully completed. Three- or seven-day self-recorded diet records have been used in several large studies, including the Baltimore Longitudinal Study on Aging and the multisite European Prospective Investigation into Cancer and Nutrition study (219, 220).

Despite their ability to provide valid quantified detail on dietary intake, diet records have lost favor due to numerous limitations. First, as with dietary recalls, data are obtained for a specific day or set of days, which comes with the potential for misclassification of usual (current) intake. The strict compliance required has often led studies using dietary intake to be restricted to highly educated and motivated participants to maximize internal validity (216). Over time, it has also been recognized that participants are less likely to be compliant with this demanding method, leading to poor completion rates, reduced study power, and compromises to external validity (221, 222). Additionally, diet records tend to show lower energy intake relative to the 24HR, due to reductions in food intake and a tendency to avoid poor eating habits during this conscious observation (223). For these reasons, weighed diet records are now used less frequently than other methods.

Nonweighed diet records are currently in use by several commercial apps promoted for weight loss or health/fitness such as MyFitnessPal (developed by MyFitnessPal, Inc.) (224). However, the quality and completeness of the underlying food composition database are often not the same as established food tables (225) and the USDA database (226).

24HRs

A commonly used method of gathering nutritional intake data is the 24HR, wherein a participant or caregiver is prompted to report all food and beverage consumption from the prior 24-h period. Typically, this information is gathered by an interviewer in a structured setting; however, several self-administered versions of the 24HR now exist. Information from recalls is used to estimate nutrient intake, quantify intake adequacy, describe dietary patterns, and to determine adherence to or deviation from diet recommendations or requirements. Data gathered are entered into a nutrient analysis software program (e.g., the University of Minnesota Nutrient Data System for Research, ESHA Research Food Genesis, MyFitnessPal, Inc. etc.), with varying levels of accuracy. Estimates using other quantification methods—such as the diabetes exchange system—may be used as well.

Advances in accuracy of the 24HR have been accomplished with the USDA's Automated Multiple-Pass Method (AMPM), which is used in the NHANES (227), with similar methodologies used in other dietary software, including the University of Minnesota Nutrient Data System for Research (228). The interviewer begins the 24HR assessment by initiating a first-round open-ended dialogue asking the participant to list all food and beverages consumed the prior day, recounting as many specific details as possible. After this free-form discussion, the interviewer returns to specific items and asks for more granular detail (e.g., condiments, brand, serving-size estimation, timing of consumption, preparation method, etc.) as well as any potentially missed items (e.g., water, added salt or sugars, supplements, small bites of foods consumed, etc.). The interviewer also uses reflecting/repetition to confirm accuracy of information received from the respondent. Other contextual information may be gathered, such as the time of day when a certain food item was consumed or who prepared each meal. In clinical and community settings, particularly in low- and middle-income countries, the 24HR is most commonly conducted using an interviewer.

Alternatively, participants can self-complete the 24HR with an automated program, such as the National Cancer Institute's (NCI's) Automated Self-Administered 24-Hour (ASA24) Dietary Assessment Tool (229). Like the USDA AMPM, the ASA24 involves an iterative 7-step process, wherein respondents are redirected to previous steps if they indicate having missed an item during reporting. The respondent reports the following: 1) a list of items typically consumed within normal meals (i.e., breakfast, lunch, snacks, and dinner), 2) a review of any gaps left in meals, 3) more granular details of items reported in steps 1 and 2, 4) a final review for accuracy, 5) a list of forgotten foods or situations where food may been unreported, 6) a last chance to review and submit additional items, and 7) a question regarding whether the reported 24HR is similar to the respondent's usual intake. Studies have shown relatively good completion of the ASA24 relative to the interviewer-administered AMPM among literate, compliant individuals (230), although validity was somewhat lower than for the AMPM (231). Importantly, validity of self-report versus interviewer-administered 24HR depends on participant education, literacy, and commitment to accurate completion.

In the context of an interviewer-led 24HR, the nature of the 24HR's flexible, free-form response format makes it appropriate and applicable for a variety of dietary patterns and cultural contexts. Because respondents are not limited to a preset list of foods, culturally relevant foods are less likely to be unreported. This is critical when the study population includes subgroups not considered in the food list for common FFQs and for more detailed consideration of health disparities. The 24HR has been validated and is regarded as an excellent measure of mean intakes of groups. It is not adequate, however, for estimating the usual intakes needed for precision nutrition as multiple recalls must be conducted for this purpose.

Although the 24HR provides the most complete information without bias in diverse groups, it also has several limitations in various research settings. The validity of data from a 24HR relies on several factors working in harmony: the interviewer technique, software adequacy (complete database, use of multiple-pass system, etc.), absence of respondent reactivity, accuracy of respondent recall, and appropriate use of gathered data for a given research objective, among others.

Significant training is critical for an interviewer-conducted 24HR to both minimize error and maximize completeness and detail. Both interviewer and respondent biases are possible. Respondents may be prone to reactivity bias (especially in the case of an anticipated 24HR) as well as recall bias. A major source of error in data gathered from 24HRs comes from inaccurate estimation of quantities consumed. The use of food models or pictures of food portions may assist in minimizing this bias, which is usually in the direction of underestimation (232). New technology, including photographs of meals, may also assist in minimizing this bias, as noted below. Further limitations are likely in the case of a caregiver or parent respondent, a respondent with a disability, or one who is not fluent in the language in which the interview is conducted.

The major limitation of the 24HR is that a single day does not capture usual intake, as any person may, on any single day, have an intake much lower or higher than their own average. The level of intra- or inter-individual variation differs by cultural food pattern. Foods consumed in regular patterns, such as milk, coffee, or alcohol, have lower intra- or interpersonal variation than those rarely consumed, such as organ meats. These then reflect differences in nutrients, where nutrients associated with consistently consumed foods, such as calcium, may require fewer days than nutrients associated with diverse rarely consumed foods, such as vitamin A (222). This leads to serious misclassification of individuals for intake of most nutrients (233, 234), generally leading to attenuation of associations with health outcomes—which is important for precision nutrition.

Statistical techniques have been developed to “deattenuate” correlations and regression coefficients by correcting for intraindividual to interindividual variation when at least 2 nonconsecutive 24HRs are available for a subset of the population (235). However, misclassification remains problematic unless numerous recalls are conducted over several seasons and averaged.

A new development in the field of recalls is to limit the recall period to less than 24 h—for example, to 2 or 4 h, but then to include more days, so that overall data on a full day are being collected. This is now feasible using smartphone push notifications (236). This may limit recall bias, and possibly enhance compliance of the participants. Validation of this approach is currently under way.

FFQs

Most longitudinal cohort studies focus on long-term usual dietary exposure as the primary measure of risk associated with disease endpoints. Because of the 24HR's limitations in assessing usual intake, FFQs are generally used in large epidemiologic studies. FFQs are attractive because, in a single administration, they can provide estimated intakes for the past year. The earliest and most well-known FFQs in the United States are the Willett FFQ, developed and validated for the large Nurses’ Health Study (237), and the Block FFQ, developed at the NCI using national-level data (238). These questionnaires were originally developed with responses that could be optically scanned and then analyzed with algorithms that translate the food intake to nutrients. Since then, numerous additional FFQs have been developed for specific studies with cultural tailoring, including the Hawaii-California Multi-Ethnic Cohort Study (MECS) (239) and FFQs for other specific populations (240, 241).

There are notable differences in the development and presentation of various FFQs. The Willett FFQ was developed to rank people according to selected nutrient intakes with expert input from research dietitians. During testing, the decision was made to provide frequency measures with an assumed standard portion size noted because, in this population of US nurses, the addition of portion size did not add to the ranking of nutrient intakes enough to merit the extra burden, particularly after adjustment for total energy intake (242). The Block FFQ was developed using NHANES national data from 24HRs. Major food sources of nutrients were ranked to ensure that the most important foods were included on the food list. The questionnaire included small, medium, and large portion sizes to obtain more precise estimates of quantities consumed. The MECS was developed using 3-d diet records from people in each of 5 targeted ethnic groups to ensure that foods for each group were equally represented. Recognizing the importance of portion size due to important differences in eating behavior across the 5 ethnic groups, MECS takes quantification a step further by providing photographs of different portion sizes (243–245).

Most recently, the NCI and others have moved their FFQs online. This has allowed skip patterns to be introduced to reduce participant burden when certain foods or categories of foods are not reported. The newest version of the NCI FFQ, called the Diet History Questionnaire (DHQ) III (246), is freely available for use with adults aged 19 y or older. This version, consisting of 135 food and 26 supplement items, is based on 24HR data from the NHANES 2007–2014 and is available with and without portion size. The DHQ III can be used to request intake data for the past year or month. The DHQ, unlike other FFQs, is based on cognitive interviewing with the grid format abandoned in favor of individual questions for each item (247).

The FFQ has the advantage of requiring only a single administration to obtain usual intake patterns, making it the method of choice for most longitudinal studies. Repeated measures over time can increase the precision of long-term exposure. If used appropriately for the population for whom it has been designed, FFQs have been shown to rank usual intake of nutrients well and are, therefore, useful in identifying intakes for use in precision nutrition and prevention of disease outcomes (142).

Limitations of FFQs include the fact that they are population and time specific. Hence, new FFQs may need to be developed with differing populations and over time, as food patterns change. The primary limitation is that reporting is dependent on the food list. For this reason, many studies develop their own FFQs to suit the targeted population, making cross-study comparisons difficult. It is critical that FFQs include the major food sources of nutrients for all important cultural subgroups in studies to accurately assess the role of nutrition in health disparities. For compliant completion, the length of the questionnaire must be limited. This means that groups of related foods are reported together, which limits the ability to obtain true variation in some nutrients. For certain frequently consumed foods, follow-up questions can help in specifying more detail, such as type of milk, type of oils used in cooking, and so on, but these again must be limited to allow compliant completion. The most recent online questionnaires have the potential for improvement in this area with the use of strategic skip patterns to gather more detail on the foods commonly consumed by each individual, but much more work is needed in this area (248). Limitations of these online questionnaires include the likelihood of differential responses among those with limited education or English-language use, or with food patterns not represented by the food list. In the Netherlands, an automated system has been developed for this purpose, with an FFQ and corresponding script for calculating nutrient intakes. Input is based on the national food-consumption survey, and foods included are selected based on their contribution to absolute nutrient intakes and their contribution to the variation in intake in the target population (237). Importantly, FFQs must be used only for the population for which they were developed, and participants with differing dietary patterns are likely to be seriously misclassified without specific consideration for their inclusion (234).

With an increasingly diverse population, existing FFQs are generally study specific. The use of a general questionnaire with diverse subgroups not explicitly considered in its design will lead to bias and underestimation of intake in these subgroups, not only due to lack of commonly consumed foods but also, in some cases, due to large differences in standard portion sizes and in food preparation (240). It is also important to note that the food supply and dietary patterns change over time, so a fixed food list may miss these changes in a longitudinal study. Further, most large-cohort studies ask participants to complete the FFQ themselves, usually online. However, to ensure inclusion of low-literate populations less familiar with this type of form, it is often necessary for an interviewer to administer the questionnaire. If this is not done, noncompletion bias will be likely differential by subgroup.

Another limitation of the FFQ is that, given the approximate nature of the foods listed and, particularly, the lack of detail on recipes and portion sizes, the data are considered semi-quantitative. This means that total energy intake is usually not well measured. For this reason, FFQs are not recommended for energy intake assessment. Rather, adjusting for energy will tend to improve the ranking of the other nutrients, as shown in validation studies against multiple records or biomarkers (237, 249–251).

In some cases, such as when total dietary intake is not considered necessary, dietary screeners are used. These screeners, which are basically short FFQs, can be focused on dietary quality (252), specific foods such as the NCI fruit and vegetable screener (253, 254), sugar-sweetened beverages (255), or specific nutrients such as calcium (256). These screeners help meet specific study needs but are limited in their inability to control for total energy or other nutrient intakes.

Further improvements are needed to better quantify details of individual intake, which is necessary to truly link exposures to outcomes, given other specific characteristics. As indicated earlier, statistical methods to adjust for measurement error have been developed (257). In the case of FFQs, validation studies on a subsample using multiple-day food records or 24HRs are often used, although these are not optimal because of correlation of measurement errors; the use of biomarkers is preferred (233).

With more investment into improving technology that can increase the details collected and improve validation studies, the FFQ will likely be useful for precision nutrition because it is currently the only cost-effective method of capturing long-term usual intake.

Biomarkers of intake, status, and metabolomics

Probably the most useful adjunct to dietary reporting is the concomitant use of biomarkers. The topic of biomarkers is discussed in greater detail in the section on Nutritional Status, but the main issues are briefly summarized here. The potential for biomarkers of intake for whole foods and dietary patterns and relevant new developments in metabolomics are also discussed.

In the field of nutrition, a biomarker is generally referred to as “a biochemical indicator of dietary intake/nutritional status (recent or long term), an index of nutrition metabolism, or a marker of the biological consequences of dietary intake” (258, 259). In dietary assessment specifically, biomarkers are further classified based on their association with intake (258): 1) recovery biomarkers [based on the recovery of certain food compounds directly related to intake and not subject to substantial interindividual differences; i.e., doubly labeled water (DLW) and urinary nitrogen, potassium, and sodium]; 2) predictive biomarkers (are sensitive, time dependent, and show a dose–response relation with intake, but their overall recovery is lower than recovery biomarkers, such as urinary sucrose and fructose); 3) concentration biomarkers [concentration correlates with intake of corresponding foods or nutrients but the correlation is often lower (<0.6) than that expected for recovery biomarkers (>0.8), such as serum vitamin or serum lipids]; and 4) replacement biomarkers (closely related to concentration biomarkers, but referring specifically to compounds for which information in food-composition databases is unsatisfactory or unavailable), such as urinary aflatoxin and epicatechin, or serum phytoestrogen.

Although well established for only a limited number of nutrients, recovery biomarkers and concentration biomarkers can be used to validate dietary methods and may be used to calibrate intake estimates. It may be possible to collect concentration biomarkers on a subset of study participants and extrapolate to a larger study, at least partially circumventing the issue of correlated errors in validation studies using self-reports (260).

Recovery biomarkers have been important in understanding the validity of dietary assessment for certain nutrients (232). Urinary nitrogen and potassium biomarkers are frequently used to investigate the validity of a self-report measurement such as an FFQ, and results show that intake of protein and potassium, as assessed by FFQ, is generally valid after adjustment for total energy intake, although underestimation of total protein intake amounted to 10–29% and of potassium to 5–6% (261, 262).

Assessment of sodium intake is complicated, as intake is not only determined by sodium in foods, as listed in the food-composition tables, but also by use in cooking, at the table, and hidden in processed foods. Urinary sodium, therefore, appears to be a good marker and is preferred over assessment using self-reported intake, but multiple 24-h urine collections are needed to average day-to-day variability (263).

DLW has been especially helpful as a recovery biomarker method. It is more elaborately described in the section on Nutritional Status. It has been used to validate total energy intake as assessed by 24HRs or FFQs in various studies (264–266). Findings show that self-reported intake generally underestimates energy intake, and that this effect is more pronounced in those with higher BMI (267).

Recently, an updated classification for exposure (intake) biomarkers has been proposed, which is based on intended use: food compound biomarkers [nutrition intake biomarkers, nonnutrient intake biomarkers, food or food component intake biomarkers, and dietary pattern biomarkers (258)]. This aligns with the development of food-based dietary guidelines. Relevant nutrients for thorough assessment of the validity of nutrition intake biomarkers are the 6 micronutrients studied and reported by the Biomarkers of Nutrition for Development (BOND) initiative—iron (268), zinc (269), iodine (270), vitamin A (271), folate (272), and vitamin B-12 (273)—which are all important from a global public health perspective (274).

For iron status, serum ferritin is the usual choice for assessment, but measurements of plasma soluble transferrin receptor and hepcidin are also used. In general, iron status biomarkers are preferred over iron-intake assessment because of large differences in bioavailability across food sources. Algorithms to weigh heme versus nonheme iron from diet are available (275, 276). The BOND Zinc Expert Panel recommends 3 measurements for estimating zinc status: 1) dietary zinc intake together with intake of phytate, to assess the amount of zinc available for absorption; 2) plasma zinc concentration; and 3) height-for-age of growing infants and children (269). Urinary iodine concentration is a reliable biomarker of recent iodine intake in populations for all amounts of iodine intake. For individuals, diurnal and day-to-day variation needs to be considered; to capture intraindividual variation requires at least 10 repeated 24-h urine collections. Intake assessments based on 24HRs or FFQs can be reliable if iodized salt is included but, like sodium, the use of discretionary salt is difficult to quantify.

For vitamin D, dietary intake accounts for only part of the body nutrient stores because sunlight can be the more important source, depending on the geographic location (conversion is more efficient near the equator), skin tone (darker skin converts less vitamin D to its active form than lighter skin) (277), season (278, 279), and age (conversion to active vitamin D declines with age) (280). Hence, serum concentration of 25-hydroxyvitamin D is used for monitoring individuals and populations rather than intake assessment alone (281, 282).

When using food or food component intake biomarkers instead of nutrients, it is important to note that only serum carotenoids for fruit and vegetables (283) and n–3 fatty acids for oily fish (284) have been well validated. However, developments in this field are rapid. Using metabolomic techniques, it is now possible to measure thousands of metabolites at once, with platforms such as NMR spectroscopy, LC-MS, and GC-MS (285, 286). Samples can be derived from various tissues and biofluids such as plasma, serum erythrocytes and leucocytes, urine, saliva, feces, cerebrospinal fluid, and hair (287, 288). The food metabolome is defined as the part of the human metabolome directly derived from the digestion and biotransformation of foods and their constituents (289). Metabolomics not only allows identification of numerous biomarkers at once but also provides the opportunity to create combinations of biomarkers to assess past food intake. Combinations of markers have been explored to assess fruit and vegetable intake, for example (290). In a recent example, a panel of 3 biomarkers—proline betaine, hippurate, and xylose—was identified in a fruit intervention study using NMR to analyze urinary samples and validated as combined biomarkers in an observational study (291).

Similarly, several studies have examined the association between biomarkers and dietary patterns (292–294). For example, in the Jackson Heart Study (294), 327 metabolites were analyzed in fasting plasma, of which 14 were significantly (false discovery rate <0.05) associated with a meat and fast-food dietary pattern in the discovery sample. Nine of the 14 metabolites were associated with the meat and fast-food dietary pattern in a replication sample: indole-3-propanoic acid, C24:0 LPC, N-methyl proline, and proline betaine were inversely associated with the meat and fast-food dietary pattern; C34:2 phosphatidylethanolamine (PE) plasmalogen, C36:5 PE plasmalogen, C38:5 PE plasmalogen, cotinine, and hydroxyproline were positively associated with the meat and fast-food dietary pattern. When validated further, these could be useful as biomarkers of a Southern US dietary pattern.

With appropriate sampling and consideration of day-to-day variation, food metabolomics holds promise for use in validation and calibration studies and can be expected to support and enhance dietary assessment. However, metabolomics remains new, and often reveals patterns of association that are difficult to interpret. The extent to which these panels of food metabolome markers can be used for dietary assessment needs more clarification. To aim for a full library of metabolites covering all food intakes is probably unrealistic, given the cost, time, and effort needed for biomarker discovery and validation. In addition, it should be noted that many components and metabolites have not yet been identified. On the other hand, technology is rapidly developing, and cheaper methods based on target markers could be developed.

New technologies

The main goal of new approaches is to assess intake in a more objective way and to reduce reliance on self-report. Several new technologies are currently being explored with the potential to improve accuracy of dietary assessment. For diet records, 24HRs, and FFQs, key issues are food identification (what food or drink exactly did you use or are you habitually using?) and portion-size estimation (how much of this food or beverage are you generally using?). New statistical techniques are being explored for combining data from multiple methods to improve estimation. For example, the NHANES briefly used a propensity questionnaire (a list of infrequently used foods, such as liver) to supplement 24HR data (295). Combining various methods has potential, as differing approaches may balance the weaknesses and strengths of others and, from this perspective, the inclusion of biomarkers is also useful. Ideally, new or improved methods will be less time-consuming, less prone to socially desirable answers, rely less on memory, minimize misreporting, and maximize retention and completeness.

Photographic images

Using images may enhance the accuracy of self-reported dietary intake, particularly for assisting in portion-size estimation. In a review published in 2018 (296), 42 new tools were identified, of which 33% used digital images to help identify foods. Two approaches can be discerned, the image-assisted and the image-based approach (297). Image-assisted approaches can be useful as part of a retrospective method (i.e., 24HR or FFQ). The participant captures all food and drinks consumed through pictures, which subsequently assists the reporting of intake and associated portion sizes (297).

Image-based approaches can also be used as part of prospective methods such as food records. The participant records intake by taking pictures before and after all food and drink consumption. AI methods can then be used to automatically identify foods and estimate portion size (298). The use of images may increase accuracy of current food and nutrient intake estimates, although self-report remains essential to determine the content of the food photographed. Currently, the burden on both participants and researchers remains relatively high when images are used. Still, this method is promising for study populations such as children, who have limited skills in literacy, writing, and food recognition (299).

Conversational agents

Some automated systems are now using a conversational agent, or chatbot, to help collect information. Rather than a live interviewer, the chatbot can ask the user questions to complete missing data or assist with food recognition. This approach may be particularly useful for individuals with functional impairment (e.g., visual or motor impairment) or with limited health literacy (300, 301).

Sensors

Detecting food intake using sensor-based technology holds promise for assisting in objective measurements (299). So far, sensor use varies in technique and concept. Acoustic sensors can be used to detect food intake via sounds of chewing and swallowing (302), inertial sensors recognize wrist/arm motions (inertial) (303), physiological sensors use skeletal muscle activity or skull vibrations (304), and piezoelectric sensors can detect changes in electric charge in response to chewing and swallowing (305). When Amft and colleagues (306) tested sound-based recognition for apple, potato chips, and lettuce, they found a 94% average accuracy of food classification based on chewing sequences, with the mean weight prediction error lowest for apples (19.4%) and largest for lettuce (31%). Sensing methods have not yet been integrated into dietary assessment; however, they do hold promise for future use.

Mobile applications

Mobile applications offer new opportunities for improving dietary assessment (307). More than 80% of the US population currently uses a smartphone (308) and use is increasing globally. Several programs have been developed for personal use, such as the Fitbit diet app (309), but do not yet include sufficient detail and quality for research application. Access to this technology does remain limited in low-literate, low-income, and remote communities.

Pictures and conversational agents, such as chatbots, and other mobile applications have the potential to assist in improving the precision of food identification. Another new technique is using the smartphone's camera as a barcode scanner to identify foods used by the interviewee (310). Scanning the Universal Product Code (UPC) printed on food packages can be used as input in digital dietary records/food diaries to derive information about nutrient and energy contents of foods consumed (311, 312). This method holds promise for enhancing detail in variation of quality within food groups, particularly if used to identify staple, frequently used foods (e.g., type of breakfast cereal, type of cooking oil, bread, etc.). Due to its efficiency, barcode scanning could reduce the burden of food recording and coding (5, 6). Participants have evaluated the barcode scanner method as comprehensive, easy to use, and nonintrusive (7).

Another handheld methodology is based on near-infrared spectroscopy, an analytical tool that measures how light is emitted, absorbed, or scattered by a sample. These spectrometers, such as the handheld SCiO Near Infrared Micro Spectrometer by Consumer Physics, can be used to detect certain compounds in a food (313) and are currently used for inspecting food quality and safety. The role of the food matrix, the measurements range, and validity are currently being investigated (287). Another new tool for food identification is the hyperspectral camera, which collects and processes information from across the electromagnetic spectrum (313, 314). To date, however, these cameras have not been adequately validated for research application and use with mobile phones.

Despite considerable work toward integrating new technologies into dietary assessment, limitations continue to exceed their benefits (315). The use of images, for example, has become more accurate for estimating portion size but remains, and will continue to remain, unable to clearly identify specific foods and ingredients without concomitant self-report. Most imaging methods still rely on investigator screening and time-consuming integration with self-report, making it infeasible for large studies (316). Compliance is a major issue and, in addition to forgetting to always use the camera appropriately, changes in intake are similar to what is seen with a diet record (305). Sensing arm movements and chewing actions has been shown to correlate with energy and types of food intakes but not at a level that improves on self-report (315). Although much progress has been made, more research and development are needed to make most of these new technologies useful and cost-effective.

Key considerations in data interpretation

For precision nutrition, which requires estimation of usual intake for individuals or at least key subgroups of the population, the FFQ, multiple 24HR, or some combination remain the best methods to assess dietary intake of a wide list of foods and nutrients. The addition of biomarkers of intake greatly enhances interpretation of these methods for some nutrients, but not others, and is likely to continue to advance our ability to quantify dietary intake in the future. All existing approaches currently have limitations, and it is important that more support and effort go into improving these and new approaches to better estimate usual dietary intake at the individual level.

Although FFQs are the most cost-effective and direct approach to assessing usual intake, existing forms are designed for specific populations and are often misused when diverse ethnicities are included in studies, leading to the potential for substantial bias (317). As research attempts to be increasingly inclusive, more effort is needed to design FFQs that are more generalizable. This may be possible with continuing advancements in computing and AI. At the same time, there is an important need to update nutrient databases (318) to include precise nutrient data for a greater variety of foods, as diet continues to change. Efforts to improve food databases globally include the International Network of Food Data Systems (319). Additionally, future work on the FFQ should allow better assessment of dietary quality within food groups. Whenever a new population is assessed, it is best to begin with a 24HR for inclusion of detail on foods, recipes, portion sizes, and dietary patterns that may differ from prior assumptions. Multiple 24HRs, when acceptable, continue to offer the most detail, including the ability to examine eating occasions and time of day.

Although statistical advances, including calibration equations, continue to be helpful, they are generally population specific with the potential for differential error and do not replace efforts to improve direct individual assessment.

In closing

In summary, much work is needed to advance our ability to capture individual usual dietary intake. It will always remain a moving target, as the food supply is constantly changing, demographics and cultural backgrounds within populations evolve, and individual behavior changes over time and throughout the life course. New computational approaches that combine data show exciting promise. To truly assess dietary intake with validity and precision requires dedication and commitment to allocate the resources to do it well. Improvements will require combinations of self-report, biomarkers, and computational advances. It is worth the effort as there is no doubt that dietary behavior is of central importance in the future of precision nutrition and health.

Genetics and Epigenetics

Introduction

People are metabolically heterogeneous, meaning they differ in the efficiency in which metabolic pathways process nutrients. This heterogeneity is, in part, due to genetic and epigenetic variation between people. People have millions of variations in their genetic code, with any one person having about 50,000 SNPs with functional consequences (320, 321). They also have gene copy number variations (322, 323) and tandem repeat (stretches of DNA that are highly variable in length) variations (324). Varying epigenetic marks on genes and chromosomes further modify gene expression and function (325). Because genetic code variations are inherited from ancient ancestors, they differ among people depending on their heritage (320, 321). Epigenetic variations are derived not only from inheritance but also from differences in life exposures and experiences, including dietary differences (325). Genetic and epigenetic variants can change the expression and function of enzymes, transporters, or receptors and their ligands (discussed later in the “Transcript methods” section), thereby altering nutritional metabolic and signaling pathways. Nutrition and energy metabolism are critical for survival and are—therefore, potent drivers of evolution and genetic modification (326). Some epigenetic mechanisms may have evolved so that humans could more rapidly sense and respond to changes in nutrient availability as diets change (e.g., shift from hunter to agricultural lifestyle) (327–329).

Examples of genetic and epigenetic variations that result in metabolic and nutritional heterogeneity abound. For example, the concentration of the vitamin D precursor 7-deoxycholesterol is increased in the skin of people with a low function variant of delta-7-sterol reductase (DHCR7 rs7944926), which enhances vitamin D synthesis (330). A common SNP in phosphatidylethanolamine-N-methyltransferase (PEMT), rs12325817-C, the estrogen-responsive gene that enables de novo biosynthesis of phosphatidylcholine, reduces PEMT’s inducibility by estrogen. Women with this SNP have an increased dietary choline requirement and are 25 times more likely to develop liver or muscle damage when eating a low-choline diet (331, 332). Increasing EPA in the diet has a different effect on HDL cholesterol depending on whether people have a specific SNP in the gene that encodes for a cholesterol efflux transporter [ATP-binding cassette subfamily A member 1 (ABCA1) rs2246293-CC]. The expression of ABCA1 is decreased by DNA methylation at the site of ABCA1 rs2246293. In people with the ABCA1 rs2246293-GG genotype, EPA can increase DNA methylation of this site, which suppresses expression of ABCA1 and, thereby, lowers HDL concentration. People with the ABCA1 rs2246293-CC genotype have decreased methylation of the gene and higher plasma HDL cholesterol than people with the GG genotype (333). Although such individual SNP effects are known, we do not yet understand fully how to integrate the effects of multiple gene variants that interact across many pathways in metabolism.

Epigenetic differences as a source of metabolic heterogeneity

Although all our tissues have the same genetic code, they differ in phenotype and function because gene expression is regulated by epigenetic mechanisms mediated by noncoding RNAs and epigenetic marks on genes or histone tails. Diet- and environmentally induced epigenetic changes are now proposed to be responsible for a significant portion of normal and disease-related phenotypic differences that cannot be explained by differences in DNA sequence (325). Nutrients interact with epigenetic regulatory enzymes and induce or repress their activity (325).

One of the best understood epigenetic marks is DNA methylation, where 5-methylcytosine forms primarily at CG dinucleotides (334). DNA methylation usually suppresses gene expression, but exceptions do occur. S-adenosylmethionine, the methyl-donor for DNA methyltransferases, is formed by pathways that metabolize a number of nutrients [methionine, 5-methyltetrahydrofolate, betaine (from choline), vitamin B-12, and vitamin B-6]. For this reason, DNA methylation is influenced by diet (325, 335–340). Diets high in methyl-group donors increase DNA methylation of specific genes (341) and can result in a permanent change in phenotype [e.g., coat color in the Agouti mouse (342) or twisted tails in Axin fused mice (336, 343 )]. DNA methylation can be oxidized to 5-hydroxymethylcytosine and further oxidized derivatives. These DNA modifications are stable but less understood (344).

Histone proteins H2A, H2B, H3, and H4 make up the nucleosome around which DNA is coiled (345), creating the chromatin structure (open/active vs. closed/inactive). These structural changes are modified by post-translational histone marks added to the amino-terminal tails of these proteins. Many of these histone modifications are sensitive to dietary intake (325). For example, histone methylation is modulated by intake of dietary methyl-donors (339, 346, 347), and histone demethylases are dependent on α-ketoglutarate and iron derived from diet and from nutrient metabolism (348–350). Microbial metabolism of the diet (such as the process that forms butyrate, which inhibits histone deacetylases) can also be important (351).

Diet can also modulate expression of noncoding RNAs that regulate gene expression and/or post-transcriptional activity (352, 353). MicroRNAs bind to messenger RNA (mRNA) that contains a targeting sequence for the microRNA and mark them for cleavage, degradation, or translational repression, depending on the gene target (354). There are specific microRNAs that regulate almost all gene products involved in metabolism. The genes for approximately 70% of microRNAs are, in turn, regulated by DNA methylation or histone modifications (325).

Genetics methods

An obvious component of putting precision nutrition into practice will be genetic testing. The genetic testing field is currently transitioning from gene-sequencing technology to chip-based analysis. Gene chips, or microarrays, use short sequences of complementary DNA (oligos) as hooks that bind to sequences of interest. Millions of oligos can be attached to a chip, which can be custom designed. Today, many commercially available chips detect known common polymorphisms that derive from a diverse group of ancestries and can detect many, but not all, variants relevant to nutritional heterogeneity. However, many chip makers offer custom versions that add additional oligos.

Gene expression microarray analysis

Gene chips are designed to profile expression levels of thousands of genes simultaneously. These chips typically immobilize 25-bp pair oligonucleotides positionally in arrays in sets of probes that either perfectly match a target gene sequence or with 1-bp mismatches to account for nonspecific binding. Thus, these arrays are dependent on the set of sequenced and annotated genes in a genome. The targeted genes and specific probe design, which vary between manufacturers and in different products, typically has as many as 20 different probes for different regions of 1 target gene. The frame size per probe on the microarray has been reduced to 5 μm or fewer squares so that as many as 30,000 genes can be targeted (355, 356). There are technical considerations when using gene chips, and algorithms are used to correct for nonspecific binding and to determine average expression across the multiple probes used for each targeted gene. Chip genotyping has been slow to respond to the latest discoveries in genetics because of the cost and time required to update microarray chips with newly discovered genetic targets. It is difficult to select all the important microarray candidate gene variants with functional effects on nutritional metabolism because they are not known. In addition, not all selected variants will work in a microarray assay, as there are limitations associated with the hybridization efficiency of the array. Chip genotyping is not the best for discovery because the genetic targets on a microarray are preselected.

Whole-genome sequencing

While gene chips need to be constructed based on publicly available data on functional gene variants, whole-genome sequencing is not limited by this, allowing for high discovery potential. Because gene sequencing provides information about the entire genome, it can be used to not only identify known gene variants but also to discover previously unknown gene variants. Overall, this method has much greater flexibility than microarrays, as the hardware does not need to be redesigned to study each new gene variant. Today, gene sequencing is 2 to 3 times more expensive than using chip technology but, as prices fall, it likely will make chip technology obsolete. Chip technology is currently used in almost all genetic testing that is available directly to consumers.

Transcript methods

Downstream of gene expression, or transcription, RNA is formed. Measuring RNA can be helpful for identifying functional effects of genetic and epigenetic variation. An important limitation of measuring RNA is that it is much less stable than DNA. RNases—enzymes that degrade RNA—are ubiquitous and, thus, the laboratory equipment and reagents for RNA isolation and analysis are typically separate from those used for other analyses. Samples are homogenized in buffer that quickly denatures RNase, and then total RNA is rapidly separated from DNA and proteins. Commercial buffers (e.g., TRIzol reagent made by Thermo Fisher Scientific) allow rapid isolation, including obtaining total RNA, protein, and DNA fractions from the same small sample.

Northern/slot/dot blotting

Specific RNA sequences are detected by blotting and hybridization analysis using techniques very similar to those originally developed for DNA. The original DNA blotting procedure was named Southern blotting after its developer, Edwin Southern. The RNA blotting procedure was analogously named Northern blotting, in part because RNA is osmotically drawn upward to bind a membrane (357). The following methods are used to study mRNA and noncoding RNAs, including microRNAs and small interfering RNAs. The core steps in the 5 major methods in Figure 2 can be applied for RNA analysis by in situ hybridization or in single cells or subcellular organelles, by measurement of mRNAs being actively translated on polysomes, or by analysis of nascent (newly formed) mRNAs (356). Techniques are also available to study RNA structure or RNA–protein interactions (356), such as in RNA interference or RNA silencing, in which small interfering RNAs target specific mRNAs for degradation (358).

FIGURE 2.

FIGURE 2

Transcriptomics methods. Shown are schematic comparisons of the steps used in 5 methods starting with total RNA isolated from a sample. Shown in red are the steps where sequence-specific probes or primers are added to target individual transcripts. In each scheme, the final box and graphic indicate the method of detection. Subsequent computational analysis is needed to determine differential expression.

For Northern analysis, RNA is separated by size using gel electrophoresis. This must take place under denaturing conditions to prevent single-stranded RNA from forming secondary structures that do not separate reliably (Figure 2). After counting, the bound labeled probe can be stripped from the membrane, and the membrane re-analyzed with a second probe specific for a control RNA, or for an additional target RNA. This allows the level of the target mRNA to be expressed as a ratio with the level of a control mRNA (357).

Ribonuclease protection/gel retardation/electromobility shift analysis

Ribonuclease protection uses labeled single-stranded antisense RNA probes that hybridize to the target mRNA to form RNA duplexes, which are then treated with RNases that degrade only single-stranded RNA. The resulting labeled double-stranded RNA are subsequently separated by electrophoresis, and then detected just as for Northern blotting (Figure 2). Simultaneous analysis with multiple labeled probes of different lengths upon electrophoresis and blotting allows detection and quantification of multiple transcripts and control RNA in the same blot.

RT-PCR

The innovation utilized for RT-PCR is thermal cycling of the polymerase reaction to generate a million (20 cycles) to a million-million (40 cycles) copies, allowing quantitative detection of small initial samples. For RT-PCR, total RNA from a sample is reverse-transcribed by RNA-dependent DNA polymerase, generating a complementary DNA library containing a single copy of each mRNA in the original sample (Figure 2). Pairs of ∼20-base-long DNA primers are designed for ∼150–200 base-long sequences of the target mRNA. As practiced today, individual reactions are conducted in 96- to 384-well plates in thermocycler spectrophotometers that continuously monitor the number of copies at each cycle. Today's thermocyclers can complete analysis of a plate in 2 h, allowing analysis of over 100 samples in triplicate for an individual gene or over 100 genes in the same sample, or a mixture of the 2 (359).

RNA sequencing

RNA sequencing advanced transcriptomics because it can sequence the full mRNA population in a sample, unlike earlier methods that required sequence information about the targeted genes. RNA sequencing is typically rapid and can handle multiple samples at a time; hence, it is often referred to as high-throughput sequencing. This technique depends on a second sophisticated refinement, where each individual small nucleotide from the sample is copied (so-called bridge PCR) to obtain sufficient fragments in a micro-cluster that can then be sequenced by PCR. The trade-off is that RNA sequencing requires sophisticated kits and equipment that are typically high cost (356).

Isolation of mRNA and preparation of double-stranded complementary DNA for RNA sequencing are the same as outlined for the earlier methods (Figure 2). Today, RNA sequencing typically yields expression values for close to 20,000 mammalian transcripts in an individual sample. Involved normalization of expression data for each sample and correction for false discovery rate (FDR) are needed to determine whether there are significant expression differences between diets, exposures, or treatments (356).

Epigenetic methods

A major consideration when measuring epigenetic modifications in people is the timing of the measurement relative to many exposures (336). Some epigenetic modifications are responsive to changes in diet and environment. Certain epigenetic marks remain modifiable throughout the lifespan, while others are modifiable only during specific windows of development. In people, infants born at different times of the year may be epigenetically different because they were exposed to different nutrients before birth. For example, seasonal variation in methyl-group content of mothers’ diets is associated with different patterns of DNA methylation (340). Although changes in some epigenetic marks and expression of noncoding RNAs can occur at any time during life, many are most susceptible during sensitive windows in time during early development and are usually then maintained by mechanisms that ensure faithful copying of DNA during cell replication.

Dietary exposures can induce epigenetic changes (335–340), and perhaps, sensitive windows provide the opportunity to retune metabolism if the infant is born into a dietary environment markedly different from that expected based on the environment in utero (325). The epigenome appears more susceptible to environmental factors during periods of extensive epigenetic reprogramming in early life, particularly during the prenatal, neonatal, and pubertal periods.

Other dietary components also can modify DNA methylation, including fat (13), protein restriction (25), and some bioactives (epigallocatechin-3-gallate, genistein, polyphenols, etc.). Therefore, information on diet composition is needed to interpret epigenetic data.

Methods for assessing epigenetic changes are relatively mature (342, 344); however, epigenetic marks and noncoding RNAs are usually tissue specific, and investigators do not always have access to the tissue of interest. Epigenetic analyses of DNA from lymphocytes are unlikely to show the same marks as DNA from the liver, muscle, brain, etc. Perhaps soon it will be possible to use circulating cell-free DNA released from such tissues into the bloodstream to assess epigenetic marks from hard-to-access tissues (360). Investigators are trying to determine whether epigenetic marks in blood-based or blood cell–based DNA have usable associations with epigenetic changes in less accessible tissues (361, 362). The development of chromatin immunoprecipitation methods, in 1984 (363), made it possible to identify histone proteins bound to DNA and enabled characterization of their role in epigenetic regulation of the genome.

Key considerations

Applying methods and interpreting data

Methods used for measuring genetic variations or epigenetic modifications are limited by the fact that the algorithms developed for interpreting relations between genetic variations or epigenetic modifications and health outcomes are no better than the data upon which they were developed. To date, datasets have been small, because they are often derived from people willing to volunteer for research studies (likely not reflective of diversity in the general population). Although dietary intake data are needed to provide context for the genetic and epigenetic data when developing an algorithm, this information is often missing or of questionable quality. These limitations likely cause greater errors than are inherent to the methods themselves.

The accuracy of genetic methods relies on the fidelity of base-pairing in hybridization. Northern blotting, RNase protection, RT-PCR, and microarray analysis depend on the sequence accuracy of the probe, and thus are limited to analysis of annotated genes. RNA sequencing, in contrast, determines sequences using the mRNA in the sample. A real weakness of all genetic and transcript methods, and specifically microarray and RNA sequencing, is that these methods depend on the accuracy and completeness of the annotated and sequenced genes in the genome (364). RNA sequencing assembles, or maps, the short sequences to the annotated genome. Mis-annotated genes are likely to be missed or reported as 2 or more transcripts; other detected transcripts may match to unannotated, uncharacterized unknown genes, limiting the further characterization of their biochemical functions. A recent comparison of RNA sequencing with microarrays found that RNA sequencing identified more differentially expressed protein-coding genes and provided a wider quantitative range of expression-level changes (365). In this study, approximately 78% of differential expression transcripts identified by microarrays overlapped with RNA sequencing differential expression transcripts. RNA sequencing, however, also identified differential expression of noncoding RNA.

Diet challenges

Sometimes the effects of genetic or epigenetic variations only become apparent when metabolism is challenged. For example, premenopausal women with the PEMT rs12325817-C variant described earlier have impaired endogenous synthesis of choline but can overcome this problem by eating more choline. Their metabolic perturbation is only obvious when they eat a low-choline diet. Alternatively, metabolic inefficiencies caused by genetic or epigenetic variations are revealed only when people eat too much of a nutrient. For example, when people consume a diet low in saturated fat, the APOA2 rs5082-CC SNP has no effect, but when a diet high in saturated fat is consumed, the people with the CC genotype have a higher BMI while the TT and TC genotypes do not (366). Thus, measuring genotype is not sufficient to identify people who will experience an adverse outcome; it is important to consider dietary intake when interpreting the functional effects of genetic and epigenetic variation. Genomewide association studies have identified remarkably few gene variants associated with metabolic heterogeneity because most of these studies are based on datasets that do not include diet intake.

Today, transcriptomics studies most commonly involve comparisons between 1 treatment and 1 control, resulting in a huge number of differential expression transcripts with P < 0.05 and often still with a large number of q < 0.05 differential expression transcripts. Care should be exercised to adjust for FDR when reporting differential expression in bioinformatics-driven research.

Experimental design

Careful assessment of overall health is critical for ensuring that reported differences are directly associated with the treatment variable, rather than downstream effects due to poor growth, disease, or other conditions. Subsequent functional analysis may otherwise report numerous significantly affected pathways and functions that are not directly caused by the treatment.

The 2-treatment design, however, may not accurately describe the overall biology; this limitation is especially important for interpreting nutritional transcriptomics studies. For instance, comparing effects of nutrient excess with a nutrient deficiency may identify a large set of differential expression transcripts changed due to a deficiency versus an adequate status, but these may not be present when comparing excess nutrient with an adequate status. Thus, selection of the control treatment is very important.

Care should be exercised when results from 2-treatment designs in nutrition research are extrapolated and generalized. Completely different results in a separate study may indicate that other variables have far more influence than changes in the study nutrient. Use of multiple-graded levels of nutrients in a single study should yield cohesive results and are best used before ascribing the differences to general effects of the nutrient (367). The subsequent pathway and biological function analyses are also limited to examining only known processes—the streetlight effect—and do not evaluate yet-to-be-identified gene sets.

Data analysis and availability

The raw reads obtained from RNA sequencing can offer additional opportunities, as the resulting small sequence fragments for both coding RNA and noncoding RNA can be assembled de novo into transcript abundance data (e.g., CuffLinks software) without dependence on the annotated genome (356). This de novo assembly can be especially useful for species with genomes that are less thoroughly annotated or are distinct from human and rodent genomes.

Available computer programs can correct raw sequence expression data for background and normalize each sample to generate expression data (368). Often, transcripts found in low abundance in all samples are removed from further analysis.

Statistical analysis using open-source statistics software (e.g., edgeR) can identify differentially expressed transcripts, with P values adjusted for multiple testing to adjust for FDR (often identified as a q-value) because 1 in 20 (P = 0.05) is not appropriate when analyzing datasets with thousands of data points (369). Usually, expression of individual transcripts isrelative to expression in a control treatment, which is known as the differential expression value (355, 356). The result is a list of individual transcripts that are differentially expressed relative to the control. Stark and colleagues (356) provide a summary of common software programs used for computational data analysis.

Additional functional analysis is often used to further identify pathways and biological functions that are collectively enriched (up or down) in a dataset, as compared with a control dataset (341). Two such approaches are ingenuity pathway analysis (370) or gene ontology analysis to identify pathways, biological processes, and molecular functions (371). Again, an FDR of q < 0.05 is considered significant in gene ontology. Gene set enrichment analysis is another approach that iteratively evaluates transcript expression data at the level of gene sets to detect changes in pathways and biological processes that are coordinated at a more subtle level than found by differential expression analysis of individual genes (372).

Most journals and NIH-funded studies require that microarray and RNA sequencing data be made available upon publication in a repository such as the Gene Expression Omnibus (373) or the National Center for Biotechnology Information's Sequence Read Archive (374). These repositories allow researchers to take advantage of previously conducted studies in their area.

In closing

Below is a list of current needs for genetics and epigenetic methods as related to personalized nutrition.

  • Catalog of individual gene variants and epigenetic modifications that result in functional differences in metabolism, nutrient requirements, and effects of diet on health outcomes. This requires collecting appropriate data from diverse populations, improved methods for collecting diet intake data, and better accuracy and completeness of annotated and sequenced genes. This will help provide insights into how functional effects are influenced by diet and other exposures.

  • Algorithms that integrate the combined functional effects of multiple genetic and epigenetic variations across the many pathways of metabolism.

  • Better methods for cross-correlating and validating genetic, transcriptomic, and epigenetic data with other omics data (like metabolomic, microbiomic). This will help reveal whether functional changes in genes correlate with perturbations in metabolites measured.

  • Better study designs and statistical and bioinformatics approaches for conducting studies in people that validate the use of information on genetic and epigenetic variation for the development of diet interventions and recommendations.

  • Better regulatory oversight to ensure that use of genetic and epigenetic variation data is not limited to the wealthy, that such data have appropriate privacy protections, and that health claims made for such testing are adequately supported by evidence.

  • Consideration as to what level of data and study design is needed to use knowledge of a group's common genetic and epigenetic variants to develop policy recommendations for that subgroup of the population. Will it be possible to eventually reduce the size of such subgroups to achieve almost-personal recommendations?

  • To deliver personalized nutrition, accelerated efforts should be made to integrate patient data on nutrient status into existing and future databanks, such as the Human Genome Project (375), Human Variome Project (376), 1000 Genomes Project (377), and 100,000 Genomes Project (378, 379).

Microbiome

Introduction

It is now recognized that the human microbiome influences interindividual variability in response to diet and the environment (380, 381). The microbiome is made of the genetic material of microbes (e.g., bacteria, and lesser components, such as fungi, parasites, etc.) found on the skin and in the urogenital system and gastrointestinal tract. The microbiome for just 1 person can contain more than 33 million genes, exceeding the number of genes in the human genome by more than 100-fold (382, 383). The microbiota composition across different parts of the body shows intraindividual variability (380), and most research has focused on the gut microbiome. This section will focus on the gut microbiome, given that the gastrointestinal tract is the major site for diet–microbiome interactions. However, other body sites can also be influenced by diet, and many of the considerations for gut microbiome research can be applied to studies of other sites.

The gut microbiome has been associated with numerous diseases, such as cardiometabolic conditions, inflammatory bowel disease, certain cancers, asthma, and some neurological disorders. Associations with markers of systemic inflammation such as lipopolysaccharides, cytokines, and C-reactive protein have also been reported (382, 384, 385). The gut microbiome also plays an important role in gut endocrine function and intestinal mucosal integrity (385). From a precision nutrition perspective, recent studies have indicated that the interindividual variability of the gut microbiome may show patterns across the population (386). Studies suggest that only 2% of this variability is related to genetics (387), and other influencing factors include age and life stage, sex, and health status (382, 384). Moreover, emerging animal studies have suggested a circadian rhythmicity of the gut microbiome (388). Dietary factors, such as macronutrient composition (e.g., protein vs. carbohydrate) and customary dietary patterns (389–391), are coming to the forefront as having major impacts on the composition and diversity of the gut microbiome. Recent studies have also suggested that the gut microbiome's makeup could be predictive of the effect of dietary influences on health outcomes in individuals, allowing personalization of advice (382). The dynamic nature of the microbiome, including effects from environmental cues and emerging evidence of circadian cycles, suggests that the optimum diet for an individual will likely change over the lifetime, and may even fluctuate throughout the day or week. For example, the 2020–2030 NIH Strategic Plan for Nutrition Research noted the role of diet in the microbiome–health relationship as a key objective for understanding and applying precision nutrition (392).

A full review of microbiome and health is beyond the scope of this section. However, due to the complexity of the microbiota, the many factors affecting it, and its multifaceted relationship with health outcomes, a variety of approaches are necessary to understand the interplay among diet, nutrition, and the microbiome. The field is in its infancy and methods are still developing. Much effort is being made to harmonize methods and establish standard approaches, and several recent reports have been published. Notably, many authors indicated that past studies lacked or included insufficient information on dietary descriptions for participants, limiting the knowledge base on diet, microbiome, and health (381, 393). Approaches for studying the microbiome include in vitro laboratory analyses/in silico models, animal models, and various types of human studies such as RCTs and observational studies. These approaches will be discussed in terms of relevance to microbiome and precision nutrition.

Strengths and limitations of approaches

Approaches that range from dynamic in vitro multicomponent fermentation systems and animal models to human clinical interventions and cohort studies have been utilized for addressing questions on the microbiome, diet, and health. In vitro model systems can be helpful in assessing microbial-generated metabolites from different macronutrients, micronutrients, and nondigestible substances, but are limited in addressing precision nutrition concepts. The most common approaches used to understand microbiome–health relationships have involved animal models, primarily gnobiotic rodents (models in which all microbes are controlled or not present) with human fecal transplants and human intervention (e.g., RCTs or prospective cohort studies). Animal models allow more genetic, diet, and environmental control and contribute to mechanistic understanding but are limited in translation to humans, whereas human microbiome intervention studies may be directly applicable to humans but are generally small and challenged by the high interindividual responses seen with diet. Prospective cohorts can provide data from a broader population, but due to the number of confounding factors that influence the microbiome, are not definitive with respect to cause-and-effect relationships. In addition, the complexity of the human gut microbiome and individual differences in responses to the same interventions make it challenging to design studies that can be translated to clinical and public health applications.

A recent review of in vitro, animal, and human findings of how oat consumption influences the microbiome and health provides an example of using differing approaches and the types of information that can be gathered from each approach (394). These authors reported that in vitro studies contributed insights into the types of metabolites produced by microbiota from oat ingredients, which include SCFAs. In particular, the in vitro data provide an understanding of how the rate and production of SCFAs are influenced by the oat's physiochemical properties, such as form (e.g., bran or a dehulled whole-grain flake), particle size, and molecular weight. This type of data can be helpful in defining targets to assess in human studies and for identifying possible effects of processing and food-source changes. Data from human studies provided important evidence that oats preferentially enriched certain beneficial bacteria, decreased fecal pH, and in some studies, increased SCFA production. Overall, the authors noted that factors such as the wide variety of ingredients studied, differences in amounts of ingredient tested, duration of interventions, and, in human studies, participant health and background diets made direct comparisons and definitive conclusions difficult. They called for investigators performing future studies to develop best practices and provide more details, especially on the products studied, to provide more insight into the relevance of the findings for diet and health.

No specific experimental technique has received consensus as the optimal approach to address a question; rather, challenges such as the complexity of the microbiome–diet relationship and limitations on sampling have led to a combination of different approaches being necessary to shed light on specific diet–microbiome–health relationships (395–397). In a commentary on the role and use of different experimental systems for microbiome research, Douglas (398) summed up the situation, stating, “Microbiome science benefits from the coordinated use of multiple systems, which is facilitated by networks of researchers with expertise in different experimental systems.”

Methods that can contribute to the overall understanding of the microbiome, diet, and health relationship are discussed in more detail below. The most recent publications on best practices are included where available.

In vitro laboratory models

Early studies on microbiota used static methods to simulate digestion and fermentation and have provided important details on the biochemical aspects of digestion and fermentation of many carbohydrates. These systems were designed to mimic physiological conditions with respect to temperature, agitation, pH, enzyme, and chemical composition but lacked other aspects of digestion such as shearing, mixing, hydration, and peristalsis (399). In addition, these systems were challenged by the ability to culture bacteria, and generally only represented the aerobic bacteria of the distal colon. The 1990s and 2000s saw the development of several dynamic in vitro systems that simulated digestion through the stomach to proximal and distal fermentation, including anerobic compartments that attempted to include fermentation of the small intestine and proximal large intestine. These models include compartments for the stomach, small intestinal tract, and large intestinal tract, and mimic digestion with acid and agitation, digestive enzymes, and the addition of other constituents such as bile salts and pancreatin, with small and large intestine compartments that are inoculated with batch human fecal matter to mimic fermentation. Major models used today include the Simulator of the Human Intestinal Microbial Ecosystem, TNO Gastrointestinal Models (TIM and TIM-2), and the computer-controlled SIMulator Gastro-Intestinal model (400–403).

In vitro fermentation systems can provide helpful information on microbiome–diet connections, particularly metabolites and preferential substrates for specific bacteria (394). For example, these models can reveal differential and rapid effects of high-protein compared with high-carbohydrate diets on microbial composition and the resulting differences in metabolite generation (404). This allows valuable information to be obtained when comparing different substrates. However, the inoculum used varies across studies with these models, and the ability to address precision nutrition questions has not been shown. Although these models currently use batch microbial mixtures or well-defined microbe blends, it is possible that, as more knowledge on the different patterns of microbial populations is gained, precision nutrition questions could be addressed with in vitro fermentation systems.

In vitro systems allow examination of controlled interactions between defined gut microbe mixtures and substrates such as nondigestible carbohydrates and polyphenolics that are known to escape digestion, while also providing valuable information on metabolites formed from these components. A recent review comparing the primary in vitro simulator approaches for studying prebiotics highlights another major challenge with these models: the absence of small intestinal hydrolytic conditions (405). The limitation of not emulating the mucosal environment, which includes the brush border, is an important consideration in assessing use of these models given that most carbohydrates are hydrolyzed in the small intestine.

A physiological model of the gastrointestinal tract requires numerous cell types, including absorptive and secretive epithelial cells, Paneth cells, and goblet cells. In the last decade, multicellular organoid (e.g., gut-on-a-chip) technologies have been developed that include multiple cell types. Given the important role of the gastrointestinal tract in immune function—more than 70% of immune cells line the intestinal tract—these organoid systems can play an important role in increasing the understanding of the effects of diet components and factors such as food form on metabolism, nutrient transport, and barrier function. Methods for including gut microbiota in these organoid systems are being explored, including microbes cultured under anaerobic conditions (406, 407) or cultured from individual human intestinal biopsies, which supports use in precision nutrition (408). However, these technologies are in early-stage development and standardization for replicability and robustness of data is needed (407).

Animal models

Animal, or preclinical, models have provided important mechanistic insights into the microbiome's involvement in, and treatments for, conditions such as inflammatory bowel disease, Clostridia difficile infection, allergic diseases, metabolic syndrome, and diabetes (409–411). Benefits of different animal models and challenges in extrapolating the findings to humans have been reviewed extensively (393, 396, 396, 410, 412). A variety of animal models have been used, including zebrafish, Drosophila, Caenorhabditis elegans, pigs, and dogs (393, 398). However, most studies on diet–microbiome–health use rodent models, primarily mice. Mouse models include genetic sister strains, knock-out disease models, and surgical interventions in which a specific disease is targeted. Specific to the microbiome is the gnobiotic mouse model, in which human microbiota are transplanted into germ-free animals or mice pretreated with antibiotics to remove native mouse microbiome.

Animal studies offer flexibility, easy sample collection, and the ability to use procedures not available in humans. Many environmental, developmental, and biological (e.g., genetic, epigenetic) factors can be controlled, or at least semi-controlled, in animal studies, which allows for the study of mechanistic and metabolic relationships in disease development and progression from changes in microbiota profiles. For example, gnobiotic mouse models have led to significant insights into conditions such as pregnancy-induced increases in adiposity and asthma (413). Animal studies are lower cost and can be conducted over a shorter time, and some findings have been reproduced in humans. Another advantage is that the diets, which represent ∼60% of the variation in the gut microbiome of mice, can be specifically defined and tightly controlled (393). Light–dark cycles can also be controlled.

As with other areas of science, extrapolation of findings from animal models must be conducted cautiously as there are many biological differences between these models and humans. Many findings in animals have not been replicated in humans. For example, an estimated 80% of therapeutics shown to be safe and effective in animals fail in humans (411). Some reasons for this include that the mouse's gastrointestinal tract is anatomically different than that of humans (409, 410). Additionally, in contrast to mice, diet has been proposed to explain only around 10% of the interindividual variation in people (393). Genetically altering animals comes with metabolic changes that may impact outcomes as well. In particular, germ-free animals have altered immune systems that can complicate extrapolation of findings (384, 396). In many animal studies, it is not known whether human microbiota were successfully transplanted or whether the disruption or dysbiosis patterns observed in humans have been replicated (413). In addition, most human studies use fecal samples, whereas mouse studies usually involve cecal sampling (410).

Although mouse models of human diseases have been invaluable for mechanistic studies, thanks to the ability to control numerous factors, animal handling and conditions can vary greatly across laboratories. The use of inbred rodent strains, standardized environments, and semi-purified diets has helped reduce experimental variability and allowed investigations from different laboratories to be compared and has improved replication. However, the diets used for animals are not comparable to those of humans with respect to the range of foods and components, such as fiber (393, 411). Emerging science suggests that certain volatile compounds present in bedding can influence the microbial output, for example, and therefore, environments in which rodents are kept may represent a confounding factor (414).

Harmonization of design and reporting is necessary for improving replicability and translation to humans. Several recent reviews identified important factors for designing and reporting animal studies (396, 411–413). In addition, identifying optimal diets for comparison to human studies is still an emerging area. A summary of animal model considerations published in recent key reviews on studying the effects of diet on the microbiome is provided in Table 1, and attention to these factors in design and reporting should be considered to improve replicability of animal models.

TABLE 1.

Compiled considerations for animal model studies on diet–microbiome interactions

Parameter
Study design, randomization
  • Use age-matched animals (396)

  • Consider and report type of randomization used and document whether littermates were randomized across groups (396)

Sample size
  • For humanized mice, the experimental unit is the human donor; the number of mice are only replicates (413)

Acclimation period
  • Document and report how long after animals were received, experiments were started (396)

  • Document and report how long after microbiome transplant, experiments were started (396)

Background strain
  • Strains differ in their microbiota composition or metabolic patterns; report background strain (415)

Cohort
  • Report number of cohorts that replicated experiment findings and suppliers and animal house materials used (396)

  • Keep a record of litters (396)

Controls
  • Conventional mice should be used as a control in gnotobiotic diet-microbiome studies (413)

  • Report details for controls (e.g., genetic models, surgical sham, vehicle controls (396); with genetic models, it is recommended to use ≥2 distinct strains/models since separated breeding between strains over time can result in microbiota differences (396)

Diet/food intake
  • Describe energy density and nutrient composition of diets (396)

  • Confirm/define whether diet is consistent across animal facilities (396)

  • Control-chow versus defined-control diet; chow is not standardized (396)

  • If diets are changed, report changes in food consumption (396)

  • Consider coprophagia in design and reporting (409)

  • Length of diet-acclimation period should be considered and reported (396)

  • Add human diet components to increase translatability (411)

  • Semi-purified diets usually only have cellulose as fiber source (411), and defined rodent diets have casein as primary protein source (393)

  • If using chow, use open formula because it does not vary in ingredient composition (393)

  • AIN-93 does not have fermentable substrate, which leads to reduced microbial diversity over generations (393)

  • Purified diet requires some fermentable substrate (393)

  • Acidified water decreases infections but changes the microbiome (393)

Housing
  • Phenotypes can disappear after a mouse house renovation (415)

  • Clean facilities limit microbes, and each mouse house harbors distinct microbes pool (410)

  • Report cohousing versus individual housing (avoid eating feces, energy expended to keep warm, cage effects) (396)

Germ-free mice
  • Confirm gnotobiotic status with 16s rRNA sequencing (396)

  • Avoid antibiotics in water (396)

  • Consider that factors such as diet, lifestyle, phenotype, genotype that impact the microbiome in human donors may be absent or express differently in the recipient mice (413)

  • Prevalent microbial taxa of the human gut may not successfully colonize the mouse gut (413)

  • Conventional mice may be a better model for human–diet–microbiome interactions, as humans also have an adapted microbiome (413)

  • Consider the differences in immune function in different mouse strains (412)

Microbiome transplant
  • In humanized mice, microbiota should be sequenced to determine if microbiota patterns are replicated (413)

Fecal sample
  • Report pellets versus cecum material (396)

  • Clearly document and report form (oral vs. other routes), frequency, and duration of dosing with respect to fecal sampling (396)

Circadian rhythm, environmental
  • Unlike humans, mice eat mostly at night, exposing their gut to different microbes throughout the day, which can affect their circadian rhythm (415).

  • Microbiota from wildlingmice show seasonal shifts in gut microbiome possibly related to diet transition from insect- to seed-based diets; captive mice are not subject to these shifts due to consuming a similar diet over time (415)

  • Results can be impacted by location, including factors like temperature, humidity, and altitude (412)

Human studies

Human studies such as intervention studies (e.g., RCTs), observational cross-sectional studies, case-control studies, and prospective cohort studies are important sources of evidence for efficacy and effectiveness of interventions. RCTs, in particular, can provide semi-mechanistic or cause-and-effect data. Cross-sectional and case-control studies allow identification of dietary composition and metabolic or clinical profile associations with microbiome composition and diversity patterns. For example, ethnography studies have shown the effect of culture and food patterns on microbiome composition and diversity, and differences have been noted between vegans and omnivores as well (386, 416, 417). While not cause-and-effect, these studies have identified factors that should be reported and/or controlled for in RCTs. For example, associations have been reported between microbiota patterns and demographic factors such as age, BMI, sex, disease/health status, ethnicity and cultural identification, geographic location, living structure, and socioeconomic environment from cross-sectional studies (397, 418, 419). These studies can help form hypotheses for further research.

RCT intervention studies are the gold standard for efficacy and cause-and-effect relationships. Clinical trial standards have been developed for designing pharmaceutical interventions for disease. However, human nutrition RCTs come with unique considerations, in part, due to the challenges in defining what constitutes a healthy population and the small changes (i.e., low signal-to-noise) in nondisease outcomes. Controlling diets and addressing confounders is more challenging in nutrition studies because test foods and dietary components must be studied in the context of background diets and due to the fact that dietary practices are influenced by socioeconomic and cultural factors as well as religious experiences and beliefs. To improve rigor, consistency, and reliability of nutrition intervention data, a series of publications on the design, conduct, documentation, clinical data management, analysis, and reporting of nutrition RCTs were recently published under the auspices of the Tufts and Indiana Clinical and Translational Science Institutes (420–422). In addition, guidance for conducting RCTs that investigate relationships between the microbiome and specific health outcomes has also been published (396).

Longitudinal prospective cohorts provide assessment of larger populations over longer times. Given the complexity of our diet, which includes an estimated 26,000 unique food chemicals and more than 9,000 unique foods, cohorts allow representation of the real-world variability in dietary intake (395, 423). However, dietary intake data collection that can identify this range of foods is necessary and can be difficult in a cohort study. In addition, detail on food preparation, food form, and composition is important to obtain. Given the challenges in controlling interindividual variation, it has been recommended to obtain multiple consecutive microbiome samples per study time point or phase (395). Considerations for gut microbiome assessment in epidemiological studies have also been published (424). Community-based clinical or longitudinal clinical studies are designed to help with this challenge by obtaining data in real-world settings, while controlling some confounders such as diet. However, these approaches can be expensive and tracking diet in detail is still challenging.

Several recent publications have addressed study design, diet collection, and other considerations specific to microbiome–diet studies for precision nutrition (395, 425, 426). However, most human studies do not sufficiently collect or report dietary intake data. When designing new studies, it is important to keep in mind that food components and dietary composition can affect the microbiota profile and the metabolites produced from the microbiome; therefore, both effects should be considered. Microbiota can respond rapidly to dietary changes, sometimes in as little as 24 h. Also, due to the transit time of the gastrointestinal tract, the diet from 2–3 d prior to sample collection can impact microbiome data; therefore, data should be collected for a minimum of 48 h prior to starting fecal collection.

A major challenge in human studies is addressing interindividual variability in the microbiome and controlling for numerous confounders. Some interindividual differences relate to documentable participant characteristics such as age, sex, history, medication use (particularly antibiotics), and presence of diseases or conditions such as infections or immune dysfunction. Confounders such as physical activity, lifestyle factors, and circadian effects also need to be documented and controlled or monitored. Other factors, such as history of being breastfed and early solid-food feeding, may also affect microbiome patterns but it is difficult to obtain detailed information retrospectively. A specific challenge with longitudinal cohorts is that environmental changes and events that occur over the life course—including varied exposures to microorganisms, occurrence of disease, and hormonal and age-related changes—affect the gut microbiota (384). It can be as difficult to discern causation from association in RCTs. These studies tend to be shorter and smaller, making it difficult to capture seasonal variations in a person's microbiome and to address interindividual variability. For human studies, only limited collection strategies are available. These strategies include stool sampling, mucosal tissue sampling of the distal gastrointestinal tract, and lavage and swab sampling. Recent reviews that include considerations for human studies investigating the effects of diet on the microbiome are provided in Table 2.

TABLE 2.

Compiled considerations for human studies on diet–microbiome interactions1

Parameter Considerations and recommendations
Study design
  • Double-blind, placebo-controlled RCTs, whether parallel or crossover designs, are the most rigorous and suitable for human studies. Crossover studies allow each participant to act as their own control, mitigating some interindividual effects and enabling investigation of responder vs. non-responder status. Parallel studies require less commitment from study participants and less complex data analysis (427)

  • Crossover studies with a washout period between treatments is an ideal study design (428)

  • Controlled-feeding studies are best for measuring biological effects of diets. However, these studies are expensive and not applicable for real-world conditions (393).

  • Include lead-in periods before the study start and during the end of the washout period to stabilize lifestyle factors (428)

  • The washout period in crossovers should be at least a couple weeks, but the optimal time is not known (393)

  • Report study length and washout periods (396)

Sample size
  • Case-control and cross-sectional studies should include large sample sizes (e.g., 400–500) based on detecting 5–9% of differences in major taxon abundances (395). However it may not be possible to calculate sample size when the effect of a particular dietary intervention on specific bacterial taxa is unknown for the target population (427)

  • Current EFSA guidance does not specify effect sizes for biological relevance. An increase of around 0.5 units of Shannon diversity index may be biologically relevant based on data from obese and lean people (427)

  • Crossover studies have smaller sample sizes than parallel studies due to using participants as their own control. This helps with the issue of interindividual variation in the microbiome (427)

Participant microbiome and bowel habit considerations
  • Epidemiological studies have reported that 20% to >80% of study participants return stool samples. Consider participant response rate and representability prior to initiating a population-based microbiome study and include timing flexibility of shipping samples after stool collection (424).

  • Consider baseline microbiome profile as possible stratification or exclusion requirement where a 1–2-wk turn-around from collection to sequence is possible, particularly for parallel studies and longitudinal studies, or use an interspersed treatment design that includes baseline microbiome (427, 428)

  • Stool consistency, transit time, and timing of fecal sampling can influence the microbiome and should be considered in managing background noise in data from intervention studies (427)

Participant demographics
  • Collect and report age and sex, and whether participants were randomized based on these factors (396, 428)

  • Collect and report ethnicity, birth location, immigration history, and cultural identification. These factors may be relevant to background diet and lifestyle (396, 428)

  • Discrete age ranges should be considered because infants, adults, and the elderly exhibit differences in microbiota composition, but there is no agreement on when the microbiota becomes adult (427).

  • Gender differences in gut microbial composition and in microbial response to dietary components may exist (427)

Participant socioeconomic and environmental characteristics
  • Collect and report geographic location (i.e., city/town, country), living structure (e.g., parents with children, single, shared housing, multigenerational) as well as socioeconomic and rural/urban environment factors (396)

  • The variation across people's microbiome arises in infancy, and a person's microbiome responses may be due to variation in GI bio-geography, complex community interactions, and random events (428)

  • Approximately 20% of the overall microbiota variation is due to diet, anthropometry, and medication (427)

  • Record environmental factors including social-environmental conditions such as crowding, family composition, and family size for interpretation of results (427)

  • Household pets are rich source of microbes for children (427)

  • Hygiene and antimicrobial product usage practices can influence microbial exposure (427)

Participant health/physical characteristics
  • Report BMI and whether there was a difference between treatment groups before and after intervention (396)

  • Report or exclude comorbidities that may serve as confounding variables (396)

  • Consider excluding participants with history of ileostomy or colectomy; who are on dialysis; or have undergone recent chemotherapy treatment, radiation therapy, endoscopy procedures, or recent irrigation/cleansing of the large intestine (424)

  • Record health status and medication use for interpretation of results (427). Consider excluding participants on corticosteroid hormones, prescription weight-loss drugs, insulin, or thyroid medications (<6 mo) (424)

  • Inclusion and exclusion criteria should consider recent infection and vaccinations, including flu shot (<1 mo) (424, 428)

  • Inclusion and exclusion should consider alcohol consumption (428)

  • Factors such as stress, smoking, and coffee consumption can affect microbiota (393)

  • Metabolic factors, exercise level, and hydration status can be significant covariates and should be recorded. Participants planning on modifying lifestyle habits including exercise should be excluded (427, 428)

  • Pregnant and lactating participants are typically excluded, but in some cases may be included (428)

  • Bowel habits should be tracked at baseline and throughout the study (428)

  • Energy intake restriction can change the microbiota; hence, consider excluding people on weight loss programs and those with substantial weight change (e.g., >20 pounds in past 6 mo) (424, 427)

  • Special diets such as vegan or diets with limited food groups (e.g., Paleo diet, gluten-free) have been reported to change the microbiota (427)

  • Microbiome variation across study participants may not be avoidable despite strict eligibility criteria (427)

Temporal factors and considerations
  • In prospective cohort studies, single time-point sample collection may not adequately characterize a person's microbiome, and multiple measures may be needed (424)

  • Variation in metabolite phenotype can be seen in the adult gut microbiome. However, microbial DNA is expected to be representative within a person in comparison to other people over time (424)

  • Report temporal factors such as season or time of the day/week of sample collection as these factors can impact the microbiome (396)

  • Sleep cycles and menstrual cycles should be considered as covariates in longitudinal studies (428)

  • Circadian rhythms cannot be typically controlled in free-living human studies. Hence, fecal collection time can be used as a covariate for elucidating interactions with meal timing and fed/fasting intervals (428)

  • Long duration travel and jet lag can lead to diurnal fluctuations in the microbiome (427)

Test intervention and control
  • In intervention studies, report whether food is supplied, when it is supplied, and the amount consumed (396)

  • Dietary interventions should use placebos and blinding when possible (428)

  • Dietary products evaluated must be well characterized. In addition to macronutrients, consider components that can act as microbial modulators including fibers, specific micronutrients, polyphenols, or probiotics (427)

  • A suitable control product should also be selected based on its minimal effect on the microbiome. Maltodextrin is commonly used as a control, but it can alter microbiota composition. Microcrystalline cellulose has minimal energy and in comparison to alternative control fibers is less fermented by the gut microbiome (427)

Compliance and blinding
  • It is recommended to report compliance assessment and blinding procedures for study staff, participants, and diets/food components (396)

  • In addition to other considerations, fecal sampling burden (collection and storage) may affect compliance (427)

Background diet and control
  • Dietary intake should be stabilized, rather than standardized, to control variation and guarantee consistency of individual diet during intervention studies. This can be done by asking participants to maintain their habitual diets or designing a diet based on the participants’ recent dietary intake data (428)

  • Habitual diets, especially fiber intake, should be considered and documented. Evidence suggests that baseline dietary fiber can influence responsiveness to interventions (427)

Diet information collection
  • If stool transit time is known, dietary intake collection can be optimized relative to stool collection time; when transit time is unknown, then dietary intake can be collected 3 d prior to stool sample collection (428)

  • Report baseline diet of subjects, dietary intake assessments, length of time, recording time, and nutrients that are relevant to the study objectives, health outcome, and gut microbiota (396)

  • In additional to recording dietary macronutrient intake, food choices should be recorded to capture the complexity and diversity of diet (393, 428)

  • Consider including multiple 24-h dietary recalls or 3-d diet records, in addition to FFQ (428).

  • Ideally, numerous axes of dietary intake should be controlled, including meal timing, length or duration, and location—these all affect the food environment. Consider recording appetite dimensions when relevant (428)

  • Assessing food intake is a major challenge, but is key, and the main research question will guide the dietary methods for the study. All self-reporting methods are prone to error, such as random day-to-day variability and systematic error, and knowing the basis of the error can aid in interpretation of results (427)

Antibiotic use
  • Antibiotic use, due to its specific impact on microbes, should be considered independently from other medications and supplements. It is important to specifically address antibiotic use in human studies, given the effect of antibiotics on microbiota

  • Consider excluding participants with recent (<6 mo) antibiotics in oral or IV form (424, 429).• Recent antibiotic exposure should be considered as part of the inclusion and exclusion criteria (428)

Medication and supplements
  • Medication use should be recorded (428)

  • Participants consuming prebiotics, probiotics, or supplements may be excluded or may need to maintain consistent use throughout the study period (428)

Fecal sample frequency
  • Describe repeated fecal sampling protocol, if any (396)

  • Consider day-to-day variation by taking multiple consecutive microbiome samples per study time point or phase (e.g., for 3 consecutive days). Ideally, collect more fecal samples (e.g., for up to 7 consecutive days) per time point or use daily sampling (428)

  • Multiple measurements within the same person may be required for accuracy; however, this is not always practical (393, 427)

Fecal sample collection
  • Important to use the same methodology throughout the study (427)

  • The collection process dictates temperature and transport parameters. Methodologic variation typically is less than interindividual variation (428)

  • Participant self-collection of a few grams of feces can provide enough sample for sequencing or metabolomics without excessive burden. However, this can over- or underrepresent specific taxa (428)

  • If samples are immediately frozen after home collection, then temperature must be maintained during transport as freeze–thaw cycles can alter microbial composition (426)

  • Fecal samples may show uneven distribution of microbes due to variation in the gut environment [e.g., differing pH levels from the proximal to distal colon, higher concentrations of oxygen near the mucosa relative to other areas;

homogenization of whole stool may provide a more uniform sample; collection of an entire bowel movement fecalsample, however, may not be feasible for many studies] (424, 428)
• Considerations for gut biopsies, if available, include that the gut mucosal and luminal microbiomes are not similar, andmicrobial populations differ across biopsies collected at different locations of the GI tract (424)
• Sample transport or storage temperature can affect microbial community structure, but temperature-induced variationhas been found to be less than that which arises from interindividual differences. However, sample storage (withoutpreservative) at room temperature for more than 24 h can impact the microbiome (424)
• Use of a nucleic acid storage solution [e.g., 95% ethanol, RNAlater® (Ambion)] aids in preservation; however, somestudies have found reduced yield and purity of bacterial DNA and possible alterations in bacterial phyla comparedwith frozen samples (424, 428)
1

EFSA, European Food Safety Authority; FFQ, food-frequency questionnaire; GI, gastrointestinal; RCT, randomized controlled trial.

Key considerations in data interpretation

The increasing number of diet–microbiome studies being conducted highlights the need for consideration of key factors such as experimental design, standardization of sample collection and analysis workflows, host–microbiome interactions, influences of other biological and environmental factors, and integration with ’omics data to ensure that data collected are robust for addressing interindividual variability.

Strengthening experimental design

The best practices in experimental design targeted for diet–microbiome studies have not been extensively reviewed. However, Knight et al. ( 429) present typical issues related to experimental design of microbiome studies that are applicable to diet–microbiome studies. They note that the study design should be appropriate for addressing the question under consideration. For example, data from cross-sectional studies are typically fraught with confounders; hence, results should be stratified by potential confounding factors such as age, sex, lifestyle, and environmental factors. Longitudinal studies are important for assessing changes in microbial communities over time. In animal studies, the influence of coprophagy and co-housing must be considered while evaluating effects of diet on the microbiome. In addition, numerous dietary factors affect the microbiota, including nondigestible carbohydrates, polyphenolics such as tannins, unabsorbed vitamins and minerals (e.g., iron), and undigested fat and protein (∼5% and 8% of ingested, respectively) (393). Controlled-feeding trials provide the best opportunity to define dietary intake; however, given the challenges and cost of controlled dietary clinical studies, Johnson et al. (395) promote that diets should be stabilized rather than standardized. In addition, any dietary changes will impact the customary microbiome in an individual, so to gain insight into precision nutrition and specific subgroups of people based on similar microbiome patterns, using the individual's customary diet and lifestyle patterns with each person as their own control may be the optimal approach.

Fecal sampling is the most established method used in human studies as it is readily available, noninvasive, and can be conducted by participants in their own homes. It most commonly involves either taking a single swab or scoop of a stool or collecting a full stool sample that is homogenized and aliquoted later. These samples represent the densely populated microbes of the luminal intestinal gut microbiota, and because they are static samples, do not differentiate permanent from transient strains (393). Fecal samples are considered high in biomass (i.e., high microbial density); hence, small quantities are required for sufficient DNA extraction. Low-biomass samples from other body sites and tissues necessitate greater consideration of sample quantity and appropriate controls. Sample preservation depends on the analysis to be conducted. For example, for assessing microbial community composition, samples should be stored at ultra-low temperatures, ideally immediately after collection, to avoid outgrowth of microbes. For meta-transcriptomics analyses, the use of RNase inhibitors is required. Overall, standardizing sample processing is important to control for variation introduced by collection methods, kits, and storage conditions.

Standardizing analysis pipelines

Microbiome data are generated from marker genes, metagenomic, or metatranscriptomic sequencing. The pros and cons of each method have been reviewed (430).

For marker gene analysis, diet–microbiome studies typically use 16S rRNA sequencing, which amplifies the 16S rRNA gene for bacteria and archaea. This approach is preferred because of the relatively lower cost of analysis. In addition, large public datasets for marker-gene sequencing are readily available for comparison. For most data, species- and strain-level resolution is not available. The choice of several primers and variable regions to target in the 16S gene can increase bias. Moreover, the functions in which the microbes are involved cannot be accurately inferred from marker gene analysis.

As opposed to marker gene sequencing, metagenome analysis can be used to sequence the whole microbial genome, and hence utilizes higher sequencing depths and is more expensive. It is also subject to host DNA contamination. However, this analysis offers higher resolution (i.e., up to the strain level) and is more accurate for inferring the relative abundance of functional genes.

Meta-transcriptomic analysis is used to study microbial gene expression. In contrast to the above 2 methods, this approach can distinguish between live, dead, and actively transcribing microbes and provides better insight into functional microbial activity. However, it is the most expensive and complex method and is subject to host RNA contamination. An ideal diet–microbiome study would combine marker gene or metagenomic analysis with meta-transcriptomic analysis to profile the microbial abundance as well as functional activity of those microbes.

After 16S sequencing, data are resolved into operational taxonomic units (OTUs) or amplicon sequence variants (ASVs). The feature abundance data generated are typically used for differential abundance analyses, which profile changes in microbe taxonomy by study groups. For metagenomic and meta-transcriptomic analyses, after host contaminants are removed the sequencing data can be analyzed using read-based profiling or assembly-based analyses. For all methods, alpha and beta diversity can be conducted to find overall patterns in microbiome variation. Given the variability in analysis pipelines, analytical methods, and statistical techniques used, meaningful comparisons among diet–microbiome studies are limited unless pipelines are standardized.

Host–microbiome interactions and environmental influences

The human microbiota transforms ingested dietary components into products that can influence metabolism and biological functions. The byproducts of microbial digestion that have nutritive value include SCFAs generated from fermentation of nondigestible carbohydrates, synthesis of certain vitamins (e.g., vitamin K), and generation of metabolites from other nondigestible dietary components including polyphenols and nondigested proteins and fats (381, 382, 384, 393, 431). Gut bacteria are also involved in converting primary bile acids to secondary bile acids and in the generation of trimethylamine (TMA) from dietary choline, carnitine, and betaine found in red meat, fish, and other animal sources. TMA can be converted into trimethylamine N-oxide (TMAO), a compound that has been linked to cardiovascular disease risk (382, 384).

There are also circadian and seasonal influences on the diet–microbiome association. Host circadian rhythms are synchronized by a central clock in the hypothalamus, which receives environmental cues and transmits it to peripheral tissues. Circadian rhythms are key to maintaining numerous physiological processes. The diurnal variations in the gut microbiota can affect host circadian rhythms (432), and their interaction can influence host metabolism, including nutrient digestion, micronutrient synthesis in the gut, and immune health. Moreover, the timing of food intake can influence microbiome rhythm and, hence, host metabolism (433). Seasonal impacts on the diet–microbiome relationship have been minimally explored in humans, although free-ranging animal models have depicted changes in microbial community composition and function (434). These variations in microbiome rhythms necessitate consideration of seasonal, diurnal, and nocturnal controls in precision nutrition studies. Considering the high interindividual and circadian variations of the gut microbiome, diet–microbiome longitudinal studies would benefit from collecting time-sequence samples.

Integrating microbiome data with ’omics data

Assessment of microbial community functions is an active and challenging area of research. ’Omics analyses can provide insight into metabolite signatures and protein expression associated with a microbial community (435). Comparing metabolomics data from cultured isolates with data obtained from clinical studies can help confirm microbially produced metabolites (427, 430). Advanced statistical techniques are important computation tools and include co-occurrence networks that associate microbe genes with metabolites and machine learning models that can classify subject states based on microbiome data. Overall, collective consideration of data from in vitro, in vivo, animal, cross-sectional, longitudinal, and clinical trials will aid in the mechanistic interpretation of dietary effects on the microbiome and microbially produced metabolites. A detailed summary of analytical considerations published in recent key reviews is provided in Table 3.

TABLE 3.

Overall analytical considerations for diet–microbiome studies1

Parameter
Storage for any type ofsample from whichDNA/RNA can beextracted Temperature
  • Samples collected without preservatives need to be temperature-controlled until storage at −20°C or −80°C (428)

  • Samples stored at room temperature for more than 24 h need preservative (424, 428)

  • Long-term storage should be at −80°C (428)

Storage solution
  • For epidemiological studies, use of a nucleic acid storage solution would be most feasible (424)

  • For meta-transcriptomics RNase inhibitor should be used (429)

  • For metabolomics, sample preservation should not interfere with metabolite extraction (429)

DNA and RNA extraction Amount
  • For high-biomass samples, a swab is sufficient while low-biomass samples may require larger amounts (429)

  • Improve DNA/RNA yields with mechanical disruption and/or chemical/enzymatic lysis (424)

  • Removal of nonmicrobial contaminants (429)

  • Controlling technical variability (429)

  • Use same reagent kits for all samples in a study (429)

  • Take multiple baseline samples for longitudinal studies (429)

  • Use blanks during sampling and DNA extraction (429)

  • Use reference samples with known composition for standardizing analyses (429)

  • Add internal standards (424)

Library preparation andsequencing Marker gene amplification and sequencing
  • Use 16S rRNA for bacteria and archaea and ITS for fungi (429)

  • Use well-tested and cost-effective methods (429)

  • Low-resolution view of microbial community (429)

    • Most taxa cannot be reliably defined at the species level (427, 429)

  • Choice of region can influence findings

    • V6-V9 region of 16S rRNA gene has higher error rate compared to V1-V3 and V3-V5 regions (424)

  • Not susceptible to host DNA contamination (429)

Whole-genome shotgun metagenomics
  • Sequences all microbial genomes (429)

  • Provides more detailed genomic information than marker gene sequencing (429)

  • Taxonomic resolution to species or strain level is possible (429)

  • Host DNA contamination should be addressed (429)

  • Can be expensive (429)

  • Analysis is complicated and computationally intensive (424)

  • Not widely used in diet studies yet (427)

Shallow shotgun sequencing
  • Cheaper than deep sequencing (428)

  • Can provide resolution up to species (428)

Other quantification methods (427)
  • FISH analysis

    • Direct histological localization of microbes

    • Samples must be prepared while fresh

  • qPCR

    • Rapid and sensitive DNA-based method

    • Quantification at different levels of taxonomic resolution

  • Microarrays (DNA arrays)

    • Comprehensive and sensitive method

    • Simultaneous detection and quantification of complex microbial community in a sample using molecular probes

    • Incomplete coverage of microbial ecosystem

Meta-transcriptome analysis
  • Gene expression and active functional information (429)

  • Biased towards microbes with higher rates of transcription (429)

  • Host RNA contamination should be addressed (429)

Data processing andbioinformatics Quality filtering and low-abundance filtering (424)
Choosing appropriate normalization of sequence counts
  • Total sum scaling (i.e., relative abundance), cumulative sum-scaling, rarifiying, etc. (429)

  • Tools such as Bioconductor DESeq2 and Edge R

16S rRNA sequencing
  • Deblur or DADA2 to resolve sequence data into sOTUs (429)

Metagenomics and meta-transcriptomics
  • Preprocessing to remove host DNA/RNA (429)

  • Analysis using read-based profiling tools such as Kraken, Metagenome Analyzer (MEGAN) or HMP Unified Metabolic Analysis Network (HUMAnN), or assembly-based tools such as metaSPAdes and MEGAHIT (429)

Statistical analysis
  • Individual taxon level

    • Univariate models to test group differences in taxon abundance (424)

    • Univariate associations with outcomes (424)

  • Higher-level analyses for overall patterns in microbiome variation

    • Alpha and beta diversity (429)

    • Insight on relationships between microbial community structure and outcomes: PCoA and NMDS, but these do not quantify potential associations (424)

    • Quantifying associations: multivariate tests such as nonparametric MANOVA, or the kernel-based regression association test of MiRKAT (424)

    • Identifying taxa associated with outcomes: penalized regression with high-dimensional predictors, such as ridge regression, LASSO (424), and SPARCC

Advancements of reference databases are essential (427)
Address challenges with statistical power and effect size for microbiome studies (429).
Metadata Record all details of collection process to account for potential variability (429)
Animal studies
  • Record stratification of experimental groups into multiple cages (429)

  • Record mouse of origin in the metadata (429)

Record environmental and physiological conditions for human data (429)
Reproducibility/datarepositories The Genome Standards Consortium standards enable comparisons across datasets (429)
  • MIxS for MIMARKS and MIMS

Bioinformatics tracking (429)
  • Commands and software should be tracked

    • Jupyter Notebooks or R Markdown and storing in GitHub

    • QIIME 2 and Galaxy, automatically track via data provenance tracking system

  • Use meta-analysis and data archiving tools such as Qiita and EBI

Data should be deposited in public repositories (429).
1

FISH, fluorescent in situ hybridization; HUMAnN, HMP Unified Metabolic Analysis Network; ITS, internal transcribed spacer; LASSO, least absolute shrinkage and selection operator; MIMARKS, marker genes; MIMS, metagenomes; MIxS, minimum information standards; NMDS, non-metric multidimensional scaling; PCoA, principal coordinates analysis; sOTU, suboperational taxonomic unit; SPARCC, sparse correlations for compositional data.

In closing

We provide a summary of the overall methods with respect to selected research interests in Table 4. Below, we provide comments on some needs in the diet–microbiome field that can be addressed now (395) and a few that can only be addressed with advancements in ’omics sciences and dietary assessment tools.

TABLE 4.

Summary of methods in the context of specific research interests1

Topic areas of interest Methods well suited Strengths Other factors to consider
Identifying microbial generated metabolites
  • In vitro

  • Directly measure metabolites

  • Easily manipulate cells

  • Controlled interaction between substrates and microbes

  • Isolated from host influence

  • Not all microbes are easily cultured

  • Does not recapitulate mucosal environment

  • Animal

  • Can use invasive procedures

  • Culture microbes of interest in gnotobiotic mice

  • Gnotobiotic mice have altered immune systems

  • Animal findings are not always replicated in humans

  • RCTs (metabolomics of human samples)

  • Integrates host influence

  • Restricted time points and accessibility of tissues

  • Colon versus cecum samples

Impact of microbiome–diet-relationship on health/disease outcomes
  • Animal

  • Well-established, disease-specific models

  • Lower complexity diet confounders can be controlled

  • Shorter lifespan to study long-term diet/healthy aging

  • Genetically altering animals may change metabolism

  • Test diets do not reflect human diet

  • RCT

  • May establish causality in humans

  • Stabilized diet

  • Time duration for assessing changes in microbial communities

  • Cost

  • Computational and prediction models

  • Able to integrate data from diverse study types

  • Requires sophisticated skills in multivariate data analysis

Influence of host genetic diversity on microbiome mediated response to foods
  • Animal

  • Genetically diverse animal models provide high diversity with lower sample sizes

Effects of dietary patterns/foods/nutrients on microbiome changes
  • Epidemiology/observational

  • Correlation/hypothesis-generating

  • Can study effects of population genetic diversity

  • Small changes can be difficult to identify in healthy populations

  • Difficult to control confounders

  • RCT/crossover studies

  • Show causal relationship

Impact of food properties on microbiome metabolites
  • In vitro

  • Animal

  • RCT/crossover studies

  • Can test many variations of specific foods (e.g., particle size, form, amount of fiber)

Effect of culture, food patterns, socioeconomic considerations, background, and other potential cofounders on diet–microbiome interaction
  • Epidemiological/observational

  • Easily identify many factors and associations to generate hypotheses

Circadian and seasonal influences on the microbiome
  • Animal

  • Controlled

  • More accurate collection of time-sequence samples

  • Ease of invasive procedures

  • Restrictions on multiple time point collection

  • Crossover/RCTs (longitudinal)

  • More realistic

Identify responder/nonresponder phenotype to microbial responses to diet
  • Crossover/RCTs (longitudinal)

  • Intraindividual differences can be captured

1

RCT, randomized controlled trial.

Many reviews noted that, for cross-sectional studies, large sample sizes are warranted, and for longitudinal studies, crossover designs are most suitable to assess interindividual variability in response to dietary interventions because study participants can serve as their own controls. Obtaining multiple samples at each time point will minimize variability introduced by sample collection. Given the high variability in microbiota among different body sites (e.g., oral, skin, colon, fecal, and urogenital), diet–microbiome studies need to move beyond fecal samples and assess changes in other body sites that are associated with health outcomes. In tandem with strengthening designs for microbiome-related outcomes and minimizing laboratory-to-laboratory variability, stronger dietary assessment methodologies for capturing variations in food intake are needed.

Because microbial responses to dietary interventions may reveal responder versus nonresponder phenotypes for specific outcomes, stratifying participants by baseline microbiome composition or responder type in human studies may be warranted. The gut microbiome itself could prove to be a promising biomarker for predicting responsiveness to a specific diet (436). Dietary intake assessment could be strengthened further by measuring established biomarkers for intake (e.g., metabolomics markers in tissues), although to date, few biomarkers exist (426). Finally, from a regulatory perspective, classification of microbiome-directed foods (i.e., foods that specifically alter the gut microbiota) as distinct from or similar to conventional foods, dietary supplements, or medical foods needs to be discussed (437).

Overall, while the field is extremely promising, we must be cautious about overinterpreting findings from current diet–microbiome studies for precision nutrition purposes, particularly when it comes to prescribing diets based on the microbiome. While this theory has been popularized in the media, it does not have sufficient supporting evidence for clinical applications. New ways of monitoring diet–host–microbiome interactions and funding priorities, such as the NIH's call for the development of tools for sampling of the gastrointestinal tract, will provide opportunities for understanding the full nature of the human gut microbiome (430). Due to the direct interaction of food components with the gut microbiome and the potential of managing chronic disease risk and life-long health through the microbiome, diet–microbiome studies are promising for helping inform targeted precision nutrition approaches that could improve individual and public health.

Nutritional Status

Introduction

Assessing nutritional status is important for monitoring of growth in infants and children, for example, and often includes relative measures of body size or specific measures of body composition and energy expenditure as markers of health or disease risk. The application of specific techniques to assess nutritional status and energy balance varies by the primary objective, such as yearly health assessments, planning for nutrition interventions, or research at the clinical or epidemiological level. Each technique has specific purposes and associated strengths and limitations and provides valuable data that can support policy recommendations and the concepts of precision nutrition in terms of health promotion and disease prevention.

Energy intake and energy expenditure

A variety of methods are available for measuring dietary intake, and these are discussed in greater detail in the “Dietary Assessment” section. Briefly, at the individual/clinical level, dietary records and 24HR interviews are the most accurate because they include detailed data on specific foods consumed and the intake of energy, macronutrients, and micronutrients. These methods also have relatively few limitations related to age, gender, and race/ethnicity, and all foods consumed are recorded and used in the final calculations. At the population level, FFQs are the most common because they involve a simple questionnaire, for which the answers are used to calculate usual intake. Finally, stable isotopes can be used to assess total energy expenditure in free-living individuals, which can be used as a proxy to estimate daily energy intake.

Total daily energy expenditure

A well-validated technique (438, 439) known as the DLW method is used to measure total daily energy expenditure in the free-living state over 7 to 14 d, using isotopically labeled water (2H218O). DLW can objectively measure energy intake, as energy intake is the sum of total daily energy expenditure and change in body energy during the measurement period, which can be estimated from change in body weight or body composition. DLW is almost completely objective (i.e., independent of participant-introduced bias) and offers substantial advantages over other methods for assessing energy intake including self-reported diet, which is influenced by participant reporting (440), and the intake-balance method on a metabolic ward, which necessarily imposes lifestyle conditions very different from usual life.

Experimentally, the DLW method is relatively low burden for participants. At baseline, 2 urine samples are collected before the participant drinks a dose of 2H218O. Then, post-dose urine samples are collected in the fasting or nonfasting state (441) on the day of dosing and at intervals of up to 14 d. Samples are shipped to the isotope laboratory for analysis. Outpatient collection and shipping of samples by participants is a routine procedure (442). The calculated isotope elimination rates reflect the participant's CO2 production and are converted to total daily energy expenditure based on an energy equivalent of 1 L of CO2 to be 3.815/respiratory quotient (RQ) + 1.2321, where RQ is based on dietary composition reported by the participant (443). This self-reported information has only a small effect on calculated energy expenditure [e.g., an RQ of 0.9 vs. 0.88 introduces a 2% error (444)]. This is because RQ is calculated from the balance of nutrients, which is more accurately reported than energy intake, thus allowing the method to be largely, if not entirely, free from participant bias.

DLW can also be used as an indirect estimate of total energy intake. Thus, for precision nutrition, this approach can detect changes in total energy intake over a period of weeks or months, especially when accompanied by measures of body weight to record any weight gain or loss. The DLW technique is also suitable for small or large studies, although the fiscal expense does make it prohibitive for large-population studies. It is important to note that, while DLW is much more accurate than activity monitors for capturing daily energy expenditure, it does not discriminate activity and reflects an average total energy expenditure over 7 to 14 d, regardless of sedentary or activity patterns.

Tri-axial activity monitors are a relatively low-cost alternative to DLW. These monitors use physical characteristics (sex, height, weight, and age) combined with continuous body movements to estimate resting metabolism and activity, which are then used to estimate total energy expenditure. These monitors have been validated in several studies (445447) and are relatively comparable to DLW, depending on the model. These activity monitors can capture almost minute-to-minute activity of an individual, and such measures are highly responsive to minor and acute changes in physical activity.

Activity monitors are not a perfect replacement for DLW because the data are based on proprietary algorithms that cannot be modified to suit specific study needs or questions. While most of the algorithms are generally accurate, there are issues when used with people with low or high fat mass. Because the algorithms use only body weight to estimate resting energy expenditure, resting energy expenditure may be underestimated for a person with low body fat or overestimated for those with excess body fat mass. Thus, average samples will produce the most valid results.

Body size and composition

Techniques are available for studying how body size and composition influence health and disease in both individuals and populations. The easiest, but least precise, techniques include anthropometric measures of height, weight, and skinfold thicknesses, as well as body circumferences. More precision is offered by secondary methods that rely on assumptions regarding body mass and hydration such as bioelectrical impedance analysis (BIA) and plethysmography. Imaging techniques, such as DXA, MRI, and computed tomography (CT) can provide accurate assessments of body composition overall or in various compartments as well as organ size. Stable isotopes used to assess total body water can be extrapolated to estimate body composition with less than minimal risk. Finally, emerging techniques using 3D scanning are being developed to estimate body composition from detailed scans of key anthropometric landmarks and are promising for predicting fat and lean mass with relative accuracy (448–451).

Strengths

Methods used for measuring energy balance and body composition each have inherent strengths that make each more or less appropriate for different types of investigations, such as clinical interventions or large population studies. In terms of precision nutrition, it is possible to examine the responsiveness of any particular outcome relative to an intervention or program. This section will present the strengths of each method in this light, focusing on the applications of each method and their potential for measuring outcomes at the personal or population level.

Energy and nutrient intake and expenditure

Dietary intake is influenced by various physiological, behavioral, social, or environmental factors. For example, psychiatric disorders such as anorexia or bulimia nervosa can manifest in eating behaviors that are distinctly out of line with healthy eating (452, 453). At the same time, eating in isolation or with a large group both influence dietary intake in ways that can decrease or increase normal intake. Environmental factors such as odors or temperature may also change dietary habits. Measuring these various influences on dietary intake in clinical or free-living conditions is challenging and using the best-suited approaches can improve one's ability to understand dietary patterns and food consumption that promote or degrade health. Most of these methods are discussed in greater detail in the section on “Dietary Assessment” methodologies, but as they are relevant to measures of energy expenditure and body composition, it is important to present them within the context of energy balance.

For individual or clinical assessments, 24HR interviews can provide a fairly reliable and generally accurate assessment of the total energy a person consumes and an adequate profile of their diet's macronutrient composition. The 24HR is logistically intensive because a trained interviewer must thoroughly discuss details of a study participant's record including types of foods consumed, approximate portion sizes, and the composition of mixed dishes. At the same time, the 24HR is flexible enough to obtain data of the same quality from many diverse individuals, regardless of factors such as race/ethnicity, religious practices, and SES. Thus, in terms of precision nutrition, the 24HR is highly responsive to major changes in a person's dietary habits and can detect changes easily, provided the interviewer is adequately trained. Although the 24HR is not as prone to database limitations as the FFQ, there are wide disparities in accuracy that can be attributed to variations in interviewer expertise, education, motivation and memory of the participant, and the database used to calculate energy and macronutrient intake. Most important, there is considerable day-to-day variation in dietary intake and a single day, or even several days, may misclassify individuals relative to their usual intake. The relative accuracy of the 24HR is generally low and has been criticized for simply being inadequate to generate data that can be used to make population-level generalizations or to form sound nutrition policies (454, 455).

For large epidemiologic/population studies, the primary alternative to the 24HR is the FFQ, which provides generally accurate estimates of usual macronutrient and micronutrient intakes. The FFQ is less labor intensive and logistically challenging than the 24HR and can be used remotely or through mailings, allowing hundreds or thousands of participants to participate. The FFQ's ease of use is a major strength for estimating the usual intake of a population with far less expense than multiple 24HRs. The database used to provide nutrient estimation is updated on a regular basis so that trends in dietary patterns of populations can be assessed. The FFQ's repeatability is similar to the 24HR when assessing micronutrient intake, but it is less accurate in capturing total energy intake, explained in more detail below. Although there is great value in using the FFQ because key changes in food patterns are detectable, it is important to recognize that the list of “common” foods in this approach may not be appropriate for some minority populations or immigrant groups.

Body size and composition

To assess nutritional status using body size and body composition, different methods are necessary to support personal and population approaches. The strengths of different methods and approaches vary but remain consistent in their ability to assess body fat mass as a primary predictor of disease risk. However, as with any research approach, caveats exist for different techniques.

The primary strength of anthropometric indices such as height, weight, and body site circumferences is the relative ease and low cost of measurement. Regardless of the setting, anthropometrics provides a fundamental assessment of body size and growth. Skinfold measures and waist circumference measurements performed by a skilled investigator provide valuable data on body size and composition with minimal cost. The response of anthropometric measures to acute changes in energy balance is limited, but chronic changes can be assessed and tracked. These strengths extend to the use of anthropometric methods in population studies and are accessible for investigators working in low- and middle-income countries.

Improving the accuracy of anthropometry involves an increase in expense, but the use of BIA is only a moderate step in this direction. A number of instruments are available and the features of each must be established prior to deciding on a particular instrument. For example, some electronic scales are equipped with BIA and handheld devices, but these instruments only measure regional body fatness and rely on proprietary algorithms that are not adaptable outside of the sample used to create the algorithm (456–458). However, for personal use, and to detect changes in body composition, use of the same instrument for serial measures limits some of these problems. Other instruments accurately assess total and regional body composition and are acceptable for both personal and population assessments (459–461).

Air displacement plethysmography is an accurate method for determining body composition in adults and children, with models available for infants (457, 462–464). The Bod Pod has been shown to be accurate and reproducible and is typically only used for clinical studies as the unit is not portable and may be cost-prohibitive. Still, between the adult and pediatric PeaPod version, this method can be used across the lifespan with minimal risk. However, unlike BIA or DXA, the Bod Pod only measures fat mass and lean body mass with no estimate of body fat distribution, limiting the knowledge that can be gained from prospective or intervention studies.

Stable isotopes are a safe and flexible method to assess total body water and body composition across the lifespan and among those with acute or chronic diseases (465–467). Perhaps most importantly, stable isotopes can be used in clinical or remote settings, making the method useful for research work in wealthy regions as well as low- and middle-income countries. Briefly, a dose of deuterium (2H) or oxygen-18 (18O) is provided to an individual and a urine or saliva sample is taken 4 to 6 h later to estimate the isotope enrichment in the body pool. This provides an accurate estimate of total body water that is used to calculate fat mass.

Finally, advanced imaging, such as DXA, MRI, and CT, provides accurate assessments of body composition overall or in various compartments as well as organ size. Given the high accuracy and precision of these techniques, even discrete changes in body composition or body fat distribution can be assessed, making such methods excellent for clinical studies and even population studies, such as NHANES. The primary strength for imaging methods is the precision of measuring body compartments that can be only indirectly measured using anthropometrics or are not assessed in methods like air displacement plethysmography.

Limitations

The application of any of these techniques to human nutrition research has inherent limitations, but more so when considering the aims of studying precision nutrition. While some of these potential limitations can be addressed in the research design stage, some cannot be minimized, and another method may be more suitable. The limitations of methods for assessing nutritional status tend to fall into 1 of 2 categories, either the technical expertise required to use the method effectively or the logistics/expense of a particular method. We will briefly address these limitations and provide alternative approaches within the context of precision nutrition.

Energy intake and expenditure

The best methods for assessing energy intake and expenditure depend on whether a study requires a more expensive and technical method or a simpler but less accurate method. The use of a 24HR is logistically and technically difficult in that each day of intake takes between 30 and 45 min to be completed well and a trained enumerator is essential to ask probing questions to clarify general responses so that detailed information can be gleaned from the participant's responses. Dietary records may be less technically involved, but the data quality tends to suffer unless the trained enumerator clarifies questionable or unclear responses. The FFQ may minimize some of these limitations but is not ideal for identifying specific ethnic or cultural differences in diet as it was developed using commonly consumed foods for a select population. Essentially, the limitations of any of these techniques will be determined by the study design. For example, the FFQ can be more easily used in large populations and is thus the obvious choice for large-cohort studies. On the other hand, if the research question is related to specific food patterns and needs to allow for various ethnic foods, using a smaller sample size allows multiple 24HRs to be used without negatively affecting data.

Ultimately, the most important question surrounding estimates of energy intake is how accurate they are compared with the most accurate approaches, such as DLW. There is a well-established trend of underreporting energy intake in dietary recalls (468, 469). Differences in energy intake revealed through DLW comparison were found for low-income women (470) and countries (471) as well as for both Black and non-Hispanic White adults (472). Overall, energy intake determined through dietary assessment methods (e.g., automated self-administered 24HR, 4/7-d food records, FFQ, semiquantitative FFQ) all underestimated energy intake compared with DLW energy expenditure (473, 474). While inaccurate for determining intake compared with DLW methods, dietary intake data provide essential information for nutrition research. Future work should aim to improve dietary assessment methods with the goal of matching DLW-determined intake as closely as possible without introducing bias into the protocol.

As discussed earlier, there are many advantages of using an activity monitor in healthy individuals, such as lower cost, logistical and technical ease, and the fact that proprietary algorithms estimate total energy expenditure using minimal physical characteristics. However, the validity of such monitors decreases as body composition of individuals is farther from what the algorithm considers normal. Thus, while these monitors may be advantageous for larger studies where the cost of DLW becomes prohibitive, or in lower-income countries where cost is a limiting factor, the fact that the monitors are less reliable for underfat or overfat persons is a major limitation that cannot be overcome without modifying the algorithm or adjusting raw values for body composition.

Body size and composition

Similar methodological challenges are found when considering which technique to use for measuring body composition. Technically, the quality of the data improves as methods progress from anthropometrics to plethysmography to stable isotopes and imaging. The major weaknesses of anthropometric measures include relatively poor precision, high need for technical expertise, and time required to collect data. For example, skinfold measurements have a bias that underestimated body fat percentage compared with deuterium oxide dilution (475, 476), 18O dilution (477, 478), and DXA (479) methods.

While BIA is an improvement in assessing body composition over anthropometry, it is not without limitations. BIA often requires proprietary algorithms that estimate total body water using age, sex, height, weight, and the raw impedance and resistance values. Without knowing the specifics of the sample from which these algorithms were developed, it is unclear if they can be applied to a broader population. For example, the algorithm may have been developed in a sample with significantly different body proportions, which would limit its ability to accurately estimate total body water in people with different torso-to-leg ratios or significantly different lean tissue hydration or leg or arm length. While most BIA equipment is generally accurate, there are differences among racial or ethnic groups. In some settings, these limitations have been overcome by developing population-specific prediction equations to apply to the raw BIA data (480, 481). In conclusion, BIA is a validated method; however, more research is needed to confirm BIA calculation for various ethnicities and age groups.

More precise methods to assess body composition are limited by available funding or technical experience. For imaging, the initial cost of equipment is significant, yet the quality and breadth of data collected with these methods are much greater than with less expensive methods. Although imaging collects a greater amount of data that are also more accurate, compared with other body-composition techniques, these advantages are often offset by the expense and lack of portability of the equipment, which presents challenges to using these techniques in low- and middle-income countries. There is also a minimal exposure to radiation using either DXA or CT scans, which raises questions about their use on certain populations, such as infants, children, older adults, and the ill.

Key considerations in data interpretation

The methods reviewed in this section include approaches that range from simple measures of nutritional status (weight, fat mass, etc.) to more refined measures of total body water to specific concentrations of nutrients in the body. Given the myriad of options available to assess nutritional status, conclusions may or may not be made from studies using any single method or combination of methods. As with any research method, the choice of method or technology to use is constrained by the overall objective of the study along with available funding and logistical issues. To address these issues, we will describe the range of options based on the primary outcome assessed.

Energy intake and expenditure

One of the most pressing nutrition issues facing the field of nutritional science is energy balance, given the continued high prevalence of undernutrition, the high prevalence of overweight and obesity, and the emerging challenge of the double burden of malnutrition. To best understand these problems and advance interventions or policies to reverse these imbalances, it is important to appreciate the nuances of methods for assessing energy balance and nutritional status. The section on Dietary Assessment methods covers the pros and cons of 24HRs and FFQs for estimating individual nutrient intake for clinical and epidemiological studies. While these methods are not as accurate as many would prefer, they do play an important role in understanding temporal changes in diet as well as changes in dietary patterns. The need for more accurate techniques is great, and many innovative technologies such as digital photos of food (482–484), internet-based applications, and food records (296, 485) are emerging as potentially useful.

Many methods used to assess energy intake and expenditure do not accurately respond to personal changes in diet and health, factors that are hallmarks of precision nutrition. For dietary intake, most methods do not have the accuracy of laboratory studies of diet, direct calorimetry, or DLW. Thus, one would not expect the outcomes of these methods—energy intake and macro- and micronutrient composition—to change significantly with minor changes in actual diet. Although, major changes, such as adopting a low-carbohydrate diet, may elicit detectable shifts of the outcome indices assessed by these methods, such changes may be lost due to the lack of accuracy and never be detected at the individual level. New and more accurate methods that are more objective, such as the Veggie Meter (486), may provide some allowance for precision nutrition, but these are generally acceptable for studying only 1 nutrient or class of nutrients.

On the other hand, using DLW to assess energy intake is objective and influenced primarily by a person's diet and activity. Overall, having an accurate estimate of total energy expenditure provides a solid foundation for assessing energy balance and dietary intake. As DLW relies on respiratory exchange rates of oxygen and carbon dioxide, any physiological response that increases CO2 production will be reflected in the total energy expenditure, making DLW extremely responsive to individual changes. However, DLW does not discriminate between the types of energy expenditure, such as basal metabolic rate or energy expenditure for physical activity.

The greatest value comes from combining methods. Using an FFQ with DLW and an activity monitor, while increasing the logistical complexity of a study by a rather modest, but significant degree, greatly increases the amount and value of data collected. For example, use of an FFQ with DLW provides a level of validation that can be incorporated into statistical analyses for the dietary data. Using DLW with an activity monitor not only validates the monitor's algorithm but also provides valuable activity data that can be analyzed in conjunction with total energy expenditure. The complementary features of each method do not strengthen weaker methods but, until more accurate dietary intake methods are available, allow internal validations and cross-comparisons with other studies that may prove beneficial for making policy decisions from dietary studies.

Body composition

Methods to measure body composition range from simple anthropometrics to more technical methods. The simple methods, such as BMI and skinfold measures, are adequate for assessing large groups of people to determine the prevalence of overweight or underweight. Although there has been considerable discussion about BMI's effectiveness as a proxy for adiposity, BMI has consistently been shown to have a high correlation with body fat mass in large samples. Given that, BMI is not best used as an assessment tool for individuals because it does not measure any component of body mass other than weight and height and is not a direct measure of body fatness. However, when used in a sample of average adults, BMI can discriminate between a high- or low-fat mass relative to height and is useful for tracking changes in populations to determine changes in nutritional status, especially given the very low cost of measuring height and weight.

More precise measures of body composition, from BIA to advanced imaging, better inform investigators regarding body fat mass as a key outcome related to obesity-related comorbidities. Perhaps more important is that these measures can assess regional adiposity, providing key data on body fat distribution as a marker for pathogenic adipose tissue, such as visceral fat mass. Options for investigators remain broad as relatively inexpensive BIA equipment has been validated for estimating total and regional fat mass (487). Such methods are acceptable and available for high- and low-income settings. Advanced imaging from methods like DXA and CT provides the most precise estimates of body composition, but costs and infrastructure generally limit such equipment to large clinical settings. Finally, deuterium dilution is a relatively easy method to use in any setting, be it clinical or field; however, stable isotopes are limited to a 2-compartment model of fat mass and lean tissue. This method does, however, allow total body water to be assessed, which is a useful parameter for some diseases states.

The body-composition measures discussed have limited interface with other methods due to the technical differences and specific outcomes of each method. However, combining less accurate methods with more precise methods, such skinfold measures with total body fat mass assessed with deuterium dilution, can expand the data collected with minimal changes to protocols. Perhaps the most challenging aspect of body-composition measures is the expense and logistics of imaging in low-income settings. For these settings, it may be necessary to tolerate less accurate methods, such as BIA or skinfold thickness, and to sacrifice precision to gain compliance or improve logistics. The use of stable isotopes is a flexible approach to collect several pieces of data (i.e., energy expenditure, total body water, body composition, and tissue hydration) from 1 protocol. The additive impact of using BIA will then allow for body fat distribution as an additional outcome that is superior to anthropometrics.

As the objective of precision nutrition is to provide sound and accurate nutrition guidance that can be assessed using precise and responsive outcomes, it is of interest to consider how the techniques described to assess body composition fit with this approach. Broadly, the main outcomes of body-composition measures, such as fat mass and lean body mass, do not change with considerable dietary or activity changes that persist for only a few days. Therefore, depending on the rate of responsiveness, these methods are not appropriate for precision nutrition if acute changes are of interest. However, with sufficient time, such as 2 wk or more, dietary and activity changes can begin to be detected as body weight changes and such methods, including BMI and anthropometrics, are highly suitable for precision nutrition. More importantly, the use of stable isotopes and imaging allow very precise detection of changes in adipose tissue mass and adipose tissue distribution.

In closing

There is great interest in and need for highly accurate methods to assess dietary intake, energy expenditure, and body composition. The continued high global prevalence of obesity as well as the double burden of disease are among the high-priority areas of research that warrant accurate research methods. In addition, the development of national and global nutrition policy based on the best research available demands that methods be accurate, reproducible, and usable by investigators throughout the world, independent of wealth. Indeed, the coronavirus disease 2019 (COVID-19) pandemic has revealed stark disparities between countries in terms of responding to the pandemic as well as acquiring and distributing vaccines. The spillover effect of the pandemic on food security and nutritional status is problematic, and the ability of any 1 country to investigate these effects should not be limited to the wealthiest.

Although the methods available to date do fulfill these larger needs, the nutrition community has relied on dietary intake methods that are not highly accurate. The ability to measure total daily energy expenditure and physical activity remains in the hands of those with the technical expertise and financial ability to use DLW or activity monitors. National surveys continue to use BMI as a primary marker of nutritional status despite the fact that it is a proxy for body composition and cannot be used for more detailed research on body composition. On the other hand, the cost of using imaging or stable isotope methods may hamper cross-country comparisons.

Despite these challenges, there is a history of cooperation and collaboration between rich and poor countries, and this history clearly shows that important research questions can be addressed regardless of national income. What has not been solved is how to improve the accuracy of dietary methods to the level of methods used in energy expenditure and body composition. Developing new, more accurate dietary intake methods remains a goal for many nutrition scientists across the globe. At the same time, current methods are reliable for studying temporal changes within communities and can be used with some accuracy to study the influence of shifts of foods, diets, or dietary patterns on health. What remains for the future is to find the best ways to combine existing technology, such as phone apps, with highly accurate methods like DLW to develop methods that are inexpensive, accurate, and transferable. This will allow nutrition policies to be based on sound evidence with minimal bias.

Cross-cutting Considerations

Introduction

The sections thus far have outlined some of the many methodological domains that nutrition research encompasses. Yet, despite many aspects being shared in separate sections, there are several overlapping, cross-cutting considerations that apply to the advancement of nutrition science over the next few decades. In this section, we highlight some cross-cutting methods and principles to emphasize that nutrition science is not, and cannot, be a siloed discipline if it is to advance our understanding of the causes and correlates of nutrition and health.

Cross-cutting methodological considerations

Precision nutrition is, at its root, a causal proposition: if a person consumes a particular food or diet, will it result in better health, help avoid disease development, and/or lead to a reduction in morbidity and mortality from diet-related diseases? Will it do so within one's geopolitical, economic, or social context? These causal questions are implicit at the heart of much research on nutrition and health but are not always stated clearly. The methods used often look at average causal effects and associations, rather than the expectation of what would happen to a given person because of a specific intervention. Indeed, the expectations (in both the statistical and colloquial sense) hinge on the exchangeability principle that the expected effect of an intervention is identical across people, with random noise explaining any differences among them. Yet, in many studies, there are individuals who appear to respond to a treatment or diet while others do not respond (or are adversely affected). Rather than try to measure average effect, attention should be paid to the differences in responses between individuals and in identifying outcome patterns that could provide insight into this variability. Early advances in precision nutrition recommendations have built on repeated refinements of this exchangeability by whittling down subgroups: perhaps men respond differently than women or older people respond differently than younger adults or children.

At the extreme of precision nutrition is the idea that everyone is idiosyncratic, such that exact predictions are dependent entirely on an individual's circumstances. Although this is likely to be technically true, we can most likely find some degree of exchangeability for which the idiosyncrasies result in minor deviations from a broader prediction of individualized effect. For instance, dietary intervention A may work slightly less well for person 1’s diabetes than for person 2, but that difference is a degree of magnitude less variable than the effect of dietary intervention A versus B. These factors require moving beyond average causal effects, which are the common endpoints to many randomized trials and observational studies, and instead investigating whether deviations around the average really are random noise or whether the differences have informative, actionable causes. Individual data points previously disregarded as outliers are as valuable as those that approach a mean effect.

Below is a discussion of several cross-cutting methodological considerations that may help identify, predict, and test factors that can lead to a more personalized understanding of the effects of nutrition on health.

’Omics, big data, data mining, machine learning, and AI

Throughout several of the above sections, ’omics and other large, multidimensional data are referenced. The coherent use of multimodal data, in which data of different forms and origins are synthesized, is of increasing interest. Utilizing so-called “big data” involves a series of considerations, quite a few of which start with V, including volume (the amount of data), velocity (how much data are generated per unit time), variety (harmonization and structure of data, or lack thereof), and veracity (quality of data), among others. The challenges facing nutrition science from big data, machine learning, and AI-based inference are multifold.

The quality of dietary data, or its veracity, has been discussed thoroughly over the decades as well as within this document and will not be rehashed here other than to say that many approaches rely on weak proxies (e.g., self-report, food disappearance data, as-of-yet unvalidated biomarkers) that may, in turn, be cross-tabulated with incomplete nutrient databases. Similarly, measurements of other factors of potential interest to precision nutrition (e.g., older methods of DNA sequencing, unstandardized physiological and biochemical analyses, evolving microbiome technology, and varied sociodemographic evaluations) may have important limitations. Any predictions built on data, regardless of the amount and size of the dataset, inherit those limitations.

Consistency and generalizability are other important considerations. The consistency of data within a cohort may be sufficient to show similar associations, but if using different dietary assessment methodologies (for instance, FFQ vs. 24HR), the time horizon, diet captured, and other model inputs will vary. Thus, a beautiful prediction model could be constructed within a dietary methodology, but that model may be fragile in response to changes in definitions of dietary or characterization variables. A model built on one dataset is unlikely to meaningfully and quantitatively be reproduced with another dataset if the variables are built on heterogeneous definitions. Regardless, models should be appropriately validated, and yet many examples exist in the literature without appropriate model validation (488). This may become more challenging in cases of “black box” approaches sometimes common to AI or machine learning, where the starting conditions of the model may influence the fit, with the end user—and sometimes the developers—not fully understanding the inner workings of how the computer reached its conclusion.

Establishing causation is also a challenge. Once a robust model is defined, there is still the challenge of unpacking whether the model is causally related to the outcome of interest. Blood cholesterol is a longstanding example of this. For decades, blood cholesterol has been known to be associated with, or to predict, a higher likelihood of cardiovascular disease, yet arguments ensued as to whether it was causally related to cardiovascular disease. With much research, the advent of statins, and advanced observational analysis techniques like Mendelian randomization, the causal evidence has continued to mount and be refined (e.g., moving from total cholesterol to implicating particular lipoprotein subclasses). Note that this effort was all aimed at investigating a single biomarker. To establish the causal relationship of composites of machine-selected biomarkers will be more challenging still. Nonetheless, if prediction is what is first sought (e.g., who will have a coronary event or develop cancer) rather than how to intervene, then these models will likely be more powerful than the past approach of single-variable selection and dimension-reduction techniques. However, prediction then needs to be communicated as such, rather than implying causation (see “Fit for purpose” section).

Measurement

Nutrition science should work to lead the implementation of advanced methods to improve prediction and causation in a discipline where many of the pressing questions involve understanding a complex exposure with long-term (sometimes lifetime or intergenerational) outcomes. The dichotomy between observational and randomized trials fails to capture the complexities of nutrition science and the tools at our disposal (489). There are improvements and advances in experimental and observational designs and analyses that can better answer pressing questions. Moving beyond ordinary association tests or single-paradigm parallel-arm trials will allow stronger inference that is more appropriate to the question at hand (490) (see also the “Fit for purpose” section). It is important to keep in mind that exposure to food spans a lifespan and that people need to eat; it is not a choice as it might be for other exposures such as drugs or smoking. This adds to the complexity of studying food intake or dietary patterns and their impact on health.

Observational studies

Observational studies continue to be an integral part of nutrition research, especially as cohorts of individuals expand in number and are observed over longer periods of time. Results from these studies are valuable for laying the groundwork for developing hypotheses that can be tested in cells, animals, or humans, and for providing general information about populations that may help with targeting interventions or changing policies. Concerns arise when information obtained from observational studies is uncritically used to imply causality—especially when the sample size is so large that a relationship is deemed statistically significant—without taking into consideration confounding, variability, and potential clinical relevance. Rather than always seeking bigger samples, targeting specific subpopulations or applying greater focus on representative samples may be appropriate.

Experimental design

Improvements in experimental design depend on the nature of the proposed intervention and the outcome. When outcomes have chronic symptoms (e.g., pain in arthritis), are characterized by a manipulable measurement (e.g., weight change for obesity), or when there are potentially strong biomarkers of disease progression (e.g., lipoprotein profiles for cardiovascular disease, HbA1c for diabetes control), it is possible to attempt multiple interventions for the same person. Conversely, for chronic binary outcomes (e.g., some cancers), such manipulation of conditions is less feasible. In a sense, this repeated assignment occurs in practice in the clinical setting, with physicians taking people on and off medications to see how symptoms progress. It is important to recognize that, in clinical practice, the health care provider is focused on an n-of-1, the patient, and applies a treatment based on the body of evidence available at that time. Therefore, rigorous clinical research is important to contribute to a sound body of evidence and recommendations for implementation in the clinical care setting. The order people are taken on and off medications, the way dietitians recommend diet changes, or the way any other repeated intervention is applied are rarely conducted in a way in which the effects can be reliably determined. However, this is not necessarily the responsibility of a practitioner unless a study is being conducted. A clinician uses science, intuition, and experience to provide the best treatment possible but may never know what actually worked. That is, changing a medication may co-occur with spontaneous remediation of symptoms or with another lifestyle change that actually explains the resolution of symptoms.

Better understanding of the differences between providing health care and generating data to inform decisions is needed. Newer experimental designs can cut through this noise, both for precision nutrition and for determining what orders of interventions to try.

Single-case designs

Precision nutrition acknowledges that how people respond to different foods exhibits wide individual differences that can vary based on genetic makeup, biology, lifestyle, environment, and exposures. This variability in response to an exposure such as diet is known as heterogeneity of treatment effect (the; also called treatment response heterogeneity). RCTs provide an unbiased estimate of the average effect of a diet on outcomes. However, these average effects have error terms because of random noise as well as the potential for HTE (491). Unfortunately, random noise and HTE are conflated in a typical RCT. That is, it is unclear whether someone with an extreme change responded differently (i.e., HTE), or if their change in outcome was different by random chance. To determine whether, and to what extent, people respond differently to interventions, different designs need to be used. Research on HTEs has not only shown outcome variability in RCTs but has revealed that, in some studies, there may be very few people who actually gain the desired treatment effect (492). Research can assess moderators of treatment response to better isolate individual differences that may predict differential response to a diet. However, even this approach may not limit response variability or reduce variability for subgroups in comparison to the whole group. Although not realistically obtainable, it would be ideal to identify unique patient groups that respond to unique interventions with very limited variability between people in each group.

Specific experimental methods are needed to determine how to optimize the diet for an individual (493–495). These designs can be borrowed from personalized n-of-1 designs used in medicine and behavioral sciences to identify an optimal treatment regimen for an individual and then testing whether outcome changes are, in fact, due to the dietary approach being studied (494). These designs belong to the family of single-case experimental design (SCDs), which have been used for decades in behavioral sciences and medicine (496, 497). They are also considered the pinnacle of evidence for individual cases (498), even more so than RCTs, due to the challenge of HTE. However, they are seldom, if ever, used in nutrition research (495). We will briefly discuss the theory of these designs and their strengths and weaknesses for nutritional science.

SCDs are randomized, multiphase experimental designs in which a great deal of data are collected on 1 person under a variety of experimental conditions. While many types of SCDs can be used in nutrition research, we focus on reversal designs and multiple baseline designs. These 2 designs focus on demonstrating experimental control of the diet in relationship to outcomes but differ in the number of participants needed to show treatment effects. They both require repeated measurements until a stable outcome estimate is observed or the outcome begins moving away from the desired results. Reversal and multiple baseline designs both replicate the intervention to ensure confidence that the dietary intervention was the cause for the change (497, 499). These designs can also require clinically relevant, as opposed to statistically relevant, changes (500). In SCDs, the desired clinical effect can be prespecified and considered when interpreting the effect's relevance for clinical practice. Rather than focusing on 1 outcome, SCDs enable simultaneous measurements of multiple dependent outcomes and assessment of the dietary effects of these outcomes (500). SCDs can be used to test whether a variable mediates the change in the dependent outcome variables (501, 502). Finally, statistical approaches can be applied to ensure that changes observed between experimental phases are unlikely to result from random variation or to aggregate data across many n-of-1 trials to assess generalizability across a broader population sample (497, 503–506).

Reversal SCDs

A reversal design collects outcome data (e.g., biological, gene expression, microbiome, affect, taste) in at least 2 phases: a baseline, or usual, diet phase (A) and the experimental phase (B) when the experimental diet is administered. At least 3 replications of effect are needed to demonstrate experimental control. The experimental design A1B1A2B2 constitutes 3 replications (A1 vs. B1, B1 vs. A2, A2 vs. B2). Each phase is carried out until a prespecified endpoint was met (e.g., absence of trends in the outcome in the direction of the desired effect; a particular clinically relevant outcome; a prespecified amount of time). This design can be extended by comparing multiple diets. The diet order should be randomized, especially if the goal is to combine n-of-1 experiments across participants.

While most dietary experiments using food cannot be adequately blinded, experiments testing supplements can be implemented in a double-blind fashion, as occurs in placebo-controlled SCDs in medicine. Even if the experimenter and participant are not blinded to the dietary component being manipulated, it is good practice to blind the person collecting the data to the manipulations and expected hypotheses. Using improved objective continuous measures such as continuous glucose monitors or actigraphy can make it easy to blind the data collector to sources of bias.

One advantage of the SCD reversal design is its ability to experimentally show that a particular diet was functionally related to a particular change in an outcome variable for an individual person. This represents the core principle of precision nutrition: identifying an optimal diet for a person. These designs work well for studying effects on individual people and for studying the effect of nutritional interventions on rare diseases (494), for which it is difficult to collect enough participants with similar characteristics for an RCT. A final strength is the opportunity for the nutritional scientist or clinical nutritionist delivering clinical care to translate basic science findings or new findings from RCTs into beneficial treatments for their patients (494).

However, there are limitations. First and foremost, the SCD reversal design does not immediately contribute generalizable knowledge when used on a single individual unless processes are standardized and harmonized across replicate individuals. In addition, a critical aspect of a reversal design is the fact that, when the dietary intervention is removed, the outcome returns to baseline levels. If the effect is not quickly reversible, then these designs are not relevant. If the half-life of the effect on the outcome is known, a washout period can be used to provide time for the reversal between the phases. Another limitation is that the dietary intervention must have a relatively immediate effect on the outcome. If it takes many weeks to months for the dietary change to show effects, the reversal design may not be optimal unless the investigator can plan a very long study. As this study design depends on comparing stable data over conditions, it is not appropriate if stability cannot be achieved, which can occur due to a variety of biological or environmental issues. Finally, for a clinical situation in which a nutrition scientist is attempting to identify whether a particular diet works for a very ill patient, a reversal design is not optimal given the need to return to placebo or baseline conditions.

Multiple baseline SCDs

An alternative to a reversal design is the multiple baseline design, which involves testing a new intervention, or comparing interventions, across people. In the basic case, conditions are not reversed but, rather, baselines last for different amounts of time. If an outcome does not change until an intervention is presented, and this is replicated over several people, the change is likely due to the experimental condition. In some ways, this is analogous to a stepped wedge design, in which the crossover is only in 1 direction: no-intervention in period A to intervention in period B. For example, 4 participants could be studied over different baseline lengths. The baseline length must be long enough to show stability with no trend towards improving until the intervention was implemented. For generalizable estimates of causation, assigning participants to different length baselines should be randomized. If all baselines are stable before the intervention is introduced and each person changes outcome(s) after the intervention across the 4 different staggered baselines, it is reasonable to conclude the intervention was the reason for the change. The number of people (replicates) needed to make conclusions of diet effects depends on the effect size. For multiple baseline designs, replications are across, rather than within, participants. Multiple baseline designs can incorporate reversals, which combine the strengths of reversal and multiple baseline designs (500). Multiple baseline designs can also compare effectiveness of more than 1 diet and assess the incremental effects of adding additional dietary changes or nondietary changes such as exercise programs. For studying more than 1 diet, it would be ideal to randomize which diet comes first (497). Data from multiple baseline designs can also be aggregated across participants and studies to examine generalizability.

Multiple baseline and reversal designs can be useful for adapting basic science findings to clinical interventions and for early-phase translational research (often called pilot studies) as a step towards a fully powered RCT. These approaches may have advantages over typical small-sample pilot RCTs in providing specificity and inherently addressing HTEs.

Of course, there are limitations to multiple baseline designs. First, they may keep people in baseline or control conditions for extended periods. This is not unique to baseline designs because many RCTs keep people in these conditions for the entire study. When implementing multiple baseline designs, it is possible to begin the intervention with the first person or group of people who show stability of baseline data and then implement the diet sequentially as more people show baseline stability. This creates an inherent bias, which can be reduced if the order of implementation across the different staggered baselines are randomized.

There are limitations common to both reversal and multiple baseline designs. For example, some dependent measures may change with repeated testing, or are reactive to repeated testing, independent of implementing an intervention. If a measure changes with repeated testing without intervention it is not useful for an SCD. Given that the effects of diets are evaluated over time, systematic environmental changes or maturation could influence the relationship between diet and outcome, obscuring the effect of the dietary intervention. As SCDs rely on repeated measures and a detailed study of the relationship between treatment and outcome, studies that use dependent measures that cannot be sampled frequently are not candidates for SCDs. Likewise, failure to identify a temporal relationship between treatment introduction and initiation of change in the outcome can make it challenging to attribute change to the diet. There is always the possibility that a confounding variable is associated with introduction or removal of the diet, which may cause inappropriate decision about the effects of the diet. Dropout or uncontrolled events that occur to individuals can also introduce confounding variables. These problems are not unique to SCDs and can occur with RCTs.

While SCDs have been used for decades to identify effective interventions across disciplines, they have not yet found a home in nutritional sciences (495). Research suggests that it can take more than 15 y for new developments in medicine to trickle down to practicing physicians, and it is likely that many important developments in nutrition are not implemented rapidly enough, resulting in fad diets taking the place of science (501). The ability to test new developments in science or even fad diets using scientific principles could speed up translation of many important developments into practice. SCDs offer a unique experimental approach to identify optimal dietary approaches for individual cases because they can be accumulated in meta-analytic fashion to generalize effects across people (505, 507) and represent a flexible approach to early phase (pilot) translational research (500). The use of SCDs may speed up discovery of dietary approaches for specific outcomes to improve public health, leading to the next generation of dietary interventions with maximal effects for individual cases, as well as to early-phase translational research by gleaning generalizable principles that apply to population subgroups.

Adaptive designs

One challenge to traditional, parallel-arm, randomized controlled designs is that people are kept on assigned treatments even with no evidence of improvement in a condition. For instance, in a weight-loss trial, individuals may not lose weight or may even gain weight on a particular intervention, and thus participants, clinicians, and researchers may be uncomfortable keeping the participants on the intervention. This discomfort stems from an individual being considered a nonresponder (that is, the intervention had no causal effect). As outlined above, it can be impossible to determine in a standard RCT if someone's unfavorable outcome after an intervention is because of random noise or HTE. Nonetheless, theory around causal inference is no solace for people who are expecting success, whether in the clinic or in a trial.

This desire to address an unfavorable outcome can be addressed with Sequential Multiple Assignment Randomized Trial (SMART) (508) designs. A SMART design often begins like a typical RCT in which a group of people are randomized to competing conditions (e.g., diet A vs. diet B). The individuals are followed long enough to classify them as responders or nonresponders (despite these terms being misnomers if you consider that a nonresponder may have been even worse had they not received a treatment). When the a priori classification point is reached, the subsequent intervention allocation is conditional on the first-phase results. For example, responders could be assigned to continue their assigned diet from the first phase or be randomized to competing diets or other therapies, whereas nonresponders could be randomized to alternative therapies.

SMART can be valuable for determining both generalizable knowledge of effects and for establishing clinical decision flows. For example, in the 2-stage design outlined above, the head-to-head average treatment effects of diet A versus diet B would be tested in the first phase, similar to a parallel-arm RCT. If most (or all) people on diet B were nonresponders, this diet may have low clinical utility and may not be recommended. Because the second phase depends on the responder or nonresponder status from the first phase, this design takes personalized information into consideration when making decisions for the next phase. This makes subsequent decisions more personalized.

Formally testing predictive algorithms

Predictive algorithms provide some information a priori in terms of which nutritional intervention may be beneficial for a given individual. However, predicting that someone with a given set of characteristics is likely to benefit from a particular diet is not the same as the demonstration that someone given a particular diet actually improves. Causality, rather than prediction, is preferable for guiding policy and dietary recommendations (509). In a simplistic example, Chiolero (509) laid out the relationship between changing obesity and changing disease: people may be able to decrease obesity through changes in diet or physical activity, but those same interventions may also directly affect disease, raising the question if changing obesity was necessary. The counterfactual is also essential: increased smoking will decrease obesity but is also known to increase multiple diseases. Thus (in this example), changing obesity is not necessary to change disease. This might apply to any of the many predictive elements related to nutrition. Food insecurity predicts some outcomes, yet an intervention providing food or resources in Mexico resulted in greater weight gain (510). Certain microbiota profiles predict outcomes but altering the microbiota in preferred ways can sometimes be difficult, and the question remains whether the effect is due to the microbiota or the means used to change the microbiota.

Understanding causal pathways is important, but ultimately what is needed is a formal test of any recommendation. These predictive algorithms can be formally tested, potentially through adaptation of standard RCT designs. Consider a predictive algorithm that suggests people with characteristic type A may fair better on a high-carbohydrate diet, while characteristic type B fair better on a high-fat diet. In such a case, what individuals are randomized to would depend on the question at hand. If the question is whether an algorithm is superior to standard dietary advice, then 1 group of people can be randomized to the algorithm, and the other to standard dietary advice. Even though people in the algorithm group may be receiving different interventions, the question is about the utility of using the algorithm for superior outcomes. If the question is whether the algorithm classification scheme predicts better outcomes within people, then individuals could first be stratified by the algorithmic decision (that is, people are first divided into types A and B), and then randomized to either match or mismatch with the predicted diet. Indeed, this is related to the prediction tested by the Diet Intervention Examining The Factors Interacting with Treatment Success (DIETFITS) trial by comparing the effects of low-fat and low-carbohydrate diets in people with genotypic or insulin-secretion profiles predicted to respond differently to the diets (511).

Although we covered only a few examples of atypical, randomized designs, it is clear that large, parallel RCTs are not the only means of determining causation and that causal effects of dietary factors can be probed at levels more specific than population-wide average effects. Whether through stratification by important factors (e.g., genetics, ethnicity, sex, age), sequential intervention steps based on individual-level responses, predictive subgrouping through algorithms, or even repeated individual experimentation through n-of-1 studies, increasing personalization can be obtained with stronger causal interventional evidence. These approaches can even be combined. An algorithm could be built based on short-term n-of-1 results, with the first phase of recommendations being dependent on algorithm results followed by adaptation if the individual does not respond favorably to the predicted diet. Unfortunately, much like other evidence of average causal effects, the longer it takes for an outcome to develop or change, the more difficult it is to establish evidence for personalized effects.

Cross-cutting principles

All the nutrition science methods and domains of inquiry presented thus far will be more impactful if requisite experts are incorporated in an interdisciplinary endeavor; science is conducted rigorously and openly; and research questions are conducted and communicated appropriately for the purpose. We present these as 3 cross-cutting principles: Interdisciplinarity as a Skill; Rigorous, Reproducible, and Open Nutrition Science; and Fit for Purpose.

Interdisciplinarity as a skill

A common refrain when a limitation in science is identified is to suggest more training: more statistics, more biochemistry, more food science, more psychology, more sociology, more study design, and others. However, more training is not always an efficient solution because programs are already overloaded and training someone to work in isolation does not advance interdisciplinary science.

The science, technology, engineering, arts, and math (STEAM) approach to education works to integrate knowledge across domains within and beyond science, technology, engineering, and math (STEM). That is, rather than expecting every domain to be a silo (or even STEM siloed from non-STEM), knowledge and ways of thinking can be integrated. Instead of having people learn more in each domain, “better preparing students to operate in a truly interdisciplinary team may alleviate the need for deep knowledge of everything” (512).

In the spirit of STEAM, embracing team science is important and may require a significant culture shift in education, promotion and tenure, proprietary agreements, and ranges of disciplines engaged. As former US President Harry Truman is credited with saying, “It is amazing what you can accomplish if you do not care who gets the credit.” Yet, credit is needed—at a minimum as a motivator and for determining responsibility. In publication, acknowledgment of contribution allows individuals to be recognized for what they did to make a publication a reality, giving credit where credit is due (513). This contrasts starkly with the oft-used standards in biomedical disciplines where authorship (often first, last, and corresponding) is used as a proxy for academic credit, leaving contributions to be inferred or uncritically counted based on author order. Complex work may be conducted by a team of dozens, leaving authors who are not listed as first, last, or corresponding to explain their contributions through other means.

Another key will be harmonizing datasets and storage so that data scientists can easily work across studies, particularly when a discipline has matured to the point that common variables are well established. The Accumulating Data to Optimally Predict Obesity Treatment (ADOPT) standards in obesity research, for instance, are a set of variables that many studies could collect to allow cross-study synthesis (514). The need for interdisciplinary teams will be essential to meet the rising expectations of quality, sharing, and specificity in nutrition science. Complying with FAIR (Findable, Accessible, Interoperable, and Reusable) principles to make nutrition science open will require data science knowledge, design and analysis of appropriately conducted and analyzed personalized designs will require statistical expertise, appropriately defining nutritional interventions or definitions will require nutrition and food science experience, adequately operationalizing outcome variables will require expertise in the outcomes of interest, and so forth. It is important to have team members with cross-cutting expertise to serve as interlocutors across domains and experts with deeper domain-specific expertise on each key part of a study.

Rigorous, reproducible, and open nutrition science

Considerations for rigor, reproducibility, and open science are advancing across many domains. In a 2018 report, the National Academies stated, “Open science aims to ensure the free availability and usability of scholarly publications, the data that result from scholarly research, and the methodologies, including code or algorithms, that were used to generate those data” (515). Efforts to make data and science open have occurred across sectors. The USDA FoodData Central is evolving to provide more transparent data on components, including nutrients, in foods so people understand that a single value does not necessarily represent what is in a consumed food. NIH has explicitly made efforts and expressed interest in enhancing rigor, reproducibility, and transparency. The NIH National Heart, Lung, and Blood Institute's (NHLBI's) Biologic Specimen and Data Repository Information Coordinating Center includes data from multiple cohorts. The Transparency and Openness Promotion guidelines have worked to create common standards across journal publications (516). These efforts will help science in general but are also increasingly important as larger data, more complex algorithms, and more refined questions are necessary to answer the many personalization questions tackled by the nutrition science community.

Transparent use of definitions is also key. Ludwig et al. (517) recommend to “Define diets more precisely when feasible (e.g., with quantitative nutrient targets and other parameters, rather than qualitative descriptors such as Mediterranean) to allow rigorous and reproducible comparisons.” Disagreements around definitions in nutrition research occur for all but chemically defined nutrients down to the weight of isotopes (consider C3 and C4 plants for 12C and 13C carbon isotope ratios, or hypotheses that heavy water influences metabolism). Thus, terms like breakfast, processed foods, Mediterranean diet, time-restricted feeding, and components of these must be adequately and transparently described. Distinctions among definitions can be substantial. For instance, distinguishing among purported effects of lipids in the diet (e.g., low-fat vs. high-fat diets) are further broken down to discussions over fatty acid saturation, poly- versus mono-unsaturated, and further still to positional and geometric isomerization of fatty acids. Just as key is operationalization of outcomes, including meaningful definitions of success for obesity, diabetes, cardiovascular disease, nutrient sufficiency, and other endpoints.

The need for transparency in nutrition data and the science behind dietary recommendations is increasingly valued. Although important limitations exist with respect to how data are made available (e.g., participant confidentiality), sharing algorithms, even when data cannot be shared, would improve a reader's understanding of how conclusions were made. This includes explaining how dietary variables were constructed or transformed, how statistical analyses were conducted, and other information. The default should move toward open data and code, with careful consideration of how to evaluate past research built on closed methods or data and thoughtful investment in moving open research forward.

Fit for purpose

The methods and tools outlined in prior sections can be illuminating for the field, or they could lead us astray. Each one depends on whether the method—and subsequent communication—is fit for purpose.

As described in the interventions section above, for instance, individual changes are frequently miscommunicated as effects. Yet, examples abound of parallel-arm results being miscommunicated as showing treatment response heterogeneity. Similarly, wonderful algorithms can predict outcomes for a person but without evidence of counterfactuals—evidence of a person with 1 set of characteristics changing their diet to something else—causation, and thus strength of recommendations, is difficult to establish. Part of the philosophy of fit for purpose is to consider appropriate and responsible scientific communication.

Study designs used to establish more personalized nutritional recommendations must be carefully chosen, and communication around the results of personal- and population-focused research must also be appropriate for the study design. Headlines touting certain diets or foods abound and are based on average causal (or associative) research. That is, the studies are relevant to what a population may expect to happen, on average, under a particular nutritional paradigm, but an individual's response may vary. The converse also has issues: if an individualized algorithm is developed (or even studies of demographic substrata), communication about what the population should do is also likely an inappropriate extrapolation. Engagement with thoughtful scientific communicators is essential to navigate the nuances among these designs and the implications of their results.

In closing

Nutrition research is at a crossroads, needing to blend traditional approaches with new technologies, incorporate a rapid proliferation of available data, and synthesize information across multiple disciplines to impact precision nutrition. Inherent in our steps forward include methodological innovation, adaptation, and adoption; data security, integrity, and validation; valuing interdisciplinarity as a skill; and a commitment to open, rigorous science. The evolution of nutrition science will continue the values and norms of science more broadly: an open, consensus endeavor dedicated to self-correction and growth as new data and approaches become available. Nutrition science can build from the advances made within our own scientific community and borrow from other discipline's advances to enable investigators to make precision nutrition discoveries. In this way, our science can enable those who rely on scientific data for health recommendations or for developing policy identify points where precision nutrition will—and, just as importantly, will not—advance health.

Summary

The primary aims of this report were to highlight the multiple approaches used in nutrition research, the strengths and limitations of each, and the value of complementarity of approaches to derive the strongest evidence to address nutrition issues. The matrix of approaches may vary for given issues, but in all cases, multiple inputs will be required as solutions will have to consider the social, behavioral, and biological dimensions that contribute to the etiology of a given problem, its manifestations, and clinical and policy management approaches. To support this view, the task force identified 7 broad categories of nutrition science research methods and summarized how each provides important, unique information. However, it also identified limitations in the methods and the conclusions that can be drawn from different research approaches. These limitations do not negate the value of the information they provide; instead, they underscore the fact that no one approach is sufficient to comprehensively address any question in nutrition science as well as where the type and nature of complementary data are most needed.

This was enumerated throughout the text. For example, in the Health Disparities section, it was argued that the food environment—that is, the spaces where people acquire food (through production, gathering, purchasing)—influences dietary choices, nutritional status health, and disparities.

In the Cognitive Performance and Behaviors section, it is noted that variation between individuals is one of the primary considerations when selecting a research method. If all individuals were the same or similar, relatively few studies would be required to determine relationships, and necessary interventions would be well defined. The section emphasizes that there are numerous social, behavioral, and biological determinants of food choice, eating patterns, as well as behavioral and physiological responses to foods. Thus, it will be necessary to draw knowledge from all of the areas identified in this report as well as many traditionally viewed as outside the scope of nutrition science such as food science, psychology, sensory science, and neuroscience to address meaningfully any questions related to feeding.

In the Dietary Assessment section, it is noted that recent advances in biomedical techniques such as genomewide association studies, metabolomics, and proteomics, along with new analytic approaches using AI and machine learning, are providing potential new ways to examine complex processes simultaneously and will help scientists understand differences in response to dietary intake across groups and individuals. Moreover, there are now new biomarkers that can be used to validate or enhance dietary intake reports, photographs that can better capture portion size, and advances in computing that are helping reduce the burden of data collection. Further, with appropriate sampling and consideration of day-to-day variation, food metabolomics holds promise for use in validation and calibration studies and can be expected to support and enhance dietary assessment.

In the Genetics and Epigenetics section, it is noted that dietary components can modify DNA methylation, including fat, protein restriction, and some bioactives (epigallocatechin-3-gallate, genistein, polyphenols, etc.). Therefore, detailed information on diet composition is needed to interpret epigenetic data. Further, dietary intake data are needed to provide context for the genetic and epigenetic data when developing an algorithm. Additionally, measuring genotype is not sufficient to identify people who will experience an adverse outcome; it is important to consider dietary intake when interpreting the functional effects of genetic and epigenetic variation.

In the Microbiome section, it was noted that dietary factors, such as macronutrient composition and customary dietary patterns, are coming to the forefront as having major impacts on the composition and diversity of the gut microbiome. In addition, animal models allow more genetic, diet, and environmental control and contribute to mechanistic understandings but are limited in translation to humans, whereas human microbiome intervention studies may be directly applicable to humans but are generally small and challenged by the high interindividual responses seen with diet. Moreover, confounders such as physical activity, lifestyle factors, and circadian effects also need to be documented and controlled or monitored to interpret changes in the microbiome. Further, it is argued that an ideal diet–microbiome study would combine marker gene or metagenomic analysis with meta-transcriptomic analysis to profile the microbial abundance as well as functional activity of those microbes.

In the Nutritional Status section, it is noted that the greatest value comes from combining methods. Using an FFQ with DLW and an activity monitor increases the value of data collected and provides a level of validation that can be incorporated into statistical analyses.

Finally, in the Cross-cutting Considerations section, it is noted that new computational methods are needed to better examine the interdependence of different contributors to nutritional health.

To enhance the clarity of this report's perspective, the decision was made to undertake this review through the lens of precision nutrition, as this has been identified as a high-priority future research direction. A more public health, population science orientation could have been used just as well, as the goal was to emphasize overarching concepts of the essentiality of data harmonization rather than speaking to a specific issue. However, to demonstrate the applicability of the task force's perspectives, the following summary highlights how the research methods in each of the categories could play a role in addressing weight management in individuals. Again, weight management is used only as an example, not the singular application of concepts in this review. Publications exploring and employing these concepts are now appearing in the literature (518–524).

Health disparities

The economic constraints of social disadvantage influence the ability not only to afford healthy foods promoted for obesity prevention and weight loss but also the valuable resource of time to devote to food preparation by individuals. Psychosocial stressors and coping behaviors to deal with these stressors are also important influencers of dietary intake and of many other lifestyle behaviors (e.g., exercise, smoking etc.). Consideration of these factors is critical in understanding the determinants of obesity risk for individuals and in generating evidence to inform actions to address it. Measuring factors such as SES and neighborhood food environment can highlight mediating factors that influence the potential effectiveness of precision nutrition interventions. Broader sociopolitical circumstances must also be considered in the context of food systems and the increasing abundance of high-energy-dense and low-nutrient-dense foods. Developing an effective obesity-prevention or weight-loss intervention must account for these barriers and constraints and take advantage of the unique contextual factors that may enhance them. Utilizing methodologies such as CBPR, FESs, and impact pathway analysis is, therefore, critical for 1) gaining in-depth knowledge about the target individual and the population and community from which they are drawn, 2) understanding community resources and individual's motivations, and last, 3) examining and measuring intervention results and the potential variability of success based on social constraints. The root causes of disparities in obesity are complex and, consequently, nutrition research should adequately account for these influences to inform the design of interventions and measure their impact.

Cognitive performance and behaviors

The utility of behavioral and cognitive performance measures to develop precision nutrition approaches for the treatment of obesity or prevention of weight gain can be conceptualized either as a problem for big data or individualized personalized n-of-1 designs. A big-data approach would use these measures to predict weight loss or to prevent weight gain using data contributed by many people, and then test prediction models on individuals to see if it is possible to predict whether factors that influence eating can be used to individualize treatment approaches. The alternative is to use personalized n-of-1 trials, which can use big data to identify ideas about factors to manipulate to test how different behavioral or cognitive practices can improve weight loss or prevent weight gain for that person. There is considerable interest in macronutrient characteristics of the diets, patterns of eating, and how characteristics of foods drive the motivation to eat, and all of these factors may have important impacts on weight loss or change that need to be individualized to achieve best results. While these approaches remain untested, it is likely that there are large individual differences in factors that influence eating, types of foods people eat, and patterns of eating that are important in developing personalized nutrition programs.

Dietary assessment

As we move into a future focused on precision nutrition, measuring individual dietary behavior will continue to be necessary. Precision nutrition offers the promise of prescribing individualized diets based on specific genetic and metabolic signatures and identified risk of disease outcomes. Prevention of weight gain will differ across individuals, based on their unique genome, metabolism, and sociocultural circumstances. Dietary assessment is the first step in understanding baseline dietary patterns and may be accomplished with multiple dietary recalls or with FFQs, perhaps supplemented with new technology as it becomes available. Continued monitoring during the change process is also important to better understand what does and does not work as individualized plans are implemented.

Genetics and epigenetics

Putting precision nutrition for weight management into practice will require a more complete understanding of the interaction of metabolism with an individual's genetics and epigenetics, as well as their current and past diet and environmental exposures. Continuing advancements in the ability to integrate an individual's genome sequence and epigenetic markers with expression of both coding and noncoding transcripts will provide nutritional scientists the opportunity to combine this information with metabolite expression patterns to begin to explain some aspects of metabolic heterogeneity. Such studies are likely to identify improved biomarkers for both adequate and inadequate nutritional status. Both well-designed human studies and the collection of these datasets into population-wide databanks will be necessary. A key role for nutritional scientists will be to provide dietary information for individuals in projects that are compiling human genomic and epigenomic data and health records.

Microbiome

The human microbiome is complex and plays an important role in many health outcomes, including maintenance of healthy weight. Studies have shown that different gut microbiome patterns are associated with overweight and obesity, compared with healthy weight. Further, microbiome changes are seen in individuals on weight-loss programs. These effects are attributed, in part, to the role of the microbiome in digestive and gastrointestinal function, as well as fermentation of nondigestible substrates, such as fibers, to energy-yielding metabolites. Other fermentation products may also play a role in weight management through effects on metabolic pathways. One of the most challenging, and intriguing aspects of the microbiome is the high interindividual variability. This is a key focus in understanding precision nutrition, including why people respond differently to a given weight-management regimen. However, microbiome research is still in the early stages, particularly with respect to diet, and it is critical to develop harmonized methods and standardized approaches for replication of findings. Sufficient descriptions of diet and nutrient interventions are essential to further the field. In addition, due to the complexity of the microbiome and interactions with the host, utilization of different experimental systems and approaches is needed. The microbiome field is exploding in new findings and shows great promise in advancing understanding of precision nutrition, but much more understanding is needed for definitive conclusions that can be translated to effective clinical and public health recommendations.

Nutritional status

The research methods described in the section on energy balance and body composition can be applied in several different types of studies and complement many of the techniques described in other sections. For population surveys that require simple methods for a large sample, the use of anthropometrics or BIA can augment and address objectives that may not be fully met by dietary assessment. At the same time, more technical and clinical measures, including stable isotopes or imaging, are excellent methods that offer accuracy and precision that can extend methods to assess nutritional status and clinical studies of food intake. From the perspective of precision nutrition, weight-loss methods vary in flexibility of use, depending on the objective of the study. Clearly, weight loss can be measured accurately with a simple scale and acute changes are detectable with such simple approaches. However, for body-composition compartments associated with health risk—in particular, visceral adipose tissue or total fat mass—more precise methods, such as MRI, DXA, or BIA, may be better utilized. These methods clearly augment dietary and epigenetic studies associated with a metabolic outcome that is both measurable and part of a causal pathway. Thus, fitting the specific methodology to the study objective is necessary, and such examples of coupling a more precise approach to study energy expenditure and nutritional status with other broader approaches to diet or gut health are possible with a focus on precision nutrition.

Cross-cutting considerations

Personalized approaches to weight maintenance can be initiated by anyone being their own scientist through self-directed n-of-1 studies, under the care of a clinician, if the individual has underlying health concerns. Whereas chronic diseases or mortality may be binary in nature, weight maintenance is amenable to individuals repeatedly trying different approaches. These attempts could be tailored by using metabolomic profiling, genetic analysis, microbiological assessment, behavior and cognitive typing, social and life circumstance characterization, and nutritional physiology, independently or in conjunction, to identify people more or less likely to succeed using a particular set of weight-management approaches. Indeed, these approaches could help identify individuals who may or may not have a concern about weight gain to begin with. Any approach borne from these measurements to predict weight-management success needs to be subjected to subsequent testing, rather than relying only on prediction to guide clinical and public health practice.

Studies undertaken to elucidate these patterns should be done transparently, pre-registered where possible, with open algorithms, and sharing data where appropriate according to shared standards such as ADOPT. Such approaches need to be clearly communicated commensurate with the strength of evidence. Scientists owe it to the public to present the personalized evidence for weight management with honesty.

The genesis of this report was growing concern about the tone of discourse among nutrition professionals and the implications of acrimony on the productive study of nutrition science and its translation to clinical practice and policy. Too often honest differences of opinion were cast as conflicts [e.g., epidemiology vs. clinical trials; basic vs. applied research; animal models vs. human testing; acute vs. chronic trials; research results vs. policy; qualitative vs. quantitative methods; food intake vs. surrogate endpoint (biomarkers) vs. established clinical endpoint; and in vitro vs. in vivo models] instead of areas of needed collaboration. Hopefully, recognition of the value (and limitations) of contributions from well-executed nutrition science derived from the various approaches used in the discipline, as well as appreciation of how their layering will yield the strongest evidence base, will provide a basis for greater productivity and impact.

ACKNOWLEDGEMENTS

The authors thank ex officio members of the task force, Drs. John E Courtney, ASN Chief Executive Officer, and Lindsay H Allen, 2019–2020 President of ASN, for their support. The authors also thank Dr. Alison GM Brown for her significant contributions to the Health Disparities section and Dr. Susan Roberts for her contributions to the Nutritional Status section. The authors’ responsibilities were as follows—all authors: contributed to the drafting of this report and read and approved the final manuscript.

Notes

RDM receives research support from the Almond Board of California and has other outside interests in the Grain Food Foundation and Mars, Wrigley Inc.; DJH receives research support from the Robert Wood Johnson Foundation; DJL received research support from Texas A&M AgriLife and has other outside interests in Biofortis, Merieux NutriSciences, and the Institute for the Advancement for Food and Nutrition Sciences; EJMF receives research support from Ausnutria; KLT receives research support from NIH; SHZ has financial, commercial, and outside interests in SNP Therapeutics, ByHeart, and Ingenuity Foods; AJB received research support from Arizona State University. All other authors report no disclosures.

The ASN Board of Directors appointed the Nutrition Research Task Force to develop a report on scientific methods used in nutrition science to advance discovery, interpretation, and application of knowledge in the field. This report was reviewed and approved by the ASN Board of Directors.

Abbreviations used: ABCA1, ATP-binding cassette subfamily A member 1; ADOPT, Accumulating Data to Optimally Predict Obesity Treatment; AI, artificial intelligence; AMPM, Automated Multiple-Pass Method; ASA24, Automated Self-Administered 24-Hour; BIA, bioelectrical impedance analysis; BOND, Biomarkers of Nutrition for Development; CBPR, community-based participatory research; CF, cognitive function; CT, computed tomography; DHQ, Diet History Questionnaire; DLW, doubly labeled water; FDR, false discovery rate; FES, focused ethnographic study; FFQ, food-frequency questionnaire; GIS, geographic information system; HbA1c, glycated hemoglobin; HTE, heterogeneity of treatment effect; IYCF, infant and young child feeding; MECS, Multi-Ethnic Cohort Study; mRNA, messenger RNA; MTHFR, methylenetetrahydrofolate reductase; NCI, National Cancer Institute; PE, phosphatidylethanolamine; PEMT, phosphatidylethanolamine-N-methyltransferase; RCT, randomized controlled trial; RQ, respiratory quotient; SCD, single-case experimental design; SES, socioeconomic status; SMART, Sequential Multiple Assignment Randomized Trial; SNP, single nucleotide polymorphism; STEAM, science, technology, engineering, arts, and math; STEM, science, technology, engineering, and math; TMA, trimethylamine; 24HR, 24-hour recall.

Contributor Information

Richard D Mattes, Purdue University, West Lafayette, IN, USA.

Sylvia B Rowe, SR Strategy, Washington, DC, USA.

Sarah D Ohlhorst, ASN, Rockville, MD, USA.

Andrew W Brown, Indiana University, Bloomington, IN, USA.

Daniel J Hoffman, Rutgers University, New Brunswick, NJ, USA.

DeAnn J Liska, Texas A&M AgriLife Research, College Station, TX, USA.

Edith J M Feskens, Wageningen University & Research, Wageningen, The Netherlands.

Jaapna Dhillon, University of Missouri, Columbia, MO, USA.

Katherine L Tucker, University of Massachusetts Lowell, Lowell, MA, USA.

Leonard H Epstein, University at Buffalo Jacobs School of Medicine and Biomedical Sciences, Buffalo, NY, USA.

Lynnette M Neufeld, Global Alliance for Improved Nutrition (GAIN), Geneva, Switzerland.

Michael Kelley, Michael Kelley Nutrition Science Consulting, Wauwatosa, WI, USA.

Naomi K Fukagawa, USDA Beltsville Human Nutrition Research Center, Beltsville, MD, USA.

Roger A Sunde, University of Wisconsin, Madison, WI, USA.

Steven H Zeisel, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA.

Anthony J Basile, Arizona State University, Tempe, AZ, USA.

Laura E Borth, University of Wisconsin, Madison, WI, USA.

Emahlea Jackson, University of Washington, Seattle, WA, USA.

References

  • 1. Carter J. The American public still trusts scientists, says a new pew survey [Internet]. Sci Am[cited 2021 Aug 2]. Available from: https://www.scientificamerican.com/article/the-american-public-still-trusts-scientists-says-a-new-pew-survey/. [Google Scholar]
  • 2. Funk C, Hefferon M, Kennedy B, Johnson C. Trust and mistrust in Americans’ views of scientific experts [Internet]. Pew Research Center Science & Society. 2019[cited 2021 Aug 2]. Available from: https://www.pewresearch.org/science/2019/08/02/trust-and-mistrust-in-americans-views-of-scientific-experts/. [Google Scholar]
  • 3. Garza C, Stover PJ, Ohlhorst SD, Field MS, Steinbrook R, Rowe Set al. Best practices in nutrition science to earn and keep the public's trust. Am J Clin Nutr. 2019;109(1):225–43. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Trust in nutrition science [Internet] . American Society for Nutrition. [cited 2021 Sep 20]. Available from: https://nutrition.org/trust/. [Google Scholar]
  • 5. Office of Strategic Coordination . NOT-RM-21–005: request for information: data science challenges and opportunities in the field of precision nutrition [Internet]. [cited 2021 Aug 2]. Available from: https://grants.nih.gov/grants/guide/notice-files/NOT-RM-21-005.html. [Google Scholar]
  • 6. Rodgers GP, Collins FS. Precision nutrition—the answer to “What to eat to stay healthy.” JAMA. 2020;324(8):735–6. [DOI] [PubMed] [Google Scholar]
  • 7. World Health Organization . A conceptual framework for action on the social determinants of health: debates, policy & practice, case studies [Internet]. 2010[cited 2021 Feb 24]. Available from: http://apps.who.int/iris/bitstream/10665/44489/1/9789241500852_eng.pdf [Google Scholar]
  • 8. Braveman P. What are health disparities and health equity? We need to be clear. Public Health Rep. 2014;129(1 Suppl2):5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Tumilowicz A, Ruel MT, Pelto G, Pelletier D, Monterrosa EC, Lapping Ket al. Implementation science in nutrition: concepts and frameworks for an emerging field of science and practice. Curr Dev Nutr. 2019;3(3):nzy080. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Wallerstein NB, Duran B. Using community-based participatory research to address health disparities. Health Promot Pract. 2006;7(3):312–23. [DOI] [PubMed] [Google Scholar]
  • 11. Coughlin SS, Smith SA. Community-based participatory research to promote healthy diet and nutrition and prevent and control obesity among African-Americans: a literature review. J Racial Ethnic Health Disparities. 2017;4(2):259–68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Wieland ML, Weis JA, Palmer T, Goodson M, Loth S, Omer Fet al. Physical activity and nutrition among immigrant and refugee women: a community-based participatory research approach. Womens Health Issues. 2012;22(2):e225–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Arcan C, Culhane-Pera KA, Pergament S, Rosas-Lee M, Xiong MB. Somali, Latino and Hmong parents’ perceptions and approaches about raising healthy-weight children: a community-based participatory research study. Public Health Nutr. 2018;21(6):1079–93. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Grier SA, Kumanyika SK. The context for choice: health implications of targeted food and beverage marketing to African Americans. Am J Public Health. 2008;98(9):1616–29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Larson NI, Story MT, Nelson MC. Neighborhood environments: disparities in access to healthy foods in the U.S. Am J Prev Med. 2009;36(1):74–81.e10. [DOI] [PubMed] [Google Scholar]
  • 16. Viswanathan M, Ammerman A, Eng E, Garlehner G, Lohr K, Griffith Det al. Community-based participatory research: assessing the evidence: summary. Evidence Report/Technology Assessment (Summary). 2004 Sep 1;18:1–8. [PMC free article] [PubMed] [Google Scholar]
  • 17. Holkup PA, Tripp-Reimer T, Salois EM, Weinert C. Community-based participatory research: an approach to intervention research with a native American community. Adv Nurs Sci. 2004;27(3):162–75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Freudenberg N, Tsui E. Evidence, power, and policy change in community-based participatory research. Am J Public Health. 2014;104(1):11–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Lazarus S, Bulbulia S, Taliep N, Naidoo AV. Community-based participatory research as a critical enactment of community psychology. J Community Psychol. 2015;43(1):87–98. [Google Scholar]
  • 20. Victora CG, Schellenberg JA, Huicho L, Amaral J, El Arifeen S, Pariyo Get al. Context matters: interpreting impact findings in child survival evaluations. Health Policy Plan. 2005;20(Suppl 1):i18–31. [DOI] [PubMed] [Google Scholar]
  • 21. Pelto PJ. Anthropological research: the structure of inquiry. 2nd ed. Cambridge (UK): Cambridge University Press; 1978. [Google Scholar]
  • 22. Tumilowicz A, Neufeld LM, Pelto GH.. Using ethnography in implementation research to improve nutrition interventions in populations. Matern Child Nutr. 2016;11:55–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Victora CG, Christian P, Vidaletti LP, Gatica-Domínguez G, Menon P, Black RE. Revisiting maternal and child undernutrition in low-income and middle-income countries: variable progress towards an unfinished agenda. Lancet North Am Ed. 2021;397(10282):1388–99. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Bellamy C, UNICEF . The state of the world's children. New York: UNICEF; 1998. [Google Scholar]
  • 25. Bhutta ZA, Das JK, Rizvi A, Gaffey MF, Walker N, Horton Set al. Evidence-based interventions for improvement of maternal and child nutrition: what can be done and at what cost?. Lancet North Am Ed. 2013;382(9890):452–77. [DOI] [PubMed] [Google Scholar]
  • 26. Pelto GH, Armar-Klemesu M, Siekmann J, Schofield D. The focused ethnographic study ‘assessing the behavioral and local market environment for improving the diets of infants and young children 6 to 23 months old’ and its use in three countries. Matern Child Nutr. 2012;9:35–46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Zobrist S, Kalra N, Pelto G, Wittenbrink B, Milani P, Diallo AMet al. Using cognitive mapping to understand Senegalese infant and young child feeding decisions. Matern Child Nutr. 2018;14(2):e12542. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Pelto GH. Applying focused ethnographic methods: examining implications of intracultural diversity for nutrition interventions. Nutr Rev. 2020;78(Suppl 2):71–9. [DOI] [PubMed] [Google Scholar]
  • 29. Banerjee A, Barnhardt S, Duflo E. Can iron-fortified salt control anemia? Evidence from two experiments in rural Bihar. J Dev Econ. 2018;133:127–46. [Google Scholar]
  • 30. Habicht JP, Victora CG, Vaughan JP. Evaluation designs for adequacy, plausibility and probability of public health programme performance and impact. Int J Epidemiol. 1999;28(1):10–8. [DOI] [PubMed] [Google Scholar]
  • 31. Habicht J-P, Pelto GH. From biological to program efficacy: promoting dialogue among the research, policy, and program communities. Adv Nutr. 2014;5(1):27–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. White H. Theory-based evaluation: principles and practic. e. New Delhi (India): International Initiative for Impact Evaluation (3ie); 2009. Report No.: 3. [Google Scholar]
  • 33. Nguyen PH, Menon P, Keithly SC, Kim SS, Hajeebhoy N, Tran LMet al. Program impact pathway analysis of a social franchise model shows potential to improve infant and young child feeding practices in Vietnam. J Nutr. 2014;144(10):1627–36. [DOI] [PubMed] [Google Scholar]
  • 34. Avula R, Menon P, Saha KK, Bhuiyan MI, Chowdhury AS, Siraj Set al. A program impact pathway analysis identifies critical steps in the implementation and utilization of a behavior change communication intervention promoting infant and child feeding practices in Bangladesh. J Nutr. 2013;143(12):2029–37. [DOI] [PubMed] [Google Scholar]
  • 35. Savy M, Briaux J, Seye M, Douti MP, Perrotin G, Martin-Prevel Y. Tailoring process and impact evaluation of a “Cash-Plus” program: the value of using a participatory program impact pathway analysis. Curr Dev Nutr. 2020; 4(7):nzaa099. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36. Le Port A, Zongrone A, Savy M, Fortin S, Kameli Y, Sessou Eet al. Program impact pathway analysis reveals implementation challenges that limited the incentive value of conditional cash transfers aimed at improving maternal and child health care use in Mali. Curr Dev Nutr. 2019;3(9):nzz084. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37. García-Guerra A, Neufeld LM, Bonvecchio Arenas A, Fernández-Gaxiola AC, Mejía-Rodríguez F, García-Feregrino Ret al. Closing the nutrition impact gap using program impact pathway analyses to inform the need for program modifications in Mexico's conditional cash transfer program. J Nutr. 2019;149(Suppl 1):2281S–9S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Habicht J-P, Pelto GH. Program impact pathways and contexts: a commentary on theoretical issues and research applications to support the EsIAN component of Mexico's conditional cash transfer program. J Nutr. 2019;149(Suppl 1):2332S–2340S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Andresen EM, Miller DK. The future (history) of socioeconomic measurement and implications for improving health outcomes among African Americans. J Gerontol A Biol Sci Med Sci. 2005;60(10):1345–50. [DOI] [PubMed] [Google Scholar]
  • 40. Kahn J, Fazio E. Economic status over the life course and racial disparities in health. J Gerontol B. 2005;60(2):S76–S84. [DOI] [PubMed] [Google Scholar]
  • 41. Goldthorpe JH. Analysing social inequality: a critique of two recent contributions from economics and epidemiology. Eur Soc Rev. 2010;26(6):731–44. [Google Scholar]
  • 42. Geyer S, Hemström O, Peter R, Vågerö D. Education, income, and occupational class cannot be used interchangeably in social epidemiology. Empirical evidence against a common practice. J Epidemiol Comm Health. 2006;60(9):804–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Shavers VL. Measurement of socioeconomic status in health disparities research. J Natl Med Assoc. 2007;99(9):1013–23. [PMC free article] [PubMed] [Google Scholar]
  • 44. Psaki SR, Seidman JC, Miller M, Gottlieb M, Bhutta ZA, Ahmed Tet al. Measuring socioeconomic status in multicountry studies: results from the eight-country MAL-ED study. Population Health Metrics. 2014;12(1):1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Keita AD, Casazza K, Thomas O, Fernandez JR. Neighborhood-level disadvantage is associated with reduced dietary quality in children. J Am Diet Assoc. 2009;109(9):1612–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. Zhu Z, Zhang D, Wang JH-Y, Qiao Y, Liu Y, Braithwaite D. Is education or income associated with insufficient fruit and vegetable intake among cancer survivors? A cross-sectional analysis of 2017 BRFSS data. BMJ Open. 2020;10(12):e041285. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Aaron GJ, Friesen VM, Jungjohann S, Garrett GS, Neufeld LM, Myatt M. Coverage of large-scale food fortification of edible oil, wheat flour, and maize flour varies greatly by vehicle and country but is consistently lower among the most vulnerable: results from coverage surveys in 8 countries. J Nutr. 2017;147(5):984S–94S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48. Braveman PA, Cubbin C, Egerter S, Chideya S, Marchi KS, Metzler Met al. Socioeconomic status in health research: one size does not fit all. JAMA. 2005;294(22):2879–88. [DOI] [PubMed] [Google Scholar]
  • 49. Hadden WC, Maury BB. The limits of contextual or multi-level analysis of health surveys. Statistical Policy Working Paper 30. 1999 Federal Committee on Statistical Methodology Research Conference: Statistical Policy Office, Office of Management and Budget. https://nces.ed.gov/FCSM/1999research.asp. [Google Scholar]
  • 50. Turner C, Aggarwal A, Walls H, Herforth A, Drewnowski A, Coates Jet al. Concepts and critical perspectives for food environment research: a global framework with implications for action in low- and middle-income countries. Global Food Security. 2018;18:93–101. [Google Scholar]
  • 51. Food and Agriculture Organization (FAO) . Influencing food environments for healthy diets, summary. Rome: FAO; 2016. [Google Scholar]
  • 52. Glanz K, Sallis JF, Saelens BE, Frank LD. Healthy nutrition environments: concepts and measures. Am J Health Promot. 2005;19(5):330–3. [DOI] [PubMed] [Google Scholar]
  • 53. Glanz K, Sallis JF, Saelens BE, Frank LD. Nutrition Environment Measures Survey in stores (NEMS-S): development and evaluation. Am J Prev Med. 2007;32(4):282–9. [DOI] [PubMed] [Google Scholar]
  • 54. Saelens BE, Glanz K, Sallis JF, Frank LD. Nutrition Environment Measures Study in restaurants (NEMS-R): development and evaluation. Am J Prev Med. 2007;32(4):273–81. [DOI] [PubMed] [Google Scholar]
  • 55. Voss C, Klein S, Glanz K, Clawson M. Nutrition environment measures survey-vending: development, dissemination, and reliability. Health Promot Pract. 2012;13(4):425–30. [DOI] [PubMed] [Google Scholar]
  • 56. Downs SM, Ahmed S, Fanzo J, Herforth A. Food environment typology: advancing an expanded definition, framework, and methodological approach for improved characterization of wild, cultivated, and built food environments toward sustainable diets. Foods. 2020;9(4):532. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57. Toure D, Herforth A, Pelto GH, Neufeld LM, Mbuya MNN. An emergent framework of the market food environment in low- and middle-income countries. Curr Dev Nutr. 2021;5(4):nzab023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 58. Ghirardelli A, Quinn V, Foerster SB. Using geographic information systems and local food store data in California's low-income neighborhoods to inform community initiatives and resources. Am J Public Health. 2010;100(11):2156–62. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59. Charreire H, Casey R, Salze P, Simon C, Chaix B, Banos Aet al. Measuring the food environment using geographical information systems: a methodological review. Public Health Nutr. 2010;13(11):1773–85. [DOI] [PubMed] [Google Scholar]
  • 60. Life expectancy: could where you live influence how long you live?. [Internet]Robert Wood Johnson Foundation. 2020; [cited 2021 Feb 23]. Available from: https://www.rwjf.org/en/library/interactives/whereyouliveaffectshowlongyoulive.html. [Google Scholar]
  • 61. Caspi CE, Sorensen G, Subramanian SV, Kawachi I. The local food environment and diet: a systematic review. Health Place. 2012;18(5):1172–87. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62. Cetateanu A, Jones A. How can GPS technology help us better understand exposure to the food environment? A systematic review. SSM Population Health. 2016;2:196–205. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63. Mattes RD. Hunger and thirst: issues in measurement and prediction of eating and drinking. Physiol Behav. 2010;100(1):22–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64. Mattes RD, Friedman MI. Hunger. Dig Dis. 1993;11(2):65–77. [DOI] [PubMed] [Google Scholar]
  • 65. Boswell RG, Kober H. Food cue reactivity and craving predict eating and weight gain: a meta-analytic review. Obes Rev. 2016;17(2):159–77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66. Meule A. The psychology of food cravings: the role of food deprivation. Curr Nutr Rep. 2020;9(3):251–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67. Sun W, Kober H. Regulating food craving: from mechanisms to interventions. Physiol Behav. 2020;222:112878. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68. Mattes R. Fluid calories and energy balance: the good, the bad, and the uncertain. Physiol Behav. 2006;89(1):66–70. [DOI] [PubMed] [Google Scholar]
  • 69. Mattes RD, Hollis J, Hayes D, Stunkard AJ. Appetite: measurement and manipulation misgivings. J Am Diet Assoc. 2005;105(5):87–97. [DOI] [PubMed] [Google Scholar]
  • 70. Bartoshuk LM, Duffy VB, Hayes JE, Moskowitz HR, Snyder DJ. Psychophysics of sweet and fat perception in obesity: problems, solutions and new perspectives. Philos Trans R Soc B Biol Sci. 2006;361(1471):1137–48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71. Stunkard AJ, Messick S. The Three-Factor Eating Questionnaire to measure dietary restraint, disinhibition and hunger. J Psychosom Res. 1985;29(1):71–83. [DOI] [PubMed] [Google Scholar]
  • 72. Lasschuijt MP, Mars M, de Graaf C, Smeets PAM. Endocrine cephalic phase responses to food cues: a systematic review. Adv Nutr. 2020;11(5):1364–83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73. Wiedemann SJ, Rachid L, Illigens B, Böni-Schnetzler M, Donath MY. Evidence for cephalic phase insulin release in humans: a systematic review and meta-analysis. Appetite. 2020;155:104792. [DOI] [PubMed] [Google Scholar]
  • 74. Figlewicz DP, Benoit SC. Insulin, leptin, and food reward: update 2008. Am J Physiol Regul Integr Comp Physiol. 2009;296(1):R9–R19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75. Woods SC. The eating paradox: how we tolerate food. Psychol Rev. 1991;98(4):488–505. [DOI] [PubMed] [Google Scholar]
  • 76. Lappalainen R, Sjödén P-O, Karhunen L, Gladh V, Lesinska D. Inhibition of anticipatory salivation and craving in response to food stimuli. Physiol Behav. 1994;56(2):393–8. [DOI] [PubMed] [Google Scholar]
  • 77. Nirenberg TD, Miller PM. Salivation: an assessment of food craving?. Behav Res Ther. 1982;20(4):405–7. [DOI] [PubMed] [Google Scholar]
  • 78. Sato W, Yoshikawa S, Fushiki T. Facial EMG activity is associated with hedonic experiences but not nutritional values while viewing food images. Nutrients. 2021;13(1):11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79. Sato W, Minemoto K, Ikegami A, Nakauma M, Funami T, Fushiki T. Facial EMG correlates of subjective hedonic responses during food consumption. Nutrients. 2020;12(4):1174. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80. Fulkerson JA, Nelson MC, Lytle L, Moe S, Heitzler C, Pasch KE. The validation of a home food inventory. Int J Behav Nutr Phys Activity. 2008;5(1):55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 81. Booth DA. Learned ingestive motivation and the pleasures of the palate. In: Bolles RC, editor. The hedonics of taste. Hillsdale (NJ): Lawrence Erlbaum Associates; 1991. p. 29–58. [Google Scholar]
  • 82. Booth DA, Higgs S, Schneider J, Klinkenberg I. Learned liking versus inborn delight: can sweetness give sensual pleasure or is it just motivating?. Psychol Sci. 2010;21(11):1656–63. [DOI] [PubMed] [Google Scholar]
  • 83. Nederkoorn C, Jansen A. Cue reactivity and regulation of food intake. Eat Behav. 2002;3(1):61–72. [DOI] [PubMed] [Google Scholar]
  • 84. Cornell CE, Rodin J, Weingarten HP. Stimulus-induced eating when satiated. Physiol Behav. 1989;45(4):695–704. [DOI] [PubMed] [Google Scholar]
  • 85. Sclafani A. Learned controls of ingestive behaviour. Appetite. 1997;29(2):153–8. [DOI] [PubMed] [Google Scholar]
  • 86. Myers KP, Sclafani A. Development of learned flavor preferences. Dev Psychobiol. 2006;48(5):380–8. [DOI] [PubMed] [Google Scholar]
  • 87. Yeomans MR, Leitch M, Gould NJ, Mobini S. Differential hedonic, sensory and behavioral changes associated with flavor-nutrient and flavor-flavor learning. Physiol Behav. 2008;93(4-5):798–806. [DOI] [PubMed] [Google Scholar]
  • 88. Sclafani A, Ackroff K. Nutrient-conditioned flavor preference and incentive value measured by progressive ratio licking in rats. Physiol Behav. 2006;88(1–2):88–94. [DOI] [PubMed] [Google Scholar]
  • 89. Booth DA, Mather P, Fuller J. Starch content of ordinary foods associatively conditions human appetite and satiation, indexed by eating and eating pleasantness of starch-paired flavours. Appetite. 1982;3(2):163–84. [DOI] [PubMed] [Google Scholar]
  • 90. Figlewicz DP. Adiposity signals and food reward: expanding the CNS roles of insulin and leptin. Am J Physiol Regul Integr Comp Physiol. 2003;284(4):R882–92. [DOI] [PubMed] [Google Scholar]
  • 91. Epstein LH, Leddy JJ, Temple JL, Faith MS. Food reinforcement and eating: a multilevel analysis. Psychol Bull. 2007 Sep;133(5):884–906. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92. Epstein LH, Lin H, Carr KA, Fletcher KD. Food reinforcement and obesity. Psychological moderators. Appetite. 2012;58(1):157–62. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93. Epstein LH, Carr KA, Lin H, Fletcher KD. Food reinforcement, energy intake, and macronutrient choice. Am J Clin Nutr. 2011;94(1):12–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94. Avena NM, Rada P, Hoebel BG. Evidence for sugar addiction: behavioral and neurochemical effects of intermittent, excessive sugar intake. Neurosci Biobehav Rev. 2008;32(1):20–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95. Gearhardt AN, Hebebrand J. The concept of “food addiction” helps inform the understanding of overeating and obesity: YES. Am J Clin Nutr. 2021;113(2):263–7. [DOI] [PubMed] [Google Scholar]
  • 96. Schulte EM, Avena NM, Gearhardt AN. Which foods may be addictive? The roles of processing, fat content, and glycemic load. PLoS One. 2015;10(2):e0117959. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97. Goldfield GS, Epstein LH, Davidson M, Saad F. Validation of a questionnaire measure of the relative reinforcing value of food. Eat Behav. 2005;6(3):283–92. [DOI] [PubMed] [Google Scholar]
  • 98. Epstein LH, Paluch RA, Carr KA, Temple JL, Bickel WK, MacKillop J. Reinforcing value and hypothetical behavioral economic demand for food and their relation to BMI. Eat Behav. 2018;29:120–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99. Robinson TE, Berridge KC. The psychology and neurobiology of addiction: an incentive-sensitization view. Addiction. 2000;95:S91–117. [DOI] [PubMed] [Google Scholar]
  • 100. Robinson TE, Berridge KC. Incentive salience and drug “wanting.” Psychopharmacology (Berl). 2004;171:352–3. [Google Scholar]
  • 101. Robinson TE, Berridge KC. The neural basis of drug craving: an incentive-sensitization theory of addiction. Brain Res Rev. 1993;18(3):247–91. [DOI] [PubMed] [Google Scholar]
  • 102. Finlayson G, King N, Blundell JE. Liking vs. wanting food: importance for human appetite control and weight regulation. Neurosci Biobehav Rev. 2007;31(7):987–1002. [DOI] [PubMed] [Google Scholar]
  • 103. Lowe MR, Butryn ML, Didie ER, Annunziato RA, Thomas JG, Crerand CEet al. The Power of Food Scale. A new measure of the psychological influence of the food environment. Appetite. 2009;53(1):114–8. [DOI] [PubMed] [Google Scholar]
  • 104. Temple JL. Factors that influence the reinforcing value of foods and beverages. Physiol Behav. 2014;136:97–103. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 105. Gearhardt AN, Hebebrand J. The concept of “food addiction” helps inform the understanding of overeating and obesity: debate consensus. Am J Clin Nutr. 2021;113(2):274–6. [DOI] [PubMed] [Google Scholar]
  • 106. Hebebrand J, Gearhardt AN. The concept of “food addiction” helps inform the understanding of overeating and obesity: NO. Am J Clin Nutr. 2021;113(2):268–73. [DOI] [PubMed] [Google Scholar]
  • 107. Gearhardt AN, Corbin WR, Brownell KD. Development of the Yale food addiction scale version 2.0. Psychol Addict Behav. 2016;30(1):113–21. [DOI] [PubMed] [Google Scholar]
  • 108. Epel ES, Tomiyama AJ, Mason AE, Laraia BA, Hartman W, Ready Ket al. The Reward-based Eating Drive Scale: a self-report index of reward-based eating. PLoS One. 2014;9(6):e101350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 109. Birch LL, Birch D, Marlin DW, Kramer L. Effects of instrumental consumption on children's food preference. Appetite. 1982;3(2):125–34. [DOI] [PubMed] [Google Scholar]
  • 110. Birch LL, Fisher JO, Grimm-Thomas K, Markey CN, Sawyer R, Johnson SL. Confirmatory factor analysis of the Child Feeding Questionnaire: a measure of parental attitudes, beliefs and practices about child feeding and obesity proneness. Appetite. 2001;36(3):201–10. [DOI] [PubMed] [Google Scholar]
  • 111. Birch LL, Johnson SL, Grimm-Thomas K, Fisher JO. The Child Feeding Questionnaire (CFQ): an instrument for assessing parental child feeding attitudes and strategies. Operational definitions of factors, scoring and summing instructions. University Park (PA): Pennsylvania State University; 1998. [Google Scholar]
  • 112. Carr KA, Epstein LH. Choice is relative: Reinforcing value of food and activity in obesity treatment. Am Psychol. 2020;75(2):139–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 113. Bardo MT, Klebaur JE, Valone JM, Deaton C. Environmental enrichment decreases intravenous self-administration of amphetamine in female and male rats. Psychopharmacology (Berl). 2001;155:278–84. [DOI] [PubMed] [Google Scholar]
  • 114. El Rawas R, Thiriet N, Lardeux V, Jaber M, Solinas M. Environmental enrichment decreases the rewarding but not the activating effects of heroin. Psychopharmacology (Berl). 2009;203(3):561–70. [DOI] [PubMed] [Google Scholar]
  • 115. Solinas M, Thiriet N, Chauvet C, Jaber M. Prevention and treatment of drug addiction by environmental enrichment. Prog Neurobiol. 2010;92(4):572–92. [DOI] [PubMed] [Google Scholar]
  • 116. Bryant EJ, King NA, Blundell JE. Disinhibition: its effects on appetite and weight regulation. Obes Rev. 2008;9(5):409–19. [DOI] [PubMed] [Google Scholar]
  • 117. Arnow B, Kenardy J, Agras WS. The Emotional Eating Scale: the development of a measure to assess coping with negative affect by eating. Int J Eat Disord. 1995;18(1):79–90. [DOI] [PubMed] [Google Scholar]
  • 118. Konttinen H, van Strien T, Mannisto S, Jousilahti P, Haukkala A. Depression, emotional eating and long-term weight changes: a population-based prospective study. Int J Behav Nutr Phys Activity. 2019;16(1):28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119. Tomiyama AJ, Puterman E, Epel ES, Rehkopf DH, Laraia BA. Chronic psychological stress and racial disparities in body mass index change between black and white girls aged 10–19. Ann Behav Med. 2013;45(1):3–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 120. Adam TC, Epel ES. Stress, eating and the reward system. Physiol Behav. 2007;91(4):449–58. [DOI] [PubMed] [Google Scholar]
  • 121. Beck AT, Steer RA, Garbin MG. Psychometric properties of the Beck Depression Inventory: twenty-five years of evaluation. ClinPsychol Rev. 1988;8:77–100. [Google Scholar]
  • 122. Cooper, Z, Fairburn CG. The eating disorder examination: a semi-structured interview for the assessment of the specific psychopathology of eating disorders. Int J Eat Disord. 1987;6(1):1–8. [Google Scholar]
  • 123. Fairburn CG, Beglin SJ. Assessment of eating disorders: interview or self-report questionnaire?. Int J Eat Disord. 1994;16(4):363–70. [PubMed] [Google Scholar]
  • 124. Epstein LH, Truesdale R, Wojcik A, Paluch RA, Raynor HA. Effects of deprivation on hedonics and reinforcing value of food. Physiol Behav. 2003;78(2):221–7. [DOI] [PubMed] [Google Scholar]
  • 125. Raynor HA, Epstein LH. The relative-reinforcing value of food under differing levels of food deprivation and restriction. Appetite. 2003;40(1):15–24. [DOI] [PubMed] [Google Scholar]
  • 126. Schlam TR, Wilson NL, Shoda Y, Mischel W, Ayduk O. Preschoolers’ delay of gratification predicts their body mass 30 years later. J Pediatr. 2013;162(1):90–3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 127. Mischel W, Shoda Y, Rodriguez MI. Delay of gratification in children. Science. 1989;244(4907):933–8. [DOI] [PubMed] [Google Scholar]
  • 128. Francis LA, Susman EJ. Self-regulation and rapid weight gain in children from age 3 to 12 years. Arch Pediatr Adolesc Med. 2009;163(4):297–302. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 129. Bickel WK, Marsch LA. Toward a behavioral economic understanding of drug dependence: delay discounting processes. Addiction. 2001;96(1):73–86. [DOI] [PubMed] [Google Scholar]
  • 130. Bickel WK, Freitas-Lemos R, Tomlinson DC, Craft WH, Keith DR, Athamneh LNet al. Temporal discounting as a candidate behavioral marker of obesity. Neurosci Biobehav Rev. 2021;129:307–29. [DOI] [PubMed] [Google Scholar]
  • 131. DeHart WB, Snider SE, Pope DA, Bickel WK. A reinforcer pathology model of health behaviors in individuals with obesity. Health Psychol. 2020;39(11):966–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 132. Epstein LH, Salvy SJ, Carr KA, Dearing KK, Bickel WK. Food reinforcement, delay discounting and obesity. Physiol Behav. 2010;100(5):438–45. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 133. Epstein LH, Paluch RA, Stein JS, Quattrin T, Mastrandrea LD, Bree KAet al. Delay discounting, glycemic regulation and health behaviors in adults with prediabetes. Behav Med. Published online April 10, 2020. doi: 10.1080/08964289.2020.1712581. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 134. Bickel WK, Jarmolowicz DP, Mueller ET, Koffarnus MN, Gatchalian KM. Excessive discounting of delayed reinforcers as a trans-disease process contributing to addiction and other disease-related vulnerabilities: emerging evidence. Pharmacol Ther. 2012;134(3):287–97. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 135. Epstein LH, Jankowiak N, Fletcher KD, Carr KA, Nederkoorn C, Raynor HAet al. Women who are motivated to eat and discount the future are more obese. Obesity (Silver Spring). 2014;22(6):1394–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 136. Rollins BY, Dearing KK, Epstein LH. Delay discounting moderates the effect of food reinforcement on energy intake among non-obese women. Appetite. 2010;55(3):420–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 137. Carr KA, Daniel TO, Lin H, Epstein LH. Reinforcement pathology and obesity. Curr Drug Abuse Rev. 2011;4(3):190–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 138. Bickel WK, Tegge AN, Carr KA, Epstein LH. Reinforcer pathology's alternative reinforcer hypothesis: a preliminary examination. Health Psychol [Internet]. 2020. Available from: https://www.ncbi.nlm.nih.gov/pubmed/32969696. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 139. Ma F, Chen B, Xu F, Lee K, Heyman GD. Generalized trust predicts young children's willingness to delay gratification. J Exp Child Psychol. 2018;169:118–25. [DOI] [PubMed] [Google Scholar]
  • 140. Watts TW, Duncan GJ, Quan H. Revisiting the marshmallow test: a conceptual replication investigating links between early delay of gratification and later outcomes. Psychol Sci. 2018;29(7):1159–77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 141. Koffarnus MN, Bickel WK. A 5-trial adjusting delay discounting task: accurate discount rates in less than one minute. Exp Clin Psychopharmacol. 2014;22(3):222–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 142. Rasmussen EB, Lawyer SR, Reilly W. Percent body fat is related to delay and probability discounting for food in humans. Behav Processes. 2010;83(1):23–30. [DOI] [PubMed] [Google Scholar]
  • 143. Bickel WK, Stein JS, Paluch RA, Mellis AM, Athamneh LN, Quattrin Tet al. Does episodic future thinking repair immediacy bias at home and in the laboratory in patients with prediabetes?. Psychosom Med. 2020;82(7):699–707. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 144. Ello-Martin JA, Ledikwe JH, Rolls BJ. The influence of food portion size and energy density on energy intake: implications for weight management. Am J Clin Nutr. 2005;82(1):236S–41S. [DOI] [PubMed] [Google Scholar]
  • 145. De Graaf C, Blom WA, Smeets PA, Stafleu A, Hendriks HF. Biomarkers of satiation and satiety. Am J Clin Nutr. 2004;79(6):946–61. [DOI] [PubMed] [Google Scholar]
  • 146. Gerstein DE, Woodward-Lopez G, Evans AE, Kelsey K, Drewnowski A. Clarifying concepts about macronutrients’ effects on satiation and satiety. J Am Diet Assoc. 2004;104(7):1151–3. [DOI] [PubMed] [Google Scholar]
  • 147. Stubbs J, Ferres S, Horgan G. Energy density of foods: effects on energy intake. Crit Rev Food Sci Nutr. 2000;40(6):481–515. [DOI] [PubMed] [Google Scholar]
  • 148. de Graaf C. Why liquid energy results in overconsumption. Proc Nutr Soc. 2011;70(2):162–70. [DOI] [PubMed] [Google Scholar]
  • 149. Rolls BJ, Bell EA, Waugh BA. Increasing the volume of a food by incorporating air affects satiety in men. Am J Clin Nutr. 2000;72(2):361–8. [DOI] [PubMed] [Google Scholar]
  • 150. Rolls BJ, Castellanos VH, Halford JC, Kilara A, Panyam D, Pelkman CLet al. Volume of food consumed affects satiety in men. Am J Clin Nutr. 1998;67(6):1170–7. [DOI] [PubMed] [Google Scholar]
  • 151. Bartoshuk LM, Duffy VB, Green BG, Hoffman HJ, Ko CW, Lucchina LAet al. Valid across-group comparisons with labeled scales: the gLMS versus magnitude matching. Physiol Behav. 2004;82(1):109–14. [DOI] [PubMed] [Google Scholar]
  • 152. Cardello AV, Schutz HG, Lesher LL, Merrill E. Development and testing of a labeled magnitude scale of perceived satiety. Appetite. 2005;44(1):1–13. [DOI] [PubMed] [Google Scholar]
  • 153. Keller KL, Assur SA, Torres M, Lofink HE, Thornton JC, Faith MSet al. Potential of an analog scaling device for measuring fullness in children: development and preliminary testing. Appetite. 2006;47(2):233–43. [DOI] [PubMed] [Google Scholar]
  • 154. Rolls BJ, van Duijvenvoorde PM, Rowe EA. Variety in the diet enhances intake in a meal and contributes to the development of obesity in the rat. Physiol Behav. 1983;31(1):21–7. [DOI] [PubMed] [Google Scholar]
  • 155. Rolls BJ, Rolls ET, Rowe EA, Sweeney K. Sensory specific satiety in man. Physiol Behav. 1981;27(1):137–42. [DOI] [PubMed] [Google Scholar]
  • 156. Zanchi D, Depoorter A, Egloff L, Haller S, Mahlmann L, Lang UEet al. The impact of gut hormones on the neural circuit of appetite and satiety: a systematic review. Neurosci Biobehav Rev. 2017;80:457–75. [DOI] [PubMed] [Google Scholar]
  • 157. Huda MS, Wilding JP, Pinkney JH. Gut peptides and the regulation of appetite. Obes Rev. 2006;7(2):163–82. [DOI] [PubMed] [Google Scholar]
  • 158. Hameed S, Dhillo WS, Bloom SR. Gut hormones and appetite control. Oral Dis. 2009;15(1):18–26. [DOI] [PubMed] [Google Scholar]
  • 159. Brennan IM, Little TJ, Feltrin KL, Smout AJ, Wishart JM, Horowitz Met al. Dose-dependent effects of cholecystokinin-8 on antropyloroduodenal motility, gastrointestinal hormones, appetite, and energy intake in healthy men. Am J Physiol Endocrinol Metab. 2008;295(6):E1487–94. [DOI] [PubMed] [Google Scholar]
  • 160. Epstein LH, Temple JL, Roemmich JN, Bouton ME. Habituation as a determinant of human food intake. Psychol Rev. 2009;116(2):384–407. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 161. Epstein LH, Robinson JL, Temple JL, Roemmich JN, Marusewski AL, Nadbrzuch RL. Variety influences habituation of motivated behavior for food and energy intake in children. Am J Clin Nutr. 2009;89(3):746–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 162. Myers MD, Epstein LH. The effect of dietary fat on salivary habituation and satiation. Physiol Behav. 1997;62(1):155–61. [DOI] [PubMed] [Google Scholar]
  • 163. Myers Ernst M, Epstein LH. Habituation of responding for food in humans. Appetite. 2002;38(3):224–34. [DOI] [PubMed] [Google Scholar]
  • 164. Epstein LH, Carr KA, Cavanaugh MD, Paluch RA, Bouton ME. Long-term habituation to food in obese and nonobese women. Am J Clin Nutr. 2011;94(2):371–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 165. Epstein LH, Robinson JL, Roemmich JN, Marusewski A. Slow rates of habituation predict greater zBMI gains over 12 months in lean children. Eat Behav. 2011;12(3):214–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 166. Rolls ET, Murzi S, Yaxley S, Thorpe SJ, Simpson SJ. Sensory-specific satiety: food-specific reduction in responsiveness of ventral forebrain neurons after feeding in the monkey. Brain Res. 1986;368(1):79–86. [DOI] [PubMed] [Google Scholar]
  • 167. Rolls ET, Rolls BJ, Rowe EA. Sensory-specific satiety and motivation specific satiety for the sight and taste of food and water in man. Physiol Behav. 1983;30(2):185–92. [DOI] [PubMed] [Google Scholar]
  • 168. Williams RA, Roe LS, Rolls BJ. Assessment of satiety depends on the energy density and portion size of the test meal. Obesity (Silver Sprint). 2014;22(2):318–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 169. Yaxley S, Rolls ET, Sienkiewicz ZJ, Scott TR. Satiety does not affect gustatory activity in the nucleus of the solitary tract of the alert monkey. Brain Res. 1985;347(1):85–93. [DOI] [PubMed] [Google Scholar]
  • 170. Epstein LH, Fletcher KD, O'Neill J, Roemmich JN, Raynor H, Bouton ME. Food characteristics, long-term habituation and energy intake. Laboratory and field studies. Appetite. 2013;60(1):40–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 171. Carr KA, Epstein LH. Relationship between food habituation and reinforcing efficacy of food. Learn Motiv. 2011;42(2):165–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 172. Rolls BJ. Sensory-specific satiety. Nutr Rev. 1986;44(3):93–101. [DOI] [PubMed] [Google Scholar]
  • 173. Rolls BJ, Hetherington M, Burley VJ. The specificity of satiety: the influence of foods of different macronutrient content on the development of satiety. Physiol Behav. 1988;43(2):145–53. [DOI] [PubMed] [Google Scholar]
  • 174. Epstein LH, Carr KA. Food reinforcement and habituation to food are processes related to initiation and cessation of eating. Physiol Behav. 2021;239:113512.doi: 10.1016/j.physbeh.2021.113512 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 175. Robinson E, Almiron-Roig E, Rutters F, de Graaf C, Forde CG, Tudur Smith Cet al. A systematic review and meta-analysis examining the effect of eating rate on energy intake and hunger. Am J Clin Nutr. 2014;100(1):123–51. [DOI] [PubMed] [Google Scholar]
  • 176. Burger KS, Fisher JO, Johnson SL. Mechanisms behind the portion size effect: visibility and bite size. Obesity (Silver Spring). 2011;19(3):546–51. [DOI] [PubMed] [Google Scholar]
  • 177. Hoddy KK, Marlatt KL, Cetinkaya H, Ravussin E. Intermittent fasting and metabolic health: from religious fast to time-restricted feeding. Obesity (Silver Spring). 2020;28(Suppl 1):S29–37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 178. Blundell JE, King NA. Effects of exercise on appetite control: loose coupling between energy expenditure and energy intake. Int J Obes Relat Metab Disord. 1998;22(Suppl 2):S22–9. [PubMed] [Google Scholar]
  • 179. Flack K, Pankey C, Ufholz K, Johnson L, Roemmich JN. Genetic variations in the dopamine reward system influence exercise reinforcement and tolerance for exercise intensity. Behav Brain Res. 2019;375:112148. [DOI] [PubMed] [Google Scholar]
  • 180. Lynch WJ, Piehl KB, Acosta G, Peterson AB, Hemby SE. Aerobic exercise attenuates reinstatement of cocaine-seeking behavior and associated neuroadaptations in the prefrontal cortex. Biol Psychiatry. 2010;68(8):774–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 181. Champagne CM, Baker NB, DeLany JP, Harsha DW, Bray GA. Assessment of energy intake underreporting by doubly labeled water and observations on reported nutrient intakes in children. J Am Diet Assoc. 1998;98(4):426–33. [DOI] [PubMed] [Google Scholar]
  • 182. Fricker J, Baelde D, Igoin-Apfelbaum L, Huet J-M, Apfelbaum M. Underreporting of food intake in obese “small eaters.” Appetite. 1992;19(3):273–83. [DOI] [PubMed] [Google Scholar]
  • 183. Zhang R, Amft O. Monitoring chewing and eating in free-living using smart eyeglasses. IEEE J Biomed Health Informatics. 2018;22(1):23–32. [DOI] [PubMed] [Google Scholar]
  • 184. Goris AHC, Meijer EP, Kester A, Westerterp KR. Use of a triaxial accelerometer to validate reported food intakes. Am J Clin Nutr. 2001;73(3):549–53. [DOI] [PubMed] [Google Scholar]
  • 185. Fisher GG, Chacon M, Chaffee DS.Theories of cognitive aging and workIn: Baltes BB, Rudolph CW, Zacher H. Editor(s), Work across the lifespan, London: Academic Press; 2019;17–45. [Google Scholar]
  • 186. Stepan ME, Altmann EM, Fenn KM. Caffeine selectively mitigates cognitive deficits caused by sleep deprivation. J Exp Psychol Learn Mem Cogn. 2021;47(9):1371–82. [DOI] [PubMed] [Google Scholar]
  • 187. Wightman EL, Jackson PA, Spittlehouse B, Heffernan T, Guillemet D, Kennedy DO. The acute and chronic cognitive effects of a sage extract: a randomized, placebo controlled study in healthy humans. Nutrients. 2021;13(1):218. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 188. Rasmussen J. The Lipididiet trial: what does it add to the current evidence for Fortasyn connect in early Alzheimer's disease?. Clin Intervent Aging. 2019;14:1481–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 189. Bai D, Fan J, Li M, Dong C, Gao Y, Fu Met al. Effects of folic acid combined with DHA supplementation on cognitive function and amyloid-β-related biomarkers in older adults with mild cognitive impairment by a randomized, double blind, placebo-controlled trial. J Alzheimers Dis. 2021;81(1):155–67. [DOI] [PubMed] [Google Scholar]
  • 190. Valls-Pedret C, Sala-Vila A, Serra-Mir M, Corella D, de la Torre R, Martínez-González MÁet al. Mediterranean diet and age-related cognitive decline: a randomized clinical trial. JAMA Intern Med. 2015;175(7):1094–103. [DOI] [PubMed] [Google Scholar]
  • 191. Kesse-Guyot E, Andreeva VA, Jeandel C, Ferry M, Hercberg S, Galan P. A healthy dietary pattern at midlife is associated with subsequent cognitive performance. J Nutr. 2012;142(5):909–15. [DOI] [PubMed] [Google Scholar]
  • 192. Kesse-Guyot E, Andreeva VA, Ducros V, Jeandel C, Julia C, Hercberg Set al. Carotenoid-rich dietary patterns during midlife and subsequent cognitive function. Br J Nutr. 2014;111(5):915–23. [DOI] [PubMed] [Google Scholar]
  • 193. The SU.VI.MAX 2 Research Group . Mediterranean diet and cognitive function: a French study. Am J Clin Nutr. 2013;97(2):369–76. [DOI] [PubMed] [Google Scholar]
  • 194. Smith A. Effects of chewing gum on mood, learning, memory and performance of an intelligence test. Nutr Neurosci. 2009;12(2):81–8. [DOI] [PubMed] [Google Scholar]
  • 195. Brickman AM, Khan UA, Provenzano FA, Yeung L-K, Suzuki W, Schroeter Het al. Enhancing dentate gyrus function with dietary flavanols improves cognition in older adults. Nat Neurosci. 2014;17(12):1798–803. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 196. Park SK, Jung IC, Lee WK, Lee YS, Park HK, Go HJet al. A combination of green tea extract and l-theanine improves memory and attention in subjects with mild cognitive impairment: a double-blind placebo-controlled study. J Med Food. 2011;14(4):334–43. [DOI] [PubMed] [Google Scholar]
  • 197. Partington JE, Leiter RG. Partington's Pathways Test. Psycholog Service Center J. 1949;1:11–20. [Google Scholar]
  • 198. Mastroiacovo D, Kwik-Uribe C, Grassi D, Necozione S, Raffaele A, Pistacchio Let al. Cocoa flavanol consumption improves cognitive function, blood pressure control, and metabolic profile in elderly subjects: the Cocoa, Cognition, and Aging (CoCoA) study—a randomized controlled trial. Am J Clin Nutr. 2015;101(3):538–48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 199. Desideri G, Kwik-Uribe C, Grassi D, Necozione S, Ghiadoni L, Mastroiacovo Det al. Benefits in cognitive function, blood pressure, and insulin resistance through cocoa flavanol consumption in elderly subjects with mild cognitive impairment: the Cocoa, Cognition, and Aging (CoCoA) study. Hypertension. 2012;60(3):794–801. [DOI] [PubMed] [Google Scholar]
  • 200. Shao Z, Janse E, Visser K, Meyer AS. What do verbal fluency tasks measure? Predictors of verbal fluency performance in older adults. Front Psychol. 2014;5:772. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 201. Giesbrecht T, Rycroft JA, Rowson MJ, De Bruin EA. The combination of L-theanine and caffeine improves cognitive performance and increases subjective alertness. Nutr Neurosci. 2010;13(6):283–90. [DOI] [PubMed] [Google Scholar]
  • 202. Kirchner WK. Age differences in short-term retention of rapidly changing information. J Exp Psychol. 1958;55(4):352–8. [DOI] [PubMed] [Google Scholar]
  • 203. Bond A, Lader M. The use of analogue scales in rating subjective feelings. Br J Med Psychol. 1974;47(3):211–8. [Google Scholar]
  • 204. McNair D, Lorr M, Droppleman L. Manual Profile of Mood States. Educational & Industrial Testing Service. 1971;756. [Google Scholar]
  • 205. Spielberger CD, Gorsuch RL, Lushene R, Vagg PR, Jacobs GA. STAI: Manual for the State-Trait Anxiety Inventory, Palo Alto, CA: Consulting Psychologists Press; 1968. [Google Scholar]
  • 206. Rogers PJ, Richardson NJ, Elliman NA. Overnight caffeine abstinence and negative reinforcement of preference for caffeine-containing drinks. Psychopharmacology (Berl). 1995;120(4):457–62. [DOI] [PubMed] [Google Scholar]
  • 207. Ray MK, Sylvester MD, Osborn L, Helms J, Turan B, Burgess EEet al. The critical role of cognitive-based trait differences in transcranial direct current stimulation (tDCS) suppression of food craving and eating in frank obesity. Appetite. 2017;116:568–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 208. Nuechterlein KH, Green MF, Kern RS, Baade LE, Barch DM, Cohen JDet al. The MATRICS consensus cognitive battery, part 1: test selection, reliability, and validity. Am J Psychiatry. 2008;165(2):203–13. [DOI] [PubMed] [Google Scholar]
  • 209. Julian LJ. Measures of anxiety: State-Trait Anxiety Inventory (STAI), Beck Anxiety Inventory (BAI), and Hospital Anxiety and Depression Scale-Anxiety (HADS-A). Arthritis Care Res. 2011;63(Suppl 11):S467–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 210. Armitage H. 5 Questions: John Ioannidis calls for more rigorous nutrition research, Stanford Medicine News Center, 2018, [cited 2021 Nov 2]. Available from: http://aemstage.med.stanford.edu/news/all-news/2018/07/john-ioannidis-calls-for-more-rigorous-nutrition-research.html. [Google Scholar]
  • 211. Eklund A, Nichols TE, Knutsson H. Cluster failure: why fMRI inferences for spatial extent have inflated false-positive rates. Proc Natl Acad Sci. 2016;113(28):7900. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 212. Sacchet MD, Knutson B. Spatial smoothing systematically biases the localization of reward-related brain activity. Neuroimage. 2013;66:270–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 213. Bennett CM, Miller MB. How reliable are the results from functional magnetic resonance imaging?. Ann NY Acad Sci. 2010;1191(1):133–55. [DOI] [PubMed] [Google Scholar]
  • 214. Chen X, Lu B, Yan C-G. Reproducibility of R-fMRI metrics on the impact of different strategies for multiple comparison correction and sample sizes. Hum Brain Mapp. 2018;39(1):300–18. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 215. ABC News . Precision nutrition: what this new field is trying to accomplish, and why current technology could hold us back [Internet]. ABCNews. [cited 2021 Nov 8]. Available from: https://abcnews.go.com/Health/precision-nutrition-field-accomplish-current-technology-hold-us/story?id=72653764. [Google Scholar]
  • 216. Carlsen MH, Lillegaard IT, Karlsen A, Blomhoff R, Drevon CA, Andersen LF. Evaluation of energy and dietary intake estimates from a food frequency questionnaire using independent energy expenditure measurement and weighed food records. Nutr J. 2010;9(1):37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 217. Bingham SA, Day NE. Using biochemical markers to assess the validity of prospective dietary assessment methods and the effect of energy adjustment. Am J Clin Nutr. 1997;65(4):1130S–7S. [DOI] [PubMed] [Google Scholar]
  • 218. Thompson FE, Subar AF. Dietary assessment methodology. In: Coulston AM, Boushey CJ, Ferruzzi MG, eds. Nutrition in the prevention and treatment of disease. [Internet]. London: Elsevier; 2013, p. 5–46.. [cited 2021 May 16]. Available from: https://linkinghub.elsevier.com/retrieve/pii/B9780123918840000019. [Google Scholar]
  • 219. Riboli E, Hunt KJ, Slimani N, Ferrari P, Norat T, Fahey Met al. European Prospective Investigation into Cancer and Nutrition (EPIC): study populations and data collection. Public Health Nutr. 2002;5(6b):1113–24. [DOI] [PubMed] [Google Scholar]
  • 220. Ferrucci L. The Baltimore Longitudinal Study of Aging (BLSA): a 50-Year-Long journey and plans for the future. J Gerontol A Biol Sci Med Sci. 2008;63(12):1416–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 221. Ortega RM, Pérez-Rodrigo C, López-Sobaler AM. Dietary assessment methods: dietary records. Nutr Hosp. 2015;31(Suppl 3):38–45. [DOI] [PubMed] [Google Scholar]
  • 222. Sempos CT, Johnson NE, Smith EL, Gilligan C. Effects of intraindividual and interindividual variation in repeated dietary records. Am J Epidemiol. 1985;121(1):120–30. [DOI] [PubMed] [Google Scholar]
  • 223. De Keyzer W, Huybrechts I, De Vriendt V, Vandevijvere S, Slimani N, Van Oyen Het al. Repeated 24-hour recalls versus dietary records for estimating nutrient intakes in a national food consumption survey. Food Nutr Res [Internet], 2011, [cited 2021 May 16]; 55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 224. Franco RZ, Fallaize R, Lovegrove JA, Hwang F. Popular nutrition-related mobile apps: a feature assessment. JMIR Mhealth Uhealth. 2016;4(3):e85. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 225. Teixeira V, Voci SM, Mendes-Netto RS, da Silva DG. The relative validity of a food record using the smartphone application Myfitnesspal. Nutr Diet. 2018;75(2):219–25. [DOI] [PubMed] [Google Scholar]
  • 226. FoodData Central [Internet][cited 2021 May 16]. Available from: https://fdc.nal.usda.gov/. [Google Scholar]
  • 227. Moshfegh AJ, Rhodes DG, Baer DJ, Murayi T, Clemens JC, Rumpler WVet al. The US Department of Agriculture Automated Multiple-Pass Method reduces bias in the collection of energy intakes. Am J Clin Nutr. 2008;88(2):324–32. [DOI] [PubMed] [Google Scholar]
  • 228. Feskanich D, Sielaff BH, Chong K, Buzzard IM. Computerized collection and analysis of dietary intake information. Comput Methods Programs Biomed. 1989;30(1):47–57. [DOI] [PubMed] [Google Scholar]
  • 229. Zimmerman TP, Hull SG, McNutt S, Mittl B, Islam N, Guenther PMet al. Challenges in converting an interviewer-administered food probe database to self-administration in the National Cancer Institute automated self-administered 24-hour recall (ASA24). J Food Compos Anal. 2009;22(Suppl 1):S48–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 230. Pannucci TE, Thompson FE, Bailey RL, Dodd KW, Potischman N, Kirkpatrick SIet al. Comparing reported dietary supplement intakes between two 24-hour recall methods: the automated self-administered 24-hour dietary assessment tool and the interview-administered Automated Multiple Pass Method. J Acad Nutr Diet. 2018;118(6):1080–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 231. Kirkpatrick SI, Subar AF, Douglass D, Zimmerman TP, Thompson FE, Kahle LLet al. Performance of the automated self-administered 24-hour recall relative to a measure of true intakes and to an interviewer-administered 24-h recall. Am J Clin Nutr. 2014;100(1):233–40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 232. Naska A, Lagiou A, Lagiou P. Dietary assessment methods in epidemiological research: current state of the art and future prospects. F1000Research[Internet]. 2017[cited 2021 May 3];6:926. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5482335/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 233. Freedman LS, Schatzkin A, Midthune D, Kipnis V. Dealing with dietary measurement error in nutritional cohort studies. J Natl Cancer Inst. 2011;103(14):1086–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 234. Hörnell A, Winkvist A, Hallmans G, Weinehall L, Johansson I. Mis-reporting, previous health status and health status of family may seriously bias the association between food patterns and disease. Nutr J. 2010;9(1):48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 235. Collins C, Kirkpatrick S. Assessment of nutrient intakes, [Internet]. 2017. [cited 2021 May 3]. Available from: https://torl.biblioboard.com/content/2d5fef42-3aea-4078-b2ae-ee0c927236e6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 236. Lucassen DA, Brouwer-Brolsma EM, van de Wiel AM, Siebelink E, Feskens EJM. Iterative development of an innovative smartphone-based dietary assessment tool: Traqq. J Visual Exp. 2021;Mar 19:(169)doi: 10.3791/62032. [DOI] [PubMed] [Google Scholar]
  • 237. Willett WC, Sampson L, Stampfer MJ, Rosner B, Bain C, Witschi Jet al. Reproducibility and validity of a semiquantitative food frequency questionnaire. Am J Epidemiol. 1985;122(1):51–65. [DOI] [PubMed] [Google Scholar]
  • 238. Block G, Hartman AM, Dresser CM, Carroll MD, Gannon J, Gardner L. A data-based approach to diet questionnaire design and testing. Am J Epidemiol. 1986;124(3):453–69. [DOI] [PubMed] [Google Scholar]
  • 239. Stram DO, Hankin JH, Wilkens LR, Pike MC, Monroe KR, Park Set al. Calibration of the Dietary Questionnaire for a Multiethnic Cohort in Hawaii and Los Angeles. Am J Epidemiol. 2000;151(4):358–70. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 240. Tucker KL, Bianchi LA, Maras J, Bermudez OI. Adaptation of a food frequency questionnaire to assess diets of Puerto Rican and non-Hispanic adults. Am J Epidemiol. 1998;148(5):507–18. [DOI] [PubMed] [Google Scholar]
  • 241. Tucker KL, Maras J, Champagne C, Connell C, Goolsby S, Weber Jet al. A regional food-frequency questionnaire for the US Mississippi Delta. Public Health Nutr. 2005;8(1):87–96. [PubMed] [Google Scholar]
  • 242. Willett W. Nutritional epidemiology, New York: Oxford University Press; 2017. [Google Scholar]
  • 243. Mohd Razif S, Abbe Maleyki MJ, Marhazlina M, Wee BS, Sakinah H, Engku Fadzli Hasan SAet al. Development of digital food photographs as a tool to visually estimate food portion size in multi-ethnic Malaysian adult [Internet]. Bali, Indonesia; 2019[cited 2021 May 3]. Available from: https://eprints.unisza.edu.my/2026/. [Google Scholar]
  • 244. Almiron-Roig E, Aitken A, Galloway C, Ellahi B. Dietary assessment in minority ethnic groups: a systematic review of instruments for portion-size estimation in the United Kingdom. Nutr Rev. 2017;75(3):188–213. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 245. Hotz C, Abdelrahman L. Simple methods to obtain food listing and portion size distribution estimates for use in semi-quantitative dietary assessment methods. PLoS One. 2019;14(10):e0217379. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 246. Diet History Questionnaire III (DHQ III)| EGRP/DCCPS/NCI/NIH [Internet]. [cited 2021 Mar 7]. Available from: https://epi.grants.cancer.gov/dhq3/. [Google Scholar]
  • 247. Subar AF, Thompson FE, Smith AF, Jobe JB, Ziegler RG, Potischman Net al. Improving food frequency questionnaires: a qualitative approach using cognitive interviewing. J Am Diet Assoc. 1995;95(7):781–88.; quiz 789–90. [DOI] [PubMed] [Google Scholar]
  • 248. Kristal AR, Kolar AS, Fisher JL, Plascak JJ, Stumbo PJ, Weiss Ret al. Evaluation of web-based, self-administered, graphical food frequency questionnaire. J Acad Nutr Diet. 2014;114(4):613–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 249. Tucker KL, Chen H, Vogel S, Wilson PW, Schaefer EJ, Lammi-Keefe CJ. Carotenoid intakes, assessed by dietary questionnaire, are associated with plasma carotenoid concentrations in an elderly population. J Nutr. 1999;129(2):438–45. [DOI] [PubMed] [Google Scholar]
  • 250. Tucker KL, Rich S, Rosenberg I, Jacques P, Dallal G, Wilson PWet al. Plasma vitamin B-12 concentrations relate to intake source in the Framingham Offspring Study. Am J Clin Nutr. 2000;71(2):514–22. [DOI] [PubMed] [Google Scholar]
  • 251. Tucker KL, Selhub J, Wilson PW, Rosenberg IH. Dietary intake pattern relates to plasma folate and homocysteine concentrations in the Framingham Heart Study. J Nutr. 1996;126(12):3025–31. [DOI] [PubMed] [Google Scholar]
  • 252. Schröder H, Benitez Arciniega A, Soler C, Covas M-I, Baena-Díez JM, Marrugat Jet al. Validity of two short screeners for diet quality in time-limited settings. Public Health Nutr. 2012;15(4):618–26. [DOI] [PubMed] [Google Scholar]
  • 253. Greene GW, Resnicow K, Thompson FE, Peterson KE, Hurley TG, Hebert JRet al. Correspondence of the NCI fruit and vegetable screener to repeat 24-H recalls and serum carotenoids in behavioral intervention trials. J Nutr. 2008;138(1):200S–4S. [DOI] [PubMed] [Google Scholar]
  • 254. Yaroch AL, Tooze J, Thompson FE, Blanck HM, Thompson OM, Colón-Ramos Uet al. Evaluation of three short dietary instruments to assess fruit and vegetable intake: the National Cancer Institute's food attitudes and behaviors (FAB) survey. J Acad Nutr Diet. 2012;112(10):1570–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 255. Lundeen EA, Park S, Dooyema C, Blanck HM. Total sugar-sweetened beverage intake among US adults was lower when measured using a 1-question versus 4-question screener. Am J Health Promot. 2018;32(6):1431–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 256. Gilmore JME, Marshall TA, Levy SM, Stumbo PJ. Development of the Iowa bone nutrient food frequency questionnaire based on data from the US Department of Agriculture Continuing Survey of the Food Intake by Individuals. J Food Compos Anal. 2008;21:S60–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 257. Keogh RH, White IR. A toolkit for measurement error correction, with a focus on nutritional epidemiology. Stat Med. 2014;33(12):2137–55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 258. Gao Q, Praticò G, Scalbert A, Vergères G, Kolehmainen M, Manach Cet al. A scheme for a flexible classification of dietary and health biomarkers. Genes Nutr. [Internet]2017; [cited 2021 May 16];12. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5728065/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 259. Potischman N, Freudenheim JL. Biomarkers of nutritional exposure and nutritional status: an overview. J Nutr. 2003;133(3):873S–4S. [DOI] [PubMed] [Google Scholar]
  • 260. Freedman LS, Midthune D, Carroll RJ, Tasevska N, Schatzkin A, Mares Jet al. Using regression calibration equations that combine self-reported intake and biomarker measures to obtain unbiased estimates and more powerful tests of dietary associations. Am J Epidemiol. 2011;174(11):1238–45. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 261. Sluik D, Geelen A, Vries JHM, Eussen S, Brants HAM, Meijboom Set al. A national FFQ for the Netherlands (the FFQ-NL 1.0): validation of a comprehensive FFQ for adults. Br J Nutr. 2016;116(5):913–23. [DOI] [PubMed] [Google Scholar]
  • 262. Freedman LS, Commins JM, Moler JE, Arab L, Baer DJ, Kipnis Vet al. Pooled results from 5 validation studies of dietary self-report instruments using recovery biomarkers for energy and protein intake. Am J Epidemiol. 2014;180(2):172–88. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 263. Cobb LK, Anderson CAM, Elliott P, Hu FB, Liu K, Neaton JDet al. Methodological issues in cohort studies that relate sodium intake to cardiovascular disease outcomes: a science advisory from the American Heart Association. Circulation. 2014;129(10):1173–86. [DOI] [PubMed] [Google Scholar]
  • 264. Macena ML, Pureza I, Melo ISV, Clemente AG, Ferreira HS, Florêncio Tet al. Agreement between the total energy expenditure calculated with accelerometry data and the BMR yielded by predictive equations v. the total energy expenditure obtained with doubly labelled water in low-income women with excess weight. Br J Nutr. 2019;122(12):1398–408. [DOI] [PubMed] [Google Scholar]
  • 265. Black AE, Bingham SA, Johansson G, Coward WA. Validation of dietary intakes of protein and energy against 24 hour urinary N and DLW energy expenditure in middle-aged women, retired men and post-obese subjects: comparisons with validation against presumed energy requirements. Eur J Clin Nutr. 1997;51(6):405–13. [DOI] [PubMed] [Google Scholar]
  • 266. Montgomery C, Reilly JJ, Jackson DM, Kelly LA, Slater C, Paton JYet al. Validation of energy intake by 24-hour multiple pass recall: comparison with total energy expenditure in children aged 5–7 years. Br J Nutr. 2005;93(5):671–6. [DOI] [PubMed] [Google Scholar]
  • 267. Trijsburg L, Geelen A, Hollman PC, Hulshof PJ, Feskens EJ, Van't Veer Pet al. BMI was found to be a consistent determinant related to misreporting of energy, protein and potassium intake using self-report and duplicate portion methods. Public Health Nutr. 2017;20(4):598–607. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 268. Lynch S, Pfeiffer CM, Georgieff MK, Brittenham G, Fairweather-Tait S, Hurrell RFet al. Biomarkers of Nutrition for Development (BOND)—iron review. J Nutr. 2018;148(Suppl 1):1001S–67S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 269. King JC, Brown KH, Gibson RS, Krebs NF, Lowe NM, Siekmann JHet al. Biomarkers of Nutrition for Development (BOND)—zinc review. J Nutr. 2015;146(4):858S–85S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 270. Rohner F, Zimmermann M, Jooste P, Pandav C, Caldwell K, Raghavan Ret al. Biomarkers of Nutrition for Development—iodine review. J Nutr. 2014;144(8):1322S–42S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 271. Tanumihardjo SA, Russell RM, Stephensen CB, Gannon BM, Craft NE, Haskell MJet al. Biomarkers of Nutrition for Development (BOND)—vitamin A review. J Nutr. 2016;146(9):1816S–48S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 272. Bailey LB, Stover PJ, McNulty H, Fenech MF, Gregory JF, Mills JLet al. Biomarkers of Nutrition for Development—folate review. J Nutr. 2015;145(7):1636S–80S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 273. Allen LH, Miller JW, de Groot L, Rosenberg IH, Smith AD, Refsum Het al. Biomarkers of Nutrition for Development (BOND): vitamin B-12 review. J Nutr. 2018;148(Suppl 4):1995S–2027S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 274. Wasantwisut E, Neufeld L. Use of nutritional biomarkers in program evaluation in the context of developing countries. J Nutr. 2012;142(1):186S–90S. [DOI] [PubMed] [Google Scholar]
  • 275. Armah SM, Carriquiry A, Sullivan D, Cook JD, Reddy MB. A complete diet-based algorithm for predicting nonheme iron absorption in adults. J Nutr. 2013;143(7):1136–40. [DOI] [PubMed] [Google Scholar]
  • 276. Hallberg L, Hulthén L. Prediction of dietary iron absorption: an algorithm for calculating absorption and bioavailability of dietary iron. Am J Clin Nutr. 2000;71(5):1147–60. [DOI] [PubMed] [Google Scholar]
  • 277. Nair R, Maseeh A. Vitamin D: the “sunshine” vitamin. J Pharmacol Pharmacother. 2012;3(2):118–26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 278. Heidari B, Haji Mirghassemi MB. Seasonal variations in serum vitamin D according to age and sex. Caspian J Intern Med. 2012;3(4):535–40. [PMC free article] [PubMed] [Google Scholar]
  • 279. Klingberg E, Oleröd G, Konar J, Petzold M, Hammarsten O. Seasonal variations in serum 25-hydroxy vitamin D levels in a Swedish cohort. Endocrine. 2015;49(3):800–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 280. Gallagher JC. Vitamin D and aging. Endocrinol Metab Clin North Am. 2013;42(2):319–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 281. Institute of Medicine (US) Committee to Review Dietary Reference Intakes for Vitamin D and Calcium . Dietary Reference Intakes for calcium and vitamin D [Internet], Ross AC, Taylor CL, Yaktine AL, Del Valle HB, editors. Washington (DC): National Academies Press (US); 2011; [cited 2019 Jun 10]. The National Academies Collection: Reports funded by National Institutes of Health. Available from: http://www.ncbi.nlm.nih.gov/books/NBK56070/. [PubMed] [Google Scholar]
  • 282. Makris K, Bhattoa HP, Cavalier E, Phinney K, Sempos CT, Ulmer CZet al. Recommendations on the measurement and the clinical use of vitamin D metabolites and vitamin D binding protein—a position paper from the IFCC Committee on bone metabolism. Clin Chim Acta. 2021;517:171–97. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 283. Al-Delaimy WK, Ferrari P, Slimani N, Pala V, Johansson I, Nilsson Set al. Plasma carotenoids as biomarkers of intake of fruits and vegetables: individual-level correlations in the European Prospective Investigation into Cancer and Nutrition (EPIC). Eur J Clin Nutr. 2005;59(12):1387–96. [DOI] [PubMed] [Google Scholar]
  • 284. Trijsburg L, de Vries JH, Hollman PC, Hulshof PJ, van't Veer P, Boshuizen HCet al. Validating fatty acid intake as estimated by an FFQ: how does the 24 h recall perform as reference method compared with the duplicate portion?. Public Health Nutr. 2018;21(14):2568–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 285. Pan Z, Raftery D. Comparing and combining NMR spectroscopy and mass spectrometry in metabolomics. Anal Bioanal Chem. 2007;387(2):525–7. [DOI] [PubMed] [Google Scholar]
  • 286. Egert B, Weinert CH, Kulling SE. A peaklet-based generic strategy for the untargeted analysis of comprehensive two-dimensional gas chromatography mass spectrometry data sets. J Chromatogr A. 2015;1405:168–77. [DOI] [PubMed] [Google Scholar]
  • 287. Wishart DS, Feunang YD, Marcu A, Guo AC, Liang K, Vázquez-Fresno Ret al. HMDB 4.0: the human metabolome database for 2018. Nucleic Acids Res. 2018;46(D1):D608–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 288. Ulaszewska MM, Weinert CH, Trimigno A, Portmann R, Andres Lacueva C, Badertscher Ret al. Nutrimetabolomics: an integrative action for metabolomic analyses in human nutritional studies. Mol Nutr Food Res. 2019;63(1):1800384. [DOI] [PubMed] [Google Scholar]
  • 289. Scalbert A, Brennan L, Manach C, Andres-Lacueva C, Dragsted LO, Draper Jet al. The food metabolome: a window over dietary exposure. Am J Clin Nutr. 2014;99(6):1286–308. [DOI] [PubMed] [Google Scholar]
  • 290. Souverein OW, de Vries JHM, Freese R, Watzl B, Bub A, Miller ERet al. Prediction of fruit and vegetable intake from biomarkers using individual participant data of diet-controlled intervention studies. Br J Nutr. 2015;113(9):1396–409. [DOI] [PubMed] [Google Scholar]
  • 291. McNamara AE, Walton J, Flynn A, Nugent AP, McNulty BA, Brennan L. The potential of multi-biomarker panels in nutrition research: total fruit intake as an example. Front Nutr [Internet]. 2021; [cited 2021 Mar 7];7. Available from: https://www.frontiersin.org/articles/10.3389/fnut.2020.577720/full. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 292. Landberg R, Hanhineva K. Biomarkers of a healthy nordic diet—from dietary exposure biomarkers to microbiota signatures in the metabolome. Nutrients. 2019;12(1):27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 293. Gibbons H, Carr E, McNulty BA, Nugent AP, Walton J, Flynn Aet al. Metabolomic-based identification of clusters that reflect dietary patterns. Mol Nutr Food Res. 2017;61(10):1601050. [DOI] [PubMed] [Google Scholar]
  • 294. Rebholz CM, Gao Y, Talegawkar S, Tucker KL, Colantonio LD, Muntner Pet al. Metabolomic markers of southern dietary patterns in the Jackson Heart study. Mol Nutr Food Res. 2021;65(8):2000796. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 295. Subar AF, Dodd KW, Guenther PM, Kipnis V, Midthune D, McDowell Met al. The food propensity questionnaire: concept, development, and validation for use as a covariate in a model to estimate usual food intake. J Am Diet Assoc. 2006;106(10):1556–63. [DOI] [PubMed] [Google Scholar]
  • 296. Eldridge AL, Piernas C, Illner A-K, Gibney MJ, Gurinović MA, de Vries JHMet al. Evaluation of new technology-based tools for dietary intake assessment—an ILSI Europe Dietary Intake and Exposure Task Force evaluation. Nutrients. 2018;11(1):55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 297. Boushey CJ, Spoden M, Zhu FM, Delp EJ, Kerr DA. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods. Proc Nutr Soc. 2017;76(3):283–94. [DOI] [PubMed] [Google Scholar]
  • 298. Lu Y, Stathopoulou T, Vasiloglou MF, Pinault LF, Kiley C, Spanakis EKet al. goFOODTM: an artificial intelligence system for dietary assessment. Sensors (Basel) [Internet]. 2020; [cited 2021 May 3];20(15). Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7436102/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 299. Brouwer-Brolsma EM, Lucassen D, Rijk MGD, Slotegraaf A, Perenboom C, Borgonjen Ket al. Dietary intake assessment: from traditional paper-pencil questionnaires to technology-based tools. In: Athanasiadis I,Frysinger S,Schimak G,Knibbe Weds.Environmental Software Systems. Data Science in Action. ISESS 2020. IFIP Advances in Information and Communication Technology. 554:2020;7–23.. Conference: February 5-7, 2020; Wageningen, The Netherlands. [Google Scholar]
  • 300. Bickmore TW, Kimani E, Trinh H, Pusateri A, Paasche-Orlow MK, Magnani JW. Managing chronic conditions with a smartphone-based conversational virtual agent. In: Proceedings of the 18th International Conference on Intelligent Virtual Agents [Internet]. New York, NY, USA: Association for Computing Machinery; 2018; [cited 2021 Mar 7]. 119–24.. (IVA ’18). Available from: 10.1145/3267851.3267908/ [DOI] [Google Scholar]
  • 301. Bol N, Helberger N, Weert JCM. Differences in mobile health app use: a source of new digital inequalities? Inf Soc. 2018;34(3):183–93. [Google Scholar]
  • 302. Bi Y, Lv M, Song C, Xu W, Guan N, Yi W. AutoDietary: a wearable acoustic sensor system for food intake recognition in daily life. IEEE Sensors J. 2016;16(3):806–16. [Google Scholar]
  • 303. Amft O, Junker H, Troster G. Detection of eating and drinking arm gestures using inertial body-worn sensors. In: Ninth IEEE International Symposium on Wearable Computers (ISWC’05). IEEE; 2005. p. 160–3.. Conference: 18-21 Oct. 2005; Osaka, Japan. [Google Scholar]
  • 304. Farooq M, Fontana JM, Sazonov E. A novel approach for food intake detection using electroglottography. Physiol Meas. 2014;35(5):739–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 305. Kalantarian H, Alshurafa N, Sarrafzadeh M. A wearable nutrition monitoring system. In: Proceedings—11th International Conference on Wearable and Implantable Body Sensor Networks, BSN 2014 [Internet]. IEEE Computer Society; 2014; [cited 2021 Mar 7]. Available from: https://www.scholars.northwestern.edu/en/publications/a-wearable-nutrition-monitoring-system. [Google Scholar]
  • 306. Amft O, Kusserow M, Troster G. Bite weight prediction from acoustic recognition of chewing. IEEE Trans Biomed Eng. 2009;56(6):1663–72. [DOI] [PubMed] [Google Scholar]
  • 307. Cambon L. Health smart devices and applications…towards a new model of prevention?. Eur J Public Health. 2017;27(3):390–1. [Google Scholar]
  • 308.Statista. Topic: US smartphone market [Internet]. [cited 2021 Mar 7]. Available from: https://www.statista.com/topics/2711/us-smartphone-market/. [Google Scholar]
  • 309. How do I track my food with the Fitbit app?. [Internet]. [cited 2021 Mar 7] Available from: https://help.fitbit.com/articles/en_US/Help_article/1375.htm. [Google Scholar]
  • 310. Maringer M, Wisse-Voorwinden N, Van't Veer, Geelen A. Food identification by barcode scanning in the Netherlands: a quality assessment of labelled food product databases underlying popular nutrition applications. Public Health Nutr. 2019;7:1215–22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 311. Arens-Volland AG, Spassova L, Bohn T. Promising approaches of computer-supported dietary assessment and management—current research status and available applications. Int J Med Informatics. 2015;84(12):997–1008. [DOI] [PubMed] [Google Scholar]
  • 312. Steele R. An overview of the state of the art of automated capture of dietary intake information. Crit Rev Food Sci Nutr. 2015;55(13):1929–38. [DOI] [PubMed] [Google Scholar]
  • 313. Qin J, Kim MS, Chao K, Chan DE, Delwiche SR, Cho B-K. Line-scan hyperspectral imaging techniques for food safety and quality applications. Appl Sci. 2017;7(2):125. [Google Scholar]
  • 314. Ahn D, Choi J-Y, Kim H-C, Cho J-S, Moon K-D, Park T. Estimating the composition of food nutrients from hyperspectral signals based on deep neural networks. Sensors. 2019;19(7):1560. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 315. Vu T, Lin F, Alshurafa N, Xu W. Wearable food intake monitoring technologies: a comprehensive review. Computers. 2017;6(1):4. [Google Scholar]
  • 316. Prinz N, Bohn B, Kern A, Püngel D, Pollatos O, Holl RW. Feasibility and relative validity of a digital photo-based dietary assessment: results from the Nutris-Phone study. Public Health Nutr. 2019;22(7):1160–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 317. Shim J-S, Oh K, Kim HC. Dietary assessment methods in epidemiologic studies. Epidemiol Health. [Internet]. 2014; [cited 2021 May 3];36. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4154347/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 318. INFOODS: FAO/INFOODS databases. [Internet] [cited 2021 May 16]. Available from: http://www.fao.org/infoods/infoods/tables-and-databases/faoinfoods-databases/en/. [Google Scholar]
  • 319. Scrimshaw NS. INFOODS: the International Network of Food Data Systems. Am J Clin Nutr. 1997;65(4):1190S–3S. [DOI] [PubMed] [Google Scholar]
  • 320. Sabeti PC, Varilly P, Fry B, Lohmueller J, Hostetter E, Cotsapas Cet al. Genome-wide detection and characterization of positive selection in human populations. Nature. 2007;449(7164):913–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 321. Overbeek R, Begley T, Butler RM, Choudhuri JV, Chuang H-Y, Cohoon Met al. The subsystems approach to genome annotation and its use in the project to annotate 1000 genomes. Nucleic Acids Res. 2005;33(17):5691–702. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 322. Sharma AK, Eils R, Konig R. Copy number alterations in enzyme-coding and cancer-causing genes reprogram tumor metabolism. Cancer Res. 2016;76(14):4058–67. [DOI] [PubMed] [Google Scholar]
  • 323. Reiter T, Jagoda E, Capellini TD. Dietary variation and evolution of gene copy number among dog breeds. PLoS One. 2016;11(2):e0148899. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 324. Bilgin Sonay T, Carvalho T, Robinson MD, Greminger MP, Krutzen M, Comas Det al. Tandem repeat variation in human and great ape populations and its impact on gene expression divergence. Genome Res. 2015;25(11):1591–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 325. Ideraabdullah FY, Zeisel SH. Dietary modulation of the epigenome. Physiol Rev. 2018;98(2):667–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 326. Efeyan A, Comb WC, Sabatini DM. Nutrient-sensing mechanisms and pathways. Nature. 2015;517(7534):302–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 327. Hayakawa K, Hirosawa M, Tabei Y, Arai D, Tanaka S, Murakami Net al. Epigenetic switching by the metabolism-sensing factors in the generation of orexin neurons from mouse embryonic stem cells. J Biol Chem. 2013;288(24):17099–110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 328. Shin HJ, Kim H, Oh S, Lee JG, Kee M, Ko HJet al. AMPK-SKP2-CARM1 signalling cascade in transcriptional regulation of autophagy. Nature. 2016;534(7608):553–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 329. Fullgrabe J, Klionsky DJ, Joseph B. The return of the nucleus: transcriptional and epigenetic control of autophagy. Nat Rev Mol Cell Biol. 2014;15(1):65–74. [DOI] [PubMed] [Google Scholar]
  • 330. Wang TJ, Zhang F, Richards JB, Kestenbaum B, van Meurs JB, Berry Det al. Common genetic determinants of vitamin D insufficiency: a genome-wide association study. Lancet North Am Ed. 2010;376(9736):180–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 331. Fischer LM, da Costa KA, Kwock L, Galanko J, Zeisel SH. Dietary choline requirements of women: effects of estrogen and genetic variation. Am J Clin Nutr. 2010;92(5):1113–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 332. Resseguie ME, da Costa KA, Galanko JA, Patel M, Davis IJ, Zeisel SH. Aberrant estrogen regulation of PEMT results in choline deficiency-associated liver dysfunction. J Biol Chem. 2011;286(2):1649–58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 333. Ma Y, Follis JL, Smith CE, Tanaka T, Manichaikul AW, Chu AYet al. Interaction of methylation-related genetic variants with circulating fatty acids on plasma lipids: a meta-analysis of 7 studies and methylation analysis of 3 studies in the Cohorts for Heart and Aging Research in Genomic Epidemiology consortium. Am J Clin Nutr. 2016;103(2):567–78. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 334. Szyf M. DNA methylation and demethylation as targets for anticancer therapy. Biochemistry (Moscow). 2005;70(5):533–49. [DOI] [PubMed] [Google Scholar]
  • 335. Dolinoy DC, Weidman JR, Waterland RA, Jirtle RL. Maternal genistein alters coat color and protects Avy mouse offspring from obesity by modifying the fetal epigenome. Environ Health Perspect. 2006;114(4):567–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 336. Waterland RA, Dolinoy DC, Lin JR, Smith CA, Shi X, Tahiliani KG. Maternal methyl supplements increase offspring DNA methylation at Axin fused. Genesis. 2006;44(9):401–6. [DOI] [PubMed] [Google Scholar]
  • 337. Waterland RA, Travisano M, Tahiliani KG. Diet-induced hypermethylation at agouti viable yellow is not inherited transgenerationally through the female. FASEB J. 2007;21(12):3380–5. [DOI] [PubMed] [Google Scholar]
  • 338. Wolff GL, Kodell RL, Moore SR, Cooney CA. Maternal epigenetics and methyl supplements affect agouti gene expression in Avy/a mice. FASEB J. 1998;12(11):949–57. [PubMed] [Google Scholar]
  • 339. Mehedint MG, Niculescu MD, Craciunescu CN, Zeisel SH. Choline deficiency alters global histone methylation and epigenetic marking at the Re1 site of the calbindin 1 gene. FASEB J. 2010;24(1):184–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 340. Dominguez-Salas P, Moore SE, Baker MS, Bergen AW, Cox SE, Dyer RAet al. Maternal nutrition at conception modulates DNA methylation of human metastable epialleles. Nat Commun. 2014;5(1):3746. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 341. Zhang N. Epigenetic modulation of DNA methylation by nutrition and its mechanisms in animals. Anim Nutr. 2015;1(3):144–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 342. Waterland RA. Assessing the effects of high methionine intake on DNA methylation. J Nutr. 2006;136(6):1706S–10S. [DOI] [PubMed] [Google Scholar]
  • 343. Jones PA. Functions of DNA methylation: islands, start sites, gene bodies and beyond. Nat Rev Genet. 2012;13(7):484–92. [DOI] [PubMed] [Google Scholar]
  • 344. Shen L, Waterland RA. Methods of DNA methylation analysis. Curr Opin Clin Nutr Metab Care. 2007;10(5):576–81. [DOI] [PubMed] [Google Scholar]
  • 345. Lai WKM, Pugh BF. Understanding nucleosome dynamics and their links to gene expression and DNA replication [Internet]. Nat Rev Mol Cell Biol. 2017;18(9):548–62.; Available from: https://www.ncbi.nlm.nih.gov/pubmed/28537572. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 346. Davison JM, Mellott TJ, Kovacheva VP, Blusztajn JK. Gestational choline supply regulates methylation of histone H3, expression of histone methyltransferases G9a (Kmt1c) and Suv39h1 (Kmt1a), and DNA methylation of their genes in rat fetal liver and brain. J Biol Chem. 2009;284(4):1982–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 347. Jiang X, Yan J, West AA, Perry CA, Malysheva OV, Devapatla Set al. Maternal choline intake alters the epigenetic state of fetal cortisol-regulating genes in humans. FASEB J. 2012;26(8):3563–74. [DOI] [PubMed] [Google Scholar]
  • 348. Kaelin WG Jr, McKnight SL. Influence of metabolism on epigenetics and disease. Cell. 2013;153(1):56–69. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 349. Inoue S, Honma K, Mochizuki K, Goda T. Induction of histone H3K4 methylation at the promoter, enhancer, and transcribed regions of the Si and Sglt1 genes in rat jejunum in response to a high-starch/low-fat diet. Nutrition. 2015;31(2):366–72. [DOI] [PubMed] [Google Scholar]
  • 350. Leung A, Trac C, Du J, Natarajan R, Schones DE. Persistent chromatin modifications induced by high fat diet. J Biol Chem. 2016;291(20):10446–55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 351. Rajendran P, Williams DE, Ho E, Dashwood RH. Metabolism as a key to histone deacetylase inhibition. Crit Rev Biochem Mol Biol. 2011;46(3):181–99. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 352. Deveson IW, Hardwick SA, Mercer TR, Mattick JS. The dimensions, dynamics, and relevance of the mammalian noncoding transcriptome. Trends Genet. 2017;33(7):464–78.. doi: 10.1016/j.tig.2017.04.004. [DOI] [PubMed] [Google Scholar]
  • 353. Holoch D, Moazed D. RNA-mediated epigenetic regulation of gene expression. Nat Rev Genet. 2015;16(2):71–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 354. Saetrom P, Snove O Jr, Rossi JJ. Epigenetics and microRNAs. Pediatr Res. 2007;61(5 Part 2):17R–23R. [DOI] [PubMed] [Google Scholar]
  • 355. Slonim DK, Yanai I. Getting started in gene expression microarray analysis. PLoS Comput Biol. 2009;5(10):e1000543. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 356. Stark R, Grzelak M, Hadfield J. RNA sequencing: the teenage years. Nat Rev Genet. 2019;20(11):631–56. [DOI] [PubMed] [Google Scholar]
  • 357. Brown T, Mackey K, Du T. Analysis of RNA by Northern and slot blot hybridization. Curr Protocols Mol Biol. 2004;67(1):4.9.1–4.9.19. [DOI] [PubMed] [Google Scholar]
  • 358. Weinberg MS, Morris KV. Transcriptional gene silencing in humans. Nucleic Acids Res. 2016;44(14):6505–17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 359. Green MR, Sambrook J. Polymerase chain reaction. Cold Spring Harbor Protocols. 2019;2019(6):pdb.top095109. [DOI] [PubMed] [Google Scholar]
  • 360. Oussalah A, Rischer S, Bensenane M, Conroy G, Filhine-Tresarrieu P, Debard Ret al. Plasma mSEPT9: a novel circulating Cell-free DNA-based epigenetic biomarker to diagnose hepatocellular carcinoma. EBioMedicine. 2018;30:138–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 361. García-Calzón S, Perfilyev A, Martinell M, Ustinova M, Kalamajski S, Franks PWet al. Epigenetic markers associated with metformin response and intolerance in drug-naïve patients with type 2 diabetes. Sci Transl Med. 2020;12(561):eaaz1803. [DOI] [PubMed] [Google Scholar]
  • 362. Harris CJ, Davis BA, Zweig JA, Nevonen KA, Quinn JF, Carbone Let al. Age-associated DNA methylation patterns are shared between the hippocampus and peripheral blood cells. Front Genet [Internet]. 2020; [cited 2021 Aug 2].Available from: https://www.frontiersin.org/articles/10.3389/fgene.2020.00111/full. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 363. Gilmour DS, Lis JT. Detecting protein-DNA interactions in vivo: distribution of RNA polymerase on specific bacterial genes. Proc Natl Acad Sci. 1984;81(14):4275–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 364. Yang X, Kui L, Tang M, Li D, Wei K, Chen Wet al. High-throughput transcriptome profiling in drug and biomarker discovery. Front Genet. 2020;11:19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 365. Rao MS, Van Vleet TR, Ciurlionis R, Buck WR, Mittelstadt SW, Blomme EAGet al. Comparison of RNA-Seq and microarray gene expression platforms for the toxicogenomic evaluation of liver from short-term rat toxicity studies. Front Genet. 2018;9:636. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 366. Corella D, Tai ES, Sorli JV, Chew SK, Coltell O, Sotos-Prieto Met al. Association between the APOA2 promoter polymorphism and body weight in Mediterranean and Asian populations: replication of a gene-saturated fat interaction. Int J Obes. 2011;35(5):666–75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 367. Sunde RA. Gene set enrichment analysis of selenium-deficient and high-selenium rat liver transcript expression and comparison with Turkey liver expression. J Nutr. 2021;151(4):772–84. [DOI] [PubMed] [Google Scholar]
  • 368. Klaus B, Reisenauer S. An end to end workflow for differential gene expression using Affymetrix microarrays. [version 2; peer review: 2 approved]. F1000Research. [Internet]. 2018;5(1384). Available from: https://f1000research.com/articles/5-1384/v2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 369. Hochberg Y, Benjamini Y. More powerful procedures for multiple significance testing. Stat Med. 1990;9(7):811–8. [DOI] [PubMed] [Google Scholar]
  • 370. Krämer A, Green J, Pollard J, Tugendreich S. Causal analysis approaches in Ingenuity Pathway Analysis. Bioinformatics. 2014;30(4):523–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 371. Kuleshov MV, Jones MR, Rouillard AD, Fernandez NF, Duan Q, Wang Zet al. Enrichr: a comprehensive gene set enrichment analysis web server 2016 update. Nucleic Acids Res. 2016;44(W1):W90–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 372. Mootha VK, Lindgren CM, Eriksson K-F, Subramanian A, Sihag S, Lehar Jet al. PGC-1α-responsive genes involved in oxidative phosphorylation are coordinately downregulated in human diabetes. Nat Genet. 2003;34(3):267–73. [DOI] [PubMed] [Google Scholar]
  • 373. Home–Gene Expression Omnibus (GEO) -National Center for Biotechnology Information, National Library of Medicine [Internet]. [cited 2021 Sep 20]. Available from: https://www.ncbi.nlm.nih.gov/geo/. [Google Scholar]
  • 374. Home–Sequence Read Archive (SRA) –National Center for Biotechnology Information, National Library of Medicine [Internet]. [cited 2021 Sep 20]. Available from: https://www.ncbi.nlm.nih.gov/sra. [Google Scholar]
  • 375. Schmutz J, Wheeler J, Grimwood J, Dickson M, Yang J, Caoile Cet al. Quality assessment of the human genome sequence. Nature. 2004;429(6990):365–8. [DOI] [PubMed] [Google Scholar]
  • 376. Ring HZ, Kwok P-Y, Cotton RG. Human Variome Project: an international collaboration to catalogue human genetic variation. Pharmacogenomics. 2006;7(7):969–72. [DOI] [PubMed] [Google Scholar]
  • 377. Auton A, Abecasis GR, Altshuler DM, Durbin RM, Abecasis GR, Bentley DRet al. A global reference for human genetic variation. Nature. 2015;526(7571):68–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 378. Nunn JS, Tiller J, Fransquet P, Lacaze P. Public involvement in global genomics research: a scoping review. Front Public Health. 2019;7:79. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 379. The 100,000 Genomes Project by numbers | genomics England. [Internet]. 2018; [cited 2021 Sep 20]. Available from: https://www.genomicsengland.co.uk/the-100000-genomes-project-by-numbers/. [Google Scholar]
  • 380. Huttenhower C, Gevers D, Knight R, Abubucker S, Badger JH, Chinwalla ATet al. Structure, function and diversity of the healthy human microbiome. Nature. 2012;486(7402):207–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 381. Zeisel SH. A conceptual framework for studying and investing in precision nutrition. Front Genet [Internet]. 2019; [cited 2021 Feb 4];10. Available from: https://www.frontiersin.org/articles/10.3389/fgene.2019.00200/full. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 382. Mills S, Stanton C, Lane JA, Smith GJ, Ross RP. Precision nutrition and the microbiome, part I: current state of the science. Nutrients. 2019;11(4):923. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 383. Sender R, Fuchs S, Milo R. Are we really vastly outnumbered? Revisiting the ratio of bacterial to host cells in humans. Cell. 2016;164(3):337–40. [DOI] [PubMed] [Google Scholar]
  • 384. Al Bander Z, Nitert MD, Mousa A, Naderpoor N. The gut microbiota and inflammation: an overview. Int J Environ Res Public Health. 2020;17(20):7618. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 385. Fan Y, Pedersen O. Gut microbiota in human metabolic health and disease. Nat Rev Microbiol. 2021;19(1):55–71. [DOI] [PubMed] [Google Scholar]
  • 386. Mobeen F, Sharma V, Tulika P. Enterotype variations of the healthy human gut microbiome in different geographical regions. Bioinformation. 2018;14(9):560–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 387. Zeisel SH. Precision (Personalized) nutrition: understanding metabolic heterogeneity. Ann Rev Food Sci Technol. 2020;11(1):71–92. [DOI] [PubMed] [Google Scholar]
  • 388. Ye Y, Xu H, Xie Z, Wang L, Sun Y, Yang Het al. Time-restricted feeding reduces the detrimental effects of a high-fat diet, possibly by modulating the circadian rhythm of hepatic lipid metabolism and gut microbiota. Front Nutr. 2020;7:596285. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 389. Chen J, He X, Huang J. Diet effects in gut microbiome and obesity. J Food Sci. 2014;79(4):R442–51. [DOI] [PubMed] [Google Scholar]
  • 390. David LA, Maurice CF, Carmody RN, Gootenberg DB, Button JE, Wolfe BEet al. Diet rapidly and reproducibly alters the human gut microbiome. Nature. 2014;505(7484):559–63. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 391. Pak HH, Cummings NE, Green CL, Brinkman JA, Yu D, Tomasiewicz JLet al. The metabolic response to a low amino acid diet is independent of diet-induced shifts in the composition of the gut microbiome. Sci Rep. 2019;9(1):67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 392. The National Institutes of Health (NIH) Nutrition Research Task Force . 2020–2030 Strategic plan for NIH nutrition research. 2020;24. [cited 2022 May 9]. Available from:  https://www.niddk.nih.gov/about-niddk/strategic-plans-reports/strategic-plan-nih-nutrition-research. [Google Scholar]
  • 393. Klurfeld DM, Davis CD, Karp RW, Allen-Vercoe E, Chang EB, Chassaing Bet al. Considerations for best practices in studies of fiber or other dietary components and the intestinal microbiome. Am J Physiol Endocrinol Metab. 2018;315(6):E1087–97. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 394. Korczak R, Kocher M, Swanson KS. Effects of oats on gastrointestinal health as assessed by in vitro, animal, and human studies. Nutr Rev. 2020;78(5):343–63. [DOI] [PubMed] [Google Scholar]
  • 395. Johnson AJ, Zheng JJ, Kang JW, Saboe A, Knights D, Zivkovic AM. A guide to diet-microbiome study design. Front Nutr [Internet]. 2020; 7:[cited 2021 Feb 21]. Available from: https://www.frontiersin.org/articles/10.3389/fnut.2020.00079/full. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 396. Marques Francine Z, Jama Hamdi A, Tsyganov K, Gill Paul A, Rhys-Jones D, Muralitharan Rikeish Ret al. Guidelines for transparency on gut microbiome studies in essential and experimental hypertension. Hypertension. 2019;74(6):1279–93. [DOI] [PubMed] [Google Scholar]
  • 397. Wade KH, Hall LJ. Improving causality in microbiome research: can human genetic epidemiology help?. Wellcome Open Res [Internet]. 2020;Apr 24: [cited 2021 Feb 22];4. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7217228/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 398. Douglas AE. Which experimental systems should we use for human microbiome science?. PLoS Biol. 2018;16(3):e2005245. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 399. Alegría A, Garcia-Llatas G, Cilla A. Static digestion models: general introduction. In: Verhoeckx K, Cotter P, López-Expósito I, Kleiveland C, Lea T, Mackie Aet al. et al.edsThe Impact of food bioactives on health: in vitro and ex vivo models [Internet]. Cham (Switzerland): Springer International Publishing; 2015. p. 3–12.. [cited 2021 Feb 21] Available from: 10.1007/978-3-319-16104-4_1. [DOI] [Google Scholar]
  • 400. Barroso E, Cueva C, Peláez C, Martínez-Cuesta MC, Requena T. The computer-controlled multicompartmental dynamic model of the gastrointestinal system SIMGI. In: Verhoeckx K, Cotter P, López-Expósito I, Kleiveland C, Lea T, Mackie Aet al. et al.eds.The impact of food bioactives on health: in vitro and ex vivo models [Internet]. Cham (Switzerland): Springer International Publishing; 2015. p.319–27.. [cited 2021 Feb 21] Available from: 10.1007/978-3-319-16104-4_28. [DOI] [Google Scholar]
  • 401. Minekus M. The TNO Gastro-Intestinal Model (TIM). In: Verhoeckx K, Cotter P, López-Expósito I, Kleiveland C, Lea T, Mackie Aet al. et al.eds.The impact of food bioactives on health: in vitro and ex vivo models [Internet]. Cham (Switzerland): Springer; 2015; [cited 2021 Feb 21]. Available from: http://www.ncbi.nlm.nih.gov/books/NBK500162/. [Google Scholar]
  • 402. Van de Wiele T, Van den Abbeele P, Ossieur W, Possemiers S, Marzorati M. The Simulator of the Human Intestinal Microbial Ecosystem (SHIME®). In: Verhoeckx K, Cotter P, López-Expósito I, Kleiveland C, Lea T, Mackie Aet al. et al.eds.The impact of food bioactives on health: in vitro and ex vivo models [Internet]. Cham (Switzerland): Springer International Publishing; 2015; 305–17.. [cited 2021 Feb 21] Available from: 10.1007/978-3-319-16104-4_27. [DOI] [Google Scholar]
  • 403. Venema K. The TNO In Vitro Model of the Colon (TIM-2). In: Verhoeckx K, Cotter P, López-Expósito I, Kleiveland C, Lea T, Mackie Aet al., editors. The impact of food bioactives on health: in vitro and ex vivo models [Internet]. Cham (Switzerland): Springer International Publishing; 2015[cited 2021 Feb 21]. p. 293–304.. Available from: https://doi.org/10.1007/978-3-319-16104-4_26. [Google Scholar]
  • 404. Aguirre M, Eck A, Koenen ME, Savelkoul PHM, Budding AE, Venema K. Diet drives quick changes in the metabolic activity and composition of human gut microbiota in a validated in vitro gut model. Res Microbiol. 2016;167(2):114–25. [DOI] [PubMed] [Google Scholar]
  • 405. Hernandez-Hernandez O. In vitro gastrointestinal models for prebiotic carbohydrates: a critical review. Curr Pharm Des. 2019;25(32):3478–83. [DOI] [PubMed] [Google Scholar]
  • 406. Rubert J, Schweiger PJ, Mattivi F, Tuohy K, Jensen KB, Lunardi A. Intestinal organoids: a tool for modelling diet–microbiome–host interactions. Trends Endocrinol Metab. 2020;31(11):848–58. [DOI] [PubMed] [Google Scholar]
  • 407. Xiang Y, Wen H, Yu Y, Li M, Fu X, Huang S. Gut-on-chip: recreating human intestine in vitro. J Tissue Eng [Internet]. 2020[11. [cited 2021 Feb 21] Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7682210/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 408. Lukovac S, Roeselers G. Intestinal crypt organoids as experimental models. In: Verhoeckx K, Cotter P, López-Expósito I, Kleiveland C, Lea T, Mackie Aet al. et al.eds.The impact of food bioactives on health: in vitro and ex vivo models. [Internet]. Cham (Switzerland): Springer International Publishing; 2015; 245–53.. [cited 2021 Feb 21] Available from: 10.1007/978-3-319-16104-4_22. [DOI] [Google Scholar]
  • 409. Chanyi RM, Craven L, Harvey B, Reid G, Silverman MJ, Burton JP. Faecal microbiota transplantation: where did it start? What have studies taught us? Where is it going?. SAGE Open Med [Internet]. 2017; 5. [cited 2021 Feb 22] Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5431603/. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 410. Nguyen TLA, Vieira-Silva S, Liston A, Raes J. How informative is the mouse for human gut microbiota research?. Dis Models Mech. 2015;8(1):1–16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 411. Ward RE, Benninghoff AD, Hintze KJ. Food matrix and the microbiome: considerations for preclinical chronic disease studies. Nutr Res. 2020;78:1–10. [DOI] [PubMed] [Google Scholar]
  • 412. Carney SM, Clemente JC, Cox MJ, Dickson RP, Huang YJ, Kitsios GDet al. Methods in lung microbiome research. Am J Respir Cell Mol Biol. 2020;62(3):283–99. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 413. Arrieta M-C, Walter J, Finlay BB. Human microbiota-associated mice: a model with challenges. Cell Host Microbe. 2016;19(5):575–78. [DOI] [PubMed] [Google Scholar]
  • 414. Ericsson AC, Gagliardi J, Bouhan D, Spollen WG, Givan SA, Franklin CL. The influence of caging, bedding, and diet on the composition of the microbiota in different regions of the mouse gut. Sci Rep. 2018;8(1):4065. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 415. Hugenholtz F, de Vos WM. Mouse models for human intestinal microbiota research: a critical evaluation. Cell Mol Life Sci. 2018;75(1):149–60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 416. Filippo CD, Cavalieri D, Paola MD, Ramazzotti M, Poullet JB, Massart Set al. Impact of diet in shaping gut microbiota revealed by a comparative study in children from Europe and rural Africa. Proc Natl Acad Sci. 2010;107(33):14691–96. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 417. Wu GD, Compher C, Chen EZ, Smith SA, Shah RD, Bittinger Ket al. Comparative metabolomics in vegans and omnivores reveal constraints on diet-dependent gut microbiota metabolite production. Gut. 2016;65(1):63–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 418. Odamaki T, Kato K, Sugahara H, Hashikura N, Takahashi S, Xiao Jet al. Age-related changes in gut microbiota composition from newborn to centenarian: a cross-sectional study. BMC Microbiol. 2016;16(1):90. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 419. Taneja V. Microbiome: impact of gender on function & characteristics of gut microbiome. In: Legato MJ, editor. Principles of gender-specific medicine. 3rd ed[Internet]. San Diego (CA): Academic Press; 2017; 569–83.. [cited 2021 Feb 21] Available from: https://www.sciencedirect.com/science/article/pii/B9780128035061000279. [Google Scholar]
  • 420. Lichtenstein AH, Petersen K, Barger K, Hansen KE, Anderson CAM, Baer DJet al. Perspective: design and conduct of human nutrition randomized controlled trials. Adv Nutr. 2021;12(1):4–20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 421. Maki KC, Miller JW, McCabe GP, Raman G, Kris-Etherton PM. Perspective: laboratory considerations and clinical data management for human nutrition randomized controlled trials: guidance for ensuring quality and integrity. Adv Nutr. 2021;12(1):46–58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 422. Weaver CM, Fukagawa NK, Liska D, Mattes RD, Matuszek G, Nieves JWet al. Perspective: US documentation and regulation of human nutrition randomized controlled trials. Adv Nutr. 2021;12(1):21–45. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 423. Barabási A-L, Menichetti G, Loscalzo J. The unmapped chemical complexity of our diet. Nat Food. 2020;1(1):33–7. [Google Scholar]
  • 424. Fu BC, Randolph TW, Lim U, Monroe KR, Cheng I, Wilkens LRet al. Characterization of the gut microbiome in epidemiologic studies: the multiethnic cohort experience. Ann Epidemiol. 2016;26(5):373–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 425. Leeming ER, Louca P, Gibson R, Menni C, Spector TD, Le Roy CI. The complexities of the diet-microbiome relationship: advances and perspectives. Genome Med. 2021;13(1):10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 426. Swann JR, Rajilic-Stojanovic M, Salonen A, Sakwinska O, Gill C, Meynier Aet al. Considerations for the design and conduct of human gut microbiota intervention studies relating to foods. Eur J Nutr. 2020;59(8):3347–68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 427. Quinn RA, Phelan VV, Whiteson KL, Garg N, Bailey BA, Lim YWet al. Microbial, host and xenobiotic diversity in the cystic fibrosis sputum metabolome. ISME J. 2016;10(6):1483–98. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 428. Johnson AJ, Zheng JJ, Kang JW, Saboe A, Knights D, Zivkovic AM. A guide to diet-microbiome study design. Front Nutr. [Internet]. 2020;7. [cited 2021 Feb 21] Available from: https://www.frontiersin.org/articles/10.3389/fnut.2020.00079/full. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 429. Knight R, Vrbanac A, Taylor BC, Aksenov A, Callewaert C, Debelius Jet al. Best practices for analysing microbiomes. Nat Rev Microbiol. 2018;16(7):410–22. [DOI] [PubMed] [Google Scholar]
  • 430. NIH . PAR-20–133: gastrointestinal (GI) and microbiome explorers: development of swallowable smart pills or devices for precision nutrition, microbiome and digestive disease applications (R21/R33 clinical trial required) [Internet]. 2020[cited 2021 Feb 21]. Available from: https://grants.nih.gov/grants/guide/pa-files/PAR-20-133.html. [Google Scholar]
  • 431. Koudoufio M, Desjardins Y, Feldman F, Spahis S, Delvin E, Levy E.. Insight into polyphenol and gut microbiota crosstalk: are their metabolites the key to understand protective effects against metabolic disorders?. Antioxidants. 2020;9(10):982. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 432. Frazier K, Chang EB. Intersection of the gut microbiome and circadian rhythms in metabolism. Trends Endocrinol Metab. 2020;31(1):25–36. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 433. Collado MC, Engen PA, Bandín C, Cabrera-Rubio R, Voigt RM, Green SJet al. Timing of food intake impacts daily rhythms of human salivary microbiota: a randomized, crossover study. FASEB J. 2018;32(4):2060–72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 434. Leeming ER, Johnson AJ, Spector TD, Le Roy CI. Effect of diet on the gut microbiota: rethinking intervention duration. Nutrients. 2019;11(12):2862. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 435. Roume H, EL Muller E, Cordes T, Renaut J, Hiller K, Wilmes P. A biomolecular isolation framework for eco-systems biology. ISME J. 2013;7(1):110–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 436. Mills S, Lane JA, Smith GJ, Grimaldi KA, Ross RP, Stanton C. Precision nutrition and the microbiome part II: potential opportunities and pathways to commercialisation. Nutrients. 2019;11(7):1468. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 437. Green JM, Barratt MJ, Kinch M, Gordon JI. Food and microbiota in the FDA regulatory framework. Science. 2017;357(6346):39–40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 438. Schoeller DA. Recent advances from application of doubly labeled water to measurement of human energy expenditure. J Nutr. 1999;129(10):1765–8. [DOI] [PubMed] [Google Scholar]
  • 439. Westerterp KR. Doubly labelled water assessment of energy expenditure: principle, practice, and promise. Eur J Appl Physiol. 2017;117(7):1277–85. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 440. Burrows TL, Ho YY, Rollo ME, Collins CE. Validity of dietary assessment methods when compared to the method of doubly labeled water: a systematic review in adults. Front Endocrinol. 2019;10:850. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 441. Calazel CM, Young VR, Evans WJ, Roberts SB. Effect of fasting and feeding on measurement of carbon dioxide production using doubly labeled water. J Appl Physiol (1985). 1993;74(4):1824–9. [DOI] [PubMed] [Google Scholar]
  • 442. Sawaya AL, Saltzman E, Fuss P, Young VR, Roberts SB. Dietary energy requirements of young and older women determined by using the doubly labeled water method. Am J Clin Nutr. 1995;62(2):338–44. [DOI] [PubMed] [Google Scholar]
  • 443. Speakman JR, Yamada Y, Sagayama H, Berman ESF, Ainslie PN, Andersen LFet al. A standard calculation methodology for human doubly labeled water studies. Cell Rep Med. 2021;2(2):100203. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 444. Surrao J, Sawaya AL, Dallal GE, Tsay R, Roberts SB. Use of food quotients in human doubly labeled water studies: comparable results obtained with 4 widely used food intake methods. J Am Diet Assoc. 1998;98(9):1015–20. [DOI] [PubMed] [Google Scholar]
  • 445. Henriksen A, Grimsgaard S, Horsch A, Hartvigsen G, Hopstock L. Validity of the Polar M430 activity monitor in free-living conditions: validation study. JMIR Form Res. 2019;3(3):e14438. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 446. Sasaki JE, John D, Freedson PS. Validation and comparison of Actigraph activity monitors. J Sci Med Sport. 2011;14(5):411–6. [DOI] [PubMed] [Google Scholar]
  • 447. Henriksen A, Johansson J, Hartvigsen G, Grimsgaard S, Hopstock L. Measuring physical activity using triaxial wrist worn polar activity trackers: a systematic review. Int J Exerc Sci. 2020;13(4):438–54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 448. Little HJ, Nutt DJ, Taylor SC. Selective changes in the in vivo effects of benzodiazepine receptor ligands after chemical kindling with FG 7142. Neuropharmacology. 1987;26(1):25–31. [DOI] [PubMed] [Google Scholar]
  • 449. Ng BK, Hinton BJ, Fan B, Kanaya AM, Shepherd JA. Clinical anthropometrics and body composition from 3D whole-body surface scans. Eur J Clin Nutr. 2016;70(11):1265–70. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 450. Sager R, Güsewell S, Rühli F, Bender N, Staub K. Multiple measures derived from 3D photonic body scans improve predictions of fat and muscle mass in young Swiss men. PLoS One. 2020;15(6):e0234552. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 451. Lee JJ, Freeland-Graves JH, Pepper MR, Stanforth PR, Xu B. Prediction of android and gynoid body adiposity via a three-dimensional stereovision body imaging system and dual-energy X-ray absorptiometry. J Am Coll Nutr. 2015;34(5):367–77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 452. Caran LG, Santana DD, Monteiro LS, da Veiga GV. Disordered eating behaviors and energy and nutrient intake in a regional sample of Brazilian adolescents from public schools. Eat Weight Disord Stud Anorexia Bulimia Obesity. 2018;23(6):825–32. [DOI] [PubMed] [Google Scholar]
  • 453. Ruppert AM, Grams J, Jędrzejczyk M, Matras-Michalska J, Keller N, Ostojska Ket al. Titania-Supported catalysts for levulinic acid hydrogenation: influence of support and its impact on γ-valerolactone yield. ChemSusChem. 2015;8(9):1538–47. [DOI] [PubMed] [Google Scholar]
  • 454. Yamato M, Hashigaki K, Ishikawa S, Hitomi S, Kokeguchi T. Synthesis and antiulcer activity of (isochroman-1-yl)alkylamines. I. Chem Pharm Bull (Tokyo). 1988;36(5):1758–65. [DOI] [PubMed] [Google Scholar]
  • 455. Ravelli MN, Schoeller DA. Traditional self-reported dietary instruments are prone to inaccuracies and new approaches are needed. Front Nutr. 2020;7:90. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 456. Dellava JE, Hoffman DJ. Validity of resting energy expenditure estimated by an activity monitor compared to indirect calorimetry. Br J Nutr. 2009;102(1):155–9. [DOI] [PubMed] [Google Scholar]
  • 457. Vasold KL, Parks AC, Phelan DML, Pontifex MB, Pivarnik JM.. Reliability and validity of commercially available low-cost bioelectrical impedance analysis. Int J Sport Nutr Exercise Metab. 2019;29(4):406–10. [DOI] [PubMed] [Google Scholar]
  • 458. Kyle UG, Bosaeus I, De Lorenzo AD, Deurenberg P, Elia M, Manuel Gómez Jet al. Bioelectrical impedance analysis—part II: utilization in clinical practice. Clin Nutr. 2004;23(6):1430–53. [DOI] [PubMed] [Google Scholar]
  • 459. Froelich MF, Fugmann M, Daldrup CL, Hetterich H, Coppenrath E, Saam Tet al. Measurement of total and visceral fat mass in young adult women: a comparison of MRI with anthropometric measurements with and without bioelectrical impedance analysis. Br J Radiol. 2020;93(1110):20190874. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 460. Fernandes RA, Rosa CSC, Buonani C, Oliveira AR, Freitas IF Jr. The use of bioelectrical impedance to detect excess visceral and subcutaneous fat. J Pediatr (Rio J). 2007;83(6):529–34. [DOI] [PubMed] [Google Scholar]
  • 461. Nagai M, Komiya H, Mori Y, Ohta T, Kasahara Y, Ikeda Y. Estimating visceral fat area by multifrequency bioelectrical impedance. Diabetes Care. 2010;33(5):1077–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 462. Fields DA, Higgins PB, Radley D. Air-displacement plethysmography: here to stay. Curr Opin Clin Nutr Metab Care. 2005;8(6):624–29. [DOI] [PubMed] [Google Scholar]
  • 463. Mazahery H, von Hurst PR, McKinlay CJD, Cormack BE, Conlon CA. Air displacement plethysmography (pea pod) in full-term and pre-term infants: a comprehensive review of accuracy, reproducibility, and practical challenges. Matern Health Neonat Perinatol. 2018;4(1):12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 464. Demerath EW, Fields DA. Body composition assessment in the infant. Am J Hum Biol. 2014;26(3):291–304. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 465. Davies PSW. Stable isotopes: their use and safety in human nutrition studies. Eur J Clin Nutr. 2020;74(3):362–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 466. Davies PS. Stable isotopes and bioelectrical impedance for measuring body composition in infants born small for gestational age. Horm Res. 1997;48(1):50–5. [DOI] [PubMed] [Google Scholar]
  • 467. Bier DM. The use of stable isotopes in metabolic investigation. Baillieres Clin Endocrinol Metab. 1987;1(4):817–36. [DOI] [PubMed] [Google Scholar]
  • 468. Poslusna K, Ruprich J, De Vries JHM, Jakubikova M, Van't Veer P. Misreporting of energy and micronutrient intake estimated by food records and 24hour recalls, control and adjustment methods in practice. Br J Nutr. 2009;101(Suppl 2):S73–85. [DOI] [PubMed] [Google Scholar]
  • 469. Foster E, Bradley J. Methodological considerations and future insights for 24-hour dietary recall assessment in children. Nutr Res. 2018;51:1–11. [DOI] [PubMed] [Google Scholar]
  • 470. Johnson RK, Soultanakis RP, Matthews DE. Literacy and body fatness are associated with underreporting of energy intake in US low-income women using the multiple-pass 24-hour recall. J Am Diet Assoc. 1998;98(10):1136–40. [DOI] [PubMed] [Google Scholar]
  • 471. Gibson RS, Ruth Charrondiere U, Bell W. Measurement errors in dietary assessment using self-reported 24-hour recalls in low-income countries and strategies for their prevention. Adv Nutr. 2017;8(6):980–91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 472. Arab L, Tseng CH, Ang A, Jardack P. Validity of a multipass, web-based, 24-hour self-administered recall for assessment of total energy intake in blacks and whites. Am J Epidemiol. 2011;174(11):1256–65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 473. Yuan C, Spiegelman D, Rimm EB, Rosner BA, Stampfer MJ, Barnett JBet al. Relative validity of nutrient intakes assessed by questionnaire, 24-hour recalls, and diet records as compared with urinary recovery and plasma concentration biomarkers: findings for women. Am J Epidemiol. 2018;187(5):1051–63. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 474. Park Y, Dodd KW, Kipnis V, Thompson FE, Potischman N, Schoeller DAet al. Comparison of self-reported dietary intakes from the automated self-administered 24-h recall, 4-d food records, and food-frequency questionnaires against recovery biomarkers. Am J Clin Nutr. 2018;107(1):80–93. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 475. Bhat DS, Yajnik CS, Sayyad MG, Raut KN, Lubree HG, Rege SSet al. Body fat measurement in Indian men: comparison of three methods based on a two-compartment model. Int J Obes. 2005;29(7):842–8. [DOI] [PubMed] [Google Scholar]
  • 476. van der Kooy K, Leenen R, Deurenberg P, Seidell JC, Westerterp KR, Hautvast JG. Changes in fat-free mass in obese subjects after weight loss: a comparison of body composition measures. Int J Obesity Relat Metab Disord. 1992;16(9):675–83. [PubMed] [Google Scholar]
  • 477. Bandara T, Hettiarachchi M, Liyanage C, Amarasena S, Wai-Lun Wong W. Body composition among Sri Lankan infants by 18O dilution method and the validity of anthropometric equations to predict body fat against 18O dilution. BMC Pediatr. 2015;15(1):1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 478. Hoffman DJ, Sawaya AL, Martins PA, McCrory MA, Roberts SB. Comparison of techniques to evaluate adiposity in stunted and nonstunted children. Pediatrics. 2006;117(4):e725–32. [DOI] [PubMed] [Google Scholar]
  • 479. Watts K, Naylor LH, Davis EA, Jones TW, Beeson B, Bettenay Fet al. Do skinfolds accurately assess changes in body fat in obese children and adolescents?. Med Sci Sports Exercise. 2006;38(3):439–44. [DOI] [PubMed] [Google Scholar]
  • 480. Liu A, Byrne NM, Ma G, Nasreddine L, Trinidad TP, Kijboonchoo Ket al. Validation of bioelectrical impedance analysis for total body water assessment against the deuterium dilution technique in Asian children. Eur J Clin Nutr. 2011;65(12):1321–7. [DOI] [PubMed] [Google Scholar]
  • 481. Sen B, Mahalanabis D, Kurpad AV, Shaikh S, Bose K. Total body water and fat-free mass: evaluation of equations based on bioelectrical impedance analysis in infants and young children in India. Br J Nutr. 2010;104(2):256–64. [DOI] [PubMed] [Google Scholar]
  • 482. Christoph MJ, Loman BR, Ellison B. Developing a digital photography-based method for dietary analysis in self-serve dining settings. Appetite. 2017;114:217–25. [DOI] [PubMed] [Google Scholar]
  • 483. McClung HL, Champagne CM, Allen HR, McGraw SM, Young AJ, Montain SJet al. Digital food photography technology improves efficiency and feasibility of dietary intake assessments in large populations eating ad libitum in collective dining facilities. Appetite. 2017;116:389–94. [DOI] [PubMed] [Google Scholar]
  • 484. Prinz N, Bohn B, Kern A, Püngel D, Pollatos O, Holl RW. Feasibility and relative validity of a digital photo-based dietary assessment: results from the Nutris-Phone study. Public Health Nutr. 2019;22(7):1160–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 485. Yang Y-K, Zheng H, Zhou X-L. Transcatheter closure or surgical repair of right pulmonary artery-left atrial fistula. Cardiol Young. 2017;27(4):819–21. [DOI] [PubMed] [Google Scholar]
  • 486. Di Noia J, Monica D, Jensen H, Sikorskii A. Economic evaluation of a farm-to-Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) intervention promoting vegetable consumption. Public Health Nutr. 2021;24(12):3922–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 487. Hamasaki H, Furuta M, Yanai H. Validity of visceral fat area measurement by bioelectrical impedance analysis in Japanese obese individuals. Curr Diabetes Rev. 2020;16(5):515–9. [DOI] [PubMed] [Google Scholar]
  • 488. Ivanescu AE, Li P, George B, Brown AW, Keith SW, Raju Det al. The importance of prediction model validation and assessment in obesity and nutrition research. Int J Obes. 2016;40(6):887–94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 489. Brown AW, Aslibekyan S, Bier D, Ferreira da Silva R, Hoover A, Klurfeld DMet al. Toward more rigorous and informative nutritional epidemiology: the rational space between dismissal and defense of the status quo. Crit Rev Food Sci Nutr. 2021;Oct 22:1–18.. doi: 10.1080/10408398.2021.1985427. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 490. Richardson MB, Williams MS, Fontaine KR, Allison DB. The development of scientific evidence for health policies for obesity: why and how?. Int J Obes. 2017;41(6):840–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 491. Vijan S. Evaluating heterogeneity of treatment effects. Biostatistics Epidemiol. 2020;4(1):98–104. [Google Scholar]
  • 492. Ioannidis JPA, Lau J. The impact of high-risk patients on the results of clinical trials. J Clin Epidemiol. 1997;50(10):1089–98. [DOI] [PubMed] [Google Scholar]
  • 493. Dallery J, Raiff BR. Optimizing behavioral health interventions with single-case designs: from development to dissemination. Transl Behav Med. 2014;4(3):290–303. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 494. Davidson KW, Silverstein M, Cheung K, Paluch RA, Epstein LH. Experimental designs to optimize treatments for individuals_Personalized N-of-1 trials. JAMA Pediatr. 2021;175(4):404. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 495. Schork NJ, Goetz LH. Single-subject studies in translational nutrition research. Annu Rev Nutr. 2017;37(1):395–422. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 496. Kazdin AE. Single-case research designs: methods for clinical and applied settings. New York: Oxford University Press; 2010. [Google Scholar]
  • 497. Kratochwill TR, Levin JR. Enhancing the scientific credibility of single-case intervention research: randomization to the rescue. Psychol Methods. 2010;15(2):124–44. [DOI] [PubMed] [Google Scholar]
  • 498. Guyatt G, editor. Users’ guides to the medical literature: essentials of evidence-based clinical practice. 2nd ed.New York: McGraw-Hill Medical; 2008. [Google Scholar]
  • 499. Kratochwill TR, Levin JR. Single-case research design and analysis: new directions for psychology and education. New York: Lawrence Erlbaum Associates, Inc; 1992. [Google Scholar]
  • 500. Epstein LH, Bickel WK, Czajkowski SM, Paluch RA, Moeyaert M, Davidson KW. Single case designs for early phase behavioral translational research in health psychology. Health Psychol. 2021;40(12):858–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 501. Riley WT, Glasgow RE, Etheredge L, Abernethy AP. Rapid, responsive, relevant (R3) research: a call for a rapid learning health research enterprise. Clin Transl Med. 2013;2(1):e10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 502. Miočević M, Klaassen F, Geuke G, Moeyaert M, Maric M. Using Bayesian methods to test mediators of intervention outcomes in single case experimental designs (SCEDs). Evidence-based Communication Assessment and Intervention [Internet]. Available from: 10.1080/17489539.2020.1732029. [DOI] [Google Scholar]
  • 503. Hedges LV, Pustejovsky JE, Shadish WR. A standardized mean difference effect size for single case designs. Res Synthesis Methods. 2012;3(3):224–39. [DOI] [PubMed] [Google Scholar]
  • 504. Hedges LV, Pustejovsky JE, Shadish WR. A standardized mean difference effect size for multiple baseline designs across individuals. Res Synthesis Methods. 2013;4(4):324–41. [DOI] [PubMed] [Google Scholar]
  • 505. Shadish WR, Hedges LV, Pustejovsky JE. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications. J School Psychol. 2014;52(2):123–47. [DOI] [PubMed] [Google Scholar]
  • 506. Valentine JC, Tanner-Smith EE, Pustejovshi JE, Lau TS. Between-case standardized mean difference effect size for single-case designs. A primer and tutorial using the scdhlm web application(Campbell Methods Series: Discussion Paper 1). Oslo (Norway): The Campbell Collaboration; 2016. [Google Scholar]
  • 507. Vannest KJ, Peltier C, Haas A. Results reporting in single case experiments and single case meta-analysis. Res Dev Disabil. 2018;79:10–8. [DOI] [PubMed] [Google Scholar]
  • 508. Almirall D, Nahum-Shani I, Sherwood NE, Murphy SA. Introduction to SMART designs for the development of adaptive interventions: with application to weight loss research. Transl Behav Med. 2014;4(3):260–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 509. Chiolero A. Why causality, and not prediction, should guide obesity prevention policy. Lancet Public Health. 2018;3(10):e461–2. [DOI] [PubMed] [Google Scholar]
  • 510. Leroy JL, Gadsden P, González de Cossío T, Gertler P. Cash and in-kind transfers lead to excess weight gain in a population of women with a high prevalence of overweight in rural Mexico. J Nutr. 2013;143(3):378–83. [DOI] [PubMed] [Google Scholar]
  • 511. Gardner CD, Trepanowski JF, Del Gobbo LC, Hauser ME, Rigdon J, Ioannidis JPAet al. Effect of low-fat vs low-carbohydrate diet on 12-month weight loss in overweight adults and the association with genotype pattern or insulin secretion: the DIETFITS randomized clinical trial. JAMA. 2018;319(7):667–79. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 512. Brown AW, Kaiser KA, Allison DB. Issues with data and analyses: errors, underlying themes, and potential solutions. Proc Natl Acad Sci. 2018;115(11):2563–70. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 513. Brand A, Allen L, Altman M, Hlava M, Scott J. Beyond authorship: attribution, contribution, collaboration, and credit. Learned Publishing. 2015;28(2):151–5. [Google Scholar]
  • 514. MacLean PS, Rothman AJ, Nicastro HL, Czajkowski SM, Agurs-Collins T, Rice ELet al. The Accumulating Data to Optimally Predict Obesity Treatment (ADOPT) core measures project: rationale and approach. Obesity (Silver Spring). 2018;26(Suppl 2):S6–S15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 515. National Academies of Sciences, Engineering, and Medicine . Open science by design: realizing a vision for 21st century research. Washington, DC: The National Academies Press, USA 2018.The National Academies Press. [PubMed] [Google Scholar]
  • 516. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJet al. Promoting an open research culture. Science. 2015;348(6242):1422–5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 517. Ludwig DS, Ebbeling CB, Heymsfield SB. Improving the quality of dietary research. JAMA. 2019;322(16):1549–50. [DOI] [PubMed] [Google Scholar]
  • 518. Nicoletti CF, Cortes-Oliveira C, Pinhel MAS, Nonino CB. Bariatric surgery and precision nutrition. Nutrients. 2017;9(9):974. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 519. San-Cristobal R, Navas-Carretero S, Martínez-González MÁ, Ordovas JM, Martínez JA. Contribution of macronutrients to obesity: implications for precision nutrition. Nat Rev Endocrinol. 2020;16(6):305–20. [DOI] [PubMed] [Google Scholar]
  • 520. Ramos-Lopez O, Milton-Laskibar I, Martínez JA. Precision nutrition based on phenotypical traits and the (epi)genotype: nutrigenetic and nutrigenomic approaches for obesity care. Curr Opin Clin Nutr Metab Care. 2021;24(4):315–25. [DOI] [PubMed] [Google Scholar]
  • 521. González-Muniesa P, Martínez JA. Precision nutrition and metabolic syndrome management. Nutrients. 2019;11(10):2411. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 522. Chilton FH, Dutta R, Reynolds LM, Sergeant S, Mathias RA, Seeds MC. Precision nutrition and omega-3 polyunsaturated fatty acids: a case for personalized supplementation approaches for the prevention and management of human diseases. Nutrients. 2017;9(11):1165. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 523. Jiang X, Nelson AE, Cleveland RJ, Beavers DP, Schwartz TA, Arbeeva Let al. Precision medicine approach to develop and internally validate optimal exercise and weight-loss treatments for overweight and obese adults with knee osteoarthritis: data from a single-center randomized trial. Arthritis Care Res. 2021;73(5):693–701. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 524. Wang DD, Hu FB. Precision nutrition for prevention and management of type 2 diabetes. Lancet Diabetes Endocrinol. 2018;6(5):416–26. [DOI] [PubMed] [Google Scholar]

Articles from Advances in Nutrition are provided here courtesy of American Society for Nutrition

RESOURCES