Abstract
Interventions to address micronutrient deficiencies have large potential to reduce the related disease and economic burden. However, the potential risks of excessive micronutrient intakes are often not well determined. During the Global Summit on Food Fortification, 9–11 September 2015, in Arusha, a symposium was organized on micronutrient risk–benefit assessments. Using case studies on folic acid, iodine and vitamin A, the presenters discussed how to maximize the benefits and minimize the risks of intervention programs to address micronutrient malnutrition. Pre‐implementation assessment of dietary intake, and/or biomarkers of micronutrient exposure, status and morbidity/mortality is critical in identifying the population segments at risk of inadequate and excessive intake. Dietary intake models allow to predict the effect of micronutrient interventions and their combinations, e.g. fortified food and supplements, on the proportion of the population with intakes below adequate and above safe thresholds. Continuous monitoring of micronutrient intake and biomarkers is critical to identify whether the target population is actually reached, whether subgroups receive excessive amounts, and inform program adjustments. However, the relation between regular high intake and adverse health consequences is neither well understood for many micronutrients, nor do biomarkers exist that can detect them. More accurate and reliable biomarkers predictive of micronutrient exposure, status and function are needed to ensure effective and safe intake ranges for vulnerable population groups such as young children and pregnant women. Modelling tools that integrate information on program coverage, dietary intake distribution and biomarkers will further enable program makers to design effective, efficient and safe programs.
Keywords: micronutrient malnutrition, public health, nutritional interventions, food fortification, nutritional supplements, risk–benefit assessment
Introduction
More than 2 billion people globally are estimated to suffer from micronutrient deficiencies (von Grebmer et al. 2014). Iron, zinc, vitamin A, iodine, folate and other vitamin B deficiencies are among the most widespread global micronutrient deficiencies (Muthayya et al. 2013). Factors that contribute to poor micronutrient intake and absorption include poor dietary diversity, poor nutrient density of staple‐based complementary foods, anti‐nutritional factors in plant‐based foods and environmental enteropathy. Metabolic requirements for micronutrient are especially high during early development, pregnancy and lactation. Disease conditions resulting from infections can further aggravate micronutrient deficiencies (Katona & Katona‐Apte 2008).
Strategies such as targeted supplementation with vitamin A, or universal salt iodization, oil fortification with vitamin A, or flour fortification with iron or folic acid, are often part of national strategies to address the problem of micronutrient malnutrition among vulnerable groups throughout the life cycle. Overcoming deficiencies of micronutrients can have a potential benefit in terms of young child survival, growth and development and preventing intrauterine growth restriction and low birth weight (Black et al. 2013). Reviews such as the Copenhagen Consensus have consistently ranked micronutrients as the most cost‐effective development intervention (Copenhagen Consensus Panel 2012). Economic analyses suggest that fortification and/or targeted supplementation with, for instance, iron, zinc, vitamin A, folic acid and iodine can be highly cost‐effective (Horton 2006; Horton et al. 2008; Edejer et al. 2005; Yi et al. 2011). Fortification will not reach all individuals but can be a cost‐effective strategy provided that a centrally processed, affordable fortified food vehicle is available and either the deficiency is widespread or only a small group is affected but the adverse effects because of the deficiency are very costly (Horton 2006). Targeted micronutrient programs such as home fortification or periodic high‐dose supplements have the advantage to deliver micronutrients of concern to the vulnerable subpopulation(s) without unnecessarily exposing other groups in the population. In order for supplementation to be cost‐effective, the defined target group should be readily reached with little compliance issues (Horton 2006).
When implementing micronutrient programs there is the risk of providing insufficient or excessive amounts, missing those at risk or redundant coverage (coverage of individuals with adequate intake, or coverage by more programs than necessary), potentially resulting in inadequate or excessive intakes and poor cost‐effectiveness. Designing an effective and safe fortification or supplementation program imposes the dual challenge of reaching the target population groups at most risk of micronutrient deficiency while avoiding overexposure of individuals at the high end of the intake distribution curves (European Food Safety Authority (EFSA) 2010). This may pose particular challenges regarding micronutrient interventions that exert benefits in the target population but unintentionally adverse effects in a subgroup of this target population with already high intake or status of the micronutrient (e.g. ‘targeted’ vitamin A supplementation) (European Food Safety Authority (EFSA) 2010). Increasing micronutrient intakes through universal fortification can deliver potential benefits to deficient population groups, but sometimes also risks in other population groups when consumed in excess (e.g. folic acid fortification to prevent neural tube defects in the unborn child versus masking of vitamin B12 deficiency in elderly) (European Food Safety Authority (EFSA) 2010).
Values have been set for the intake of each micronutrient below which the risk for adverse health effects is likely to increase. The risk of an inadequate intake increases as an individual's intake falls further below the recommended intake (Food and Nutrition Board & Institute of Medicine (IOM) 2000). At two standard deviations below the recommended intake, i.e. the estimated average requirement (EAR), the risk is approximately 50% that an individual's requirements are not met. Adverse health consequences because of a micronutrient deficiency condition increase in incidence and/or severity the further the usual intake falls below the EAR (Renwick et al. 2004). Therefore, reducing the number of individuals with inadequate micronutrient intakes can be expected to have clear benefits on micronutrient deficiency‐related morbidity especially when the morbidity is severe and/or chronic.
However, the adverse health consequences of deficiency are far better understood than the evidence base related to the consequences of excess, if any. The ‘excess’ level of intake is neither well understood and defined for many micronutrients, nor do biomarkers exist that can detect early adverse effects. Considerable uncertainty exists for many micronutrients about the ‘threshold’ for intake or a biomarker, above which the risk of adverse effects may increase and the dose–response relationship (Renwick et al. 2004). The tolerable upper intake level (UL) is the highest daily intake still considered to be safe for almost all healthy individuals in a specified group. Depending on the magnitude of the evidence, the UL is obtained by downward adjustment of the no‐observed‐adverse‐effect level (if known) or the lowest‐observed‐adverse‐effect level by applying an uncertainty factor. This has resulted for a number of micronutrients in an UL that is close to the recommended nutrient intake because of large uncertainty factors used to set this UL (European Commission 2006). For some micronutrients, no evidence of risk of adverse effects is established, and hence no safe tolerable upper intake level (UL) is defined (i.e. biotin, pantothenic acid, chromium, vitamin B1, B2, B12 and K). For other micronutrients, the risk of exceeding the UL is low (i.e. vitamin B6, C, D and E, niacin, molybdenum, phosphorus and selenium). For some micronutrients, a potential risk of exceeding the maximum safe UL exists (i.e. vitamin A as preformed retinol, zinc, iron, iodine, copper and calcium). For the latter micronutrients, programs should be cautiously designed and monitored to ensure ‘at risk’ population groups are reached without any appreciable risk of adverse effects in subgroups that consume high amounts of the micronutrient.
Important elements of the program cycle include the gathering of information to assess the nutrition situation, development of a national policy and strategy, program design, implementation and monitoring, review and evaluation, and adjustment if necessary. Before implementing a micronutrient program it is critical to assess the nutrition situation by collecting data among the different population groups that can be referred to as the ‘ABCD’ of nutrition assessment; anthropometric assessments, biomarker or biochemical assessments, clinical assessments (morbidity and mortality) and dietary assessment. These data allow understanding the magnitude of the health problem and the vulnerable population group(s) at risk of micronutrient deficiency. Information on food and micronutrient intake distribution in the different population groups will help predict how micronutrient interventions will work in a variety of contexts. Another critical element in planning (cost) effective and safe programs is the ongoing monitoring and evaluation of the nutrition situation through collection of data. It provides an indication of the effectiveness and safety of the intervention and allows adjusting the type of intervention, amount of provided micronutrients or geographic distribution, if needed. Some countries have implemented one or multiple programs, focusing mostly on vitamin A, iodine and iron. However, availability of data on biomarkers predictive of nutrient intake, status and adverse health effects (Combs et al. 2013; Raiten & Combs 2015) is still sparse, and few countries have conducted detailed nutrition surveys, as their collection can be time and resource‐intensive. Often multiple programs with micronutrients are implemented to reach a greater proportion of the at‐risk population. In particular, implementation of multiple micronutrient interventions requires coordination and monitoring to ensure that these programs are complementary and not overlapping, which may result in unnecessarily exposure of vulnerable groups. Collecting dietary intake data and biomarkers can be used to correct program designs to ensure that intake ranges in the different life‐stage groups are effective and safe. Dietary intake data can be used in software modelling to predict the shift in micronutrient adequacy in different population groups for a selected food vehicle and fortification level.
The need for data modelling to predict efficacy and safety in micronutrient programs
Until recently, dietary intake modelling tools were not able to predict the effect of intake from combined fortification and supplementation programs. As addressed in this symposium report, a new simulation approach allows to predict the effect of implementing fortification combined with supplementation on intakes below the EAR and above the UL and related program cost versus the savings (Engle‐Stone et al. 2015). Other software tools have been developed that simulate the effect of changing food or nutrient intakes on prevention of the related burden of disease (i.e. the benefits). In risk–benefit approaches, the same principle is used to assess both the preventable burden of disease and the risks related to ‘excessive’ food or nutrient intake, provided that sufficient evidence is available to simulate the adverse health response. In order to use a risk–benefit approach, a better understanding is needed of the relation between micronutrient intake, status and morbidity/mortality outcomes. For some micronutrients the link between micronutrient intake, status and morbidity/mortality is sufficiently established to predict health benefits when overcoming its deficiency (Horton 2006). However, with the current dearth of data in many countries on micronutrient intake, status and related disease incidence, program modelling to predict the benefits, and even more so the risks, is still a challenge. It is nevertheless encouraging that in many countries the collection of nutrition data is increasing, which will further help program planners and policy‐makers to design cost‐effective and safe micronutrient interventions.
Inter‐individual variations and micronutrient interactions
Minimum and maximum intake values for micronutrients are relevant for population groups, but do not necessarily reflect adequate or safe intake values for all individuals in that group. The requirement and safety limit for a micronutrient not only varies by age, gender and life‐stage, but also by health condition, medication use and genetic profile, making it even more challenging to develop a safe and effective program tailored to all ‘at risk’ individuals. For instance, chronic disease conditions or genetic polymorphisms may lead to higher micronutrient requirements. Micronutrients do not necessarily provide benefits in all deficiency conditions; complex interactions may exist between the delivered micronutrients, micronutrient status, infections and medication use. Complicating factors include (1) interactions between micronutrients, for instance iron and zinc reducing each other's absorption (Lönnerdal 2004), (2) micronutrient–drug interactions; micronutrients reducing the absorption or efficacy of a drug or vice versa (Scaglione & Panzavolta 2014; Hathcock 1985), or (3) antagonistic interactions between micronutrients and infections (e.g. iron and malaria). The benefits and risks of some micronutrients therefore depend on the setting in which they are provided. For instance, the effect of certain micronutrients may not only depend on whether the target population is deficient or not, but also on whether malaria control strategies are implemented in malaria endemic regions. This is exemplified by the Pemba study, in which children living on malaria‐endemic Pemba island who received iron and folic acid supplements, were more likely to die or need treatment in hospital for an adverse event (Sazawal et al. 2006). However, in a sub‐study of the trial in which children were given malaria‐reduction measures, children who were iron deficient and anaemic at baseline, showed a significant reduction in adverse events, including malaria episodes, suggesting that supplementation with iron and folic acid may especially confer benefits to those who are deficient.
These issues, altogether, pose a challenge to program planners in designing micronutrient intervention programs that meet the requirements of the majority of the population at risk while still being safe. There is a need for tools that assist is designing micronutrient intervention programs that are effective in reaching the target groups at risk of developing micronutrient deficiencies, efficient, in terms of reaching the target group and not unnecessarily exposing non‐target groups, and safe, in terms of not overexposing a segment of the population that is already more than adequate, especially vulnerable target groups. There is a need to understand how specific micronutrients, mainly as folic acid and iron, may interact with medication and infection, and how they exert beneficial or antagonistic effects under these conditions. This raises the question of how to achieve the maximum possible public health benefits of a micronutrient intervention strategy with the least risks.
Key messages.
Monitoring of data (dietary intake, program coverage, biomarkers and morbidity) among vulnerable population groups is essential to optimize effectiveness, safety, and efficiency of micronutrient programs.
Modelling tools are needed that predict the effectiveness, safety, and efficiency of micronutrient programs by integrating above data.
Research is needed to identify biomarkers predictive of micronutrient exposure, status, and potential health consequences of inadequate and especially excessive micronutrient exposure. This may help setting the appropriate no‐observed‐adverse‐effect or lowest‐adverse‐effect level, currently often based on scant data, particularly for young children.
More insight is needed into the antagonistic effects resulting from complex interactions between micronutrients, drugs, and infections.
Objective
During the Global Summit on Food Fortification on 9–11 September 2015 in Arusha, one of the sessions included a special session co‐organized by Sight and Life and UNICEF, entitled ‘Effective and Safe Micronutrient Interventions: Weighing the Risks against the Benefits’. The aim of the session was to identify existing tools and needs to assist policy makers in designing effective micronutrient programs with the highest public health benefits and least risks. The presentations in the session addressed this challenge, based on a number of case studies related to folic acid, iodine and vitamin A.
Session summary
An introduction to the session was given by Dr. Maaike Bruins (DSM Biotechnology Center, Delft, the Netherlands), with the title ‘Assessing the risks and benefits of micronutrients interventions’. In order to understand the risks of inadequate and excessive intake of a micronutrient, information is required on the intake distribution of the micronutrient in the population. This enables to assess the proportion of a population life‐stage group with intakes below their requirements and above their UL. Software is available, such as PC‐SIDE and IMAPP, for the estimation and modelling of food and micronutrient intake distributions, which can assist in selecting optimal food vehicles and fortification levels (Iowa State University 2015). Other tools make it possible to estimate the preventable public disease burden because of a micronutrient intervention program, expressed for instance as preventable disability‐adjusted life years (DALYs). The DALY is a quantitative measure of overall disease burden and can serve as important guide to decision‐making. The DALY is expressed as the number of years lost because of early death or morbidity (e.g. infections, anaemia, motor and cognitive impairments, or blindness). The DALY metric is composed of mortality and incidence, duration and disability of the morbidity (related to micronutrient deficiency). For instance, preventing DALYs by overcoming maternal folate deficiency may have a large public health impact; the preventable lifelong and severe disabilities resulting from of neural tube defects are substantial, even if a few neural tube defects in 1000 births could be prevented. The World Health Organization (WHO) has developed a source of estimates for cost‐effectiveness of micronutrient interventions is the CHOosing Interventions that are Cost Effective (WHO‐CHOICE) database (World Health Organization (WHO) 2016). The estimates are based on known intervention costs and effectiveness (in terms of micronutrient status), and links between micronutrient status and morbidity/mortality outcomes. The WHO has also developed tools that can assist program planners in simulating the public health impact of an increase in micronutrient intake in terms of DALYs gained (World Health Organization (WHO) 2015). This requires (1) dietary intake data in different population groups and (2) evidence for a dose–response relationship between micronutrient intake and morbidity because of deficiency. With the introduction of a micronutrient intervention, the benefits of increasing micronutrients do not always clearly outweigh the risks. For instance, when implementing mandatory fortification, the benefits of preventing frequent, or severe, or life‐long disabilities should be balanced against the possible risks of excess intake in other population groups (Hoekstra et al. 2008). Risk–benefit assessment software provides the option to quantify the public health risks and benefits, and balance them, using for instance DALYs as a common metric (Ellis & Aspurger 2000). Even though simulating changes in DALYs resulting from micronutrient intervention programs can provide useful information to policy makers to set investment priorities, the required dietary intake data are often lacking. Moreover, for many micronutrients the dose–response relationship between excessive intake and risks is not well known. Even though simulating changes in DALYs resulting from micronutrient intervention programs can provide useful information to policy makers to set investment priorities, the required intake data are often lacking and the methods require in‐depth expertise. Next to these sophisticated modelling tools, more user‐friendly tools are needed that integrate information on program coverage, dietary intake distribution, biomarkers and morbidity to design effective and safe micronutrient programs in terms of public health impact.
Increasing micronutrient intake may confer benefits on those who have inadequate intakes. However, micronutrient interventions may not confer benefits in all settings. For instance, in some studies iron supplementation has been associated with an increased risk of malaria and death in children living in malaria‐endemic regions, but not when regular malaria surveillance and treatment services are provided (World Health Organization (WHO) 2015). Another example of a micronutrient that may not necessarily confer benefits under all circumstances is, for instance, folic acid. Folic acid supplementation may be beneficial to the folate‐deficient child, but may confer risks when high‐dose supplements are implemented in malaria‐endemic settings alongside antimalarial drugs (Kupka 2015). This was discussed by Dr. Roland Kupka from UNICEF (New York). The WHO calls for micronutrient powder (MNP) programs to be implemented alongside malaria control strategies in malaria endemic regions. Current MNP formulations generally provide folic acid and other micronutrients at the Recommended Nutrient Intake level. However, the benefits and risks of providing supplemental folic acid are largely unknown. The limited data available suggest that folate deficiency may not be a major public health problem among children living in sub‐Saharan Africa; as a result, supplemental folic acid may not confer health benefits. Furthermore, folic acid provided at supraphysiological, and possibly even physiological, levels may favour the growth of the parasite Plasmodium falciparum (responsible for 85% of malaria cases), inhibit clearance of the parasite by sulphadoxine‐pyrimethamine (SP)‐treated malaria and increase subsequent recrudescence (Nzila et al. 2014). Limiting prophylactic SP use or promoting the use of insecticide‐treated bed nets may render the use of folic acid in MNP programs safer, but programmatic barriers to these approaches may remain. The use of 5‐methyltetrahydrofolic acid concomitantly with SP may be a promising alternative, as it does not affect SP efficacy, although its stability in food and cost‐in‐use remain to be confirmed (Scaglione & Panzavolta 2014). The presentation concluded that more work is needed to characterize the prevalence of folate deficiency among young children worldwide, and to optimize the benefit–risk ratio of MNP programs in sub‐Saharan Africa.
Both iodine deficiency and iodine excess have adverse health effects. Prof. Michael Zimmerman from the ETH Zürich (Switzerland) discussed the risks and benefits of salt iodization programs. In general, the relatively small risks of iodine excess are far outweighed by the substantial risks of iodine deficiency (Zimmermann 2008). Iodine deficiency has multiple adverse effects on growth and development because of inadequate thyroid hormone production, which are termed the iodine deficiency disorders (IDD). IDD remains one of the most common causes of preventable mental impairment worldwide. In nearly all iodine‐deficient countries, the best strategy to control IDD is salt iodization, one of the most cost‐effective ways of contributing to economic and social development (Zimmermann 2008). In areas of severe iodine deficiency, iodine repletion in pregnant women improves pregnancy outcomes, eliminates endemic cretinism and improves IQ in their children. Iodine repletion in newborns reduces infant mortality, and may improve cognitive development and growth. Even in areas of mild‐to‐moderate iodine deficiency, repletion in school‐aged children improves cognition and fine motor skills (Zimmermann 2007). On the other hand, the introduction of iodized salt to regions with chronic IDD may transiently increase the incidence of thyroid disorders, such as iodine‐induced hyperthyroidism and autoimmune hypothyroidism, and programs should therefore include monitoring for both iodine deficiency and excess. More data on the epidemiology of thyroid disorders caused by differences in iodine intake are needed, and should be the focus of future research.
The adverse effects of vitamin A deficiency include childhood blindness, and the increased risk of mortality from measles and diarrhoea (Stevens et al. 2015). Vitamin A supplementation and fortification programs can be a cheap and effective way to reduce morbidity and mortality because of vitamin A deficiency (Horton 2006). These programs have significantly improved millions of lives among pre‐school children in many parts of the world (Stevens et al. 2015). Long‐term excessive vitamin A intakes, on the other hand, can lead to adverse effects, such as hepatotoxicity. Generally, the substantial risks of vitamin A deficiency can be expected to far outweigh the relatively small risks of vitamin A excess (Allen & Haskell 2002). The lack of coordination of multiple vitamin A interventions and many overlapping schemes in some areas has increased concern about the risks of these programs, as some children could theoretically receive well above their recommended daily dose of the vitamin. The uncertainty around the level of excess for vitamin A has resulted in a small margin of safe intake for young children, between the recommended intake levels and maximum safe UL ‘thresholds’ (Allen & Haskell 2002). Dr. Georg Lietz from Newcastle University (Newcastle upon Tyne, UK) discussed how to best assess the window of benefit for vitamin A intake. In his presentation, Dr. Lietz highlighted the need for reliable, robust and affordable biomarkers of inadequate and excess status, to enable policy and program makers to adjust the correct level and coverage of their interventions. The appropriate no‐observed‐adverse‐effect level or the lowest‐observed‐adverse‐effect level for children up to 5 years of age needs to be urgently determined and the ULs revised as most ULs for children are extrapolated from ULs for adults. Dr. Lietz also discussed the benefits and problems associated with the most recent methods to have been developed to enable the monitoring of both fortification and supplementation programs, including the application of stable isotope dilution techniques, and the potential for lab‐on‐chip technologies for future monitoring. He ended his presentation by highlighting the need to coordinate different methods of risk assessment, to make the monitoring process more efficient, reliable and cost‐effective.
Dr. Reina Engle‐Stone from the University of California (Davis, CA, USA) presented a dietary modelling approach to assess the risk of inadequate and excessive intakes under different program scenarios, and their applications to program planning. Her presentation provided an overview of the methodology for the use of dietary data to examine the risk of inadequate and excessive micronutrient intakes using dietary reference values, and given information on current dietary patterns and the reach of program delivery platforms. Examples from Cameroon illustrated the use of dietary modelling to set fortification levels, by predicting the effects of each new or modified program on dietary intake (Engle‐Stone et al. 2015; Brown et al. 2015; Kagin et al. 2015; Vosti et al. 2015). This dietary modelling tool can be used to predict the effect of fortification of different food vehicles and overlapping micronutrient programs on the percentage of the population with intakes below EAR or above the UL (Brown et al. 2015; Vosti et al. 2015). Finally, the role of complementary modelling tools was discussed, including the (1) Lives Saved Tool (LiST), which predicts the mortality reduction from some micronutrient interventions delivered to mothers and children (Johns Hopkins School of Public Health 2015), (2) models assessing liver vitamin A concentration, which can assist in the interpretation of risk of dietary intakes above the upper intake level in young children, and (3) program coverage surveys which reveal absent or redundant program coverage and thus help identify subpopulations (e.g. regions) at risk of inadequate or excessive intake (Aaron 2014; Spohrer 2015).
Population dietary intake data can be used to model the effect of different food vehicles and micronutrient levels to optimize coverage and minimize the risks of inadequate and excessive intakes in relation to dietary cut‐off points. Monitoring intake data will enable policy makers to adjust to the correct food vehicle, micronutrient level, and coverage of intervention programs. However, the success of these programs has, inevitably, to be confirmed by applying biomarkers indicative of micronutrient status that are affordable, sensitive and specific, and can be applied in population settings (Combs et al. 2013; Raiten & Combs 2015). Some methods which are currently applied are inadequate, because of their lack of precision and accuracy, to detect nutrient concentrations in at‐risk population groups, or because of insensitivity to changes in status within a certain range (e.g. serum retinol), or alterations during inflammation (zinc, retinol, ferritin). Thus, improved biomarkers are urgently needed which will enable policy makers to understand the benefits and risks in vulnerable population groups related to increasing micronutrient intakes. Tools are available which assess or predict the effects of micronutrient programs, in terms of dietary intakes against dietary cut‐off points. However, there is still need for standardized, user‐friendly tools, which predict the public health impact made by micronutrient interventions, by integrating benefits and possible risk. More research is warranted in order to understand the concerns regarding the safety of iron and folic acid supplementation among young children in malaria endemic areas, and their interaction with antimalarials and infections. Research is needed to assess the minimum and maximum intake range for vitamin A outside which intakes might cause concern. In general, the small risks of micronutrient excess are far outweighed by the substantial benefits of overcoming the deficiency. However, there is an urgent need to monitor overlapping micronutrient supplementation and fortification programs, and to evaluate the benefits and risks (if any) of increasing population micronutrient intakes for each setting (population group, region, choice of food vehicle, micronutrient level in food, micronutrient status of the population group, concomitant food and medications, etc.).
Source of funding
None.
Conflicts of interest
Maaike Bruins is employed at DSM, a manufacturer of vitamin and mineral premixes. Klaus Kraemer is the director of Sight and Life foundation, a nutrition think tank primarily funded by DSM.
Contributor statement
The authors declare that they have all participated in drafting of the manuscript, and that they have approved the final version.
Bruins, M. J. , Kupka, R. , Zimmermann, M. B. , Lietz, G. , Engle‐Stone, R. , and Kraemer, K. (2016) Maximizing the benefits and minimizing the risks of intervention programs to address micronutrient malnutrition: symposium report. Maternal & Child Nutrition, 12: 940–948. doi: 10.1111/mcn.12334.
References
- von Grebmer K., Saltzman A., Birol E., Wiesmann D., Prasai N., Yin S. et al (2014) 2014 Global Hunger Index: The Challenge of Hidden Hunger. Welthungerhilfe, International Food Policy Research Institute, and Concern Worldwide: Bonn, Washington, D.C., and Dublin. [Google Scholar]
- Muthayya S., Rah J.H., Sugimoto J.D., Roos F.F., Kraemer K. & Black R.E. (2013) The global hidden hunger indices and maps: an advocacy tool for action. PLoS One 8 (6), e67860. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Katona P. & Katona‐Apte J. (2008) The interaction between nutrition and infection. Clinical Infectious Diseases 46 (10), 1582–8. [DOI] [PubMed] [Google Scholar]
- Black R.E., Victora C.G., Walker S.P., Bhutta Z.A., Christian P., de Onis M. et al. (2013) Maternal and child undernutrition and overweight in low‐income and middle‐income countries. Lancet 382 (9890), 427–51. [DOI] [PubMed] [Google Scholar]
- Copenhagen Consensus Panel . Copenhagen consensus statement 2012; Available from: http://www.copenhagenconsensus.com/copenhagen-consensus-iii/outcome (Accessed 29 April 2016).
- Horton S. (2006) The economics of food fortification. Journal of Nutrition 136 (4), 1068–71. [DOI] [PubMed] [Google Scholar]
- Horton S., Mannar V. & Wesley A. (2008) Best Practice Paper. Food Fortification with Iron and Iodine. Copenhagen Consensus Center: Frederiksberg, Denmark. [Google Scholar]
- Edejer T.T., Aikins M., Black R., Wolfson L., Hutubessy R. & Evans D.B. (2005) Cost effectiveness analysis of strategies for child health in developing countries. BMJ 331 (7526), 1177. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yi Y., Lindemann M., Colligs A. & Snowball C. (2011) Economic burden of neural tube defects and impact of prevention with folic acid: a literature review. European Journal of Pediatrics 170 (11), 1391–400. [DOI] [PMC free article] [PubMed] [Google Scholar]
- European Food Safety Authority (EFSA) (2010) Guidance on human health risk–benefit assessment of foods. EFSA Journal 8 (7), 1673. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Food and Nutrition Board & Institute of Medicine (IOM) (2000) Using dietary reference intakes for nutrient assessment of individuals In: Dietary Reference Intakes: Applications in Dietary Assessment, pp 51 National Academies Press: Washington (DC), USA. [Google Scholar]
- Renwick A.G., Flynn A., Fletcher, R.J., Muller D.J., Tuijtelaars S. & Verhagen H. (2004) Risk–benefit analysis of micronutrients. Food and Chemical Toxicology 42 (12), 1903–22. [DOI] [PubMed] [Google Scholar]
- European Commission , Discussion Paper on the setting of maximum and minimum amounts for vitamins and minerals in foodstuffs 2006.
- Combs G.F. Jr., Trumbo P.R., McKinley M.C., Milner J., Studenski S., Kimura T. et al. (2013) Biomarkers in nutrition: new frontiers in research and application. Annals of the New York Academy of Sciences 1278, 1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raiten D.J. & Combs G.F. Jr. (2015) Biomarkers and bio‐indicators: providing clarity in the face of complexity. Sight and Life 29 (1), 39–44. [Google Scholar]
- Engle‐Stone R., Nankap M., Ndjebayi A.O., Vosti S.A. & Brown K.H. (2015) Estimating the effective coverage of programs to control vitamin A deficiency and its consequences among women and young children in Cameroon. Food and Nutrition Bulletin 36 (Suppl 3), S149–71. [DOI] [PubMed] [Google Scholar]
- Lönnerdal B. (2004) In: Interactions between Micronutrients: Synergies and Antagonisms, in Micronutrient Deficiencies during the Weaning Period and the First Years of Life. 54th Nestlé Nutrition Workshop, Pediatric Program, São Paulo, October 2003 (ed. Pettifor J.M.), pp 67–81. Karger AG: Basel. [Google Scholar]
- Scaglione F. & Panzavolta G. (2014) Folate, folic acid and 5‐methyltetrahydrofolate are not the same thing. Xenobiotica 44 (5), 480–8. [DOI] [PubMed] [Google Scholar]
- Hathcock J.N. (1985) Metabolic mechanisms of drug–nutrient interactions. Federation Proceedings 44 (1 Pt 1), 124–9. [PubMed] [Google Scholar]
- Sazawal S., Black R.E., Ramsan M., Chwaya H.M., Stoltzfus R.J., Dutta A. et al (2006) Effects of routine prophylactic supplementation with iron and folic acid on admission to hospital and mortality in preschool children in a high malaria transmission setting: community‐based, randomised, placebo‐controlled trial. Lancet 367 (9505), 133–43. [DOI] [PubMed] [Google Scholar]
- Iowa State University . Software for Intake Distribution Estimation: PC‐SIDE, IMAPP . [cited 2015. 5 December]; Available from: http://www.side.stat.iastate.edu/.
- World Health Organization (WHO) . Health statistics and information systems: national tools . [cited 2015. 5 December]; Available from: http://www.who.int/healthinfo/global_burden_disease/tools_national/en/.
- World Health Organization (WHO) . Cost effectiveness and strategic planning (WHO‐CHOICE). [cited 2016. 5 December]; Available from: http://www.who.int/choice/cost-effectiveness/methods/en/.
- Hoekstra J., Verkaik‐Kloosterman J., Rompelberg C., van Kranen H., Zeilmaker M., Verhagen H. et al. (2008) Integrated risk–benefit analyses: method development with folic acid as example. Food and Chemical Toxicology 46 (3), 893–909. [DOI] [PubMed] [Google Scholar]
- Ellis M. & Aspurger M. (2000) Chapter 20. Feed intake in growing finishing pigs In: Swine Nutrition (eds Lewis A.J. & Southern L.L.), pp 447–467. CRC Press: Florida, USA. [Google Scholar]
- World Health Organization (WHO) (2015) Intermittent iron supplementation in preschool and school‐age children in malaria‐endemic areas In: e‐Library of Evidence for Nutrition Actions (eLENA). Available from: http://www.who.int/elena/titles/iron_infants_malaria/en/ (Accessed 29 April 2016). [Google Scholar]
- Kupka R. (2015) The role of folate in malaria—implications for home fortification programmes among children aged 6–59 months. Maternal & Child Nutrition. DOI: 10.1111/mcn.12102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nzila A., Okombo J. & Molloy A.M. (2014) Impact of folate supplementation on the efficacy of sulfadoxine/pyrimethamine in preventing malaria in pregnancy: the potential of 5‐methyl‐tetrahydrofolate. Journal of Antimicrobial Chemotherapy 69 (2), 323–30. [DOI] [PubMed] [Google Scholar]
- Zimmermann M.B. (2008) Iodine requirements and the risks and benefits of correcting iodine deficiency in populations. Journal of Trace Elements in Medicine and Biology 22 (2), 81–92. [DOI] [PubMed] [Google Scholar]
- Zimmermann M.B. (2007) The adverse effects of mild‐to‐moderate iodine deficiency during pregnancy and childhood: a review. Thyroid 17 (9), 829–35. [DOI] [PubMed] [Google Scholar]
- Stevens G.A., Bennett J.E., Hennocq Q., Lu Y., De‐Regil L.M., Rogers L. et al. (2015) Trends and mortality effects of vitamin A deficiency in children in 138 low‐income and middle‐income countries between 1991 and 2013: a pooled analysis of population‐based surveys. Lancet Glob Health 3 (9), e528–36. [DOI] [PubMed] [Google Scholar]
- Allen L.H. & Haskell M. (2002) Estimating the potential for vitamin A toxicity in women and young children. Journal of Nutrition 132 (9 Suppl), 2907S–2919S. [DOI] [PubMed] [Google Scholar]
- Brown K.H., Engle‐Stone R., Kagin J., Rettig E. & Vosti S.A. (2015) Use of optimization modeling for selecting national micronutrient intervention strategies: an example based on potential programs for control of vitamin A deficiency in Cameroon. Food and Nutrition Bulletin 36 (Suppl 3), S141–8. [DOI] [PubMed] [Google Scholar]
- Kagin J., Vosti S.A., Engle‐Stone R., Rettig E., Brown K.H., Nankap M. et al. (2015) Measuring the costs of vitamin A interventions: institutional, spatial, and temporal issues in the context of Cameroon. Food and Nutrition Bulletin 36 (Suppl 3), S172–92. [DOI] [PubMed] [Google Scholar]
- Vosti S.A., Kagin J., Engle‐Stone R. & Brown K.H. (2015) An economic optimization model for improving the efficiency of vitamin A interventions: an application to young children in Cameroon. Food and Nutrition Bulletin 36 (Suppl 3), S193–207. [DOI] [PubMed] [Google Scholar]
- Johns Hopkins School of Public Health . LiST: Lives Saved Tool . [cited 2015. 5 December]; Available from: http://livessavedtool.org/. [Google Scholar]
- Aaron G.J., Assessing coverage of large‐scale and targeted food fortification programs: development of a fortification assessment coverage tool (FACT), In: The Micronutrient Forum June, 2014: Addis Abeba, Ethiopia. [Google Scholar]
- Spohrer R. Fortification Assessment Coverage Toolkit (FACT). QA/QC Training Workshop Harare, Zimbabwe . 2015; Available from: http://www.ffinetwork.org/about/calendar/2015/documents/QAQC_FACT.pdf.