Abstract
Accurate measurement of energy intake (EI) is important for estimation of energy balance, and, correspondingly, body weight dynamics. Traditional measurements of EI rely on self-report, which may be inaccurate and underestimate EI. The imperfections in traditional methodologies such as 24-hour dietary recall, dietary record, and food frequency questionnaire stipulate development of technology-driven methods that rely on wearable sensors and imaging devices to achieve an objective and accurate assessment of EI. The aim of this research was to systematically review and examine peer-reviewed papers that cover the estimation of EI in humans, with the focus on emerging technology-driven methodologies. Five major electronic databases were searched for articles published from January 2005 to August 2017: Pubmed, Science Direct, IEEE Xplore, ACM library, and Google Scholar. Twenty-six eligible studies were retrieved that met the inclusion criteria. The review identified that while the current methods of estimating EI show promise, accurate estimation of EI in free-living individuals presents many challenges and opportunities. The most accurate result identified for EI (kcal) estimation had an average accuracy of 94%. However, collectively, the results were obtained from a limited number of food items (i.e., 19), small sample sizes (i.e., 45 meal images), and primarily controlled conditions. Therefore, new methods that accurately estimate EI over long time periods in free-living conditions are needed.
Keywords: Energy intake, energy balance, obesity, wearable sensors, dietary assessment, clinical study methodology
I. INTRODUCTION
Objective assessment of energy intake (EI) is an important component to understand body weight dynamics associated with underweight, overweight, and obesity in humans [1]. Underweight results from energy expenditure (EE) exceeding EI over a prolonged period of time. Being underweight increases health risks such as malnutrition, bone fractures, heart irregularities, inability to fight infection, and premature death [2]. Over 815 million people in developing countries are affected with the burden of underweight [3]. According to the National Health and Nutrition Examination Survey (NHANES) 2011-2012, approximately 1.7% of US adults are underweight [4].
Conversely, overweight and obesity are associated with EI that exceeds EE. These health conditions present a growing global epidemic with substantial individual and public health consequences. The World Health Organization in 2016 estimated that more than 1.9 billion adults (nearly 39% of the world’s population) were overweight and over 650 million of those were obese [5]. Obesity and overweight are major contributors to chronic diseases such as type 2 diabetes, asthma, cardiovascular diseases, cancers, and musculoskeletal disorders, and caused 3.4 million deaths in 2016 [6]. Obese individuals spend 30% more per capita on medical costs than healthy individuals [7]. Apart from direct costs, obesity also results in indirect costs due to productivity losses, disability, and mortality [8]. In the United States, such productivity losses cost an estimated 30.15 billion USD/year with obesity-related absence from work alone contributing $3.38 billion – $6.38 billion [9]. Underweight, overweight, and obesity are health conditions highly resistant to active medical treatment [10].
Based on energy balance, the association between EI and EE is important for understanding underweight and obesity [11]. With the advent of micro-electronic accelerometers, there has been rapid progress in methods for assessing free-living EE [12]-[14]. However, in comparison, the tools for measuring EI are less developed. Accurate measurement of EI and characterization of related eating behaviors are challenges in nutrition research.
Successful detection of food intake is the first step necessary before EI can be measured in the real world. If no food intake detected, then no EI can be assessed. Once the food intake is detected, the meal/eating episode can be characterized and EI-related measures such as amount consumed/mass, portion size, volume, the number and type of food items in a meal can be estimated.
A range of methods such as 24-hour dietary recalls, diet diaries, weighed food records (WFRs), and short assessment screening questionnaires [15], have historically been used for EI measurement[16], [17]. Most of these methods rely on participants’ own recollection and declaration (self-report) and, so, suffer from underreporting, misreporting, and non-reporting of EI [18].
In the recent past, mathematical models of human metabolism and EI, including models for cumulative intake curves (CIC), have been investigated as possible tools for EI assessment [19]-[22]. While mathematically derived estimation models are yet to be tested for use in real-time settings and clinical conditions, they may hold promise as alternate methods to estimate EI. However, these methods cannot measure diet quality or eating behavior so their use in clinical interventions or research settings is limited.
The doubly labeled water (DLW) method [23] is regarded as the gold standard for measurement of free-living EE and, therefore, can be used to estimate EI under energy balance conditions. The measurement is usually conducted over 7 to 14 days. The premise assumes that, if a person is weight stable (i.e. in energy balance), EI is equivalent to EE, which is consistent with the First Law of Thermodynamics. For individuals in energy balance, DLW has been shown to estimate EI (kcal) within 2% to 8% in inpatient studies and 8% to 15% in field studies [24]. Because of its high accuracy, the DLW method is often used as a benchmark to measure total EI in free-living conditions [24]-[26] and validate other methods to estimate EI [24]. However, this method is accurate only if the person is in energy balance. Also, EE and EI fluctuate from day to day so this method gives a broad overview of total EI over the entire measurement period only and cannot be used to assess patterns in eating behavior or specific nutrient intake. In addition, this method requires biological samples be collected at specific times, expensive measurement instruments and specialized knowledge of their operation [27], [28] and therefore is only used by the most well-funded laboratories and clinics.
Technological innovations to improve the accuracy of EI assessment include the use of the internet and smartphones, facilitating the collection of detailed nutritional information with lower cost and participant burden [29], [30]. To minimize the problems associated with self-reported EI, various methods have been proposed ranging from image-assisted reporting to wearable on-body sensors [31]-[35]. The development of technology-driven, quantitative methods of EI estimation has received significant attention as an active area of discovery.
Although numerous reviews of traditional EI assessment methodologies and sensor- and vision-based methods for food intake detection and recognition are available [36], [31], [37], no systematic review of cutting-edge technologies specifically dedicated to measurement of EI exists. The current review attempts to categorize the technological approaches used in EI measurement and consider the advantages and weaknesses of each.
II. A BRIEF OVERVIEW OF CONCEPTS RELATED TO ENERGY INTAKE
A. ENERGY BALANCE
Energy balance is the difference between EI and energy expenditure. The first law of thermodynamics is the law of conservation of mass and energy, which states that energy can be transferred from one form to another, but it cannot be created or destroyed. In the context of human physiology, this is expressed as:
| (1) |
where ES = rate of energy stored in kilocalories (or kJ) per day, EI = rate of energy intake in kilocalories per day, and EE = rate of energy expended in kilocalories per day [11]. The energy balance equation describes what occurs to ES when EI and EE are in balance or imbalance over a given period of time (Figure 1) [11]. In balance, the EI equals energy expenditure. An imbalance of energy implies a change in the energy content of the body [1] which consequently changes energy expenditure due to a change in body weight.
FIGURE 1.

Representation of the energy balance paradigm.
B. ENERGY INTAKE, CALORIE AND ENERGY DENSITY
Humans need energy in the body for basic metabolic processes, physical activity, physiological functions, generation of new tissues, and maintenance of good health. EI represents the energy content (expressed as kilocalories/kcals or kJ) derived from all foods and beverages by oxidation [38]. The energy is released after ingestion of foods and converted into different forms including thermal and mechanical energy.
A kilocalorie (kcal, or capitalized Calorie) is defined as the amount of heat needed to raise the temperature of 1kg of water 1° Celsius and represents a metric unit of energy measurement [39].
Energy density is defined as the amount of energy per unit weight of food and is usually expressed in kcal/g or kJ/g [40]. The main sources of energy are provided by the three macronutrients: carbohydrates, protein, and fat. The average values are 4 kcals per gram for the macronutrients carbohydrate and protein, and 9 kcals per gram for fat. [38].
III. METHODS
This was a systemic review study and the study protocol was developed in accordance with the Preferred Reporting Items for Systematic Reviews (PRISMA) statement [41] and agreed to by all authors.
A. SEARCH STRATEGY AND DATA SOURCES
Given the research topic of finding available technology-driven methods for EI estimation, the most frequently used keywords related to EI were first identified. These keywords were then searched in five databases (Pubmed, Science Direct, IEEEXplore, ACM library, and Google Scholar) from January 2005 to August 2017. The final list of keywords that were used to conduct the full search to get the primary selected papers is in Table 1. A total of 432 articles were identified for complete screening. Based on the title and abstract, the articles were checked and screened. The primary selected articles were further screened based on the inclusion and exclusion criteria discussed in the following subsections. Cross-references were screened for additional eligible articles. The final selection of articles was decided after reading the full articles.
TABLE 1.
Key words used to obtain candidate articles.
| Keywords | Databases |
Total publications identified |
Full-text article, Peer-reviewed journal articles, Conference proceedings |
Primary selection |
||||
|---|---|---|---|---|---|---|---|---|
| Pubmed | Google Scholar |
IEEE Xplore |
Science Direct |
ACM Library |
||||
| Energy intake estimation | 656 | 119 | 1044 | 755 | 863 | 3437 | 410 | 227 |
| Nutrient intake estimation | 164 | 99 | 184 | 265 | 78 | 790 | 85 | 62 |
| Food amount consumed estimation | 144 | 88 | 95 | 45 | 98 | 470 | 65 | 48 |
| Energy intake model | 233 | 201 | 185 | 65 | 217 | 901 | 105 | 95 |
| Total | 1197 | 507 | 1508 | 1130 | 1256 | 5598 | 665 | 432 |
B. INCLUSION AND EXCLUSION CRITERIA
All primary selected articles were checked for eligibility and acceptability by adhering to the following inclusion criteria: 1) study participants were 18 years or older; 2) study participants had a Body Mass Index (BMI) over 18 kg/m2; 3) studies aimed at estimating/predicting EI in humans (i.e. in kcal or kJ per day/meal/food item); 4) studies aimed at estimating EI-related measures such as amount consumed, energy density, food recognition; and 5) studies proposing approaches using technology for EI estimation. The aspects of technology for EI estimation encompass methods that rely on image analysis and/or analysis of physiological signals obtained from wearable sensors. Full texts of the articles that met the inclusion criteria were obtained for review.
In the field of dietary assessment, most research considers two major aspects: 1) Food intake detection that deals with identification of all eating episodes, measurement of duration and microstructure parameters of each eating episode etc., and 2) Characterization of food intake which deals with estimation of food items, mass, volume, energy, and nutrients consumed during each eating episode. In this systematic review, studies which dealt purely with food intake detection were excluded. Additionally, studies that were solely concerned with the validation of an existing method of EI estimation were excluded. Finally, all duplicate papers were excluded from the analysis.
C. DATA EXTRACTION
The full text of each selected article was thoroughly reviewed to extract relevant data using a standardized data extraction procedure. The data extraction included information on the following domains: study design, study characteristics, method/algorithm used, data analyses, and significant findings.
IV. RESULTS
A. STUDY SELECTION
Table 1 summarizes the article selection for the review. After the primary selection, Figure 2 illustrates the PRISMA diagram and finally selected articles for the review. We identified 5598 articles including 1197 from Pubmed, 507 from Google Scholar, 1508 from IEEE Xplore, 1130 from Science Direct, 1256 from ACM library. Of the 5598 papers, 1791 were duplicates and removed from the searches. After title and abstract review of primary selected articles, 406 articles were excluded leaving 26 eligible studies. The last search for the study was conducted on 12 August 2017. The study analyzed and tabulated the important data from eligible studies: lead author’s name, publication year, gender and sample size, mean age of subjects, study design, EI estimation in each method.
FIGURE 2.

Flow chart outlining the study selection process.
B. METHODOLOGIES TO ESTIMATE EI AND ASSOCIATED PARAMETERS
Based on the review of the selected articles, the technology-driven methodologies for estimating EI and related metrics (i.e. amount consumed, food type recognition, portion size estimation) were divided into two main groups: image-assisted methods and wearable sensor-based methods.
1). IMAGE-ASSISTED METHODS
In the past decades, due to advancement in technology, the availability of image capturing devices has increased significantly. These personal devices with imaging capability are available in the form of mobile phones, smartphones, tablets, and wearable devices.
By taking advantage of easy capture, images of food and beverage items consumed can be used to estimate EI. Typically, image-based methods acquire images of food items before, during, and after a consumption using either a mobile device or wearable camera. When an individual has to perform actions to activate image capture via mobile phone, or digital camera, the image capturing method is called ‘active’. Alternately, when the image is automatically acquired using wearable cameras that can capture images continuously or at pre-determined intervals, the image capturing method is called ‘passive’.
To estimate EI, the analysis of acquired images can be performed in two ways: manual annotation of the images and automatic image recognition and processing. During manual annotation, food images are analyzed by an expert nutritionist to identify individual foods, their portion size, and nutritional content. One potential advantage of this approach compared with traditional self-report methods is a reduction in participant burden. The studies of [42]-[44] validated manual annotation for estimating EI. In these studies, the participants captured photos before and after meals, and provided short descriptions of ingredients used if mixed item dishes such as stews were consumed. Then the images were retrieved and analyzed by the nutritionist using nutritional analysis software interfaced with a nutrient database. A review of manual review/visual inspection methods is presented in [32].
Automated image recognition and processing relies on computer algorithms to segment food images, recognize foods, estimate portion size/volume and compute nutritional value. Some of the reported methods requested the user to mark foods in the image before processing, which can be categorized as a semi-automatic approach. After recognition, based on the type of foods and portion size/volume from imagery, the amount of EI is estimated using nutritional analysis software interfaced with a nutrient database. For all of these approaches, to assist the nutritionist or computer, fiducial markers (known dimensional and color references) are typically placed in the food image.
In this review, 17 papers were identified that adopted image-assisted approaches (i.e. manual review, semi-automatic and automatic image analysis). Table 2 illustrates the overview of included studies. Table 3 summarizes the methodologies, key details and reported the accuracy of the reviewed methods.
TABLE 2.
Overview of included studies using image assisted approaches.
| Study | System design |
Data Acquisition |
Study setting | # of persons |
# of days |
Food types included in study |
Total Number of image included |
Food recognition |
Volume/ Portion size estimation |
Mass estimation |
|---|---|---|---|---|---|---|---|---|---|---|
| Martin (2009) [45] | Smartphone/PAD | Pre and post image from food plate | Free-living and Lab | 50 | 3 | Full meal | NA | NA | NA | NA |
| Wu (2009) [46] | Web cam | Video | Free-living and Lab | NA* | NA | Solid, liquid | 101 | Yes | No | No |
| Zhu (2010) [29] | Smartphone-Server | Pre and post image of meal | Lab | NA | NA | Solid, liquid | 3000 | Yes | Yes | Yes |
| Wazumi (2011) [47] | Smartphone | Single pre-meal image | NA | NA | NA | Full meal | 350 | Yes | No | No |
| Kong (2012) [48] | Smartphone/video | 3 images before the meal, 3 images after the meal | NA | NA | NA | Solid | NA | Yes | Yes | No |
| Rahman (2012) [33] | Smartphone | Pairs of stereo images | NA | NA | NA | Solid fruit | 6 | No | Yes | No |
| Pouladzadeh (2014) [49] | Smartphone | 1 image before and after eating | NA | NA | NA | Solid, *mixed and non-mixed | 3000 | Yes | Yes | Yes |
| Myers (2015) [50] | Smartphone | Single pre-meal image | NA | NA | NA | Solid | 120,000 | Yes | Yes | No |
| McAllister (2015) [51] | Smartphone | Single pre-meal image | NA | NA | NA | Solid | 10 | No | Yes | No |
| Zhang (2015) [52] | Smartphone | Single pre-meal image | Free living and lab | NA | NA | Solid, *mixed | 2000 | Yes | Yes | No |
| Fang (2015) [53] | Smartphone | Single pre-meal image | NA | NA | NA | Meal | 45 | No | Yes | No |
| Sun (2015) [54] | Wearable camera | One image in every two seconds. | NA | NA | NA | Meal | NA | Yes | Yes | No |
| Pouladzadeh (2016) [55] | Smartphone | 1 image before and after eating | NA | NA | NA | Solid | 10,000 | Yes | Yes | No |
| Liao (2016) [56] | Depth camera | 3 images of food tray: empty tray, before eating and after eating | NA | NA | NA | Solid | NA | No | Yes | Yes |
| Hippocrate (2016) [57] | Smartphone / cutlery | Single image from the top with the cutlery in the picture | Lab | 15 | NA | Meal | 119 | No | Yes | Yes |
| McClung (2017) [58] | Video camera | Pre and post image from food plate | Free-living | 131 | 5 | Solid, liquid, *mixed, non-mixed | 3276 | No | No | No |
| Hassannejad (2017) [59] | Smartphone video | Six frames from video | NA | NA | NA | Meal | NA | Yes | Yes | No |
Note: NA: Not available;
mixed food: multiple food items mixed together; non-mixed: food items separated on the plate
TABLE 3.
Systematic review of image assisted method towards EI estimation.
| Study | Data analysis method | No of Food items |
Image analysis** |
Compared against*** |
Performance /accuracy |
Real-time analysis, platform |
|---|---|---|---|---|---|---|
| Martin (2009) [45] | Captures images of food selection, leftovers and a reference card, nutritionist analyses images to estimate El | NA | Manual | WFR | Free-living El: −152 ± 694 kcal/day | Phone app |
| Wu (2009) [46] | Recognizes foods in videos of eating recorded by a web camera, extracting SIFT features and utilizing pre-trained food model, predicts energy contents from standard fast food database | 101 | Automatic | DB | Not reported | NA |
| Zhu (2010) [29] | Segments images using hierarchical segmentation technique, classifies regions into food categories using SVM classifier, recognizes food items, estimates portion size through volume estimation of the food from which the energy contents determined | 19 | Automatic | NA | Food recognition: 95.8% | Phone app |
| Wazumi (2011) [47] | Segments images by hough transformation, extracts rotation invariant SPIN features, records food info in a food log system | 10 | Manual | NA | Food recognition: 78% | NA |
| Kong (2012) [48] | SIFT/statistical feature based food classification, then estimation of energy content of food items through 3D model reconstruction based volume estimation | NA | Automatic | NA | Food recognition: 92% | Phone app |
| Rahman (2012) [33] | Captures stereo pairs of images, performs feature matching between stereo images, creates a 3D model of food items to estimate volume of the food | 6 | Automatic | VWD† | Volume (ml) error: 7.7% | NA |
| Pouladzadeh (2014) [49] | Segments images, estimates food portion from SVM classifier, measures volume and calculates nutritional facts from tables. | 15 | Semi-automatic | DB | Food recognition : 92.21% (single food), 85% (non-mixed foods), 35-65% (mixed foods) Energy content (kcal) accuracy: 86% | NA |
| Myers (2015) [50] | Detects meal from single image, predicts foods, volume and size of foods using CNN, estimates energy content from USDA dataset | 2517 | Automatic | NA | NA | Real-time, phone app |
| McAllister (2015) [51] | Segments images, asks the participants to indicate the food area, estimates area covered by foods from regression model, computes energy content from selected region | 1 | Semi-automatic | DB | Energy content (kcal) of food item portions : 89.12% | NA |
| Zhang (2015) [52] | Segments images using hierarchical segmentation technique, classifies regions into food categories using SVM classifier, estimates portion size of the food to determine caloric contents | 15 | Automatic | NA | Food recognition: 85% | NA |
| Fang (2015) [53] | Estimates portion size automatically using geometric contextual information from the scene. | 19 | Automatic | WFR | Energy (kcal) error : 6% | NA |
| Sun (2015) [54] | Detects regular shaped utensils, segments food items based on color, texture measures, estimates volume of each food item using food-specific shape model, computes energy contents and nutrients using FNDDS database | 10 | Automatic | VSD | Volume (cm ) estimation error : 30% | NA |
| Pouladzadeh (2016) [55] | Segments images using graph-cut segmentation technique, classifies and recognizes food items using deep learning neural networks, measures portion size and calculates nutritional facts from tables. | 30 | Semi-automatic | DB † | Food recognition : 99% (single food) Energy content (kcal) estimation : NA | Real time, Phone app |
| Liao (2016) [56] | Acquires depth images, Filters noisy depth images, estimates volume of the foods, estimates mass of food intake using specific gravity function | 3 | Automatic | WFR | Food intake mass (g) error : 7.5% | NA |
| Hippocrate (2016) [57] | Estimates the diameter and the height of the food container and derives the food volume, given the food type, estimates the mass of the food in the image | 15 | Automatic | WFR | Mass (g) estimation error : 6.78% | NA |
| McClung (2017) [58] | Captures images using camera, estimates energy content by visual estimations from two estimator | NA | Manual | NA | NA | NA |
| Hassannejad (2017) [59] | Segments images by semi segmentation method interactive and user-dependent, creates a 3D model using a point-cloud based image modeling algorithm of food items to estimate volume of the food | 10 | Semi-automatic | NA | Volume (mm3) estimation accuracy: 92% | NA |
Note: Manual: Analyzed by trained individual; Semi-Automatic: Foods in the images are marked by the user before processing; Automatic: Fully automated analysis. NA, not available.
WFR: weighed food record; DB: Nutrition database/ Nutritional Fact labels; NA, not applicable.
VWD: measure volume using water displacement; VSD: measure volume using seed displacement
The study of [45] presented an approach based on manual review named “Remote Food Photography Method”. Before the study, participants were provided with necessary training about how to use a smartphone to capture images of their foods before consumption and any leftovers not consumed. The images were collected in free-living conditions and sent over a cellular network for review. Expert nutritionists analyzed food images to obtain EI. Around the same time, the study [46] presented a method for recognizing foods from video recordings of eating. A trained model was developed on restaurant food images and the food recognition on video frames was then carried out by image matching algorithm. After the food recognition, the method then estimated EI.
The authors of [29] described a novel mobile food record application that provided an account of daily food and nutrient intake. After segmenting and recognizing food items in captured images, portion size was estimated through 3-D reconstruction. The estimation of EI was then obtained from authors’ customized food database. Instead of direct food recognition, the authors of [47] proposed a system that recognizes the dish contents from the image and computes energy and nutrients using a food-log system.
In [48] the authors proposed a camera phone based automatic food intake monitoring system (called DietCam). They adopted a Bayesian probabilistic approach to classify food items. The authors in [33] proposed a system where the estimation of the volume of food items was done by stereoscopic imaging. The authors estimated the volume without manual fitting of 3D models to the food items. Another study [49] proposed a method to estimate EI in a food’s image by computing the volume of food portions using image segmentation followed by SVM classification. Once the food types and food volume were obtained, nutrient composition tables were used to estimate the energy content of food items.
The study of [50] proposed a system called ‘Im2Calories’ that recognizes the contents of a meal and predicts energy in restaurant meals. The system was validated using food images from restaurants where food menus were available. The system consisted of multiple stages: 1) meal detection using GoogLeNet convolutional neural network (CNN) model, 2) recognizing the restaurant at which the meal was obtained by Google’s Places API 3) food item recognition using GoogLeNet (CNN) model, 4) segmentation of the food image using “DeepLab” system, 5) volume estimation using the same CNN architecture for depth and voxel representation of the food analysis 6) EI estimation from U. S. Department of Agriculture (the USDA) database.
The study of [51] proposed a semi-automatic technique to predict energy content using regression modeling. A reference block was used in the food image to estimate food area. Based on the food portion area the proposed application trained a linear regression model to predict energy content. In [52], the author proposed a system (named Snap-n-eat) that can recognize food items on the plate and automatically estimate energy content. The system estimated the portion size of the food by counting pixels of the food portion and used it to estimate the energy content of foods from a custom database. Work reported in [53] demonstrated an approach to estimate food portion size from a single-view image. The method utilized geometric contextual information to automatically estimate the volume from the image. The study of [54] proposed the eButton which takes pictures of the food on the table automatically (one picture in every two seconds). Out of all selected papers, only this study proposed passive capture of images using the wearable camera. From the image, the volume of food items was estimated in three steps: 1) detection of utensils (e. g., circular plates or bowls); 2) segmentation of food items based on color, texture and a complexity measure; and 3) estimation of volume from food-specific shape models. The food name and volume information were then used to estimate energy and nutrient content from the Food and Nutrient Database for Dietary Studies (FNDDS, a public domain database developed by the USDA).
The same research group as in [49] proposed a combination of graph cut segmentation and deep learning neural networks to classify and recognize food items in [55]. The authors further introduced a distance-based food portion estimation in which the distance between the food image and the smartphone device was calculated using the accelerometer and magnetic field sensors. To measure the distance, the authors used the reported height of the user. Next, the energy content of food items was estimated incorporating the food portion information.
In recent years, the volume estimation technique has further improved with the introduction of depth cameras. In [56] volume estimation of food intake was done by utilizing a short-range depth camera. The volume was estimated from the width and height of pixels relating to camera distance. The estimated volume was then converted to mass using the specific gravity function of each food item. Next, the estimated mass information was used to calculate the energy content of food items.
In [57] the authors proposed a method that measures the mass of foods using eating tools (cutlery) such as a spoon, a fork, or chopsticks. The user was required to manually enter the food type to get the mass estimation. The authors in [58] proposed the use of digital food photography (DFP) in the quantification of EI in a field setting. The study installed three DFP stations to capture pre- and post-tray images. Later, multiple nutritionists estimated EI visually and compared the estimation with the weighed record and standardized servings.
Most recently, the authors in [59], presented a novel approach to estimate the food volume using image-based modeling. Six frames from a short video of meal consumption were selected for analysis. Next, the user manually marked initial segments of food items in one of the frames. Later, the final segmentation was done using a graph-cut algorithm. A fiducial marker was used as size/ground reference along-side a 3D food model to estimate the volume of food items.
Despite the improvement of computer vision systems and sophisticated image analysis algorithm, an accurate and practical method has yet to be developed. Due to the diversity of food, it is challenging to recognize every food item present in the image. The variability of individual consumption leads to further complexity, making it difficult to offer fully automatic and accurate EI estimation.
2). WEARABLE DEVICES - SENSORS
Recent advances in wearable sensor technologies have shown great potential for the estimation of EI. The development of technology has further allowed supplying rich information about food intake by means of microstructure parameters such as eating episode duration, rate of ingestion, chewing frequency, chewing rate, bite size and others [60]. These parameters can potentially help researchers to better understand eating behaviors which may be associated with adiposity. The application of wearable sensors further reduces the burden of self-reporting of EI, alleviating some of the problems of misreporting. It also offers automatic detection of food intake events with minimal individual’s active participation. Several investigators have attempted to use wearable sensor devices for estimating EI in both controlled laboratory settings and free-living environments. In this systematic review, nine studies that estimated EI or EI-related measures were thoroughly reviewed. Table 4 gives an overview of the included studies and Table 5 a summary of each of the wearable sensor methods for EI estimation.
TABLE 4.
Overview of included studies using wearable sensor based approaches.
| Study | Sample size |
Age Mean±SD (range) |
BMI Mean±SD (range) |
Notes about subjects |
Study setting | Estimated value |
Primary outcome measures |
|---|---|---|---|---|---|---|---|
| Sazonov (2009) [61] | 20 M: 11 F: 9 |
NA | 29.0±6.4 kg/m2 | Healthy normal | Lab | Amount consumed (g) | Detecting periods of food intake and predicting mass of ingested foods |
| Amft (2009) [62] | 8 M: 6 F: 2 |
20-35 years | NA | Healthy normal | Lab | Bite weight (g) | Bite weight prediction using food type and chewing recognitions |
| Liu (2012) [63] | 6 | NA | NA | Healthy normal | Lab | Proportion of food consumed (%) | Estimates the proportion of food consumed using inter-image colour histogram distances |
| Fontana (2015) [34] | 30 | 29±12 (19-58 years) | 27.9±5.5 kg/m2 | Normal, Overweight Obese | Lab | EI (kcal) | Models based on counts of chews and swallows to estimate EI |
| Alshurafa (2015) [64] | 20 | 20-40 years | NA | Healthy normal | Lab | Food types | Distinguish between food types, such as liquid and solid, hot and cold drinks, and hard and soft foods. |
| Salley (2016) [35] | 280 M:132 F: 148 |
29.7 years | 25.36±5.18 kg/m2 | Normal, Overweight Obese | Lab + free-living | EI per bite (kcal) | Estimating EI in free-living environment using bite count and mean EI per bite |
| Mirtchouk (2016) [65] | 6 (total 30 meals) | NA | NA | NA | Lab + free-living | Amount consumed (g) | Food mass and food type estimation |
| Hezarjaribi (2017) [66] | 30 | 18-30 years | NA | Healthy normal | Lab + free-living | Energy content (kcal) | EI estimation based on voice data |
| Thong (2017) [67] | NA | NA | NA | Healthy normal | Lab | Energy (KJ) and carbohydra te(g) | Quantitative prediction on food nutrient contents, such as energy and carbohydrate. |
Notes: BW, body weight; NA, not available
TABLE 5.
Systematic review of wearable sensor based method towards EI estimation.
| Study | Sensor used/sensor location | Food types/No. of foods included in study |
Regression method used for analysis | Estimated equation | Group model (Gr) / Individual model (Ind) |
Significant variables | Mass estimation | Energy density estimation | Compared against* | Accuracy/ performance |
|---|---|---|---|---|---|---|---|---|---|---|
| Sazonov (2009) [61] | Miniature microphone, Piezoelectric strain sensor | Meal, solid and liquids /NA | Linear |
MS : Predicted mass of solid food (g) : Subject’s average mass per swallow of solid food (g) NSW : Total number of swallow for solid or liquid food intake : Average mass per chew (g) NCHEW : Total number of chews ML : Predicted mass of liquid food (g) : Subject’s average mass per swallow of liquid food (g) |
Gr | Average mass per swallow, Average mass per chew, Number of swallows, Number of chews | Y | N | WFR | Average accuracy of mass (g) model for solid food intake : 91.79% Average accuracy of mass (g) model for liquid food intake: 83.76% |
| Amft (2009) [62] | Ear-pad chewing sound sensor | Solid/ 3 | Multiple linear |
Wi : Bite weight prediction (g) a : Food-specific coefficients found by least-square fit v : microstructure variables NV : Total number of variables |
Ind | Number of chewing event, Chewing duration | N | N | WFR | Food classification accuracy : 94% Lowest mean weight (g) prediction: 19.4% Largest mean weight (g) prediction: 31% |
| Liu (2012) [63] | Miniature camera and microphone | Meal | NA | NA | NA | Sound features: Energy entropy, Short time energy, Spectral roll-off, spectral centroid, spectral flux, spectral average of sub-bands, Zero crossing rate, Peak gaps between energy peaks. | N | N | NA | Not specified |
| Fontana (2015) [34] | Throat microphone, Piezoelectric strain sensor | Meal, solid and liquids /45 | Linear |
Total mass ingested MT = MS + ML; × (MPChew × cf) × Nchew; ; MT : Total mass ingested (g) MS : Mass of solid food ingested (g) ML : Mass of liquid food ingested (g) ws : weight parameter for mass prediction using number of swallows wc : weight parameter for mass prediction using number of chews MPSwS : subject's average mass per swallow of solid food (g) MPChew : subject's average mass per chew (g) : total number of swallows for solid food intake Nchew : total number of chews cf : correction factor MPSwL : subject's average mass per swallow of liquid (g) : total number of swallows for liquid intake mTi : consumed mass for the distinct food type i (g) CDi : caloric density associated to the same food type i (kcals/g) N : total number of distinct foods types consumed in the meal |
Ind | Counts of chews and swallows |
Y | WFR | Best accuracy: El (kcal) model based on chews counts Reporting error (%): 30.42 ± 23.08 |
|
| Alshurafa (2015) [64] | piezoelectric sensor | Liquid, solid, hot and cold drinks, hard and soft foods. | NA | NA | NA | NA | N | N | NA | Food type classification F-measure 90% |
| Salley (2016) [35] | Hand gesture sensor | Meal Solid, mixed and non-mixed /1844 | Linear |
Estimated kilocalories per bite = −0.128 × age + 6.167 × sex(females = 0) + 0.034 × height + 0.035 × weight −12.012 × WHR + 22.294 age : Age in years height : Height in inches weight : Weight in lb. WHR : Waist-to-hip ratio |
Gr | Age, Sex, Height, Weight, Waist to hip ratio | N | N | WFR | Mean estimation error: −71.21±562.14 kcal |
| Mirtchouk (2016) [65] | Motion (head, both wrists) and acoustic sensors (customized earbud) | Meal Solid, mixed and non-mixed / 1489 food and 285 drink Intakes | Random forest 40 trees | NA | Gr | NA | Y | N | WFR | Food classes classification accuracy: 82.7% Amount consumed (g) estimation error: 35.4% |
| Hezarjaribi (2017) [66] | Audio recording in mobile app | Not specified | NA | NA | NA | Frequency domain features (energy, fundamental frequency | N | N | DB | EI (kcal) accuracy: 92.2% |
| Thong (2017) [67] | near-infrared (NIR) scanner | Liquid | Support vector And Partial least square |
E[Y∣X] = f(X, β) E : Energy content (KJ) X : an n × m matrix; n number of scans; m number of absorption; Y : observed values of energy (KJ) , carbohydrates (g) in food samples; f : Linear function β : least square errors; |
NA | NA | N | N | DB | For energy, the prediction error is less 2 KJ. For carbohydrate, the prediction error is around 0.12 g. |
Notes: WFR: weighed food record; DB: Nutrition database/ Nutritional Fact labels
The study of [61] demonstrated a method that can accurately detect food intake, differentiate between liquids and solids, and estimate the amount consumed (ingested mass). To predict ingested food mass, the investigators proposed separate mass models for solid and liquid intakes. The mass model for solid foods was developed based on linear regression against the measured weight of food using the total number of chews and swallows during ingestion period. The liquid mass model used only swallows as predictor considering the fact that liquids do not involve chewing.
The authors in [62] proposed an ear-pad chewing sound sensor to identify chewing sequences during food intake and utilized microstructure variables extracted from chewing events to predict bite weight. To develop the model, the method classified foods, detected chewing events, and extracted microstructure variables. Using both microstructure variables from chewing events and identified foods, a multiple linear regression model for bite weight was developed against weighed food record. The investigators proposed that the bite-weight prediction may potentially assist in estimating EI but did not actually provide these data.
In [63], the authors proposed a wearable sensor system consisting of a microphone and camera. Audio features were extracted in real-time to detect the chewing activity. The camera captured a video sequence for image analysis. Both time domain and frequency domain features were extracted from audio signals and applied to an extreme learning machine (ELM) classifier to classify chewing activity. The system further recognized food content in the images based on color histogram features. The proportion of histogram distances was used to estimate the consumed amount.
The Automatic Ingestion Monitor (AIM) combined piezoelectric strain gauge sensors for monitoring jaw motion, hand-to-mouth motion sensor, and body motion with a smartphone application [68]. The sensor system has been validated in several conditions and subject groups (both adults and infants)[69]-[72]. Recent publications suggested that the sensor system on the eyeglasses can accurately detect the chewing and be used in microstructure characterization [72], [73]. The study of [34], presented an EI estimation model from counts of chews and swallows. In laboratory settings, participants were asked to consume their meals wearing the sensor system. Participants eating behavior was recorded through a video monitoring system. Features extracted from sensors and video annotation were used to build EI models. EI was then obtained by multiplying mass intake by the energy density of the food item (obtained from food composition tables). The accuracy of EI estimation was reported for each eating occasion.
The study of [64] demonstrated a necklace-like wearable system which can potentially be used to monitor food and nutrient intake. The system included an embedded piezoelectric sensor which can detect skin motion in the lower trachea during chewing. The investigators proposed an algorithm based on time-frequency decomposition, spectrogram analysis of piezoelectric sensor signals, to accurately distinguish between food types, such as liquid and solid, hot and cold drinks, and hard and soft foods.
The study of [74] presented the use of wearable sensors to detect hand-to-mouth gestures and bite events. A wrist-worn “bite counter” device uses a gyroscope to detect hand-to-mouth motion related to eating. Using the bite counter, in [35], the authors proposed a multiple regression model to predict an individual’s mean kilocalories per bite using anthropometric variables regressed against measured kilocalorie intake. The accuracy of EI estimation was reported for each eating occasion.
The study of [65] proposed a multi-modal sensor system comprising of in-ear audio, head and wrist motion sensors to accurately classify food type and estimate the amount consumed. The authors applied features from audio and motion sensors to capture eating activities. Based on the features extracted from audio and motion sensor, chewing and duration of intake were detected. To recognize food type, the method trained a random forest classifier with 40 trees using all sensors variable. A random forest regression with 40 trees was used to create food weight estimation model. Finally, the estimated amount consumed was utilized to estimate energy content from food databases. The accuracy of ingested mass was reported for each food items.
The study of [66] utilized voice to estimate EI and introduced a system called ‘Speech2Health’. The system included speech processing, natural language processing (NLP), and text mining techniques. The voice data was first converted into text. By utilizing standard NLP and pattern mapping algorithm, the text was matched to nutrition-specific data. Finally, a matching algorithm was proposed to find the food name in the database and compute the EI values. The accuracy of EI was reported for each food items.
Recently, the authors of the study [67] proposed a method of estimating the energy and nutrients in food using a near-infrared (NIR) scanner. The scanner provides reflected spectrum pattern from the scans of liquid beverage samples. Using the NIR spectra features, support vector regression and partial least square regression models were developed to predict the energy and carbohydrate contents of food.
The use of wearable systems in estimating EI are increasing due to their potential for capturing detailed characteristics of all ingestion events without creating a reporting burden to the participant. Many of these are accurate in short-term studies in lab settings. However, further evaluations among large populations in community-dwelling settings are needed before their full utility and accuracy can be determined.
V. DISCUSSION
The major goal of this review was to systematically explore the technology-driven methodologies that estimate EI or EI-related measures in humans. The systematic review revealed that various technologies were developed for EI estimation. Detailed review of selected methods revealed that each method has pros and cons and potentially needs further improvement.
Technological advancement and the availability of smartphones made it possible for the general population to easily capture images of eating episodes. The technology-driven image-assisted methods further allowed portion size/volume estimation by manual review by a trained nutritionist or automatic image recognition algorithm. Food type recognition and amount consumed estimation can also be automatically done using image processing algorithms [29], [33], [46]-[48], [52], [54]-[56]. By utilizing the objective information from images, several image-assisted methods have shown great promise in EI estimation. Despite the advantages, the image-assisted methods are not free from issues that affect the performance. The energy content of a food may be difficult to estimate from an image: the type of food may be hard to identify, especially in mixed foods; foods of different energy content may have similar appearance (e.g. diet vs. regular beverages); foods may be consumed from containers that obscure the food. A technological issue that affects most of the image-assisted approaches [29], [33], [46], [48], [52], [55] is that these require a fiducial marker to be placed in the image. The user has to carry the marker and remember to place it in the image capture area as they eat, which might not be feasible. Another key challenge for the implementation of image-based methods is the collection of data and complex analysis of data. Several methods use computationally complex algorithms to recognize foods and require high processing resources which are not generally available in clinical, rural, or third world settings. It is very important to recognize the food items that are actually consumed during the meal versus accurately estimate the leftover portions in order to measure EI. Apart from that, uneaten food items should be identified as people may simply be looking at food rather than eating it (e. g. in a restaurant setting, or eating in a group setting and looking at other peoples’ food).
Development of wearable sensors for EI estimation has gained substantial attention in recent years. The wearable sensors are capable of capturing information about eating microstructure, which, potentially, may improve the accuracy of EI estimation by reducing the participant burden and need for self-report of every eating episode. Another potential advantage of wearable sensors is as a lower user burden compared with self-report methods. However, the performance of wearable sensors is dependent on user’s compliance with wearing the device and the social acceptability of the sensor.
Overall, our findings suggest several important aspects for consideration. These include how the studies reported the measure of accuracy, under what conditions the methods evaluated the performance, statistical analyses involved in methods, and the use of nutrition expertise in engineering literature.
We observed that the included studies either estimate EI or EI-related measures such as amount consumed/mass, portion size, volume, cumulative intake, and food recognition. The measures of accuracy for EI estimation in [34], [35], [45], [49], [51], [53], [66], [67], are presented in terms of daily EI, meal/eating episodes EI and food item EI. On the other hand, the methods proposed in [29], [33], [47], [48], [52], [54]-[57], [59], [61], [62], [64], [65] reported accuracy of the estimation of EI-related measures such as amount consumed, volume, energy density, food recognition. Lastly, the methods presented in [46], [50], [58], [63] did not specify accuracy. The accuracy of any method is largely dependent on the study population, size of the dataset, set of foods present in the dataset, the study environment (laboratory or free-living), and the standard of comparison against which accuracy is evaluated. The best accuracy for EI estimation was reported in [53] as only 6% reporting error for food items EI. However, the study only included 19 types of food items in a controlled setting. Under free-living environment conditions, one can expect a large variety of foods, so the method needs to be validated under such conditions. Another study in [51] obtained 89% accuracy in estimating the energy content of just on one food item. The average accuracy of the method in [66] in energy content estimation for 30 participants was 92 % under lab conditions. However, the food types and number of food items used for the study were not specified. For food classification, the best accuracy was claimed in [55] where the authors obtained 99% accuracy for 30 food items. In the case of volume estimation, the study [59] obtained the highest accuracy of 92% considering 10 food items. Therefore, it is imperative to test the robustness of methods using real-life conditions where the system will encounter a variety of food items presented in different settings and using different containers/serve-ware.
Most of the studies reviewed were restricted to laboratory settings only. However, ensuring the robustness of the EI estimation method demands consistent performance in the free-living environment. The laboratory setting often lacks realism due to unnatural eating behavior [75] or potential behavior change, limited food items, mixed meals, eating in groups, etc. Depending on the study setting, an individual can consume meals fast/slow, skip meals, and/or take pauses between different parts of the meal to talk, etc. Therefore, laboratory experiments might not reflect real life in the free-living environment. Due to the controlled laboratory setting, many methods obtain a high degree of accuracy. Indeed, if a study is to be conducted in the laboratory, the gold standard of weighed food records which are inexpensive should be utilized.
In this review, the methods presented in [35], [45], [46], [52], [65], [66] conducted their study both in the laboratory and free-living conditions. In [45], the free-living EI was −152 ± 694 kcal/day compared to weighed food record, whereas in [35], the same method estimated −71.21 ± 562.14 kcal/meal. The study [53] claimed the best accuracy of EI estimation of 94%, however, the study setting was not specified. It can be summarized that realistic assessment needs to be done in the free-living environment.
The use of statistical analysis is also important when reporting accuracy. To realize the degree of error, the statistical analysis needs to be applied to explore how good the estimation is compared to relevant reference/gold standard. In study [45], the validity of the method was tested by comparing weighed EI to estimated EI using Bland-Altman analysis. Similarly, the study [34] analyzed Bland-Altman plots when evaluating the accuracy of the method with respect to weighed food record. A paired samples t-test was used in the study [35] to compare mean EI estimation between the proposed method and participants’ self-report. However, the remainder of the other studies lacks or contain only cursory statistical analysis. In future, authors must substantiate their results using properly designed statistical analysis.
A common observation is that very few studies list a nutritionist/dietitian as one of the co-authors. Quite frequently this leads to the use of incorrect terminology, lack of proper evaluation methods, and overstated claims. Of the reviewed studies, only 8 studies [29], [33], [34], [53], [54], [58], [63], [70] of the 26 (30%) listed had at least one nutritionist as a co-author. One of the recommendations based on this review is that the authors should seek collaboration within the community and strive to achieve sound engineering and nutrition science in publications.
While some of the methods show great promise under certain circumstances, objective, accurate and cost-efficient methods for estimating EI are yet to be developed. Such methods must provide 1) good accuracy in free-living conditions with large sample sizes in multiple studies and populations; 2) accurate recognition of a variety of food items, analysis of ingested mass, nutrient and energy contents of consumed foods with minimal effort; 3) characterization of food intake with minimal effort from the user; and 4) appropriate and rigorous statistical analysis.
VI. CONCLUSION
This systematic review explored the technology-driven EI estimation methodologies proposed in the past two decades. It is expected that the awareness of pros and cons of existing methodologies will enable researchers to develop more accurate EI estimation method. Currently, there have been a range of methodologies estimating EI and related aspects such as food type recognition, amount consumed estimation, and portion size estimation. Each of the methods has its advantages and disadvantages. Due to the challenges and limitations existent in current methods, new methods are greatly needed for accurate EI estimation. In addition to developing new methods, improvement of existing energy estimation methods are also needed. Future works should be focused on exploring combining different approaches to accurately quantify EI.
Acknowledgments
This work was supported by the National Institute of Diabetes and Digestive and the Kidney Diseases of the National Institutes of Health under Award R01DK100796. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Biography

ABUL DOULAH received the bachelor’s degree in electrical engineering from the Bangladesh University of Professionals, Dhaka, Bangladesh, in 2011, the master’s degree from the Bangladesh University of Engineering and Technology, in 2013, and the Ph.D. degree in electrical engineering from the University of Alabama, Tuscaloosa, AL, USA, in 2018, respectively. He was a Lecturer with the Department of Electrical Electronics and Computer Engineering, Military Institute of Science and Technology. His research interests include the development of wearable systems, signal analysis of sensors, and machine learning algorithms for preventive, diagnostic, and assistive health technology with a special focus on dietary intake monitoring.

MEGAN A. MCCRORY received the bachelor’s and master’s degrees in exercise science and the Ph.D. degree in nutrition from the University of California at Davis, in 1986, 1994, and 1997, respectively. She held a postdoctoral training with the Energy Metabolism Laboratory at the Jean Mayer USDA Human Nutrition Research Center on Aging, Tufts University, in 1999. She is currently a Research Associate Professor of nutrition with the Department of Health Sciences, Boston University. She studies eating patterns, such as snacking, eating frequency, meal skipping and timing, dietary variety, taste preferences, and eating food prepared away from home; their interactions with dietary composition factors such as macronutrients and fiber; and their effects on energy balance and appetite. Her researches also focus on dietary assessment methodology and improving the measurement of dietary intake. Her research has been supported by the National Institutes of Health, The USDA Economic Research Service, and The Bill and Melinda Gates Foundation. She was named as an International Life Sciences Institute (ILSI) Future Leader in nutrition, in 2003, and she currently serves on the Editorial Boards of the scientific journals Advances in Nutrition, Frontiers in Nutrition, and Nutrients.

JANINE A. HIGGINS received the Ph.D. degree in biochemistry from the University of Sydney, with a thesis focusing on the effect of carbohydrate subtype, in particular resistant starch, on insulin sensitivity in rats. She held Postdoctoral Fellowships at the University of Wollongong and the University of Colorado Denver. She has been the Colorado Clinical and Translational Sciences Institute (CCTSI) Nutrition Research Director for the past 15 years and CCTSI Director of Operations for four years. She has more than 18 years of experience in nutrition research in rodent models, children, and adults. Her primary research interests include diet and metabolism: investigating optimal treatment of Type 2 diabetes in children and adolescents (NIH multi-center TODAY study), matching diet to gut microbiome to improve metabolism and inflammation in HIV positive individuals, obesity treatment in adults, and ways to ameliorate weight regain following weight loss in rodent models. She has extensive experience in mentoring Postdoctoral and Clinical Endocrinology and Gastroenterology Fellows and is a member of the selection committee and preceptor for the Children’s Hospital Colorado Academy of Nutrition and Dietetics (AND) Dietetic Internship Program.

EDWARD SAZONOV (M’02-SM’11) received the Diploma degree in systems engineering from the Khabarovsk State University of Technology, Russia, in 1993, and the Ph.D. degree in computer engineering from West Virginia University, Morgantown, WV, USA, in 2002. He is currently a Professor with the Department of Electrical and Computer Engineering, The University of Alabama, Tuscaloosa, AL, USA, and the Head of the Computer Laboratory of Ambient and Wearable Systems. His research interests include wireless, ambient, and wearable devices, and methods of biomedical signal processing and pattern recognition. He developed devices in his laboratory include: a wearable sensor for objective detection and characterization of food intake; a highly accurate physical activity and gait monitor integrated into a shoe insole; and a wearable sensor system for monitoring of cigarette smoking. His research has been supported by the National Science Foundation, National Institutes of Health, National Academies of Science, and state agencies and private industry and foundations.
REFERENCES
- [1].Hall KD, Heymsfield SB, Kemnitz JW, Klein S, Schoeller DA, and Speakman JR, “Energy balance and its components: Implications for body weight regulation,” Amer. J. Clin. Nutrition, vol. 95, no. 4, pp. 989–994, April 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [2].Schoeller DA and Thomas D, “Energy balance and body composition,” in World Review of Nutrition and Dietetics, vol. 111, Bier DM, Mann J, Alpers DH, Vorster HHE, and Gibney MJ, Eds. Basel, Switzerland: Karger Publishers, 2014, pp. 13–18. [DOI] [PubMed] [Google Scholar]
- [3].Müller O and Krawinkel M, “Malnutrition and health in developing countries,” CMAJ, vol. 173, no. 3, pp. 279–286, August 2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [4].(2017). Products - Health E Stats - Prevalence of Underweight Among Adults 2011-2012. [Online]. Available: https://www.cdc.gov/nchs/data/hestat/underweight_adult_11_12/underweight_adult_11_12.htm
- [5].(2017). WHO ∣ Obesity and overweight WHO. [Online]. Available: http://www.who.int/mediacentre/factsheets/fs311/en/
- [6].Ng M et al. , “Global, regional, and national prevalence of overweight and obesity in children and adults during 1980–2013: A systematic analysis for the global burden of disease study 2013,” Lancet, vol. 384, no. 9945, pp. 766–781, August 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [7].Finkelstein EA, Trogdon JG, Cohen JW, and Dietz W, “Annualmedical spending attributable to obesity: Payer-and service-specific estimates,” Health Affairs, vol. 28, no. 5, pp. w822–w831, September 2009. [DOI] [PubMed] [Google Scholar]
- [8].Lehnert T, Sonntag D, Konnopka A, Riedel-Heller S, and Konig H-H, “Economic costs of overweight and obesity,” Best Pract. Res. Clin. Endocrinol. Metabolism, vol. 27, no. 2, pp. 105–115, April 2013. [DOI] [PubMed] [Google Scholar]
- [9].Hales CM, Fryar CD, Carroll MD, Freedman DS, and Ogden CL, “Trends in obesity and severe obesity prevalence in US youth and adults by sex and age, 2007–2008 to 2015–2016,” Jama, vol. 319, no. 16, pp. 1723–1725, 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [10].Sánchez-Carracedo D, Neumark-Sztainer D, and López-Guimera G, “Integrated prevention of obesity and eating disorders: Barriers, developments and opportunities,” Public Health Nutrition, vol. 15, no. 12, pp. 2295–2309, December 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [11].Gilmore LA, Ravussin E, Bray GA, Han H, and Redman LM, “An objective estimate of energy intake during weight gain using the intake-balance method,” Amer. J. Clin. Nutrition, vol. 100, no. 3, pp. 806–812, July 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [12].Ainslie P, Reilly T, and Westerterp K, “Estimating human energy expenditure: A review of techniques with particular reference to doubly labelled water,” Sports Med. Auckl. NZ, vol. 33, no. 9, pp. 683–698, February 2003. [DOI] [PubMed] [Google Scholar]
- [13].Ndahimana D and Kim E-K, “Measurement methods for physical activity and energy expenditure: A review,” Clin. Nutrition Res, vol. 6, no. 2, pp. 68–80, April 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [14].Dannecker KL, Sazonova NA, Melanson EL, Sazonov ES, and Browning RC, “A comparison of energy expenditure estimation of several physical activity monitors,” Med. Sci. Sports Exerc, vol. 45, no. 11, pp. 2105–2112, November 2013. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [15].(2018). Multifactor Screener in the 2000 National Health Interview Survey Cancer Control Supplement: Overview. [Online]. Available: https://epi.grants.cancer.gov/nhis/multifactor/
- [16].Goris AHC and Westerterp KR, “Underreporting of habitual food intake is explained by undereating in highly motivated lean women,” J. Nutrition, vol. 129, no. 4, pp. 878–882, April 1999. [DOI] [PubMed] [Google Scholar]
- [17].Goris AH, Westerterp-Plantenga MS, and Westerterp KR, “Undereating and underrecording of habitual food intake in obese men: Selective underreporting of fat intake,” Am. J. Clin. Nutrition, vol. 71, no. 1, pp. 130–134, January 2000. [DOI] [PubMed] [Google Scholar]
- [18].Johansson G, Wikman Å, Ahrén AM, Hallmans G, and Johansson I, “Underreporting of energy intake in repeated 24-hour recalls related to gender, age, weight status, day of interview, educational level, reported food intake, smoking habits and area of living,” Public Health Nutr, vol. 4, no. 4, pp. 919–927, August 2001. [DOI] [PubMed] [Google Scholar]
- [19].Hall KD and Chow CC, “Estimating changes in free-living energy intake and its confidence interval,” Amer. J. Clin. Nutrition, vol. 94, no. 1, pp. 66–74, 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [20].Thomas DM, Martin CK, Heymsfield S, Redman LM, Schoeller DA, and Levine JA, “A simple model predicting individual weight change in humans,” J. Biol. Dyn, vol. 5, no. 6, pp. 579–599, November 2011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [21].Kissileff HR, Thornton J, and Becker E, “A quadratic equation ade-quately describes the cumulative food intake curve in man,” Appetite, vol. 3, no. 3, pp. 255–272, September 1982. [DOI] [PubMed] [Google Scholar]
- [22].Thomas DM et al. , “A new universal dynamic model to describe eating rate and cumulative intake curves,” Am. J. Clin. Nutr, vol. 105, no. 2, pp. 323–331, January 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [23].Coward WA, “The doubly-labelled-water (2H218O) method: Principles and practice,” Proc. Nutrition Soc, vol. 47, no. 3, pp. 209–218, September 1988. [DOI] [PubMed] [Google Scholar]
- [24].Schoeller DA and Hnilicka JM, “Reliability of the doubly labeled water method for the measurement of total daily energy expenditure in free-living subjects,” J. Nutrition, vol. 126, no. 1, pp. 348S–354S, January 1996. [PubMed] [Google Scholar]
- [25].Black AE and Cole TJ, “Within- and between-subject variation in energy expenditure measured by the doubly-labelled water technique: Implications for validating reported dietary energy intake,” Eur. J. Clin. Nutrition, vol. 54, no. 5, pp. 386–394, May 2000. [DOI] [PubMed] [Google Scholar]
- [26].Livingstone MBE and Black AE, “Markers of the validity of reported energy intake,” J. Nutrition, vol. 133, no. 3, pp. 895S–920S, March 2003. [DOI] [PubMed] [Google Scholar]
- [27].Corella D and Ordovás JM, “Biomarkers: Background, classification and guidelines for applications in nutritional epidemiology,” Nutr. Hosp, vol. 31, no. 3, pp. 177–188, February 2015. [DOI] [PubMed] [Google Scholar]
- [28].Westerterp KR, “Doubly labelled water assessment of energy expenditure: Principle, practice, and promise,” Eur. J. Appl. Physiol, vol. 117, no. 7, pp. 1277–1285, July 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [29].Zhu F et al. , “The use of mobile devices in aiding dietary assessment and evaluation,” IEEE J. Sel. Topics Signal Process, vol. 4, no. 4, pp. 756–766, August 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [30].Higgins JA et al. , “Validation of photographic food records in children: Are pictures really worth a thousand words?” Eur. J. Clin. Nutr, vol. 63, no. 8, pp. 1025–1033, March 2009. [DOI] [PubMed] [Google Scholar]
- [31].Sazonov E, Farooq M, and Melanson E. (2015). Assessment of Ingestion by Chewing and Swallowing Sensors, Advances in the Assessment of Dietary Intake. [Online]. Available: https://www.taylorfrancis.com/
- [32].Martin CK, Nicklas T, Gunturk B, Correa JB, Allen HR, and Champagne C, “Measuring food intake with digital photography,” J. Hum. Nutrition Dietetics, vol. 27, no. 1, pp. 72–81, January 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [33].Rahman MH et al. , “Food volume estimation in a mobile phone based dietary assessment system,” in Proc. 8th Int. Conf. Signal Image Technol. Internet Based Syst., November 2012, pp. 988–995. [Google Scholar]
- [34].Fontana JM et al. , “Energy intake estimation from counts of chews and swallows,” Appetite, vol. 85, pp. 14–21, February 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [35].Salley JN, Hoover AW, Wilson ML, and Muth ER, “Comparison between human and bite-based methods of estimating caloric intake,” J. Acad. Nutr. Diet, vol. 116, no. 10, pp. 1568–1577, October 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [36].Thompson F and Subar AF, “Dietary assessment methodology,” in Nutrition in the Prevention and Treatment of Disease, 3rd ed. San Diego, CA, USA: Academic, 2013, pp. 5–46. [Google Scholar]
- [37].Archundia Herrera M, Chan C, Archundia Herrera MC, and Chan CB, “Narrative Review of New Methods for Assessing Food and Energy Intake,” Nutrients, vol. 10, no. 8, p. 1064, August 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [38].Shook RP, Hand GA, and Blair SN, “Top 10 research questions related to energy balance,” Res. Quart. Exerc. Sport, vol. 85, no. 1, pp. 49–58, January 2014. [DOI] [PubMed] [Google Scholar]
- [39].Hargrove JL, “History of the calorie in nutrition,” J. Nutrition, vol. 136, no. 12, pp. 2957–2961, December 2006. [DOI] [PubMed] [Google Scholar]
- [40].Grunwald GK, Seagle HM, Peters JC, and Hill JO, “Quantifying and separating the effects of macronutrient composition and non-macronutrients on energy density,” Brit. J. Nutrition, vol. 86, no. 2, pp. 265–276, August 2001. [DOI] [PubMed] [Google Scholar]
- [41].(2017). Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. [Online]. Available: http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1000097 [DOI] [PubMed]
- [42].Kikunaga S, Tin T, Ishibashi G, Wang D-H, and Kira S, “The application of a handheld personal digital assistant with camera and mobile phone card (Wellnavi) to the general population in a dietary survey,” J. Nutrition Sci. Vitaminol. (Tokyo), vol. 53, no. 2, pp. 109–116, April 2007. [DOI] [PubMed] [Google Scholar]
- [43].Wang D-H, Kogashiwa M, and Kira S, “Development of a new instrument for evaluating individuals’ dietary intakes,” J. Am. Diet. Assoc, vol. 106, no. 10, pp. 1588–1593, October 2006. [DOI] [PubMed] [Google Scholar]
- [44].Wang D-H, Kogashiwa M, Ohta S, and Kira S, “Validity and reliability of a dietary assessment method: The application of a digital camera with a mobile phone card attachment,” J. Nutritional Sci. Vitaminol, vol. 48, no. 6, pp. 498–504, December 2002. [DOI] [PubMed] [Google Scholar]
- [45].Martin CK, Han H, Coulon SM, Allen HR, Champagne CM, and Anton SD, “A novel method to remotely measure food intake of free-living individuals in real time: The remote food photography method,” Brit. J. Nutrition, vol. 101, no. 3, pp. 446–456, July 2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [46].Wu W and Yang J, “Fast food recognition from videos of eating for calorie estimation,” in Proc. IEEEInt. Conf. Multimedia Expo, June 2009, pp. 1210–1213. [Google Scholar]
- [47].Wazumi M, Han XH, Ai D, and Chen YW, “Auto-recognition of food images using SPIN feature for Food-Log system,” in Proc. 6th Int. Conf. Comput. Sci. Converg. Inf. Technol.(ICCIT), November 2011, pp. 874–877. [Google Scholar]
- [48].Kong F and Tan J, “DietCam: Automatic dietary assessment with mobile camera phones,” Pervas. Mob. Comput, vol. 8, no. 1, pp. 147–163, February 2012. [Google Scholar]
- [49].Pouladzadeh P, Shirmohammadi S, and Al-Maghrabi R, “Measuring calorie and nutrition from food image,” IEEE Trans. Instrum. Meas, vol. 63, no. 8, pp. 1947–1956, August 2014. [Google Scholar]
- [50].Myers A et al. , “Im2Calories: Towards an automated mobile vision food diary,” in Proc. IEEE Int. Conf. Comput. Vis. (ICCV), December 2015, pp. 1233–1241. [Google Scholar]
- [51].McAllister P, Zheng H, Bond R, and Moorhead A, “Semi-automated system for predicting calories in photographs of meals,” in Proc. IEEE Int. Conf. Eng., Technol. Innov./Int. Technol. Manage. Conf. (ICE/ITMC), June 2015, pp. 1–6. [Google Scholar]
- [52].Zhang W, Yu Q, Siddiquie B, Divakaran A, and Sawhney H, “‘Snap-n-Eat’ food recognition and nutrition estimation on a smartphone,” J. Diabetes Sci. Technol, vol. 9, no. 3, pp. 525–533, April 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [53].Fang S, Liu C, Zhu F, Delp EJ, and Boushey CJ, “Single-view food portion estimation based on geometric models,” in Proc. IEEE Int. Symp. Multimedia (ISM), December 2015, pp. 385–390. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [54].Sun M et al. , “eButton: A wearable computer for health monitoring and personal assistance,” in Proc. Des. Autom. Conf., June 2014, pp. 1–6, 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [55].Pouladzadeh P, Kuhad P, Peddi SVB, Yassine A, and Shirmohammadi S, “Food calorie measurement using deep learning neural network,” in Proc. IEEE Int. Instrum. Meas. Technol. Conf., May 2016, pp. 1–6. [Google Scholar]
- [56].Liao HC, Lim ZY, and Lin HW, “Food intake estimation method using short-range depth camera,” in Proc. IEEE Int. Conf. Signal Image Process. (ICSIP), August 2016, pp. 198–204. [Google Scholar]
- [57].Akpro Hippocrate EA, Suwa H, Arakawa Y, and Yasumoto K, “Food weight estimation using smartphone and cutlery,” in Proc. 1st Workshop IoT-Enabled Healthcare Wellness Technol. Syst., New York, NY, USA, June 2016, pp. 9–14. [Google Scholar]
- [58].McClung HL et al. , “Digital food photography technology improves efficiency and feasibility of dietary intake assessments in large populations eating ad libitum in collective dining facilities,” Appetite, vol. 116, pp. 389–394, September 2017. [DOI] [PubMed] [Google Scholar]
- [59].Hassannejad H, Matrella G, Ciampolini P, Munari ID, Mordonini M, and Cagnoni S, “A new approach to image-based estimation of food volume,” Algorithms, vol. 10, no. 2, p. 66, June 2017. [Google Scholar]
- [60].Doulah A et al. , “Meal microstructure characterization from sensor-based food intake detection,” Front. Nutr, vol. 4, p. 31, January 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [61].Sazonov E et al. , “Toward objective monitoring of ingestive behavior in free-living population,” Obesity, vol. 17, no. 10, pp. 1971–1975, October 2009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [62].Amft O, Kusserow M, and Troster G, “Bite weight prediction from acoustic recognition of chewing,” IEEE Trans. Biomed. Eng, vol. 56, no. 6, pp. 1663–1672, June 2009. [DOI] [PubMed] [Google Scholar]
- [63].Liu J et al. , “An intelligent food-intake monitoring system using wearable sensors,” in Proc. 9th Int. Conf. Wearable Implant. Body Sensor Netw. (BSN), June 2012, pp. 154–160. [Google Scholar]
- [64].Alshurafa N et al. , “Recognition of nutrition intake using time-frequency decomposition in a wearable necklace using a piezoelectric sensor,” IEEE Sensors J, vol. 15, no. 7, pp. 3909–3916, February 2015. [Google Scholar]
- [65].Mirtchouk M, Merck C, and Kleinberg S, “Automated estimation of food type and amount consumed from body-worn audio and motion sensors,” in Proc. ACM Int. Joint Conf. Pervas. Ubiquitous Comput., New York, NY, USA, February 2016, pp. 451–462. [Google Scholar]
- [66].Hezarjaribi N, Mazrouee S, and Ghasemzadeh H, “Speech2Health: A mobile framework for monitoring dietary composition from spoken data,” IEEE J. Biomed. Health Inform, vol. 22, no. 1, pp. 252–264, January 2017. [DOI] [PubMed] [Google Scholar]
- [67].Thong YJ, Nguyen T, Zhang Q, Karunanithi M, and Yu L, “Predicting food nutrition facts using pocket-size near-infrared sensor,” in 2017 39th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. (EMBC), May 2017, pp. 742–745. [DOI] [PubMed] [Google Scholar]
- [68].Fontana JM, Farooq M, and Sazonov E, “Automatic ingestion monitor: A novel wearable device for monitoring of ingestive behavior,” IEEE Trans. Biomed. Eng, vol. 61, no. 6, pp. 1772–1779, June 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [69].Sazonov ES and Fontana JM, “A sensor system for automatic detection of food intake through non-invasive monitoring of chewing,” IEEE Sensors J, vol. 12, no. 5, pp. 1340–1348, May 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [70].Sazonov E et al. , “Non-invasive monitoring of chewing and swallowing for objective quantification of ingestive behavior,” Physiol. Meas, vol. 29, no. 5, pp. 525–541, May 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [71].Farooq M and Sazonov E, “Automatic measurement of chew count and chewing rate during food intake,” Electronic, vol. 5, no. 4, p. 62, September 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [72].Farooq M and Sazonov E, “Segmentation and characterization of chewing bouts by monitoring temporalis muscle using smart glasses with piezoelectric sensor,” IEEE J. Biomed. Health Inform, vol. 21, no. 6, pp. 1495–1503, November 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [73].Farooq M, McCrory MA, and Sazonov E, “Reduction of energy intake using just-in-time feedback from a wearable sensor system,” Obesity, vol. 25, no. 4, pp. 676–681, February 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
- [74].Dong Y, Scisco J, Wilson M, Muth E, and Hoover A, “Detecting periods of eating during free-living by tracking wrist motion,” IEEE J. Biomed. Health Informat, vol. 18, no. 4, pp. 1253–1260, July 2014. [DOI] [PubMed] [Google Scholar]
- [75].Doulah A, Yang X, Parton J, Higgins J, McCrory M, and Sazonov E, “The importance of field experiments in testing of sensors for dietary assessment and eating behavior monitoring,” Proc. 40th Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., Honolulu, HI, USA, February pp. 1–10, 2018. [DOI] [PubMed] [Google Scholar]
