Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2012 Apr 1.
Published in final edited form as: Eur J Clin Nutr. 2011 May 18;65(10):1156–1162. doi: 10.1038/ejcn.2011.75

Feasibility Testing of an Automated Image-Capture Method to Aid Dietary Recall

Lenore Arab 1, Deborah Estrin 2, Donnie H Kim 2, Jeff Burke 2, Jeff Goldman 2
PMCID: PMC3172367  NIHMSID: NIHMS288506  PMID: 21587282

Abstract

Background/Objectives

The accuracy of dietary recalls might be enhanced by providing participants with photo images of foods they consumed during the test period.

Subjects/Methods

We examined the feasibility of a system (Image-Diet Day) that is a user-initiated camera-equipped mobile phone that is programmed to automatically capture and transmit images to a secure website in conjunction with computer-assisted, multi-pass, 24-hour dietary recalls in 14 participants during 2007. Participants used the device during eating periods on each of the three independent days. Image processing filters successfully eliminated underexposed, over-exposed, and blurry images. Captured images were accessed by participants using the ImageViewer software while completing the 24-hour dietary recall on the following day.

Results

None of the participants reported difficulty using the ImageViewer. Images were deemed “helpful” or “sort of helpful” by 93% of participants. A majority (79%) of users reported having no technical problems, but 71% rated the burden of wearing the device as somewhat to very difficult, owing to issues such as limited battery life, self-consciousness about wearing the device in public, and concerns about the camera’s field of view.

Conclusion

Overall, these findings suggest that automated imaging is a promising technology to facilitate dietary recall. The challenge of managing the thousands of images generated can be met. Smaller devices with a broader field of view may aid in overcoming user’s self-consciousness with using or wearing the device

Keywords: dietary assessment, automated imaging, 24-hour recalls, food frequency questionnaire, camera phones, web-based assessment, wireless technology

INTRODUCTION

Quantitative, valid, and inexpensive approaches to dietary assessment of large cohorts are needed to fulfill the promise of nutrigenomic research. Recent findings suggest that 24-hour dietary recalls are likely to remain the most feasible method for gathering dietary intake information (Schatzkin et al., 2003; Schatzkin & Kipnis, 2004). Advantages of the 24-hour dietary recall over other measures, such as food frequency questionnaires, include its reliance on short-term, rather than long-term memory, and its relatively low respondent burden. Additionally, the 24-hour dietary recall is only moderately vulnerable to a telescoping effect, that is, the forward or backward displacement of the event in time (Janssen et al., 2006), is less likely to cause participants to modify their diets due to enhanced self-awareness (Wang et al., 2002; Kikunaga et al., 2007), and does not require tailoring to specific populations, or redesign and recalibration as the food supply changes. However, use of a single 24-hour dietary recall is likely not representative of an individual’s habitual diet, and the number of days of assessment necessary to obtain a stable estimate is food—and nutrient—dependent (Beaton et al., 1979). Further, recalls may be hindered by flawed short-term memory, resulting in unreported items (omissions) falsely-reported items (intrusions) (Smith, 1993; Baxter et al., 2006), and inaccurate reporting of amounts consumed (Neuhouser et al., 2008).

Web-based technology for acquiring multiple 24-hour dietary recalls is a demonstrated feasible approach towards improving dietary assessment (Arab et al., 2010) even though a threshold effect is apparent (Schatzkin et al., 2003). Further improvement may be afforded by coupling memory-enhancing technology with web-based recalls. The use of mobile phones with cameras for capturing food images as an alternative to the paper and pencil based dietary records has been tested by several groups, with most using self image-capture (Wang et al., 2006; Kikunaga et al., 2007; Six et al., 2010; Weiss et al., 2010), and others aimed toward automated image-capture and phone calling to trigger/remind participants to initiate recording (Sun et al.). Challenges associated with the attempt to use camera phones to replace participant-based reporting have several difficulties, including 1) uncertainty regarding third party or automated identification of foods in the images, 2) flawed decision-making regarding whether foods imaged were consumed by the participant or not (prepared for others, discarded), and 3) reduced reporting of consumption of socially undesirable items due to biased imaging.

The current study examines the use of a novel, cost-effective image-assisted recall method that combines automatic image capture (to reduce participant reactivity) with a web-based 24-hour dietary recall. The goal was to test the feasibility of this approach (pairing dietary images with the recall) to enhance dietary assessment. Mobile phone cameras equipped for automatic imaging were worn by participants during eating periods and images were uploaded to a secure website for use by participants during the reporting period the next day. Since the images are used directly by the participant, this approach differs in both concept and substance from the use of non-automated camera phone-based dietary assessment that require decoding of images by automation or a third party (Wang et al., 2006; Kikunaga et al., 2007; Swanson, 2008; Boushey et al., 2009; Six et al.; Weiss et al., 2010).

METHODS

System Components

System components are diagrammed in Figure 1. Nokia N80 mobile phones programmed with three mega pixel cameras, automatic flash, and a 4.7 mm lens (35 mm equivalent) were used for image capture. The mobile phones were attached to a lanyard and worn around the neck of the participant as seen in Figure 2 Application-specific power management techniques (Kansal, 2003; Raghunathan V, 2005; Raghunathan V, 2006) were used to balance power savings and required performance (Kansal, 2006.). Software developed by engineers at the UCLA Center for Embedded Networked Sensing, Los Angeles, CA (CENS) was employed to automatically record, time stamp, and transmit (via cellular service) encrypted data from sensors integrated in the phone (e.g., camera, and microphone) to the secure SensorBase Server repository. Images were captured every 10 seconds, allowing near-complete documentation of foods and beverages consumed throughout the target period covered in the 24-hour period and without minimal user intervention. Industry-standard encrypted transmission was used for web access to the repository and transmission of images between mobile phones and data storage servers. Images were filtered to remove blurry, poorly exposed, and unclear images (Bradski, 2000; Kovesi, 2010) and filed chronologically in clusters, based on “key” images, to facilitate use by participants (Figure 3). The total number of images presented to the participant was limited to fewer than 100. A customized web-based ImageViewer was developed to allow participants private access to their own images and the ability to delete objectionable images when desired (Figure 4). Images flagged as private (i.e., not shared) were immediately and permanently deleted from the repository. Images marked as shared were made available to investigators via a separate private image browser (Reddy et al., 2007). To further protect the privacy of participants, the image repository did not store any other personal identifier information with the images.

Figure 1. Overview of System Components, Processes and Data Flow.

Figure 1

The flow of image information in Image-DietDay comprises the following steps: 1) capture; 2) upload to the SensorBase Server; 3) storage in the data repository; 4) processing; 5) sequential clustering; 6) display on the Centrax web server; 7) deletion of user-identified private images, sharing of user-identified public images, and use of images by volunteers to complete 24-hour recalls; 8) compliance tracking by SensorBase, including frequency of uploading. An automated reminder system (9) prompts users to complete required actions.

Figure 2. Image of the Mobile Phone.

Figure 2

An image of participant wearing camera-embedded used in study.

Figure 3. Schematic of Image Processing in Image DietDay.

Figure 3

A) Schematic of image filtering and clustering, with removal of unusable or private images. B) Key images representing each cluster; C) Expanded views of all related images within a cluster.

Figure 4. Example of Presentation of Food Images for Subject Review.

Figure 4

A web-based ImageViewer was used by participants to access images after filtering; participants could then delete images deemed private from the public record.

Automated imaging carries inherent risks of invasions of privacy. Sensitivity to the risk of identification of study participants is needed in this area of research. Participants can be identified if an individual passes a reflective surface when the camera takes an image. The ability to identify someone from a hand which enters the camera lens field of vision also presents a risk. This type of risk requires that images be stored in datasets that are independent and not cross-linked to those with personal data. For this we developed levels of protection that involve participation and approval of the informed participant at each level. The participants were asked whether they wish to share any, some, or all images with study investigators. Any images marked as protected (i.e., not shared) was permanently deleted from the data storage server when the participant logged out of ImageDietDay. Image counts prior to deletion were maintained, along with the timestamps and sequence position of the deleted images to assist in understanding deletion patterns and intent. This was to address concerns regarding sanitation of food consumption images. “Shared” images were to be accessible to the investigators and their designees, who sign confidentiality agreements, via password-protected interfaces. Images that were never viewed by the participant (such as when the system determines them to be unclear and therefore unhelpful to display) were deleted. To further protect the privacy of participants, the image repository did not store personally identifiable information. This study received IRB approval for a three-step process: (1) automated image capture from the camera phone, (2) automatic, encrypted upload of images to a secure repository and removal from the device, (3) a private web portal for each participant that allowed them to review their images and permanently remove any that they did not wish to share with investigators. An independent issue is that of the images taken of casual bypassers. The general rule in the United States is that anyone may take photographs of whatever they want when they are in a public place. However, even on public property, photography can be prohibited if somebody has a “reasonable expectation of privacy”.

Pilot Feasibility Testing

Two independent pilot studies of the approach were conducted in August 2007: the first to test technical feasibility and the second to test subjective acceptability.

Technical Feasibility Study (n =10)

Ten young adult CENS employees participated in a technical feasibility exercise designed to test the ability of the system to handle uploaded images collected by simultaneous users. Participants wore and operated phones at all times including outside of the home, capturing six images per minute throughout the day. To test technical feasibility they work the phones for two to three days per person.

Acceptability and Human Feasibility Study

Participants in the acceptibility and feasibility study were a subset of the 261 generally healthy non-Hispanic Caucasian and African American adult men and women living in greater Los Angeles, California who had enrolled in the UCLA Energetics Study, a biomarker-based validation study designed to test the feasibility and validity of a multi-media, computer-based nutrition questionnaire, DietDay (www.24hrrecall.com) (Arab et al., 2010). To test feasibility of the automated image capture “Image-DietDay,” all participants scheduled to enter the Energetics Study in August and September of 2007 were invited to participate in a pilot study using the camera phone system to support 24-hour dietary recall reporting. Fourteen of the 16 eligible volunteered to participate.

Participants in the pilot study wore phones on a lanyard around the neck with the camera facing outward for approximately one week- the period of time between their first and last visit, which ranged from 6 to 10 days. The camera was turned on prior to all eating occasions over a 24-hour period. A self-administered exit questionnaire containing structured and open-ended questions was administered at the end of the testing period. The participants were aware that their role was to provide objective feedback on the practicality of this new technology. During the consent process, technical aspects of the system were explained and participants were informed that they would have the ability to delete images they did not want to become part of the study record. Participants also were instructed in using the images during dietary recall reporting with DietDay. The overall and pilot studies were approved by the UCLA Medical IRB and the UCLA General Clinical Research Center.

Energy Validation using Doubly Labeled Water

Total energy expenditure (TEE) was measured in these participants using the doubly-labeled water method (Subar et al., 2003) using isotope measurement methods described by Schoeller et al. (Schoeller, 1988) and Coward and Cole (Cole & Coward, 1992). Doubly-labeled water was administered in a volume of one-fourth to one-half cup at a dose of approximately 2 g of 10 atom percent 18O labeled water and 0.12 g of 99.9 atom percent deuterium labeled water per kilogram of measured body weight; subjects also consumed a 50-mL water rinse from the doubly-labeled water bottle. Spot urine samples for determination of isotope enrichment were collected prior to dosing and at 1, 2, 3, and 3–4 hours after dosing, and twice on the 15th day after dosing. Deuterium and 18O levels in urine samples were quantified by mass spectroscopy and the values were used for calculation of total energy expenditure according to the plateau method (Schoeller, 1992). All isotopic analyses were conducted at the University of Wisconsin (Madison, WI).

RESULTS AND DISCUSSION

In the acceptability and feasibility study, 14 of 16 randomly recruited volunteers consented to the Image-DietDay add-on study and received phones. The hardware platform and data transfer were found to be robust. No technical failures occurred during any of the uploading or processing periods. Image processing occurred in seconds. Battery power was an issue, as one battery per day was not always adequate for the entire day. Although most of the volunteers were technologically naïve, 79% reported having no technical problems with the device (Table 2). Across all users, 110 distinct eating episodes and a total of 11,090 images were uploaded. An average of 101 images were captured per eating episode, range (1 to 775), reflecting the diversity of shorter and longer eating episodes across participants. When asked about the burden of wearing a camera phone throughout the day, 29% said it was easy, 50% said it was somewhat difficult, 14% said it was difficult, and 7% said it was very difficult.

Table 2.

Responses from adults in the Acceptability and Human Feasibility Study After Wearing a Camera Phone to Capture Food Eating Occasions (n=14)

Response to Burden of Wearing Phone %
 Very Easy 0%
 Easy 29%
 Somewhat Difficult 50%
 Difficult 14%
 Very Difficult 7%
Response to Experience of Using System %
No Sort Of Yes
 Technical Problems 79% 0% 21%
 Images Helpful 7% 36% 57%
 Images Clear Enough 14% 57% 29%
 Comfortable with ImageViewer* 0% 21% 79%
*

The ImageViewer refers to the software employed by the subjects to view the images from their cameras while conducting their recalls.

General characteristics of the participants can be found in Table 1. The racial breakdown of participants was fairly even (43% African-Americans vs 57% Caucasians), but the gender distribution was significantly different (79% female vs 21% male). The mean age and body mass index (BMI) of participants averaged 35 and 27, respectively. Upon analysis, a very close match between reported intake values aided by mobile camera phone images for reference and measured intake values from doubly labeled water data of total energy expenditure was found (medians of 2359 reported vs. 2377 measured). These near identical values reveal the potential of the application of automated image-capture systems to supplement reporting of energy intake.

Table 1.

Descriptive Characteristics of Automated Image System Test Participants in the Acceptability and Human Feasibility Study.

Sample Characteristics (n = 14) Mean (SD) or Percentage Median (95% CI)
Race
 African-American 43%
 Caucasian 57%
Gender
 Females 79%
 Males 21%
Age, years 35 (12) 31 (24, 46)
BMI 27 (7) 25 (22, 29)
Energy Intake, 6 day DietDay average, Kcal 2711 (1225) 2359 (1937, 3034)
TEE, Kcal 2519 (609) 2377 (2059, 2657)

BMI = body mass index, TEE = total energy expenditure measured using doubly labeled water

In the current study, participants were offered the chance to provide open-ended comments anonymously at their exit interviews. The major themes noted in the open comment fields were related to the need to recharge the battery and self-consciousness about wearing the device in public, leading in some cases to changes in behavior such as eating out less often or more rapid eating. Concern that the camera was missing foods not consumed at a dining table (e.g., snacks eaten on the run or couch consumption) were also expressed. Several participants noted that the device was cumbersome to wear and that they would have preferred a smaller, less conspicuous camera. Future generations of phones that are lighter and smaller with a wide enough field of view to capture food under a variety of subject postures would enhance the system. An inexpensive camera phone “fish-eye” lens, which has been shown elsewhere to significantly increase the range of capture, might be a feasible alternative (Committee on Networked Systems of Embedded Computers, 2010).

In the technical feasibility study, all of the phones captured and transferred images as expected. Both the secure image repository and image processing algorithms handled the full load of incoming images without failure. All images were transferred completely. Image filtering was rapid, adding only a few seconds to the processing time. Identification mechanisms for eating episodes and image deletion functioned, but warranted additional design refinement to enhance user-friendliness and maximal use of the images. This includes the development of more sophistication in the software that manages the number of images presented and the ability to access supporting images that are not initially presented. Some phones required battery replacement during the trial, indicating need for more power than was provided by the batteries used if this imaging frequency were to be maintained or enhanced. Analyses of the frequency of imaging suggests that if the goal is to capture total diet, more frequent imaging is necessary (Arab & Winter, 2010).

Image-DietDay differs from several recently reported image-capture approaches (Wang et al., 2006; Kikunaga et al., 2007; Swanson, 2008; Boushey et al., 2009; Six et al.; Weiss et al., 2010) because the participants are using the autonomously collected images to help recall the foods that they ate during the target period. The image-assisted recall method tested in the current study should minimize confusion regarding identification of the food item, as the users in the current study did not perceive their own images as burdensome to view and nearly all (93%) of participants indicated that images were beneficial (helpful, 57%; sort of helpful, 36%); only 7% of participants indicated that the images were not helpful. Additionally, the images captured using Image-DietDay may be distinct from those captured when there is intent to focus imaging on foods consumed (Wang et al., 2006; Kikunaga et al., 2007; Swanson, 2008; Boushey et al., 2009; Six et al.). Japanese investigators found that the self-consciousness of food intake caused by the process of intentional imaging may dramatically impact normal eating behavior (Wang et al., 2006) and that enlisting people in actively imaging their consumption results in lower reporting of intakes (Kikunaga et al., 2007). However, the passive, automated image capture system used in the current study may reduce the likelihood that participants will alter their normal eating patterns in comparison to intentional imaging—a method in which the participants set up each picture and are depended upon to capture their usual diet using a camera, without change in food selection, elimination of unfavorable foods, and forgetting to or deciding not to include certain eating situations.

Automated imaging offers the potential for improving documentation of all foods and beverages consumed if the application can be designed to reduce self-awareness of the times of observation. The availability of time-stamped images can be used to support participants’ memories and help prevent telescoping of the time a food was consumed and the reporting of phantom foods (intrusions) that were not actually consumed during the target period, and serve as reminders to report forgotten foods (omissions) (Baxter et al., 2006). The cost-effectiveness of mobile phones complements the accessibility of the Internet as an interview and content-review tool, as they are widely available for data collection (Kovesi, 2010). This approach would transform off-the-shelf mobile phones and web-based services into a system for monitoring individuals’ dietary intakes. The use of individuals’ cell phones also allows the option of prompted action (e.g., turning on the camera, repositioning the camera, locating connectivity for data upload) either to trigger use or when data transfer issues is detected. Depending on user preference, these could be automated calls or text messages. This system can alert the investigator to subject use and compliance based on real-time statistics of uploaded data (e.g., the number, timing, and minimal quality threshold of images) and provide immediate feedback to the subject if there are no images recorded at expected consumption times or if the image quality is poor (blurry or dark), through an automated phone call or text message.

CONCLUSIONS

These pilot studies demonstrate the feasibility of the image-assisted recall method for dietary recall in a select volunteer population. While the applicability of the findings to a randomly selected population are uncertain, the study nevertheless demonstrates that the technical requirements of adequate power, placement, technical stability, and download and image processing time, as well as concern regarding privacy issues, can be met. Nearly all participants found the images helpful in reporting their dietary intakes from the previous day. This approach differs from other pilot studies using image capture in that the images are captured automatically after users initiate the camera and later the users themselves use the images to trigger memory of the foods that were consumed, eliminating the guesswork that would be required of third party reviewers. Strategies to reduce self-awareness, such as having a run-in period of camera use, maintaining a random assignment of the 24-hour dietary recall, and miniaturizing and camouflaging the camera are important areas of development. A validation study will be required to determine how camera wear impacts eating behavior, whether the addition of the images improves objective validity of assessment and to allow closer assessment of the impact of imaging on omission and intrusion rates normally observed in recall based dietary assessment methods.

Acknowledgments

This study was supported in part by National Institutes of Health Grant 5R01CA105048-04 and UCLA, General Clinical Research Centers Program Grant M01-RR000865. We acknowledge the outstanding contributions by the staff of the UCLA General Clinical Research Centers, in particular Joe Kim, Kellie Kutcher, and Ashley Winter, during the study conduct phase. We thank Mary Catherine Cambou, Martha Sensel, PhD, and Jasmine Yaxun Chen for manuscript preparation.

Footnotes

CONFLICT OF INTEREST

The authors declare no conflict of interest.

References

  1. Arab L, Wesseling-Perry K, Jardack P, Henry J, Winter A. Eight self-administered 24-hour dietary recalls using the Internet are feasible in African Americans and Whites: the energetics study. J Am Diet Assoc. 2010;110:857–864. doi: 10.1016/j.jada.2010.03.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Arab L, Winter A. Automated camera-phone experience with the frequency of imaging necessary to capture diet. J Am Diet Assoc. 2010;110:1238–1241. doi: 10.1016/j.jada.2010.05.010. [DOI] [PubMed] [Google Scholar]
  3. Baxter SD, Smith AF, Nichols MD, Guinn CH, Hardin JW. Children’s dietary reporting accuracy over multiple 24-hour recalls varies by body mass index category. Nutr Res. 2006;26:241–248. doi: 10.1016/j.nutres.2006.05.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Beaton GH, Milner J, Corey P, McGuire V, Cousins M, Stewart E, de Ramos M, Hewitt D, Grambsch PV, Kassim N, Little JA. Sources of variance in 24-hour dietary recall data: implications for nutrition study design and interpretation. Am J Clin Nutr. 1979;32:2546–2559. doi: 10.1093/ajcn/32.12.2546. [DOI] [PubMed] [Google Scholar]
  5. Boushey CJ, Kerr DA, Wright J, Lutes KD, Ebert DS, Delp EJ. Use of technology in children’s dietary assessment. Eur J Clin Nutr. 2009;63(Suppl 1):S50–57. doi: 10.1038/ejcn.2008.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bradski G. The OpenCV Library. Computer Security. Dr Dobb’s Journal of Software Tools. 2000;25:120–125. [Google Scholar]
  7. Cole TJ, Coward WA. Precision and accuracy of doubly labeled water energy expenditure by multipoint and two-point methods. American Journal of Physiology. 1992;263:E965–E973. doi: 10.1152/ajpendo.1992.263.5.E965. [DOI] [PubMed] [Google Scholar]
  8. Committee on Networked Systems of Embedded Computers NRC. Embedded, Everywhere. A Research Agenda for Networked Systems of Embedded Computers. Washington, D.C: National Academy Press; 2010. [Google Scholar]
  9. Janssen SM, Chessa AG, Murre JM. Memory for time: how people date events. Mem Cognit. 2006;34:138–147. doi: 10.3758/bf03193393. [DOI] [PubMed] [Google Scholar]
  10. Kansal A, Kaiser W, Pottie G, Srivastava MB. irtual high resolution for sensor networks. 4th International Conference on Embedded Networked Sensor Systems; 2006. pp. 43–56. [Google Scholar]
  11. Kansal AaSMB. An environmental energy harvesting framework for sensor networks. 2003. Proceedings of the 2003 International Symposium on Low Power Electronics and Design International Symposium on Low Power Electronics and Design; 2003. pp. 481–486. [Google Scholar]
  12. Kikunaga S, Tin T, Ishibashi G, Wang DH, Kira S. The application of a handheld personal digital assistant with camera and mobile phone card (Wellnavi) to the general population in a dietary survey. J Nutr Sci Vitaminol (Tokyo) 2007;53:109–116. doi: 10.3177/jnsv.53.109. [DOI] [PubMed] [Google Scholar]
  13. Kovesi P. School of Computer Science and Software Engineering. The University of Western; Australia: 2010. MATLAB and octave function for computer vision and image processing. [Google Scholar]
  14. Neuhouser ML, Tinker L, Shaw PA, Schoeller D, Bingham SA, Horn LV, Beresford SA, Caan B, Thomson C, Satterfield S, Kuller L, Heiss G, Smit E, Sarto G, Ockene J, Stefanick ML, Assaf A, Runswick S, Prentice RL. Use of recovery biomarkers to calibrate nutrient consumption self-reports in the Women’s Health Initiative. Am J Epidemiol. 2008;167:1247–1259. doi: 10.1093/aje/kwn026. [DOI] [PubMed] [Google Scholar]
  15. Raghunathan VGS, Srivastava MB. Emerging techniques for long-lived wireless sensor networks. IEEE Communications Magazine. 2006;44:108–114. [Google Scholar]
  16. Raghunathan VPC, Srivastava MB, Gupta RK. Energy-aware wireless systems with adaptive power-fidelity tradeoffs (abstr) IEEE Transactions on Very Large Scale Integration Systems. 2005;13 [Google Scholar]
  17. Reddy S, Parker A, Hyman J, Burk J, Estrin D, Hansen M. Image browsing, processing, and clustering for participatory sensing: Lessons from a DietSense prototype; Proceedings of Embedded Networked Sensors.2007. [Google Scholar]
  18. Schatzkin A, Kipnis V. Could exposure assessment problems give us wrong answers to nutrition and cancer questions? J Natl Cancer Inst. 2004;96:1564–1565. doi: 10.1093/jnci/djh329. [DOI] [PubMed] [Google Scholar]
  19. Schatzkin A, Kipnis V, Carroll RJ, Midthune D, Subar AF, Bingham S, Schoeller DA, Troiano RP, Freedman LS. A comparison of a food frequency questionnaire with a 24-hour recall for use in an epidemiological cohort study: results from the biomarker-based Observing Protein and Energy Nutrition (OPEN) study. Int J Epidemiol. 2003;32:1054–1062. doi: 10.1093/ije/dyg264. [DOI] [PubMed] [Google Scholar]
  20. Schoeller DA. Measurement of energy expenditure in free-living humans by using doubly labeled water. J Nutr. 1988;118:1278–1289. doi: 10.1093/jn/118.11.1278. [DOI] [PubMed] [Google Scholar]
  21. Schoeller DA. Isotope dilution methods. In: Borntorp P, Brodoff BN, editors. Obesity. Philadelphia, PA: Lippincott Co; 1992. pp. 80–88. [Google Scholar]
  22. Six BL, Schap TE, Zhu FM, Mariappan A, Bosch M, Delp EJ, Ebert DS, Kerr DA, Boushey CJ. Evidence-based development of a mobile telephone food record. J Am Diet Assoc. 2010;110:74–79. doi: 10.1016/j.jada.2009.10.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Smith AF. Cognitive psychological issues of relevance to the validity of dietary reports. Eur J Clin Nutr. 1993;47(Suppl 2):S6–18. [PubMed] [Google Scholar]
  24. Subar AF, Kipnis V, Troiano RP, Midthune D, Schoeller DA, Bingham S, Sharbaugh CO, Trabulsi J, Runswick S, Ballard-Barbash R, Sunshine J, Schatzkin A. Using intake biomarkers to evaluate the extent of dietary misreporting in a large sample of adults: the OPEN study. Am J Epidemiol. 2003;158:1–13. doi: 10.1093/aje/kwg092. [DOI] [PubMed] [Google Scholar]
  25. Sun M, Fernstrom JD, Jia W, Hackworth SA, Yao N, Li Y, Li C, Fernstrom MH, Sclabassi RJ. A wearable electronic system for objective dietary assessment. J Am Diet Assoc. 2010;110:45–47. doi: 10.1016/j.jada.2009.10.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Swanson M. Digital photography as a tool to measure school cafeteria consumption. J Sch Health. 2008;78:432–437. doi: 10.1111/j.1746-1561.2008.00326.x. [DOI] [PubMed] [Google Scholar]
  27. Wang DH, Kogashiwa M, Kira S. Development of a new instrument for evaluating individuals’ dietary intakes. J Am Diet Assoc. 2006;106:1588–1593. doi: 10.1016/j.jada.2006.07.004. [DOI] [PubMed] [Google Scholar]
  28. Wang DH, Kogashiwa M, Ohta S, Kira S. Validity and reliability of a dietary assessment method: the application of a digital camera with a mobile phone card attachment. J Nutr Sci Vitaminol (Tokyo) 2002;48:498–504. doi: 10.3177/jnsv.48.498. [DOI] [PubMed] [Google Scholar]
  29. Weiss R, Stumbo PJ, Divakaran A. Automatic food documentation and volume computation using digital imaging and electronic transmission. J Am Diet Assoc. 2010;110:42–44. doi: 10.1016/j.jada.2009.10.011. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES