Skip to main content
Nicotine & Tobacco Research logoLink to Nicotine & Tobacco Research
. 2019 Nov 6;22(10):1883–1890. doi: 10.1093/ntr/ntz208

Wearable Egocentric Camera as a Monitoring Tool of Free-Living Cigarette Smoking: A Feasibility Study

Masudul H Imtiaz 1, Delwar Hossain 1, Volkan Y Senyurek 1, Prajakta Belsare 1, Stephen Tiffany 2, Edward Sazonov 1,
PMCID: PMC7542642  PMID: 31693162

Abstract

Introduction

Wearable sensors may be used for the assessment of behavioral manifestations of cigarette smoking under natural conditions. This paper introduces a new camera-based sensor system to monitor smoking behavior. The goals of this study were (1) identification of the best position of sensor placement on the body and (2) feasibility evaluation of the sensor as a free-living smoking-monitoring tool.

Methods

A sensor system was developed with a 5MP camera that captured images every second for continuously up to 26 hours. Five on-body locations were tested for the selection of sensor placement. A feasibility study was then performed on 10 smokers to monitor full-day smoking under free-living conditions. Captured images were manually annotated to obtain behavioral metrics of smoking including smoking frequency, smoking environment, and puffs per cigarette. The smoking environment and puff counts captured by the camera were compared with self-reported smoking.

Results

A camera located on the eyeglass temple produced the maximum number of images of smoking and the minimal number of blurry or overexposed images (53.9%, 4.19%, and 0.93% of total captured, respectively). During free-living conditions, 286,245 images were captured with a mean (±standard deviation) duration of sensor wear of 647(±74) minutes/participant. Image annotation identified consumption of 5(±2.3) cigarettes/participant, 3.1(±1.1) cigarettes/participant indoors, 1.9(±0.9) cigarettes/participant outdoors, and 9.02(±2.5) puffs/cigarette. Statistical tests found significant differences between manual annotations and self-reported smoking environment or puff counts.

Conclusions

A wearable camera-based sensor may facilitate objective monitoring of cigarette smoking, categorization of smoking environments, and identification of behavioral metrics of smoking in free-living conditions.

Implications

The proposed camera-based sensor system can be employed to examine cigarette smoking under free-living conditions. Smokers may accept this unobtrusive sensor for extended wear, as the sensor would not restrict the natural pattern of smoking or daily activities, nor would it require any active participation from a person except wearing it. Critical metrics of smoking behavior, such as the smoking environment and puff counts obtained from this sensor, may generate important information for smoking interventions.

Introduction

The World Health Organization (WHO) estimates seven million global deaths per year due to tobacco use.1 Research shows strong evidence of health consequences caused by cigarette smoking.2–4 In 2017, 14% of the adult population of the United States (34.3 million people) were reported as being active cigarette smokers.5 Though many people who smoke cigarettes attempt to quit, most of the attempts result in failure.6–8 Although there are multiple treatments available to help people quit smoking, the overall success rates of smoking cessation interventions are low. Accurate information about daily smoking is important for evaluating the effectiveness of smoking intervention methods and for generating a basic understanding of smoking behavior.9 Self-report methods have widely been used in clinical interventions for the estimation of daily cigarette consumption, but self-reports are prone to biases and maybe inaccurate over time.10,11 Electronic diaries have also been used by smokers to record each cigarette immediately after smoking, but regular smokers may not record many of the cigarettes they consume.12 Biomarker-based approaches may not provide accurate smoking estimates of nicotine exposure for light smokers.13 Puff topography devices14 can provide information about smoking frequency and puffing behavior in real-time; however, these devices require the user to remember and comply with instructions every time a cigarette is smoked, and, like diary methods, people may choose to not use the device for every cigarette.

More recently, modern technologies have been applied for the assessment of cigarette consumption. Body-worn sensors that monitor breathing, ECG, inertial activity, and hand-mouth proximity have been evaluated in an attempt to capture smoking-related behaviors.15–19 Smoking events have been recognized through changes in heart rate recorded by ECG sensors19,20 and through analyses of breathing patterns recorded by Respiratory Inductive Plethysmograph21–24 and acoustic sensors.25 Accelerometer, gyroscope, and compass have been employed to characterize smoking gestures from the movement of the wrist and arm.26,27 RF proximity sensors have been used to capture hand-to-mouth gestures for the recognition of smoking.28 However, these wearables may not be suitable at their current form or size for the lengthy data capture under free-living conditions.29 Hence, there is a need for alternative sensor methodologies to provide new insights into assessing smoking under free-living conditions.

Cigarette smoking is a visible event that requires the involvement of the hand and lips of the smoker. Recently, image/video analysis has been introduced for the detection of smoking30–35 to utilize technological advances in computer vision. From these video sequences, the reported methods generated smoking statistics by analyzing visual cues related to the smoker’s hand movements and face. However, these methods require either the smoker to be in the field of view of the camera, the installation of video cameras in all smoking locations, or having a portable video camera (eg, on a smartphone) pointed at the mouth during smoking. These restrictive methods do not allow for longer periods of observation under free-living conditions. In a different domain, such as free-living monitoring of eating,36,37 food preparation,38 fall detection,39 and behavior analysis,40,41 target events have been successfully captured through body-worn egocentric (with gaze- or body- aligned view) cameras such as Microsoft’s SenseCam,42 ear-worn Micro-Camera,43 eButton,44 EgoTracker,45 and so forth. These wearable cameras have been employed to assess dietary intake,43 the ingredient and nutrient measurement during food preparation,38 and precise detection and prevention of fall events.39

The current study evaluated the potential of wearable camera systems for capturing cigarette smoking under free-living conditions. The goal of the feasibility study was to develop and test a wearable camera-based sensor system that would reliably capture smoking behavior from a convenient body location and provide new insights on smoke-related measures such as smoking environment and social interactions during smoking. The research included (1) identification of the best position of sensor placement to ensure maximum capture of images related to cigarette smoking and (2) feasibility evaluation of the applicability of this wearable system as a “free-living smoking-monitoring” tool by testing participants smoking in their natural environments.

Methods

Wearable Sensor System

The proposed sensor system was designed with two major goals (1) capture high-quality, high-resolution digital images, and (2) ensure a wearable, lightweight, and miniature module that would not interfere with natural smoking behavior or daily activities.

This system featured a wide-angle (120°) 5-megapixel camera to capture digital images of 2592 × 1944 resolution (5MP) every second continuously up to 26 hours. This short capture interval of 1 second was selected considering the fact that a smoking event generally lasts for 8–10 minutes with 3–5 seconds per puff.29 The captured images were saved on a 32-GB micro SD card (capacity of ~36 hours or 1.5 days) with the timestamp of the image capture.

The camera interfacing circuitry and image storage unit were installed on a small 6.5 × 1.9 × 1.5 cm PCB and enclosed by a 3D printed plastic enclosure. The total weight of the system including the enclosure was measured as 11 g. A Li–Ion cylindrical battery of 2200 mAh capacity, 46 g weight, 69 mm length, and 18 mm diameter was used in this system as the power source and placed into another plastic enclosure to prevent mechanical damage while wearing.

Sensor Placement

Wearable cameras allow users to capture videos or images of interest and surroundings. They can be mounted to the frame of eyeglasses, clipped on a shirt pocket or collar, or even attached to a headband, wristband or waist-belt, and so forth.46 Images captured by the wearable camera may suffer from target loss (missing of the desired object in the scene), occlusions due to body parts, motion blur, or overexposure (white-looking or washed-out images) when facing the sun or light sources.46,47 A laboratory test was performed on the proposed system to identify the best sensor position on the body of a smoker that would be least susceptible to these factors. In this test, a volunteer smoker smoked five cigarettes with identical sensors simultaneously applied to five body locations while facing toward the mouth (shown in Figure 1). As the volunteer used his right hand as the dominant hand of smoking, three sensors were placed on the right side of the body and one on the left with (1) a sensor attached to the right temple of the eyeglass, 2) a sensor placed on the right shoulder, 3) a sensor placed in the middle of the chest, 4) a sensor placed on the inner side of the wrist of the right hand, and 5) a sensor placed on the left shoulder. With sensors applied, the volunteer smoked three cigarettes indoors and two cigarettes outdoors under the bright sun. The recorded image set was reviewed to identify the cigarette lighting images and smoking images from each sensor. The cigarette-lighting image and the smoking image were defined to be mutually exclusive such that a “cigarette lighting image” was defined as a captured image that showed a cigarette being lit by a cigarette lighter, and a “smoking image” was defined as a captured image that showed a cigarette was held in the hand or lips for puffing.

Figure 1.

Figure 1.

Camera placement test where cameras were placed on eyeglass temple (location 1), shoulder (locations 2 and 5), chest (location 3), and inner side of wrist (location 4). In location 1, the camera was gaze-aligned and attached with a double-sided acrylic tape. For locations 2–5, camera was facing toward mouth and attached with double sided body and fabric tapes.

Next, the blurry images and over-exposed images were identified from the captured images. The following statistics were also computed from each sensor such as:

  • The percentage of smoking images = (ns/n)*100

  • The percentage of blurry images = (nm/n)*100

  • The percentage of over-exposed images = (no/n)*100

where, ns = total number of smoking images captured over five smoking events, nm = total number of motion blur images captured over five smoking events, no = total number of over-exposed images captured over five smoking events, n = total number of images captured over five smoking events from the lighting of a cigarette to the end of the last puff of that cigarette.

Based on the results of this placement test (provided in Table 1), location 1 (the eyeglass temple) was selected for placement of the proposed sensor system (Figure 2) as the sensor placed on this eye-level location produced the greatest number of smoking images and the least number of blurry or overexposed images among all tested body locations.

Table 1.

Results of the Sensor Placement Test Involving Five Cigarette Smoking

Topics Position 1 Position 2 Position 3 Position 4 Position 5
Total number of smoking images 528 357 289 246 312
Percentage of smoking images in total captured (978 images) 53.9% 36.5% 29.5% 25.1% 31.9%
Total number of blurry images 41 78 58 96 87
Percentage of blurry images in total captured (978 images) 4.19% 7.97% 5.93% 9.81% 8.89%
Total number of overexposed or underexposed images 9 11 18 17 13
Percentage of overexposed or underexposed images in total captured (978 images) 0.93% 1.12% 1.84% 1.3% 1.32%

Positions 1–5 indicate the eyelevel, the shoulder of the dominant hand, middle of chest, wrist, the other shoulder, respectively.

Figure 2.

Figure 2.

Wearable system attached to the eyeglass temple with an external battery.

Free-Living Study

A human study, approved by the Institutional Review Board (IRB) at the University of Alabama, was performed to validate the effectiveness of the proposed wearable system under free-living conditions. The specific ethical guidance for a wearable camera research48,49 recommends that the researchers involved in wearable camera-based studies should strive to protect the rights, privacy, dignity, and well-being of those that they study; research should be (as far as possible) based on voluntary informed consent; personal information should be treated confidentially and participants anonymized unless they choose to be identified; research participants should be informed of the extent to which anonymity and confidentiality can be ensured in publication and dissemination and of the potential reuse of data. In order to be consistent with this ethical guidance, autonomy, confidentiality, and anonymity were considered while developing protocols for the smoking study.

Smokers were recruited through written announcements in the University of Alabama campus, fliers, emails, word of mouth, and so forth. The inclusion criteria of the study were (1) age 19–70 years, (2) current cigarette smokers with at least 1 year of smoking history, (3) willingness to follow study procedures, and (4) not currently pregnant. No interested smoker was denied participation in the study. Informed consent was obtained from each participant after explicitly describing the details of the sensor system, image storage, and review process. The participants also had the choice of releasing captured images for use in research publications. The recruited participants included nine males and one female with an average (±standard deviation) age of 31.90 ± 4.11 years (range: 24–40 years) and smoking history of 7.03 ± 3.66 years (range: 1–12 years). Self-reported cigarette consumption was 8.69 ± 4.71 per day (range: 2–20 per day). The mean CO level was 10 ± 3.21 ppm (range: 5–16 ppm). Among 10 participants, two reported smoking 1–3 cigarettes/day, three reported 4–7, two reported 8–12, and three reported 13–20 cigarettes/day. Participants visited the laboratory in the morning (7–8 am, before smoking their first-morning cigarette) for the measurement of expired air carbon monoxide (BreathCO Vitalograph device). Six participants were scheduled to perform the study on weekdays and four during weekends.

During the lab visit, the sensor system was attached to five participant’s eyeglasses and the remaining five participants were provided noncorrective eyeglasses. After the sensor was applied and the battery module inserted in the study-provided armband, participants left the lab and continued wearing the system until they went to sleep at night. No restriction was imposed on their smoking or other activities. During the study, participants smoked their cigarettes with their lighters.

Participants self-reported their smoking using a commercial cellphone application, aTimeLogger, customized to register the start–end of the smoking event. Participants were asked to count puffs of each cigarette and report the count on the cellphone immediately after completing smoking. Participants also described their smoking environment: smoking inside a building alone or with others, smoking outdoor alone or with others, and smoking inside a car.

In order to be consistent with the ethical guidance for wearable camera research,48,49 during the study, participants were asked to take off the sensor system where privacy could be expected (eg, restrooms); in uncomfortable situations such as sports and water activities; if requested by a person in his or her surroundings; and so forth. The sensor system was also expected to be removed while sleeping. To reduce the burden during this free-living study, the participants were not asked to report (in cellphone application) the time period of sensor system removal. However, the participants were advised not to smoke during the period of sensor system removal.

Following data collection, participants could review and remove any images from the sensor storage (SD card) before research personnel reviewed them. Participants were not asked to report the total number of deleted images or any details of events (except cigarette smoking) from which images were deleted. Participants declared that they did not remove any images during smoking events. Approved images from participants were moved to a password-protected Windows 10 computer and backed up to secure cloud storage. Cellphone annotations were also collected and further verified by participants.

After completion of the study, participants filled out a “Sensor Burden Assessment” Questionnaire and commented on whether they would agree to wear it for any further multi-day experiment. Participants received $20 remuneration for participation in the study.

Data Analysis

The following details were extracted from participants’ cellphone annotations: (1) the total number of cigarettes consumed during the study, (2) the total number of cigarettes consumed indoors, (3) the total number of cigarettes consumed outdoors, (4) the total number of cigarettes consumed alone, (5) the total number of cigarettes consumed with people, (6) the total number of cigarettes consumed inside a car, (7) the total number of puffs involved in smoking events, and (8) the mean number of puffs involved in each smoking event with their standard deviation.

Next, manual annotation of collected images was performed by two independent research associates to label the smoking images and cigarette-lighting images. As the smoking images and cigarette lighting images were mutually exclusive, they were labeled as “1” and “2”, respectively. The remaining images were labeled as “0”. The smoking environment was annotated with the same categories as self-report: smoking inside the building alone or with people, smoking outdoors alone or with people, and smoking inside the car. Cohen’s kappa coefficient was computed as the statistical measurement of inter-rater agreement. Images, where the reviewers differed in image annotation, were identified. In the presence of the third independent reviewer, both reviewers discussed the disagreements and re-annotated those images to maintain the consistency in image annotation.

The adherence to sensor wear could be measured from the accelerometer (based on periods of inactivity, the approach commonly used in actigraphy) and from the camera images (manual or automatic inspection of the captured scene). For the current study, the adherence was estimated by manual inspection of images. The consecutive images containing no changes in the captured scene were first marked and the time period of sensor system removal was estimated. In the future, this process could be automated by exploring image size, applying advanced computer vision methods, inspecting both accelerometer signals and images.

From the image annotation, cigarette-smoking events with specific body postures or activities were next identified. Static postures included sitting and standing; activities included walking or mixed activities (eg, sitting followed by walking or vice versa). Next, the total number of smoking images captured with good lighting conditions and low lighting conditions were computed from each smoking event. Here “an image with good lighting condition” was defined as a captured image where both the cigarette and the smoking environment were clearly distinguishable, and “an image with low lighting condition” was defined as a captured image where the cigarette or the smoking environment was not distinguishable because of the dark environment.

The categorical data of the smoking environment, obtained from images and self-reports, were statistically compared by Pearson’s chi-squared test. Puff counts were next obtained from images that showed a cigarette (held in hand) was moved toward the mouth and returned after smoke inhalation. The puff counts obtained from images were statistically compared with the self-reported counts by a paired sample t test with a 95% confidence interval.

Results

During five smoking events of the sensor placement test, each sensor captured a total of 978 images from the start of a cigarette lighting to the end of that cigarette. The results of the placement test are summarized in Table 1. During five smoking events, the sensor placed on the eyeglass temple produced the maximum number of images of smoking (528) which was 53.9% of the total capture. The cigarette was not evident in the remaining 450 (46.1%) images; those images were considered as nonsmoking images. The sensor placed on the eyeglass temple also produced the minimum amount of blurry (41) and overexposed (9) images, which were 4.19% and 0.93% of the total captured images, respectively. The sensor placed on the next best location (shoulder of the dominant hand of smoking) produced 357 smoking images (36.5% of total capture), 78 blurry images (7.97% of total capture), and 11 overexposed images (1.12% of total capture).

An instance of cigarette smoking captured from all cameras is provided in Supplementary Figure.

A summary of recordings during the free-living portion of the study is provided in Table 2. Participants smoked an average of 5 ± 2.3 cigarettes per day. Participants’ response to the “Sensor Burden Assessment” questionnaire is provided in Supplementary Table.

Table 2.

Overview of the Human Study

Parameters Self-report in cellphone Manual image observation
Total consumed cigarettes 50 50
Average (±SD) consumed cigarettes per participant 5 ± 2.3 5 ± 2.3
Total consumed cigarettes indoor 31 31
Total consumed cigarettes outdoor 19 19
Total smoking hours 8.7 h 8.7 h
Number of cigarettes smoked alone 37 37
Number of cigarettes smoked in a group 13 13
Number of cigarettes smoked in the car 7 7
Total number of puffs during smoking 413 428
Average (±SD) puffs per cigarette 8.7 ± 2.3 9.02 ± 2.5
The range of puffs per cigarette 4–17 4–17
Total captured images during the study N/A 286,245
Total captured smoking images N/A 816
Average smoking images per participants N/A 81.6
Average smoking images per cigarette N/A 17
Total captured cigarette lighting images N/A 119
Average cigarette lighting images per cigarette N/A 3
Number of cigarettes smoked while sitting N/A 23
Number of cigarettes smoked while standing or walking N/A 5
Number of cigarettes smoked in mixed activities (partially in sitting and partially in standing) N/A 22
Total smoking images in good lighting condition N/A 698
Total smoking images in poor lighting condition N/A 118

The image annotation process took ~54 and ~63 hours for reviewers one and two, respectively. Among 286,245 images, reviewers initially disagreed on 18 images annotated as smoking versus nonsmoking (kappa coefficient 0.99). Among 816 smoking images, there was disagreement on 47 images annotated as smoking indoor versus smoking outdoor (kappa coefficient 0.87) and 35 images annotated as smoking in a group or smoking alone (kappa coefficient 0.91). The reviewers later re-annotated those images to maintain the consistency in image annotation.

During the full-day study, the participants wore the sensor system for an average of 647 ± 74 minutes and removed the system for an average of 115 ± 48 minutes. The longest (continuous) time period of sensor removal was ~3.5 hours while a participant was taking a nap (the participant reported at the end of the study). All participants reported that they did not smoke during the period of sensor system removal.

Both reviewers identified a total of 50 smoking events (31 indoors and 19 outdoors) which was identical to self-report. The average cigarette consumption from the image annotation was estimated at 5 ± 2.3 cigarettes per day.

The chi-squared test indicated a significant difference between the smoking environment obtained from images and self-report, X2(2, N = 50) = 58.12, p < 0.001.

The t test showed a significant difference, t(49) = −2.54, p = 0.014, in the number of puffs identified through self-report (mean 8.7 ± 2.3 counts/participant) and manual annotation (mean 9.02 ± 2.5 counts/participant).

Discussion

This paper reports the development and feasibility of a novel camera-based sensor system for monitoring cigarette smoking under free-living conditions. The sensor system evaluated in this initial study was low-weight, miniature, and capable of capturing and recording high-resolution images every second. Moreover, the system battery supported full-day image collection (~26 hours) on a single charge. With the 32-GB micro SD memory, the system can store images up to ~36 hours (or 1.5 days of images) assuming the average image size of 250 KB. However, micro SD cards are commercially available with capacity up to 2 TB, potentially providing up to 90 days of storage capacity.

The camera of the sensor system featured a wide-angle lens to capture details of smoking events. The placement of the system at eye-level had better performance than other locations in terms of minimal target loss, the occurrence of motion blur, and over-exposure. The main reason for the low number of smoking images from other locations was that the cigarette was occluded by the wrist or fingers of the hand holding the cigarette. Also, sensors facing the mouth only captured cigarette images when the cigarette was moved toward the mouth for puffing, whereas placement at the eye level captured both the cigarette at the mouth and the cigarette held in the hand between puffs. The ultimate future goal of this research is to develop computer vision methods to automatically detect smoking events from the captured images. The more the cigarette is present in the scene, the more likely it is the detection of smoking events. Image captured from eye-level may have more significance than the images captured from other body parts. The high number of blurry images from other locations might have been due to the movement of the hand or torso itself where sensors were placed. The movement of clothing where sensors were attached might have also contributed to blurring.

During the sensor placement test, an average of 106 images was captured while smoking a single cigarette. Even though only 54% of these images (captured from eye-level, best location) contained the cigarette in the captured scene, the cigarette was reliably detected as it frequently appeared in images. Also, in the free-living study, there were no lost smoking events even though not all images were “smoking.” The reason that only some images contain cigarettes is that the cigarette was taken from the view of the camera when it was not in the proximity of the mouth or the person was not looking at his/her hands. These “non-smoking” images (not containing cigarettes) are equally important as “smoking” images (containing cigarettes) to understand the context of the smoking environment, smoking gesture, and activities performed during the smoking event.

In the placement test, sensor attachment to the eyeglass temple was used as the only eye-level placement. Other options from eye-level (such as a camera attached to a headband, cap, etc.) were not examined given the general usage of eyeglass in the everyday life of people (64% and 48.34% of the total people in the United States and Europe, respectively, use eyeglass in their daily life). Consequently, the attachment of the proposed system to the eyeglass temple would be more feasible than other eye-level options.

Examination of the performance of the camera system involved cigarette smokers of different ages, smoking history, and smoking frequency, as smoking patterns might vary across the type or level of smoking. During the study period, participants smoked cigarettes with different body postures, in different environments, with varied lighting conditions. Of the 50 captured smoking events, 23 occurred while sitting, 5 while standing or walking, and 22 while performing mixed activities. Regarding the environment, 31 cigarettes were smoked indoors and 19 outdoors. The system could capture detailed information from these smoking events, thereby achieving one of the prime goals of this study.

During the study, an average of 17 smoking images and 3 cigarette-lighting images were captured per cigarette and an average of 81.6 smoking images per person which might enable the analysis of the overall smoking behavior of an individual. The statistical analyses imply that the smoking environment may be better established through the review of captured images than self-reports. These images may complement other sensor systems to obtain a complete overview of individual smoking events.

The proposed camera-based system has certain advantages over other sensor systems as it provides a direct way (from images of the lit cigarettes) to detect smoking. Other sensors (eg, hand gesture, proximity, or breathing sensors) employ indirect methods of detection, where behavioral manifestations such as hand gestures and smoking-specific breathing patterns are used as proxy measures to detect smoking. In the manual inspection of the captured image of the free-living (and pilot) study, all smoking events were correctly detected, unlike off-the-shelf sensors, with a precision and a recall of 1. A smartwatch-based system for detecting cigarette smoking has recently been reported50 with a precision of 0.86 and a recall of 0.71 for recognizing smoking events. In another study, a combination of proximity and breathing sensor22 was reported with a precision of 0.90 and a recall of 0.90 for the same. The camera-based approach may even be used for capturing smoking ground truth data in free-living conditions to validate other passive measurement systems.

Smoking is a contextually embedded activity, but current methods for tracking smoking manifestations give only limited access to environmental variables that might control smoking behavior. Captured images provide information about the smoking environment (such as indoor/outdoor, solo/social smoking) not available with any other method. Unlike other sensor systems, the camera-based system could identify social interaction and locations that may promote smoking. This information related to activity and context during/immediately prior to smoking could play an important role in developing smoking intervention methods. The timestamp embedded in these images can provide additional information on the smoking time of day, duration, and frequency.

During the study, 37 cigarettes were smoked alone, and 13 were smoked socially. The visual content of the smoking image could be mined to infer smokers’ social association while smoking. As an alternative to GPS-based tracking (suggested in the study16), significant cues about smoking locations might be possible to extract from these captured images. This information could be helpful for smoking cessation or intervention methods to identify social interactions or locations that may promote smoking.

During the study, smokers self-reported an average of 8.7 ± 2.3 puffs per cigarette. The number of puffs, puff duration, or inter-puff interval are important variables for understanding the dynamics of cigarette smoking. The statistical analyses of manual puff count and self-reports shows that the self-report might be under-representing puff counts. The manual analysis of smoking images would likely provide a more accurate account than self-report data.

The current study primarily concerned with the feasibility of the wearable camera for the detection of smoking in naturalistic settings. The initial results support further evaluation of this approach with an expanded multi-day study involving participation from smokers with varying ages and demographic profiles. A major limitation of the current feasibility study is the low number of participants’ involvement. This small and uneven sample size may not represent all possible smoking persons. Only 1 of the 10 participants of the current study were female. The accuracy of the sensor system requires more testing with female participants having longer hairstyles to determine whether captured images would be obscured by longer hair.

Participants’ responses in the “Sensor Burden Assessment” review showed that most smokers were comfortable with wearing the device and interested in performing a multi-day experiment. They reported the wearable system neither restricted the way they moved or smoked, and they did not feel awkward wearing the system. On a scale of 0 to 10, participants provided ratings of 9.68 ± 0.24 on the Emotion questions (details provided in the Supplementary Table), 8.10 ± 0.94 on the Perceived Change, 8.91 ± 0.53 on the Anxiety questions, and 8.87 ± 0.19 on the Movement question. Previously, the acceptance of PACT 2.0 (a multisensory system for smoking monitoring) was differently evaluated by a general “acceptability questionnaire” where subjects rated 8.3 ± 0.31 out of 10 in terms of comfort in the free-living wearing of PACT 2.0 system for 24 hours. The acceptance of the proposed wearable camera-based system is comparable to the PACT 2.0 system. Acceptability of other sensor systems employed in smoking research has not been evaluated by any sensor burden assessment or acceptability questionnaire.

The proposed sensor system may assist a smoker to objectively learn about his or her smoking habit and identify environmental factors that may encourage cigarette smoking. The only requirement from the user perspective is to apply the eyeglass camera. This might be challenging for those who do not wear glasses. The free-living smoking study included five participants who did not use eyeglasses and were provided eyeglasses from the lab to attach the sensor system. In the “sensor burden assessment” review, they did not report any discomfort in using eyeglasses for a day or differed much in rating the sensor system. On the Emotion questions, participants with their own eyeglasses provided ratings of 9.9 whereas participants with lab provided eyeglasses provided 9.46; on the Perceived Change 8.32 and 7.88; on the Anxiety Questions 9.03 and 8.8; and on the Movement question 8.93 and 8.11, respectively.

This study evaluated the acceptance of the sensor system by the study participants, but no review was obtained from the people in contact with the participants. The sensor system needs to be tested across multiple days to evaluate whether the reception of the system might change as a consequence of nonparticipants’ reactions to the sensor system, concerns over privacy, or alterations in smoking habits. It would also be useful to directly assess the privacy concerns of people in contact with participants who are wearing the sensor system.

A limitation of the current study was that the identification of smoking events required manual inspection of captured images. The next step of this research would be the development of computer vision methods for automatic detection of smoking events, generation of related statistics, and performance comparison with concurrent systems. The application of advanced methodologies such as deep learning methods could be useful in this regard. However, a total of 118 images out of 816 smoking images were captured in low light conditions. Automatic identification of cigarettes (held in hand) on those images might be challenging for a computer algorithm. These challenges would be addressed in future research.

In summary, this research introduced a miniature egocentric wearable camera system for the detection of cigarette smoking. Initial testing identified that the best sensor placement was at the eye-level. Applicability evaluation of this system under free-living conditions indicated the proposed system could capture details of smoking images under a range of circumstances. The evaluation of the manual observation of the captured image suggested an analysis of captured images might provide important behavioral information about daily smoking.

Funding

Research supported in this publication was supported by the National Institute on Drug Abuse of the National Institutes of Health under Award Number R01DA035828. The content is solely the responsibility of the authors and does not necessarily represent official views of NIH.

Declaration of Interests

None declared.

Supplementary Material

ntz208_suppl_Supplementary_Figure
ntz208_suppl_Supplementary_Table

References

  • 1. Centers for Disease Control and Prevention. CDCTobaccoFree. Fast Facts. [Google Scholar]
  • 2. American Cancer Society. Harmful Chemicals in Tobacco Products. American Cancer Society. [Google Scholar]
  • 3. National Cancer Institute. Harms of Cigarette Smoking and Health Benefits of Quitting. [Google Scholar]
  • 4. Health CO on S and. Smoking and Tobacco Use; Fact Sheet; Tobacco-Related Mortality. Smoking and Tobacco Use. [Google Scholar]
  • 5. Health CO on S and. Smoking and Tobacco Use; Fact Sheet; Fast Facts. Smoking and Tobacco Use. [Google Scholar]
  • 6. Inc G.  Most U.S. Smokers Want to Quit, Have Tried Multiple Times Gallup.com.
  • 7. Gilpin EA, Pierce JP, Farkas AJ. Duration of smoking abstinence and success in quitting. J Natl Cancer Inst.  1997;89(8):572–576. [DOI] [PubMed] [Google Scholar]
  • 8. Khati I, Menvielle G, Chollet A, Younès N, Metadieu B, Melchior M. What distinguishes successful from unsuccessful tobacco smoking cessation? Data from a study of young adults (TEMPO). Prev Med Rep.  2015;2:679–685. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Panel TU and DG. Treating Tobacco Use and Dependence: 2008 Update. US Department of Health and Human Services; 2008. [Google Scholar]
  • 10. Patrick DL, Cheadle A, Thompson DC, Diehr P, Koepsell T, Kinne S. The validity of self-reported smoking: A review and meta-analysis. Am J Public Health.  1994;84(7):1086–1093. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Hatziandreu EJ, Pierce JP, Fiore MC, Grise V, Novotny TE, Davis RM. The reliability of self-reported cigarette consumption in the United States. Am J Public Health.  1989;79(8):1020–1023. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Klasnja P, Pratt W. Healthcare in the pocket: Mapping the space of mobile-phone health interventions. J Biomed Inform.  2012;45(1):184–198. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Benowitz NL, Iii PJ, Ahijevych K, et al.  Biochemical verification of tobacco use and cessation. Nicotine Tob Res.  2002;4(2):149–159. [DOI] [PubMed] [Google Scholar]
  • 14. Shihadeh A, Antonios C, Azar S. A portable, low-resistance puff topography instrument for pulsating, high-flow smoking devices. Behav Res Methods.  2005;37(1):186–191. [DOI] [PubMed] [Google Scholar]
  • 15. Sazonov E, Metcalfe K, Lopez-Meyer P, Tiffany S. RF hand gesture sensor for monitoring of cigarette smoking. IEEE; 2011:426–430. doi:10.1109/ICSensT.2011.6137014 [Google Scholar]
  • 16. Imtiaz M, Ramos-Garcia R, Senyurek V, Tiffany S, Sazonov E. Development of a multisensory wearable system for monitoring cigarette smoking behavior in free-living conditions. Electronics.  2017;6(4):104. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Nongpoh B, Ray R, Dutta S, Banerjee A. AutoSense: A framework for automated sensitivity analysis of program data. IEEE Trans Softw Eng.  2017;43(12):1110–1124. [Google Scholar]
  • 18. Echebarria ITU, Imtiaz SA, Mingxu Peng, Rodriguez-Villegas E. Monitoring smoking behaviour using a wearable acoustic sensor. Conf Proc IEEE Eng Med Biol Soc.  2017;2017:4459–4462. [DOI] [PubMed] [Google Scholar]
  • 19. Imtiaz MH, Senyurek VY, Belsare P, Tiffany S, Sazonov E. Objective detection of cigarette smoking from physiological sensor signals. In: 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2019:3563–3566. doi:10.1109/EMBC.2019.8856831 [DOI] [PubMed]
  • 20. Wattal S, Spear SK, Imtiaz MH, Sazonov E. A polypyrrole-coated textile electrode and connector for wearable ECG monitoring. In: 2018 IEEE 15th International Conference on Wearable and Implantable Body Sensor Networks (BSN). 2018:54–57. doi:10.1109/BSN.2018.8329657
  • 21. Lopez-Meyer P, Sazonov E. Automatic breathing segmentation from wearable respiration sensors. In: 2011 Fifth International Conference on Sensing Technology. 2011:156–160. doi:10.1109/ICSensT.2011.6136953
  • 22. Senyurek VY, Imtiaz MH, Belsare P, Tiffany S, Sazonov E. A comparison of SVM and CNN-LSTM based approach for detecting smoke inhalations from respiratory signal. In: 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2019:3262–3265. doi:10.1109/EMBC.2019.8856395 [DOI] [PubMed]
  • 23. Ramos-Garcia RI, Imtiaz MH, Sazonov E, Tiffany ST. Evaluation of RIP sensor calibration stability for daily estimation of lung volume. IEEE; 2017:1–5. doi:10.1109/ICSensT.2017.8304419 [Google Scholar]
  • 24. Ali AA, Hossain SM, Hovsepian K, Rahman MdM, Plarre K, Kumar S. mPuff:  Automated detection of cigarette smoking puffs from respiration measurements. IEEE; 2012:269–280. doi:10.1109/IPSN.2012.6920942 [Google Scholar]
  • 25. Cui J, Wang L, Gu T, Tao X, Lu J. An audio-based hierarchical smoking behavior detection system based on a smart neckband platform. In: Proceedings of the 13th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services. MOBIQUITOUS 2016. New York, NY: ACM; 2016:190–199. doi:10.1145/2994374.2994384 [Google Scholar]
  • 26. Senyurek V, Imtiaz M, Belsare P, Tiffany S, Sazonov E. Cigarette smoking detection with an inertial sensor and a smart lighter. Sensors. 2019;19(3):570. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Senyurek VY, Imtiaz MH, Belsare P, Tiffany S, Sazonov E. Smoking detection based on regularity analysis of hand to mouth gestures. Biomed Signal Process Control.  2019;51:106–112. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Sazonov E, Metcalfe K, Lopez-Meyer P, Tiffany S. RF hand gesture sensor for monitoring of cigarette smoking. In: 2011 Fifth International Conference on Sensing Technology. 2011:426–430. doi:10.1109/ICSensT.2011.6137014
  • 29. Imtiaz MH, Ramos R, Senyurek VY, Belsare P, Tiffany S, Sazonov E. Wearable sensors for monitoring of cigarette smoking in free living: A systematic review. Sensors.  2019;19(12). [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30. Gubbi J, Marusic S, Palaniswami M. Smoke detection in video using wavelets and support vector machines. Fire Saf J.  2009;44(8):1110–1115. [Google Scholar]
  • 31. Wu P, Hsieh J, Cheng J, Cheng S, Tseng S. Human smoking event detection using visual interaction clues. In: 2010 20th International Conference on Pattern Recognition; 2010:4344–4347. doi:10.1109/ICPR.2010.1056
  • 32. Pavlovic VI, Sharma R, Huang TS. Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Trans Pattern Anal Mach Intell.  1997;19(7):677–695. [Google Scholar]
  • 33. Brinkman MC, Kim H, Chuang JC, et al.  Comparison of true and smoothed puff profile replication on smoking behavior and mainstream smoke emissions. Chem Res Toxicol.  2015;28(2):182–190. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Zheng X, Wang J, Shangguan L, Zhou Z, Liu Y. Design and implementation of a CSI-based ubiquitous smoking detection system. IEEEACM Trans Netw.  2017;25(6):3781–3793. [Google Scholar]
  • 35. Zheng X, Wang J, Shangguan L, Zhou Z, Liu Y. Smokey: Ubiquitous smoking detection with commercial WiFi infrastructures. In: IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications. San Francisco, CA: IEEE; 2016:1–9. doi:10.1109/INFOCOM.2016.7524399 [Google Scholar]
  • 36. Gemming L, Utter J, Ni Mhurchu C. Image-assisted dietary assessment: A systematic review of the evidence. J Acad Nutr Diet.  2015;115(1):64–77. [DOI] [PubMed] [Google Scholar]
  • 37. Doulah A, Sazonov E. Clustering of Food Intake Images into Food and Non-food Categories. In: Rojas I, Ortuño F, eds. Bioinformatics and Biomedical Engineering. Vol 10208 Cham: Springer International Publishing; 2017:454–463. doi:10.1007/978-3-319-56148-6_40 [Google Scholar]
  • 38. Raber M, Patterson M, Jia W, Sun M, Baranowski T. Utility of eButton images for identifying food preparation behaviors and meal-related tasks in adolescents. Nutr J. 2018;17(1):32 https://www.ncbi.nlm.nih.gov/pubmed/29477143. Accessed March 15, 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Ozcan K, Velipasalar S, Varshney PK. Autonomous fall detection with wearable cameras by using relative entropy distance measure. IEEE Trans Hum-Mach Syst.  2016:1–9. doi:10.1109/THMS.2016.2620904 [Google Scholar]
  • 40. Doherty AR, Hodges SE, King AC, et al.  Wearable cameras in health: The state of the art and future possibilities. Am J Prev Med.  2013;44(3):320–323. [DOI] [PubMed] [Google Scholar]
  • 41. Doherty AR, Kelly P, Kerr J, et al.  Use of wearable cameras to assess population physical activity behaviours: An observational study. The Lancet.  2012;380:S35. doi:10.1016/S0140-6736(13)60391-8 [Google Scholar]
  • 42. Hodges S, Berry E, Wood K. SenseCam: A wearable camera that stimulates and rehabilitates autobiographical memory. Memory.  2011;19(7):685–696. [DOI] [PubMed] [Google Scholar]
  • 43. Pettitt C, Liu J, Kwasnicki RM, Yang GZ, Preston T, Frost G. A pilot study to determine whether using a lightweight, wearable micro-camera improves dietary assessment accuracy and offers information on macronutrients and eating rate. Br J Nutr.  2016;115(1):160–167. [DOI] [PubMed] [Google Scholar]
  • 44. Jia W, Li Y, Qu R, et al.  Automatic food detection in egocentric images using artificial intelligence technology. Public Health Nutr.  2018:1–12. doi:10.1017/S1368980018000538 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Nigam J, Rameshan RM. EgoTracker: Pedestrian tracking with re-identification in egocentric videos. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). Honolulu, HI: IEEE; 2017:980–987. doi:10.1109/CVPRW.2017.134 [Google Scholar]
  • 46. Mayol-Cuevas WW, Tordoff BJ, Murray DW. On the choice and placement of wearable vision sensors. IEEE Trans Syst Man Cybern - Part Syst Hum.  2009;39(2):414–425. doi:10.1109/TSMCA.2008.2010848 [Google Scholar]
  • 47. Shengyang Dai, Ying Wu. Motion from blur. In: 2008 IEEE Conference on Computer Vision and Pattern Recognition. Anchorage, AK: IEEE; 2008:1–8. doi:10.1109/CVPR.2008.4587582 [Google Scholar]
  • 48. Kelly P, Marshall SJ, Badland H, et al.  An ethical framework for automated, wearable cameras in health behavior research. Am J Prev Med.  2013;44(3):314–319. [DOI] [PubMed] [Google Scholar]
  • 49. Shipp V, Skatova A, Blum J, Brown M. The ethics of wearable cameras in the wild. In: 2014 IEEE International Symposium on Ethics in Science, Technology and Engineering. Chicago, IL: IEEE; 2014:1–5. doi:10.1109/ETHICS.2014.6893382 [Google Scholar]
  • 50. Skinner AL, Stone CJ, Doughty H, Munafò MR. StopWatch: The preliminary evaluation of a smartwatch-based system for passive detection of cigarette smoking. Nicotine Tob Res.  2019;21(2):257–261. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

ntz208_suppl_Supplementary_Figure
ntz208_suppl_Supplementary_Table

Articles from Nicotine & Tobacco Research are provided here courtesy of Oxford University Press

RESOURCES