Abstract
Air pollution exposure characterization has been shaped by many constraints. These include technologies that lead to insufficient coverage across space and/or time in order to characterize individual or community-level exposures with sufficient accuracy and precision. However, there is now capacity for continuous monitoring of many air pollutants using comparatively inexpensive, real-time sensors. Crucial questions remain regarding whether or not these sensors perform adequately for various potential end uses and whether performance varies over time or across ambient conditions. Performance scrutiny of sensors via lab- and field-testing and calibration across their lifetime is necessary for interpretation of data, and has important implications for end users including cost effectiveness and ease of use. We developed a comparatively lower-cost, portable, in-home air sampling platform and a guiding development and maintenance workflow that achieved our goal of characterizing some key indoor pollutants with high sensitivity and reasonable accuracy. Here we describe the process of selecting, validating, calibrating, and maintaining our platform – the Environmental Multi-pollutant Monitoring Assembly (EMMA) – over the course of our study to-date. We highlight necessary resources and consider implications for communities or researchers interested in developing such platforms, focusing on PM2.5, NO, and NO2 sensors. Our findings emphasize that lower-cost sensors should be deployed with caution, given financial and resource costs that greatly exceed sensor costs, but that selected community objectives could be supported at lesser cost and community-based participatory research strategies could be used for more wide-ranging goals.
Keywords: lower-cost sensor technology, indoor air pollution, sensor calibration, citizen science, air pollution monitoring
Graphical abstract
1.0. Introduction
1.1. Use of lower-cost sensors
Measurement of air pollutant concentrations at refined spatiotemporal scales can serve many purposes, including regulatory compliance, exposure assessment, epidemiology, and community use. Characterization of environmental exposures has been shaped by logistical, technological, and financial constraints, which has typically resulted in passive or integrated monitoring campaigns that do not allow for finer-scale pollution measurements that would facilitate source attribution or identify short-term events relevant to health (Yu et al., 2008). Recent advances in availability of lower-cost portable devices measuring noise, thermal conditions, and air pollutants have provided new opportunities to transform the landscape of exposure assessment (MacDonell M, 2014). There is now capacity for continuous monitoring of many air pollutants, including nitrogen dioxide (NO2) and fine particulate matter with an aerodynamic diameter of ≤2.5 microns (PM2.5), using comparatively inexpensive, real-time sensors. While these sensors have been utilized for some time in occupational settings, only recently have they met criteria for accuracy/precision and become cost-effective enough to sample at many locations simultaneously. Lower-cost sensor technology allows for abundant, fast, and widely-distributed real-time measurements that are amenable to the “citizen science” movement. “Low-cost” has been identified by the U.S. EPA as devices costing USD$<2,500; this is the limit often defining capital investment limits by citizens (Morawska et al., 2018). These devices come with challenges involving performance under realistic environmental sampling conditions, which can limit accuracy and interpretability of data for stakeholders (Cross et al., 2017). Also, methods to ensure data accuracy and interpretability often require technical resources (e.g., calibration equipment, computer software) inaccessible to individuals or communities.
The process of selecting sensors and performance attributes depends on study goals, in terms of priority measures as well as accuracy of measurements. Identifying air pollutant concentration exceedances of regulatory standards (e.g., U.S. National Ambient Air Quality Standards) or classifying concentrations as “high” or “low” may be sufficient, but in other cases, accurate continuous measurements at low concentrations may be necessary and warrant specific criteria. Additional study goals may include: i) information and education, ii) supplementary network monitoring, and iii) personal exposure monitoring (Williams et al., 2014b) (Table S1). For researchers and communities, the priority should be to maximize data quality for a predefined end use while considering cost, maintenance, and other constraints.
1.2. Study goals
As part of the Center for Research on Environmental and Social Stressors in Housing across the Life Course (CRESSH), our study team developed the Environmental Multi-pollutant Monitoring Assembly – EMMA – for an in-home study investigating air pollution exposure disparities across various housing types in two environmental justice communities of Chelsea and Dorchester, MA, USA. Measurements are being used to identify key determinants of indoor pollution exposures (e.g., activity patterns, source usage, ambient outdoor concentrations) within small spatial scales (e.g., apartments) to ultimately inform targets for mitigation. Thus, our sensor selection process necessitated fine-scale continuous measurements with high sensitivity and reasonable accuracy, at a sufficiently low cost to allow for numerous homes to be simultaneously monitored. EMMA was developed to measure and assist in achieving the aforementioned study goals. Here we describe the process of selecting, validating, calibrating, and maintaining our sensor platform over the course of our study to-date and provide guidance for others who seek to develop and deploy such platforms.
2.0. Materials and Methods
2.1.1. Package requirements
Design considerations for EMMA included: i) a small physical footprint that allows for inconspicuous placement in a main living space; ii) low operating noise profile; iii) limited electricity consumption; iv) aesthetically pleasing; v) security for internal components; and vi) ease of data upload, retrieval, and adjustments. We developed the depicted workflow diagram in the graphical abstract in order to select sensors to meet our study goals, taking into consideration our objective to determine exposure drivers as well as the scientific research community’s movement away from integrated towards real-time air pollution measurements. Our sensor selection involved other key considerations: i) review of current literature up to June 2016; ii) guidance from colleagues; iii) study goals; iv) performance parameters; v) budgetary constraints; vi) size; and vii) power requirements.
The final sensors chosen as part of EMMA are as follows, with details of pollutant measured, temporal resolution, and units (Figure 1):
Carbon monoxide (CO): Alphasense COB4 (1-minute, ppm)
Carbon dioxide (CO2): Netatmo weather station (5-minute, ppm)
Nitric oxide (NO): Alphasense NOB4 (1-minute, ppb)
Nitrogen dioxide (NO2): Alphasense NO2B43F (1-minute, ppb)
Particulate matter <2.5μm (PM2.5): Alphasense OPC-N2 particle monitor (1.4-second, μg/m3), Harvard miniPEM (integrated, μg/m3)
Temperature: Netatmo weather station (5-minute, °C) and Onset Air/Water/Soil Temperature sensor (1-minute,°C)
Relative humidity (RH): Netatmo weather station (5-minute, %)
Noise: Netatmo weather station (5-minute, decibels [dB]).
Figure 1.
Internal (1l) and external (1r) components of EMMA. Note: MicroPEM™ removed following validation and MicroPEM™ hole was capped.
Details surrounding pollutants of interest and chosen sensors are discussed in the following sections.
2.1.2. Pollutants of interest
Given our interest in the indoor environment, health implications of air pollution exposures, and real-time sensor availability, our primary sensors of interest measure PM2.5, NO, NO2, CO, and CO2. Extensive literature supports the relationship of PM2.5 exposure with adverse health effects (Shah et al., 2013, Cohen et al., 2017, Dominici, 2006, Ito et al., 2011). Notable indoor sources of PM2.5 include smoking, cooking, and candle burning (Evans et al., 2008, Lanki et al., 2007, Logue et al., 2014, Russo et al., 2014, Spilak et al., 2014). NO, NO2, and CO emissions have been associated with gas stove usage and other indoor combustion sources and adverse health outcomes (Kornartit et al., 2010, Faustini et al., 2014, Logue et al., 2014, Townsend and Maynard, 2002, Raub et al., 2000, Brauer et al., 1990, Franklin et al., 2006, Goldstein et al., 1986, Dennekamp et al., 2001, Bernstein et al., 2008). CO2 is a valuable indicator of ventilation and has been directly associated with cognitive performance in poorly ventilated indoor spaces (Allen et al., 2016). We were also interested in characterizing noise, temperature, and RH to understand thermal comfort and predictors of indoor pollutant concentrations. This manuscript focuses predominantly on results from PM2.5, NO, and NO2 sensors due to the more resource-intensive processes encountered in their selection, maintenance, and testing, though information for CO and methods for CO2 sensors can be found in the Supplemental Information (SI) document (Tables S2&S3).
2.2. Real-time PM sensor selection process
For PM2.5 sensor consideration, size and cost were contributing factors, along with sensor performance at the relatively low concentrations expected in our study domain. Boston ambient PM2.5 concentrations in the year prior ranged from <1 to approximately 23μg/m3 (MassDEP, 2015), with indoor concentrations >100μg/m3 anticipated in the presence of secondhand smoke (He et al., 2004, Tunno et al., 2015, Dacunto et al., 2013). The Dylos DC1100 particle counter, an optical device priced at approximately USD$300, was excluded due to size – too large to encase in a small- to medium-sized sampling box at 0.19m(H)x0.13m(W)x0.09m(D). In addition, location of the particle inlet on the bottom rear did not align with our sampling box configuration. Dylos does not have an internal algorithm for PM2.5 concentration, which we did not have the resources (i.e., calibrated reference instrument) to generate. The Thermo pDR-1500 was priced too high at USD$5,000+ and found to exhibit systematic and proportional biases when compared with colocated personal modular impactors (Wang et al., 2016). The Speck, an optical particle sensor from Carnegie Mellon University priced at approximately USD$150, contains an inlet on the bottom which prevented easy enclosure into a sampling box, but was poorly correlated at the 5-minute level with the Grimm Model EDM180 PM2.5 reference monitor (R2=<0.01; R2<0.1) (Williams et al., 2015). Additional sensors were ruled out as candidates for incorporation into EMMA often because of cost or physical characteristics (see Table S4).
We selected the OPC-N2, as it has been shown to correlate well with GRIMM and TSI PM reference instruments (Alphasense, 2017, Sousan et al., 2016, Mukherjee et al., 2017). The South Coast Air Quality Management District (SCAQMD) completed raw data comparisons with two federal reference methods (FRM): OPC-N2 (n=3) versus Grimm (R2>0.66) (5-minute average) and OPC-N2 (n=3) versus BAM (R2>0.76) (24-hour average) (SCAQMD, 2015a). Table S5 shows selected specifications. The manufacturer provides an algorithm to associate particle counts with PM2.5 concentration, for which we could normalize using a gold-standard filter-collected PM2.5 measurement. Dimensions of the OPC-N2 at 0.08m(L)x0.06m(W)x0.06m(H) make it suitable for placement into a small box, which we determined to be no larger than 0.25m(L)x0.27m(W)x0.11m(H) so that it could likely be incorporated into residences’ main living spaces.
2.2.1. Alphasense OPC-N2 particle monitor evaluation
Although the OPC-N2 met our basic criteria, lab- and/or field-based testing, calibration, or normalization is necessary prior to field use, with normalization necessary in order to relieve some concern regarding biasing of real-time data. To normalize OPC-N2 data, EMMA included a pump-and-filter system consisting of a Harvard mini personal environmental monitor (miniPEM) and Schwarzer rotary vane pump with skew wound motor. The miniPEM consists of a greased inlet-impactor plate to remove coarse particles and minimize particle bounce, and is fitted with a 37mm Teflon filter (Pall Life Sciences), supportive drain disc, and cellulose support pad to collect PM2.5 (Demokritou et al., 2002). Set flow rate was 0.8 liters per minute (LPM), recorded using a microelectromechanical mass flow sensor (Omron Corporation model D6FP0010A1, Kyoto, Japan) connected via tubing between air intake and filter. MiniPEMs have been shown to agree with SKC personal environmental monitors (Suh and Zanobetti, 2010), so we paired each OPC-N2 with a miniPEM and used the following equation to correct real-time data, similar to others (Chong et al., 2015):
[Equation 1] |
Pre- and post-field sampling, filters used in miniPEMs were equilibrated in a temperature- and RH-controlled weighing environment (20–23°C, 30–40%RH) for 48 hours, then weighed using a microbalance (Mettler Toledo model MT5). For every weighing session, 0.0001 and 0.0002kg standard weights from calibration procedures were used, and a test blank weighed at the beginning and end of each session for quality assurance and quality control (QA/QC). PM2.5 concentrations were calculated using mass concentration, time, and volume of air sampled.
2.2.2. RTI MicroPEM™ evaluation
As no FRM monitors were available to compare with OPC-N2s in the field, we utilized the RTI MicroPEM™ (specifications shown in SI). Preliminary results from SCAQMD’s raw data comparisons of the MicroPEM™ with two FRMs was promising: MicroPEM™ (n=3) versus Grimm showed R2>0.81 (5-minute average) in concentrations ranging from 1–100μg/m3; MicroPEM™ (n=2) versus BAM (24-hour average) revealed R2>0.90 (SCAQMD, 2015b). Due to research indicating unstable measurements at concentrations <3μg/m3 for MicroPEM™ and known limitations described by RTI, OPC-N2 values matched with MicroPEM™ measurements <3μg/m3 were removed from field data comparisons (n=5,662 of 99,884 total 1-minute data points removed –5.6% of data – prior to 5-minute averaging) (Sloan et al., 2016).
2.2.3. PM lab testing
Two PM sensor testing campaigns were conducted in open-air, ambient lab conditions of approximately 23.9°C(75°F) and 30%RH, well within operating range for all devices. With no calibrated reference instruments available for comparison, the TSI SidePak™ AM510 Personal Aerosol Monitor was utilized to compare trends in PM concentration, as it has been widely-used for determination of both indoor and outdoor PM concentrations (Klepeis et al., 2007, Hyland et al., 2008, Richmond-Bryant et al., 2009, Park et al., 2009, Laumbach et al., 2009, Jiang, 2011). In March 2016, the SidePak™ and test OPC-N2 were colocated for one week. SidePak™ data were collected at a 1-minute level at 1.7LPM and OPC-N2 at 1.4-second level at 1.2LPM; 1-hour averages were compared. Notably, the SidePak™ was calibrated against a gravimetric reference using the respirable fraction of standard ISO 12103–1 (Arizona Test Dust), therefore setting SidePak™ optical measurements to this specific aerosol so measurements assume the same density, size distribution, and refractive index as test dust (Jiang, 2011). The only known external PM source, a freshly-popped bag of popcorn, was introduced to colocated sensors at intervals to observe responses, as such acute events may be encountered in indoor environments. Moving forward, this test will be referred to as “PM source test”. The 1-week average for the SidePak™ was used to generate a correction factor for the OPC-N2:
[Equation 2] |
One-hour averaged data were then compared to examine root-mean-square error (RMSE) and percent error between the SidePak™ and OPC-N2.
A test OPC-N2 was colocated with a MicroPEM™ later in March 2016 for the purpose of comparing trends in PM concentration of raw data (i.e., no correction factor applied) in the same ambient lab conditions of approximately 23.9°C(75°F) and 30%RH. Sample flow rate for the OPC-N2 was 1.2LPM and 0.5LPM for MicroPEM™. The PM source test was employed here.
After receiving OPC-N2s needed for field data collection (n=16), all were lab-colocated and response to PM source test observed. No calibrated reference methods were available for comparison for lab testing, and no chamber testing with different particle types or compositions was completed due to limited resources. See Table S6 for complete summary of tested PM sensors.
2.2.4. PM field testing
Multiple 1-week sessions of colocated sampling (n=9) were completed in indoor environments of different types of residential buildings. As part of EMMA, raw OPC-N2 data were collected every 1.4-seconds at 1.2LPM, while raw MicroPEM™ data were collected every 10 seconds at 0.5LPM. Both were paired with respective filters for PM2.5 mass collection. Temperature and RH data were also collected (temperature at 1-minute and 5-minute level, RH at 5-minute level). Each matching dataset was compared if it included 4+ days of data (approximately n=1,200 data points) to match sampling criteria (i.e., acceptable sample defined as 50% of 1-week sample). MicroPEM™ data normalized with its 25mm Teflon filter by RTI collaborators were compared to OPC-N2 data normalized with its 37mm miniPEM Teflon filter. Though data were collected on cooking, smoking, and cleaning behaviors, source specific effects on PM2.5 are not discussed here.
2.3. Gas sensor selection
Our sampling domain includes residences with hypothesized indoor concentrations of NO and NO2 ranging from 5–200ppb, accounting for contribution of outdoor air as well as spikes greater than 100ppb in cases where gas stoves are used in under-ventilated conditions (EPA, 2017b). For the year prior in our domain (2015), outdoor NO2 ranged from <5–44ppb (MassDEP, 2015). We hypothesized CO concentrations ranging from 0.1–2.5ppm, though homes without adequate ventilation can reach 0.5–5ppm and those with poorly-functioning natural gas stoves can reach levels ≤30ppm (EPA, 2017a). Due to health effects associated with exposure to these pollutants across a range of concentrations and the desire for source characterization for all home types, it is imperative to quantify low concentrations (Kampa and Castanas, 2008).
Gas sensor selection was based upon consideration of the nature and sources (or lack thereof) of these pollutants in indoor environments. Findings from U.S. EPA’s 2014 NO2 chamber experiments were considered, which determined that the CitiSense air quality monitoring device, which included an Alphasense A-series NO2 sensor, showed linearity over concentrations of 0–200+ppb NO2 compared to the Thermo Model 42C NO/NO2/NOx Analyzer FRM (R2>0.97) (Williams, 2014). Alphasense COB4, NOB4, and NO2B43F sensors were incorporated into a UPODembedded systems platform, with findings suggesting long-term stability suitable for ambient monitoring applications (Masson et al., 2015). Further, an update to the NO2B43F revealed minimal cross-sensitivity to NO or ozone (O3), a significant improvement to humidity transient performance, and better long-term sensitivity stability (Alphasense, 2016b). Metal oxide sensors, another low-cost sensor type, were ruled out because of instability and lower selectivity (Mijling et al., 2017). Therefore, we selected Alphasense COB4, NOB4, and NO2B43F sensors (specifications shown in Tables S2,S7-S8).
2.3.1. Alphasense electrochemical (EC) gas sensor set-up
Our final EMMA design contains COB4, NOB4, and NO2B43F sensors arranged in a horizontal line connected via steel pipe nipples with an inlet to the environment (Figure 1), drawing a low flow of air (0.3LPM) across sensor surfaces to increase sensitivity as compared to passive sampling (Masson et al., 2015). Each sensor was mounted to a manufacturer-provided individual sensor board (ISB) to guarantee a low-noise environment and optimal sensor resolution at low ppb concentrations (Mijling et al., 2017). Studies have found agreement between COB4, NOB4, and NO2B43F sensors and reference measurements, either in-lab or infield with calibration and corrections applied (Mijling et al., 2017, Castell et al., 2017, Cross et al., 2017, Sun et al., 2016). In general, EC sensors have been shown to exhibit temperature and RH dependence at extreme values (e.g., temperature<10°C and >30°C; RH<15% and >85%) (Mijling et al., 2017). Some data identify a relationship between temperature and RH interference for sensors, but guidance on adjustments is not always available (Williams et al., 2014a, Williams, 2014). We performed in-lab temperature testing on select COB4, NOB4, and NO2B43F sensors to determine the need for adjustment at certain temperatures. For a summary of EC gas sensors tested, see Table S9.
2.3.2. EC gas sensor calibration process
In general, EC sensors require frequent lab calibrations to ensure accurate measurements and optimal data interpretation (Cross et al., 2017). Our calibration set-up is detailed in the SI. Individual sensor calibration accounts for intersensor variability, common within the lower-cost sensor market. In general, Alphasense EC sensors exhibit ±15% intersensor variability, and is accounted for via individual adjustment using calibration equations – an essential step in ensuring accurate data (Mijling et al., 2017). Intersensor variability was examined following first calibration in June 2016; average percent difference between COB4 sensors (n=16) was 5.28%(SD=4.02%), between NOB4 sensors (n=16) was 7.17%(SD=4.90%), and between NO2B43F sensors (n=16) was 8.59%(SD=6.30%). Tables S10-S12 provide more detail. A connection limit of four sensors per calibration (n=2 groups of two sensors) was determined in order to ensure proper air flow across sensor surfaces. One-second data were collected via HOBO data loggers (Onset Computer Corporation, Bourne, MA) to compare response time to manufacturer-reported data (Tables S2,S7&S8), where t90 is defined as seconds for the sensor to reach 90% of measured concentration (Alphasense, 2015a, Alphasense, 2016a, Alphasense, 2015b). While we did not assess individual sensor response time, we used manufacturer-reported response time to design the temporal aspect of our calibration protocol (i.e., to ensure sensors reach manufacturer-reported response time before taking next step). We allowed three minutes between adjustments and assumed response time was reached upon reviewing the 3-minute plateau during each calibration step. In general, sensitivities of EC gas sensors drift at 0.5–2.0% per month, so frequent calibration is necessary when considering the sensor aging process (AAN105–03, 2009). The calibration process for each group of sensors from set-up to generation of calibration equations spans approximately two hours. Generated in ambient lab conditions of 23.9°C(75°F) and 30%RH, but with 0%RH air introduced via the tubing, our resulting calibration equation converting raw output (Volts) into ppb concentration [C] is:
[Equation 3] |
where WE = working electrode and AE = auxiliary electrode.
Calibration temperature was close to the temperature for which manufacturer-reported sensitivity temperature dependence is absent (approximately 20°C) (Alphasense, 2015a, Alphasense, 2016a, Alphasense, 2015b). This, in addition to the outcome of lab-based temperature testing, resulted in non-application of temperature adjustments to calibration data.
2.3.3. EC gas sensor field testing
In July 2016, two sets each of COB4, NOB4, and NO2B43F sensors (sensor “G” and “J”) were field tested in a manner resembling planned field data collection. Field testing was consistent with what we aimed to achieve with planned data collection in participant’s homes, in that: i) EC gas sensors were housed in EMMA (n=2, with n=1 of each COB4, NOB4, and NO2B43F sensor); ii) operated with the same pump system; iii) were placed (side-by-side) on a table in the main living space; and iv) programmed to sample at a 1-minute interval. This residence was a studio apartment with two non-smoking occupants, a natural gas stove, and no central air conditioning. No calibrated reference instruments were available for colocation, so colocated data from EC gas sensors were compared.
2.3.4. EC gas sensor heat testing
Due to known effects of extreme temperatures on EC gas sensor performance, heat tests were performed on a select group of COB4, NOB4, and NO2B43F sensors (n=6 each, n=18 total) to account for changes in WE and AE. This was conducted following a short period of field testing to determine the degree of extreme temperatures to which sensors could be exposed, allowing development of a temperature correction for various thermal environments. We experimented with the following COB4, NOB4, and NO2B43F sensors:
sensors regularly used from start of field data collection but not exposed to extreme temperatures (n=2),
sensors not used in field data collection or subjected to extreme temperatures but showed linear calibrations over time (n=2), and
sensors continuously used in extreme temperatures (n=2).
A heat testing chamber was constructed using a National Electrical Manufacturer Association (NEMA) electrical box which included a power strip, eight sets of two sensors connected via steel pipe nipples (n=16 total) fitted with a temperature probe, and tubing to connect sensors to calibration gas (4ppm NO, 5ppm NO2, 5ppm CO) and zero air. Areas of air intake on the box were sealed. A 100-Watt incandescent bulb was placed inside the NEMA box to gradually increase temperature to 48.9°C(120°F) – the hottest temperature to which any sensor was exposed during field testing. Gas flow was regulated using TSI flow meters (one for calibration gas, one for zero air); zero air flow was constant at approximately 2LPM. A known concentration of gas (80ppb NO, 80ppb NO2, 1.6ppm CO) was released into the testing chamber for approximately one minute at the following temperatures: 23.9°C(75°F), 26.7°C(80°F), 32.2°C(90°F), 37.8°C(100°F), 43.3°C(110°F), and 48.9°C(120°F). COB4, NOB4, and NO2B43F sensors were tested separately; no co-pollutant mixtures were introduced. At 48.9°C(120°F), the bulb was removed and decreasing temperature monitored. When the temperature decreased to 37.8°C(100°F), 32.2°C(90°F), 26.7°C(80°F), and 23.9°C(75°F), gas-specific sensors were exposed to approximately 40ppb of NO and NO2 and 0.8ppm CO via tubing and response measured. The line of decreasing temperature was used for data adjustment, due to possible additional signal contribution or interference from the bulb. One-second data were averaged to the minute to match sampling frequency of field data. Following each calibration session, sensors were passively colocated in lab conditions for 7+ days to check wiring and voltage readings.
2.3.5. EC gas sensor cold testing
Cold temperature testing was conducted on the same gas sensors used in aforementioned heat testing in order to account for changes in WE and AEs, and to determine a correction for cold temperatures. Sensors were connected in a manner consistent with heat testing, fitted with a temperature probe, and HOBO data loggers programmed to log at a 1-second interval. All sensors were placed into an open box in a freezer (VWR Scientific Model U2016FA14) to sample passively, as resources did not allow for zero air system or calibration gas in such conditions. Sensors were chilled to −23.3°C(−10°F) to replicate coldest temperature exposure during field testing. One-second data were averaged to 1-minute to match sampling frequency of field data.
2.3.6. EC gas sensor interference
Addressing interference due to various extrinsic sources (e.g., RH, transient radiofrequency signals, co-pollutants) emerged as an additional step in data processing. The built-in AE for B-series EC gas sensors provides a degree of real-time correction for extrinsic interferences at the WE exposed to the ambient environment, but research has shown that the AE lags that of the WE under certain conditions (Cross et al., 2017), resulting in potential need for additional adjustment.
We identified two kinds of interference using a rule-based method. First, as the AE is theoretically blind to target gas species, significant changes in the AE should be accompanied by corresponding changes in the WE. Therefore, when the AE changed significantly (change>10ppb) without corresponding changes in the WE, those data points were characterized as having interference. Second, most negative concentrations were caused by divergence of the AE and WE (i.e., AE increases while WE decreases) as temperature increased (Cross et al., 2017). Remaining negative data points were marked as interferences, attributed to environmental sources other than temperature that are more challenging to account for (e.g., radiofrequency, copollutant mixtures). No humidity adjustment was applied, as we relied on research showing the AE useful in accounting for RH levels by having similar response characteristics as the WE with respect to levels ≤75% (Masson et al., 2015). However, data from Cross et al. (2017) showed improvement in correlation from raw sensor output (V) to calibrated sensor data versus reference measurements by incorporating environmental variables including RH to model the relationship between sensor output and reference monitor concentration. In their study, R2 values for NOB4, NO2B43F, and COB4 sensors improved from 0.21, 0.18, and 0.78 to 0.94, 0.94, and 0.96, respectively, using a high dimensional model representation (Cross et al., 2017). Results from our field data collection revealed a moderate mean RH of 51.53(SD=13.62%), but NO and NO2 measurements at higher RHs may have been optimized by applying a RH adjustment, as it is an extrinsic variable for which we did not account.
3.0. Results
3.1. PM lab testing
Using one lab colocated SidePak™ and OPC-N2, one-week averages were calculated for each sensor: 12.45μg/m3 for the SidePak™ and 7.36μg/m3 for OPC-N2. While showing moderate correlation (R2=0.47), mean percent error between the SidePak™ and OPC-N2, with correction factor applied, was 27.02%(SD=27.91%) and RMSE=2.94μg/m3 (Figure S1). While the SidePak™ is not a reference instrument, we accepted these values for purposes of study goals, which include informing both study participants of their 24-hour filter-normalized PM2.5 concentrations and exposure reduction strategies based on trends in the data.
Likewise, lab colocated PM sensors selected for incorporation into EMMA – OPC-N2 and MicroPEM™ – showed an acceptable RMSE value for raw data [i.e., data without filter normalization (RMSE=0.52μg/m3)] (Figure S2). Filter normalizing field data would improve agreeability between these measures, as filter measurements are based on gold-standard measurements. Comparisons of peak event normalization were not investigated here.
Following examination of the test OPC-N2, lab colocation of OPC-N2s for indoor sampling as part of EMMA was explored to ensure sensors tracked with one another as well as in response to PM events (n=16) (Figure S3). Peaks in response to PM source test were examined separately, exhibiting variation in response from PM events (Figure S4). Results of this experiment identified one OPC-N2 as inadequate based on inconsistent response pattern to source introduction (data not shown). Response inadequacy was determined based on nontracking of any OPC-N2 with others at ambient concentrations and PM source introduction. OPC-N2s demonstrating inconsistent response behavior compared to like colocated sensors were removed from incorporation into EMMA, which represented only 6% of our initial OPC-N2 inventory.
3.2. PM field testing
In-home field colocation of MicroPEM™s and miniPEM-corrected OPC-N2s was completed to confirm agreeability. OPC-N2 and MicroPEM™ field data were compared using 5minute averages in order to match that of subsequent data analysis. OPC-N2s and MicroPEM™ field testing showed high correlations (Figure S5, Table S13), with median R2=0.82 and median RMSE=3.52μg/m3, consistent with other testing (SCAQMD, 2015b, SCAQMD, 2015a). OPC-N2 accuracy was 68%, calculated using the integrated miniPEM concentration and real-time OPC-N2 PM2.5 average concentration for the duration of each field colocation. MicroPEM™ accuracy was 67%, calculated using integrated miniPEM concentration and real-time MicroPEM™ PM2.5 average concentration for the duration of each field colocation.
In order to identify a source of potential variability in sensor measurements, first integrated filter PM2.5 mass concentrations from each miniPEM and MicroPEM™ were compared for each colocated sample, with results of R2=0.93 and RMSE=2.29μg/m3 Then, normalization of real-time data was completed using each filter separately – the miniPEM 37mm Teflon filter and the MicroPEM™ 25mm Teflon filter – to normalize data from one sensor, the MicroPEM™. Data for comparison included colocated sensors meeting sampling criteria of 4+ days of data. Raw negative MicroPEM™ measurements were removed to avoid further data manipulation. Raw MicroPEM™ data was averaged to the 5-minute level for consistency across all comparisons made in this manuscript and ongoing data analysis. R2 and RMSE values for these device comparisons (n=9) indicated good agreement between normalization methods (R2=0.90, RMSE=5.26μg/m3). Therefore, filter normalization method was not considered a source of variability between measurements despite the difference in effective filter area. An RH correction factor for MicroPEM™ data was applied via software program (RTI, 2012) – but not for OPC-N2 data, potentially decreasing agreeability at higher RHs. A general limitation exists when comparing optical methods to gravimetric filter measurements, as optical sensors may be affected by their ability to scatter light due to changes in particle mass, size distribution or composition, and effects of RH>90% (Crilley et al., 2017). It is important to note that despite normalization of real-time data collected via optical methods with colocated filter measurements, real-time PM2.5 concentrations remain somewhat relative to actual PM2.5 concentrations, as each optical sensor may have encountered different particle characteristics and composition as well as factors affecting light-scattering ability (e.g., RH>90%). Further, studies have found the integrated filter sampling method to involve losses of evaporative species and thus potentially underestimate particle mass concentration – relevant for n=7 of our colocations (Jiang, 2011).
3.3. EC gas sensor field testing
Data for Alphasense COB4, NOB4 & NO2B43F sensors “G” and “J” colocated for one week were collected at 1-minute and averaged to 5-minutes. R2 and RMSE for COB4, NOB4 & NO2B43F sensors indicate good agreement (see Figures S6-S8). A more detailed comparison of field colocated data is presented in Table 1:COB4 sensor “G” and “J” showed a difference of 0.01792ppm; NOB4 sensor “G” and “J” 3.221ppb, and NO2B43F sensor “G” and “J” 3.204ppb. Calculated errors for these sensors, though only measured between two sensors each and lacking reference instrument field comparison, were deemed acceptable for study purposes. Agreement between sensor measurements showcase their ability to identify peak events, which may help inform behavior change or mitigation strategies planned as a follow-up to our data collection efforts.
Table 1.
EC gas sensor field colocation results shown as comparisons between each sensor as raw V (columns 2&3) then as concentration (ppm/ppb) (5-minute average) calculated using calibration equations and interference removal.
COB4 sensor ID | R2 (V) | RMSE (V) | Mean(SD) (ppm) | Range (ppm) | R2 (ppm) | RMSE (ppm) |
---|---|---|---|---|---|---|
Sensor “G” | 0.99 | 0.01992 | 0.5763(0.4402) | 0–1.920 | 0.99 | 0.01792 |
Sensor “J” | 0.5931(0.4165) | 0–1.866 | ||||
NOB4 sensor ID | R2 (V) | RMSE (V) | Mean(SD) (ppb) | Range (ppb) | R2 (ppb) | RMSE (ppb) |
Sensor “G” | 0.99 | 0.001890 | 12.67(19.16) | 0–163.2 | 0.97 | 3.221 |
Sensor “J” | 24.96(26.21) | 0–217.7 | ||||
NO2B43F sensor ID | R2 (V) | RMSE (V) | Mean(SD) (ppb) | Range (ppb) | R2 (ppb) | RMSE (ppb) |
Sensor “G” | 0.98 | 0.0008800 | 19.88(11.85) | 0–103.4 | 0.93 | 3.204 |
Sensor “J” | 18.58(12.33) | 0–102.3 |
3.4. EC gas sensor heat and cold testing: COB4, NOB4 & NO2B43F
NOB4s and NO2B43Fs were sensitive to temperature changes, but NO2B43F voltage readings decreased while NOB4 voltage readings increased at higher temperatures. For COB4s, no temperature adjustment was needed, as determined from lab-based testing – when temperatures increased from −18°C(0°F) to 21°C(70°F), the measured concentration of CO changed by 0.100V(or 0.26ppm); lack of temperature adjustment for COB4s and this lesser sensitivity to temperature change is similar to previously reported results of lesser sensitivity of COB4 sensors to temperature (Cross et al., 2017). Figure S9 shows the magnitude of heat stress on an NO2B43F sensor compared to the response following introduction of both 80ppb and 40ppb NO2. NOB4 sensors tracked with temperature change (Figure S10 shows NOB4 sensor “P”). Temperature effects were seen on raw signal (i.e., WE minus AE). Readings at introduction of 80ppb and 40ppb NO2 are presented with temperature to highlight the extent of temperature effects. NO2B43F response to introduced NO2 resulted in consistent voltage readings indicating appropriate sensor response, but without temperature correction, the response is lost in the effect from extreme temperatures.
Intercepts from both temperature tests were modeled to quantify responses of raw signal and corresponding temperature, respectively, for each sensor type. Results showed that bias caused by heat stress [>48.9°C(>120°F)] can be larger than 100ppb, while bias caused by cold stress was smaller (<25ppb). Both linear and quadratic regressions were applied to examine the relationship. We applied a step function in both NOB4 and NO2B43F temperature corrections, since measured voltage is different for heat versus cold. Heat corrections were applied to sensors exposed to >23.8°C(75.0°F), and cold corrections applied to sensors exposed to <15.6°C(60.0°F). No significant (>10ppb) temperature-induced bias was observed for measurements in 15.6°C(60°F)-23.8°C(75.0°F), considered the ambient conditions for calibrations. To quantify temperature-induced bias, both quadratic and linear models were trained using lab-generated temperature testing data. For heat testing data, the quadratic model (R2>0.99, Table S14) outperformed the linear model, demonstrating quadratic growth of heatinduced bias in measurements with temperature increase, highlighting the importance of measuring sensor surface temperature. For cold testing data, the linear model had a better goodness-of-fit (R2=0.88±0.060, Table S15).
Figure 2 shows the estimated curve of applied temperature corrections for NOB4 and NO2B43F sensors “P” and “Q” exposed to extreme temperatures. Taking manufacturer temperature corrections into consideration, lab-based controlled experiments indicated that intercepts for calibration-adjusted data for NOB4 and NO2B43F sensors exposed to temperatures ≥34.1°C(93.4°F) and 36.5°C(97.7°F) required adjustment. Cold testing indicated that NOB4 and NO2B43F measurements at temperatures ≤−3.3°C(26.0°F) and ≤1.0°C(33.8°F), respectively, required adjustment due to significant bias (difference>10ppb). No adjustment for slope was needed, as sensor sensitivity at different temperatures appeared consistent. For field data adjustments, temperature probe data were used when available and in their absence, Netatmo temperature data were used. Across 18 months of indoor data collection, no COB4, NOB4, or NO2B43F sensors were exposed to extreme temperatures that necessitated a correction.
Figure 2.
Estimated curve of applied temperature corrections for NOB4 and NO2B43F sensors “P” and “Q”, exposed to extreme temperatures. Red lines indicate 10ppb deviation (i.e., bias). Cold temperature adjustments begin around 15.6°C(60°F) and heat adjustments around 30.0°C(86°F).
3.5. NOB4 & NO2B43F sensor calibration and degradation
As mentioned, EC gas sensors in general degrade over time, requiring frequent calibration to adequately interpret data. To date, we have completed four calibrations, occurring every five months on average. Sensor degradation was present in the following two forms, using NO2B43F sensor “L” as an example: (1) larger calibration-generated slopes over time representative of decrease in sensitivity, and (2) increased response signal (i.e., comparatively higher V readings) over time for exposure to same respective gas concentrations, seen as a larger signal in latter versus initial calibrations (Figure S11). This presents problems for interpretation of NO2 concentrations if not accounted for via calibration, as there is potential for overprediction of actual versus sensor-measured gas concentrations over time: 0.005V in June 2016 versus October 2017 reflects 25 and 50ppb NO2, respectively.
NO2B43F sensor “L” sensitivity decreased approximately 42% over 18 months. For NO2B43F sensor “L” at approximately 0.03V in June 2016 reflected an NO2 concentration of 130ppb, but 0.03V in October 2017 indicated 260ppb. NOB4 sensor “L” showed similar sensitivity decrease at 49%, but conversely, there is potential under-prediction of actual versus sensor-measured concentrations if not accounted for via frequent calibrations: 0.015V in June 2016 versus October 2017 reflects 35 and 15ppb NO, respectively (Figure S12). Both are misrepresentations of actual concentrations, demonstrating the need for frequent calibration. In addition, there may be increased noise around baseline measurements (Figures S13&S14). As sensor strength degrades to the extent shown in Figure S11 for NO2B43F (October 2017 calibration), we replace it.
4.0. Conclusion & Discussion
We assembled a comparatively lower-cost (USD$<3,000 excluding consumables such as filters), portable, in-home air pollution sensor platform, allowing us to characterize key indoor pollutants with high sensitivity (i.e., concentrations ranging low at limit of detection to high during acute events), reasonable accuracy (e.g., approximately 68% for weekly PM2.5), and at sufficiently reduced cost to allow widespread deployment. The final result of sensor selection, lab testing, calibration, and assembly has been successfully collecting real-time measurements for 18+ months. Table S16 reviews EMMA components and calibration materials discussed.
Compared to previous indoor platforms, EMMA presented a relatively quiet, nonintrusive model for in-home sampling that did not distract participants from daily activities, while being easily incorporated into main living spaces. (Some participants reported noticing pump noise if the room was completely quiet.) A padlock securing EMMA contents prevented vandalism or interruption in data collection during each sampling week in participants’ homes. Importantly, the cost of powering EMMA using an electrical outlet for one week did not exceed USD$1 – calculated using Massachusetts price per kilowatt-hour and plug output voltage – which may relieve concern when sampling in low-income communities. EMMA power consumption for one week is 0.35 kilowatt-hours (kWh).
Development of high quality and affordable real-time sensors provides new opportunities to conduct research and inform individuals or communities on daily health risks. However, our experience regarding development, deployment, and maintenance of EMMA components indicate that important steps should be taken prior to employing these technologies. Below, we discuss recommendations for lower-cost sensor selection and deployment, while considering the implications for citizen science projects or other efforts engaging communities in data collection or interpretation.
4.1. Recommendations for lower-cost sensor selection and deployment
Understanding the upfront cost of sensors, their timeline limitations, and use patterns are necessary for sensor selection and experimental design, particularly when guidance on such issues may be minimal or nonexistent (Williams et al., 2014b).
4.1.1. Real cost of lower-cost sensors
While upfront costs of sensors and associated parts were relatively low, costs of maintaining sensor efficiency and general QA/QC across our study were much larger. Cost of calibration and standardization materials were more than a factor of five higher than EMMA components, not including the appreciable personnel effort required to conduct ongoing maintenance and evaluation. Frequent calibrations, one example of necessary maintenance and QA/QC, add to study costs, and investigators may not budget sufficiently. Observed from calibrations, EC gas sensor performance degraded over time and was accelerated for certain sensors exposed to extreme temperatures. After 18 months, manufacturer-suggested average degradation for COB4, NOB4, and NO2B43F sensors ranged from nine to 36%, comparable to our average measured sensitivity loss for COB4, NOB4, and NO2B43F sensors (not exposed to extreme temperature conditions): 21, 36, and 29%, respectively. To achieve optimum data quality, we adjusted data by applying calibration adjustments to sensors pre-and post-sampling and used temperature corrections, if necessary. However, field colocation of sensors with a FRM to produce temporally-matching data may be ideal in addition to pre- and post-sampling calibrations for most accurate data interpretation (Williams et al., 2014b). Such maintenance also requires technical knowledge, personnel, and/or resources, making calculation of the actual cost of lower-cost sensors critical for sensor use.
4.1.2. Timeline limitations
Continuous maintenance, scrutiny, and calibration throughout the course of any field campaign is necessary but may not align with project timelines. For example, the Alphasense manufacturer recommends all EC gas sensor calibration every three to four months, but our study design required ongoing data collection throughout warm and cool seasons. With insufficient resources preventing additional sampling box assembly, we opted for over six months between calibration sessions, which is not recommended. Adhering to trimonthly calibration is good practice for data interpretation, but exceeding this frequency may result in interpretation challenges if intensive post-data collection review and application of statistical methods to account for sensor degradation are not undertaken. Product updates may also pose challenges to sampling consistency. Our experience has shown, due to the growing market for lower-cost devices, new versions are developed and sold at a rate outpacing project timelines or sampling periods. While testing NO2B42F sensors, an improved version (NO2B43F) replaced the older version, which was no longer available for purchase. Though this was an improvement in that the NO2B43F exhibited lesser cross-sensitivity to NO and O3 and had better humidity transient and zero current performance, this would not have allowed for consistency in sensor outputs across our study period. Further, data collection timelines should account for potential delays when purchasing or replacing sensors (e.g., lack of availability from increased demand, need for independent testing, initial calibration).
4.1.3. Design improvements
Feedback from our field team – scientists, students, and community workers trained by members of our study team – and study participants regarding the EMMA design was encouraging, with two suggested improvements: i) inclusion of built-in power supply to avoid concerns with available outlets and electricity usage in homes and ii) a method for identifying power interruption issues in real-time, such as built-in wireless internet or hotspot capability. The latter was an approach we did not take due to budgetary constraints. However, depending on data security criteria, the state of cloud-based wireless networks may not be an option, as many are not compliant with security standards (Morawska et al., 2018). Connecting sensors to a wireless network (e.g., Arduino with built-in wireless) may prove important considering NOB4 and NO2B43F stabilization times following brief or extended power losses (Alphasense, 2013). For EMMA in its current state, power loss is identified and appropriately accounted for postcollection.
4.2. Implications for Community Use
While EMMA was designed to address our research questions, there is substantial interest in using lower-cost sensors to directly inform community concerns through citizen science or other efforts. Broadly, communities may be interested in answering a range of questions, which may or may not require data with accuracy and sensitivity similar to those achieved through EMMA. For example, communities may be less concerned about absolute concentrations and more about relative comparisons across sites, which necessitates calibration to ensure any between-sensor variation in sensitivity is addressed, but not requiring all elements of our protocol. Others may focus on time trends at particular locations, where issues of sensor degradation over time are paramount. The issue of temperature adjustment may be critical for outdoor sampling where sensors experience more extreme thermal conditions, but may not be relevant indoors where sensors are at comparable temperatures across interior living spaces. Our field testing showed this for temperature and RH to which sensors were exposed (Table S17). The key is for desired end use to be clearly expressed and for all users to understand any output limitations. Community groups are often interested in comparing real-time sensor measurements with air quality standards or data from regulatory monitors; though not observed here, frequent lack of agreement between real-time sensors and FRMs documented by others is a clear indication that citizen scientists should be mindful of the uncertainty of collected data (Williams et al., 2014a), especially in the absence of sustained calibration efforts.
This type of real-time monitoring may ultimately be most amenable to projects in which academic or scientific institutions collaborate with communities in data collection and interpretation. Our lab testing and calibration protocols enable adequate data interpretation and analysis for community consumption across a range of end uses; however, communities independently undertaking these steps may experience barriers such as cost and limited technical expertise and/or equipment. Our calibration system for all sensors is estimated at USD$17,100+, and though not attainable by all, cost-effective adaptations of a calibration system may be possible if a platform such as EMMA is modified to meet sampling goals involving fewer pollutants whose sensors require calibration (e.g., only NO and NO2 as opposed to NO, NO2, CO, and CO2).
Ultimately, the ideal model for utilizing real-time sensors may be community-based participatory research (CBPR), where communities actively engage in collaboration and participation in every stage of the research effort (O’Fallon and Dearry, 2002). This approach would foster shared learning and dissemination of results in useful language while providing increased quantity and quality of data collection and data usage while strengthening academiccommunity relationships. Guidance on sensor performance goals for citizen science applications have been suggested elsewhere (Table S1) (Williams et al., 2014b). Recent research cited the need for further investigation into social science implications of introducing lower-cost sensors to communities without proper guidance, suggesting that communities may face legal, administrative, or cost barriers for translating data into action believing they have little power or control in reducing their pollution exposures upon receiving results of higher pollution levels (Hubbell et al., 2018). Academic partners may provide communities with technical knowledge and resources (e.g., lab equipment, personnel), while communities provide local environmental and contextual knowledge important to data collection and interpretation. Aforementioned concerns regarding introduction of lower-cost sensors for use by communities stress the overall importance of collaborative practice between scientific institutions and communities to ensure data quality.
Supplementary Material
Highlights.
We present a framework for maintaining an in-home air pollution sensor platform.
Achieving quality low-cost sensor data requires resources inaccessible to communities.
Costs of sensor maintenance are low upfront but increase with study duration.
Collaboration between scientific and non-scientific groups can produce optimal data.
Acknowledgements
This work is part of the Center for Research on Environmental and Social Stressors in Housing across the Life Course (CRESSH), funded by the National Institute on Minority Health and Health Disparities (Award No. P50MD010428), the U.S. Environmental Protection Agency (Award No. RD-836156), and the National Institute of Environmental Health Sciences (T32ES007069). Authors would like to acknowledge current and past members of the CRESSH Home-based Observation and Monitoring Exposure (HOME) Study field team for field data collection and everyday support. In addition, we would like to give recognition to Research Triangle Institute International (RTI International) and Jonathan Thornburg, PhD and Director of Exposure and Aerosol Technology, for technical support with RTI MicroPEM™ devices and associated data. Authors would also like to acknowledge Bo Huang, PhD from The Chinese University of Hong Kong Department of Geography and Resource Management for providing funding for Yulun Zhou’s work as a visiting PhD student. The content of this manuscript is solely the responsibility of the grantee and does not represent official views of either funding entity. Further, no parties involved endorse the purchase of any commercial products or services discussed in this manuscript.
Footnotes
Conflict of Interest
The authors declare that they have no competing interests.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- AAN105–03 2009. Alphasense Application Note AAN 105–03.
- ALLEN JG, MACNAUGHTON P, SATISH U, SANTANAM S, VALLARINO J & SPENGLER JD 2016. Associations of Cognitive Function Scores with Carbon Dioxide, Ventilation, and Volatile Organic Compound Exposures in Office Workers: A Controlled Exposure Study of Green and Conventional Office Environments. Environ Health Perspect, 124, 805–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- ALPHASENSE. 2013. FAQs [Online]. Available: http://www.alphasense.com/index.php/air/faqs/ [Accessed 2016].
- ALPHASENSE. 2015a. CO-B4 Carbon Monoxide Sensor [Online]. Available: http://www.alphasense.com/WEB1213/wp-content/uploads/2015/04/COB41.pdf [Accessed 2016].
- ALPHASENSE. 2015b. NO-B4 Nitric Oxide Sensor [Online]. Available: http://www.alphasense.com/WEB1213/wp-content/uploads/2015/03/NO-B4.pdf [Accessed 2016].
- ALPHASENSE. 2016a. NO2-B43F Nitrogen Dioxide Sensor [Online]. Available: http://www.alphasense.com/WEB1213/wp-content/uploads/2017/07/NO2B43F.pdf [Accessed 2016].
- ALPHASENSE. Technical Information Release 2016b [Google Scholar]
- ALPHASENSE. 2017. OPC-N2 Particle Monitor [Online]. Available: http://www.alphasense.com/WEB1213/wp-content/uploads/2018/02/OPC-N2-1.pdf [Accessed 2017].
- BERNSTEIN JA, ALEXIS N, BACCHUS H, BERNSTEIN IL, FRITZ P, HORNER E, LI N, MASON S, NEL A & OULLETTE J 2008. The health effects of nonindustrial indoor air pollution. Journal of Allergy and Clinical Immunology, 121, 585591. [DOI] [PubMed] [Google Scholar]
- BRAUER M, RYAN PB, SUH HH, KOUTRAKIS P, SPENGLER JD, LESLIE NP & BILLICK IH 1990. Measurements of nitrous acid inside two research houses. Environmental Science & Technology, 24, 1521–1527. [Google Scholar]
- CASTELL N, DAUGE FR, SCHNEIDER P, VOGT M, LERNER U, FISHBAIN B, BRODAY D & BARTONOVA A 2017. Can commercial low-cost sensor platforms contribute to air quality monitoring and exposure estimates? Environ Int, 99, 293–302. [DOI] [PubMed] [Google Scholar]
- CHONG U, SWANSON JJ & BOIES AM 2015. Air quality evaluation of London Paddington train station. Environmental Research Letters, 10, 094012. [Google Scholar]
- COHEN AJ, BRAUER M, BURNETT R, ANDERSON HR, FROSTAD J, ESTEP K, BALAKRISHNAN K, BRUNEKREEF B, DANDONA L & DANDONA R 2017. Estimates and 25-year trends of the global burden of disease attributable to ambient air pollution: an analysis of data from the Global Burden of Diseases Study 2015. The Lancet, 389, 1907–1918. [DOI] [PMC free article] [PubMed] [Google Scholar]
- CRILLEY LR, SHAW MD, POUND RJ, KRAMER LJ, PRICE R, YOUNG S, LEWIS A & POPE F 2017. Evaluation of a low-cost optical particle counter (Alphasense OPC-N2) for ambient air monitoring. Atmospheric Measurement Techniques Discussions. [Google Scholar]
- CROSS ES, WILLIAMS LR, LEWIS DK, MAGOON GR, ONASCH TB, KAMINSKY ML, WORSNOP DR & JAYNE JT 2017. Use of electrochemical sensors for measurement of air pollution: correcting interference response and validating measurements. Atmospheric Measurement Techniques, 10, 3575–3588. [Google Scholar]
- DACUNTO PJ, CHENG K-C, ACEVEDO-BOLTON V, KLEPEIS NE, REPACE JL, OTT WR & HILDEMANN LM 2013. Identifying and quantifying secondhand smoke in multiunit homes with tobacco smoke odor complaints. Atmospheric environment, 71, 399–407. [Google Scholar]
- DEMOKRITOU P, KAVOURAS IG, FERGUSON ST & KOUTRAKIS P 2002. Development of a High Volume Cascade Impactor for Toxicological and Chemical Characterization Studies. Aerosol Science and Technology, 36, 925–933. [Google Scholar]
- DENNEKAMP M, HOWARTH S, DICK C, CHERRIE J, DONALDSON K & SEATON A 2001. Ultrafine particles and nitrogen oxides generated by gas and electric cooking. Occupational and Environmental Medicine, 58, 511–516. [DOI] [PMC free article] [PubMed] [Google Scholar]
- DOMINICI FPRD; BELL ML; PHAM L;.MCDERMOTT A; ZEGER SL SAMET JM. 2006. Fine Particulate Air Pollution and Hospital Admission for Cardiovascular and Respiratory Diseases. J American Medical Association, 295, 8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- EPA U 2017a. Carbon Monoxide’s Impact on Indoor Air Quality [Online]. [Accessed 2017].
- EPA U 2017b. Nitrogen Dioxide’s Impact on Indoor Air Quality [Online]. [Accessed 2017].
- EVANS G, PEERS A & SABALIAUSKAS K 2008. Particle dose estimation from frying in residential settings. Indoor Air, 18, 499–510. [DOI] [PubMed] [Google Scholar]
- FAUSTINI A, RAPP R & FORASTIERE F 2014. Nitrogen dioxide and mortality: review and meta-analysis of long-term studies. European Respiratory Journal, 44, 744–753. [DOI] [PubMed] [Google Scholar]
- FRANKLIN P, RUNNION T, FARRAR D & DINGLE P 2006. Comparison of peak and average nitrogen dioxide concentrations inside homes. Atmospheric Environment, 40, 7449–7454. [Google Scholar]
- GOLDSTEIN I, HARTEL D, ANDREWS L & WEINSTEIN A 1986. Indoor air pollution exposures of low-income inner-city residents. Environment International, 12, 211–219. [Google Scholar]
- HE C, MORAWSKA L, HITCHINS J & GILBERT D 2004. Contribution from indoor sources to particle number and mass concentrations in residential houses. Atmospheric environment, 38, 3405–3415. [Google Scholar]
- HUBBELL BJ, KAUFMAN A, RIVERS L, SCHULTE K, HAGLER G, CLOUGHERTY J, CASCIO W & COSTA D 2018. Understanding social and behavioral drivers and impacts of air quality sensor use. Science of The Total Environment, 621, 886–894. [DOI] [PMC free article] [PubMed] [Google Scholar]
- HYLAND A, TRAVERS MJ, DRESLER C, HIGBEE C & CUMMINGS KM 2008. A 32-country comparison of tobacco smoke derived particle levels in indoor public places. Tobacco Control, tc. 2007020479. [DOI] [PMC free article] [PubMed] [Google Scholar]
- ITO K, MATHES R, ROSS Z, NADAS A, THURSTON G & MATTE T 2011. Fine particulate matter constituents associated with cardiovascular hospitalizations and mortality in New York City. Environ Health Perspect, 119, 467–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
- JIANG R, ACEVEDO-BOLTON V, CHENG K, KLEPEIS N, OTT W, & HILDEMANN L 2011. Determination of response of real-time SidePak AM510 monitor to secondhand smoke, other common indoor aerosols, and outdoor aerosol. Journal of Environmental Monitoring, 13, 1695–1702. [DOI] [PubMed] [Google Scholar]
- KAMPA M & CASTANAS E 2008. Human health effects of air pollution. Environmental pollution, 151, 362–367. [DOI] [PubMed] [Google Scholar]
- KLEPEIS NE, OTT WR & SWITZER P 2007. Real–time measurement of outdoor tobacco smoke particles. Journal of the Air & Waste Management Association, 57, 522534. [DOI] [PubMed] [Google Scholar]
- KORNARTIT C, SOKHI R, BURTON M & RAVINDRA K 2010. Activity pattern and personal exposure to nitrogen dioxide in indoor and outdoor microenvironments. Environment international, 36, 36–45. [DOI] [PubMed] [Google Scholar]
- LANKI T, AHOKAS A, ALM S, JANSSEN NA, HOEK G, DE HARTOG JJ, BRUNEKREEF B & PEKKANEN J 2007. Determinants of personal and indoor PM 2.5 and absorbance among elderly subjects with coronary heart disease. Journal of Exposure Science and Environmental Epidemiology, 17, 124. [DOI] [PubMed] [Google Scholar]
- LAUMBACH R, TONG J, ZHANG L, OHMAN-STRICKLAND P, STERN A, FIEDLER N, KIPEN H, KELLY-MCNEIL K, LIOY P & ZHANG J 2009. Quantification of 1-aminopyrene in human urine after a controlled exposure to diesel exhaust. Journal of Environmental Monitoring, 11, 153–159. [DOI] [PMC free article] [PubMed] [Google Scholar]
- LOGUE JM, KLEPEIS NE, LOBSCHEID AB & SINGER BC 2014. Pollutant exposures from natural gas cooking burners: a simulation-based assessment for Southern California. Environmental health perspectives, 122, 43. [DOI] [PMC free article] [PubMed] [Google Scholar]
- MACDONELL M,RM, WYKER D, FINSTER M, CHANG Y, RAYMOND T, TEMPLE B, SCOFIELD M. 2014. Research and development highlights: mobile sensors and applications for air pollutants.: US Environmental Protection Agency. [Google Scholar]
- MASSDEP. 2015. MassDEP MassAir [Online]. Available: http://eeaonline.eea.state.ma.us/DEP/MassAir/pages/ChartByPollutant.aspx [Accessed May 7 2018].
- MASSON N, PIEDRAHITA R & HANNIGAN M 2015. Quantification Method for Electrolytic Sensors in Long-Term Monitoring of Ambient Air Quality. Sensors (Basel), 15, 27283–302. [DOI] [PMC free article] [PubMed] [Google Scholar]
- MIJLING B, JIANG Q, DE JONGE D & BOCCONI S 2017. Practical field calibration of electrochemical NO2 sensors for urban air quality applications. Atmospheric Measurement Techniques Discussions, 1–25. [Google Scholar]
- MORAWSKA L, THAI PK, LIU X, ASUMADU-SAKYI A, AYOKO G, BARTONOVA A, BEDINI A, CHAI F, CHRISTENSEN B & DUNBABIN M 2018. Applications of low-cost sensing technologies for air quality monitoring and exposure assessment: How far have they gone? Environment international, 116, 286–299. [DOI] [PMC free article] [PubMed] [Google Scholar]
- MUKHERJEE A, STANTON LG, GRAHAM AR & ROBERTS PT 2017. Assessing the Utility of Low-Cost Particulate Matter Sensors over a 12-Week Period in the Cuyama Valley of California. Sensors, 17, 1805. [DOI] [PMC free article] [PubMed] [Google Scholar]
- O’FALLON LR & DEARRY A 2002. Community-based participatory research as a tool to advance environmental health sciences. Environmental health perspectives, 110, 155. [DOI] [PMC free article] [PubMed] [Google Scholar]
- PARK J-M, ROCK JC, WANG L, SEO Y-C, BHATNAGAR A & KIM S 2009. Performance evaluation of six different aerosol samplers in a particulate matter generation chamber. Atmospheric Environment, 43, 280–289. [Google Scholar]
- RAUB JA, MATHIEU-NOLF M, HAMPSON NB & THOM SR 2000. Carbon monoxide poisoning—a public health perspective. Toxicology, 145, 1–14. [DOI] [PubMed] [Google Scholar]
- RICHMOND-BRYANT J, HAHN I, FORTUNE CR, RODES CE, PORTZER JW, LEE S, WIENER RW, SMITH LA, WHEELER M & SEAGRAVES J 2009. The Brooklyn traffic real-time ambient pollutant penetration and environmental dispersion (B-TRAPPED) field study methodology. Journal of Environmental Monitoring, 11, 2122–2135. [DOI] [PubMed] [Google Scholar]
- RTI 2012. RTI MicroPEM v3.2 Build II Procurement Instructions and Functional/Performance Design Specifications.
- RUSSO ET, HULSE TE, ADAMKIEWICZ G, LEVY DE, BETHUNE L, KANE J, REID M & SHAH SN 2014. Comparison of indoor air quality in smoke-permitted and smoke-free multiunit housing: findings from the Boston Housing Authority. Nicotine & Tobacco Research, 17, 316–322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- SCAQMD. 2015a. Field Evaluation AlphaSense OPC-N2 Sensor [Online]. South Coast AQMD Air Quality Sensor Performance Evaluation Center; Available: http://www.aqmd.gov/aqspec/evaluations/field [Accessed 2015]. [Google Scholar]
- SCAQMD. 2015b. Field Evaluation RTI MicroPEM PM2.5 Sensor [Online]. South Coast AQMD Air Quality Sensor Performance Evaluation Center; Available: http://www.aqmd.gov/aq-spec/evaluations/field [Accessed 2016]. [Google Scholar]
- SHAH AS, LANGRISH JP, NAIR H, MCALLISTER DA, HUNTER AL, DONALDSON K, NEWBY DE & MILLS NL 2013. Global association of air pollution and heart failure: a systematic review and meta-analysis. The Lancet, 382, 1039–1048. [DOI] [PMC free article] [PubMed] [Google Scholar]
- SLOAN CD, PHILIPP TJ, BRADSHAW RK, CHRONISTER S, BARBER WB & JOHNSTON JD 2016. Applications of GPS-tracked personal and fixed-location PM2. 5 continuous exposure monitoring. Journal of the Air & Waste Management Association, 66, 53–65. [DOI] [PubMed] [Google Scholar]
- SOUSAN S, KOEHLER K, HALLETT L & PETERS TM 2016. Evaluation of the Alphasense Optical Particle Counter (OPC-N2) and the Grimm Portable Aerosol Spectrometer (PAS-1.108). Aerosol Sci Technol, 50, 1352–1365. [DOI] [PMC free article] [PubMed] [Google Scholar]
- SPILAK MP, FREDERIKSEN M, KOLARIK B & GUNNARSEN L 2014. Exposure to ultrafine particles in relation to indoor events and dwelling characteristics. Building and Environment, 74, 65–74. [Google Scholar]
- SUH HH & ZANOBETTI A 2010. Exposure error masks the relationship between trafficrelated air pollution and heart rate variability. J Occup Environ Med, 52, 685–92. [DOI] [PMC free article] [PubMed] [Google Scholar]
- SUN L, WONG KC, WEI P, YE S, HUANG H, YANG F, WESTERDAHL D, LOUIE PK, LUK CW & NING Z 2016. Development and application of a next generation air sensor network for the Hong Kong marathon 2015 air quality monitoring. Sensors, 16, 211. [DOI] [PMC free article] [PubMed] [Google Scholar]
- TOWNSEND C & MAYNARD R 2002. Effects on health of prolonged exposure to low concentrations of carbon monoxide. Occup Environ Med, 59, 708–711. [DOI] [PMC free article] [PubMed] [Google Scholar]
- TUNNO BJ, SHIELDS KN, CAMBAL L, TRIPATHY S, HOLGUIN F, LIOY P & CLOUGHERTY JE 2015. Indoor air sampling for fine particulate matter and black carbon in industrial communities in Pittsburgh. Science of the Total Environment, 536, 108–115. [DOI] [PubMed] [Google Scholar]
- WANG Z, CALDERÓN L, PATTON AP, SORENSEN ALLACCI M, SENICK J, WENER R, ANDREWS CJ & MAINELIS G 2016. Comparison of real-time instruments and gravimetric method when measuring particulate matter in a residential building. Journal of the Air & Waste Management Association, 66, 1109–1120. [DOI] [PMC free article] [PubMed] [Google Scholar]
- WILLIAMS R, KAUFMAN A, HANLEY T, RICE J & GARVEY S 2014a. Evaluation of field-deployed low cost PM sensors. US Environmental Protection Agency. [Google Scholar]
- WILLIAMS R, KAUFMAN A, HANLEY T, RICE J & GARVEY S 2015. Evaluation of elm and speck sensors. US Environmental Protection Agency: Washington, DC, USA. [Google Scholar]
- WILLIAMS R, KILARU V, SNYDER E, KAUFMAN A, DYE T, RUTTER A, RUSSELL A & HAFNER H 2014b. Air sensor guidebook. National Exposure Research Laboratory, Office of Research and Development, US Environmental Protection Agency, Research Triangle Park, NC, USA. [Google Scholar]
- WILLIAMS RL, R.; BEAVER M; KAUFMAN A; ZEIGER F; HEIMBINDER M; ACHARYA BR; GRINWALD BA; KUPCHO KA; ROBINSON SE. 2014. Sensor Evaluation Report. United States Environmental Protection Agency. [Google Scholar]
- YU CH, MORANDI MT & WEISEL CP 2008. Passive dosimeters for nitrogen dioxide in personal/indoor air sampling: a review. Journal of Exposure Science and Environmental Epidemiology, 18, 441. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.