Abstract
This paper details an imagery dataset of interior and exterior ambiances to assess and represent photobiological outcomes of the built environment in northern territories. The images were obtained using a Raspberry Pi Camera Module (RPiCM) mounted in a holder that fixes the camera in place. This holder allows to rotate the camera by 30° and take 12 high dynamic range (HDR) images which are then combined to create a panoramic image. The HDR images enable the calculation of photobiological effects concerning photopic light intensity for vision, and the spectral dominance regarding vision and circadian stimulation. This dataset includes 13 captures in 7 interior and 6 exterior settings, each divided into 4 subfolders containing the photographic data: the sequence of low-dynamic range images (LDR), the tone-mapped images obtained from the HDR calculation, the analysis of photopic luminance and false color, and 360° panoramic images (tone-mapped HDR, false color luminance, and spectral dominance). Each space is also supplemented with photometric data presented as a .csv file containing lux and EML units obtained via a radiometer. This dataset is valuable for architects, designers, and neuroscientists to identify opportunities for enhancing human-centric lighting in existing architecture and landscape, as well as to propose solutions that promote vision and circadian stimulation in northern territories. This research was partially used in previous studies from [10]. The dataset is published and shared through a Mendeley repository [9].
Keywords: Daylight, Interior architecture, Landscape, High dynamic range image, Northern territories
Specifications Table
| Subject | Architecture |
| Specific subject area | The datasets provide information on photobiological impact of existing buildings in Quebec City and Nunavik. |
| Data format | Raw, analyzed. |
| Type of data | Table, Image, Chart, Graph |
| Data collection |
Bracketed low dynamic range (LDR) images were captured in existing buildings and outdoor points in Inukjuak, Quebec, Canada, using a Camera Module v2 (RPiCM) mounted on a Raspberry Pi Microcomputer. This camera model has a f-stop of 2.0, a focal length of 3.04 mm, and an 8-megapixel Sony IMX219 7.4 mm sensor which could produce images with a resolution of 3280 × 2464 pixels. It is characterized by a horizontal 62.2° and vertical 48.8° field of view (FoV). LDR images were captured from dark to bright exposure values (e.g., from -2EV to +2 EV) with an ISO-100 and a D65 fixed white balance. A Python script [1] was utilized to remotely trigger the data capture and produce the LDR images. The hdrgen module from Radiance was performed to compute the HDR images and false color maps. The photometric information was captured using an ILT5000 research radiometer from International Light Technologies. The sensor possesses a horizontal and vertical 180° field of view, equipped with an optical photopic and melanopic filter to capture lux and EML units. |
| Data source location |
Institution: School of Architecture, Laval University, Quebec City, QC, Canada Latitude and longitude: 46 °48 N, 71 °12 W |
| Data accessibility | Repository name: Imagery dataset of interior and exterior northern architectural spaces for photobiological lighting analyses. [2] Data identification number: doi: 10.17632/gpd7r5tngg.1 Direct URL to data: https://data.mendeley.com/datasets/gpd7r5tngg/2 |
| Related research article |
The data sets are partially used in the following paper: [3]. Representing the photobiological dimension of light in northern architecture. Indoor and Built Environment 1420326 × 231178188. 10.1177/1420326 × 231178188 |
1. Value of the Data
-
•
This dataset is composed of images that allow analyzing the photobiological potential of subarctic latitudes such as the northern territory of Quebec. These images can be used to improve the decision-making process of new architectural buildings or renovations considering human photobiological needs for vision and circadian potential.
-
•
This dataset is open to architects, designers, neuroscientists, and researchers studying the photoperiodic conditions and their relationship with the built environment of subarctic latitudes.
-
•
This dataset can be used to create immersive experiences in renovation projects of architectural spaces and new buildings in natural spaces in isolated territories through numerical visualization.
2. Background
Architectural daylighting design strategies play a pivotal role in enhancing the quality of indoor spaces by increasing their connection with the outdoors and, in the case of Nunavik, the land. These strategies directly influence human photobiological responses concerning light, specifically in vision and circadian rhythms [4,5]. Nevertheless, these strategies remain a challenge in Nunavik due to its subarctic climate, photoperiod, and solar geometry. This paper narrates the calculation process and methods of representation of daylight in Nunavik's existing architecture. It focuses on spatial representations of quantitative information as an appropriate tool for assessing light ambience, suggesting a better dialogue between architects and stakeholders in the building industry by communicating the spatial qualities of different architectural environments. Light is represented for the photopic (daytime vision) and melanopic (biological clock) portions of the electromagnetic spectrum, measured respectively in lux and equivalent melanopic lux (EML) [6]. The combination of photographic and photometric data aims at identifying melanopic- and photopic-dominant zones on panoramas in relation to the architectural composition and surface reflectance, as well as to note architectural components that appear to have a greater impact on light intensity.
3. Data Description
This database is composed of 13 environments that provide information on the lighting conditions for vision, biological clock and their spectral dominance potential for photobiological effects of light. The database presents information to reproduce panoramic representations of exterior and interior spaces in Inukjuak, Quebec, Canada. The database is divided into two main groups, exterior and interior, which in turn are divided into 6 subfolders for exterior and 7 subfolders for interior spaces as illustrated in Fig. 1.
Fig. 1.
Summary of the interior and exterior spaces that make part of the dataset.
The subfolders are named according to the date on which the images were captured, and the name of the space under study. The acronyms used to name the folders are mentioned in Table 1. Each folder space contains the 1 LDR, 2 HDR, and 3 FC folders that support the results presented in 4 PA, and a.csv file corresponding to the photopic illuminance and melanopic illuminance in a 360-degree plane measured using an ILT 5000 Radiometer [7] as presented in Fig. 2. Folder 1 LDR contains the folders in which all the LDR images are captured to create the HDR images. Folder 2 HDR contains all tone-mapped images from the HDR image generated from folder 1 LDR. Folder 3 FC contains all the false color images in photopic luminance values (cd/m2) retrieved from the HDR images. It is worth mentioning that the value range changes between indoor and outdoor spaces: 3,000 cd/m2 for interior spaces and 50,000 cd/m2 for exterior spaces. The content of folder 4 PA is illustrated in Fig. 3 and it comprises the panoramic images formed from 2 HDR and 3 FC, specifically the tone-mapped image obtained from HDR, the false color image in photopic luminance, and the spectral dominance either photopic (in red zones) or melanopic (in blue zones) and equal dominance (black or white, depending the case). The spectral dominance of the images is displayed in two representations: panoramic and polar (Fig. 3).
Table 1.
Acronyms used to name the main files.
| Acronym | Description |
|---|---|
| LDR | Low dynamic range |
| EML | Equivalent Melanopic Lux |
| EPL | Equivalent Photopic Lux |
| FC | False color |
| HDR | High dynamic range |
| PA | Panoramic |
| SD | Spectral Dominance |
Fig. 2.
Folders composition of each architectural space.
Fig. 3.
360° representations of each evaluated ambience: HDR panoramic image, false color photopic panoramic map, SD panoramic map, and SD polar map.
4. Experimental Design, Materials and Methods
Image data acquisition and processing: the experimental setup for image acquisition and FOV of the camera are illustrated in Fig. 4. A Raspberry Pi Camera Module V2 (RPiCM V2) was used to obtain the images between 05 and 08 March 2019, during a research visit to the Northern Village of Inukjuak (58.5° N, 78.0° W), Nunavik region in the province of Quebec, Canada. A Raspberry Pi corresponds to an easy handling microcomputer that permits remote control through a VNC viewer. Compared to professional camera devices, the Raspberry Pi permits the connection with a RPiCM v2 characterized by its high resolution and simplicity to set camera properties including ISO and shutter speed. The RPiCM was mounted on a Raspberry Pi microcomputer [8], which was controlled at distance to avoid handling and motion. The camera corresponds to a Sony IMX2019, which can be easily interfaced with the microcomputer and can be calibrated to ensure the scientific validity of the captured photographic data. A specially designed camera holder therefore enabled easily positioning the camera at 30° increments for the capture of 360° panoramas. The RPiCM's horizontal field of view of 62.2° requires a minimum of 12 images to provide a sufficient image overlap.
Fig. 4.
a. Photographic tool capture setup including the Raspberry Pi Microcomputer, the RPiCM v2, and holder installation.
Images are processed on the Raspberry Pi using an open-source script developed by McNeil [9], which includes hdrgen for HDRi generation [10]. The script was set in the Raspberry Pi and executed for each ambience capture. The script was configured to obtain seven exposures, referenced with the middle image [11]. Camera settings (ISO and shutter speed) are configured for each physical context (exterior, interior) to ensure proper brightness range corresponding to highly variable scene brightness. Shutter speeds range respectively from 0.05 to 50 ms for exterior captures, 0.3 to 200 ms for interior captures, and 5 to 1500 ms for night captures.
Capturing all seven exposures takes about 10 seconds for each orientation, and would require an additional 40 to 60 seconds for image processing. As capturing an entire 360° panorama could take up to 12 min, which is long enough for daylight conditions to change over time, LDR images were captured for all twelve orientations, and then subsequently batch-computed as HDRi. The script used to generate the tone mapped images from the HDRi is configured to optimise space on the Raspberry Pi, which prevents the .hdr and .pic images from being saved. The HDRi are possible to store by modifying the script available at https://github.com/andyrew/piHDR.
The camera response function underwent calibration before the captures, utilizing a ColorChecker © [12] chart known for its expansive chromatic representation. Despite the fact that hdrgen [10] already provides accurate luminance levels, this research followed the step-by-step process indicated by Pierson et al. [13] to gather an appropriate camera response curve for luminance maps. This function was computed as illustrated in Fig. 5 from a set of low dynamic range (LDR images), meticulously captured under consistent daylight conditions, with exposure duration spanning from 0.8 to 800 milliseconds. By establishing a response function as illustrated in Fig. 5, optimal white balance and color fidelity are ensured in the resulting of high dynamic range image (HDRi), subsequently applied during image acquisition utilizing seven LDR images for enhancing accuracy.
Fig. 5.
Camera response curve created using hdrgen [10].
Photometric data acquisition: As mentioned, the photographic data is complemented with photometric information collected using an ILT5000 research radiometer [7]. The sensor possesses a field of view that is comparable to humans and can be equipped with two optical filters that correspond to photopic (lux) and melanopic (EML) light. The sensor head is mounted on a tripod at the height of 1.5 m to correspond to a standing observer as illustrated in Fig. 6, and vertical illuminance (lux, EML) is acquired through a horizontal panoramic sweep for all orientations of a 360° rotation, performed over approximately 60 seconds, at the rate of one value per second. The capture is repeated with both optical filters to capture vertical illuminance at a specific time following the method developed in previous experiments from Lalande et al. [14].
Fig. 6.
Illuminance and EML capture using the ILT 5000 Radiometer. a) Angle of capture for the vertical plane, and b) angle of capture for the horizontal plane.
Panorama stitching: the fusion of images to form panoramic and polar representations was performed through Autopano Giga 4 [15]. The software places the imported images side by side, and for each cluster of 2 images, it automatically detects visually similar points (control points). Control points can then be modified, adjusted, added, and deleted manually as required, and each control point displays a reliability value (accuracy/imprecision), between 1.00 (perfect) and 100.00+ (poor). There were between 80 and 120 good quality control between each of the images to ensure proper stitching.
Spectral dominance calculation: The spectral dominance of an image refers to the ratio between the photopic and melanopic illuminance levels, also known as M/P Ratio. A higher ratio indicates that the melanopic levels are more predominant than the photopic levels. Conversely, a lower ratio indicates that photopic levels are more predominant than melanopic levels. To perform this calculation, a relative calculation script was created based on the research from Jung and Inanici [16] and presented in the following equations:
Equation 1: Relative photopic luminance calculation.
Equation 2: Relative melanopic luminance calculation.
This script is freely available in conjunction with the dataset and was developed in the Grasshopper [17] plugin which is supported by Rhinoceros (“ [18]). To reproduce the spectral dominance maps, the .JPG images were imported into the script (Fig. 7) which then displayed in a false color map in photopic luminance (cd/m2), melanopic luminance (cd/m2) and spectral dominance (M/P Ratio) values as illustrated in Fig. 7.
Fig. 7.
Overview of Grasshopper script to generate photopic luminance, melanopic luminance and SD dominance false color maps.
Limitations
Not applicable.
Ethics Statement
The authors confirm that the current work does not involve human subjects, animal experiments, or any data collected from social media platforms.
CRediT authorship contribution statement
Philippe Lalande: Conceptualization, Methodology, Software, Data curation, Formal analysis, Validation, Visualization, Investigation, Writing – review & editing. Marc Hébert: Supervision, Funding acquisition, Writing – review & editing. Jean-François Lalonde: Supervision, Funding acquisition, Writing – review & editing. Carolina Espinoza-Sanhueza: Writing – original draft. Claude MH Demers: Supervision, Funding acquisition, Writing – review & editing.
Acknowledgement
This research was supported by the Sentinel North program of Université Laval, made possible, in part, thanks to funding from the Canada First Research Excellence Fund.
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Data Availability
References
- 1.Python Software Foundation, 2024 Welcome to Python.org [WWW Document]. Python.org. URL https://www.python.org/(accessed 8.24.22).
- 2.P. Lalande, M. Hébert, A. Potvin, J.F. Lalonde, C.M. Demers, 2024. Imagery dataset of interior and exterior northern architectural spaces for photobiological lighting analyses. 10.17632/gpd7r5tngg.3.
- 3.Lalande P., Hébert M., Potvin A., Lalonde J.F., Watchman M., Demers C.M. Representing the photobiological dimension of light in northern architecture. Indoor Built Environ. 2023 doi: 10.1177/1420326X231178188. 1420326X231178188. [DOI] [Google Scholar]
- 4.Browning W.D., Ryan C.O. 1st ed. RIBA Publishing; 2020. Nature Inside: A Biophilic Design Guide. [DOI] [Google Scholar]
- 5.Kellert, S.R., Calabrese, E.F., 2015. The practice of biophilic design [WWW Document]. URL www.biophilic-design.com (accessed 11.4.22).
- 6.Lucas R.J., Peirson S.N., Berson D.M., Brown T.M., Cooper H.M., Czeisler C.A., Figueiro M.G., Gamlin P.D., Lockley S.W., Hagan J.B.O., Price L.L.A., Provencio I., Skene D.J., Brainard G.C. Measuring and using light in the melanopsin age. Trends Neurosci. 2014;37:1–9. doi: 10.1016/j.tins.2013.10.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.ILT Technologies, 2020. ILT5000 Research/Lab Radiometer [WWW Document]. URL https://www.intl-lighttech.com/products/ilt5000-researchlab-radiometer (accessed 3.7.21).
- 8.Raspberry Pi Foundation, 2024 Camera - About Camera Modules [WWW Document]. Raspberry Pi Documentation. URL https://www.raspberrypi.com/documentation/accessories/camera.html (accessed 3.20.24).
- 9.McNeil, An., 2024 piHDR [WWW Document]. URL https://github.com/andyrew/piHDR.
- 10.G. Ward, 2024 Anyhere Software [WWW Document]. URL http://www.anyhere.com/.
- 11.Jacobs A. High dynamic range imaging and its application in building research. Adv. Build. Energy. 2007;2549:177–202. doi: 10.1080/17512549.2007.9687274. [DOI] [Google Scholar]
- 12.Pantone, 2024 ColorChecker® Classic [WWW Document]. X-Rite. URL https://www.xrite.com/categories/calibration-profiling/colorchecker-classic (accessed 4.11.24).
- 13.Pierson C., Cauwerts C., Bodart M., Wienold J. Tutorial: luminance maps for daylighting studies from high dynamic range photography. LEUKOS. 2020:1–30. doi: 10.1080/15502724.2019.1684319. [DOI] [Google Scholar]
- 14.Lalande P., Demers C.M.H., Lalonde J.F., Potvin A., Hébert M. Spatial representations of melanopic light in architecture. Archit. Sci. Rev. 2020;64 doi: 10.1080/00038628.2020.1785383. 552–533. [DOI] [Google Scholar]
- 15.A. Frich, 2018. Autopano Giga.
- 16.Jung B., Inanici M. Measuring circadian lighting through high dynamic range photography. Light. Res. Technol. 2019;51:742–763. doi: 10.1177/1477153518792597. [DOI] [Google Scholar]
- 17.S. Davidson, 2024 Grasshopper [WWW Document]. URL https://www.grasshopper3d.com/ (accessed 4.11.24).
- 18.Rhino - Rhinoceros 3D [WWW Document], 2024 URL https://www.rhino3d.com/ (accessed 4.11.24).
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.







