Skip to main content
Scientific Reports logoLink to Scientific Reports
. 2024 Nov 2;14:26429. doi: 10.1038/s41598-024-77612-2

Affine transform representation for reducing calibration cost on absorption-based LWIR depth sensing

Takahiro Kushida 1,, Ryutaro Nakamura 1, Hiroaki Matsuda 1, Wenhao Chen 1, Kenichiro Tanaka 1
PMCID: PMC11531503  PMID: 39488627

Abstract

Multispectral long-wave infrared (LWIR) ranging is a technique that estimates the distance to the object based on wavelength-dependent absorption of LWIR light through the air. Prior works require time-consuming measurements for calibration and solve non-linear inverse problems, which sometimes falls into a local minimum. In this paper, we propose a linear representation that connects the measurements and the scene parameters using the affine matrix. In this representation, the distance and the temperature of the object can be obtained as a closed-form solution and the calibration cost can be reduced to at least three observations. In real-world experiments, we demonstrate that our method is effective to reduce the calibration cost while keeping the precision of the depth estimation.

Subject terms: Electrical and electronic engineering, Computer science

Introduction

Depth sensing technology is an important research topic in computer vision, which has a wide variety of applications including robotics, augmented reality, autonomous vehicles, and 3D reconstruction. Various depth sensing techniques including stereo vision, depth from focus/defocus, time-of-flight, structured light, and photometric stereo have been developed to meet the diverse needs of these applications1.

While these methods are effective, they have different pros and cons, and there are still some challenges left. One of the challenging scenarios is to measure the scene depth without any controlled light sources, where the scene is texture-less or under dark environments. Recently, a few passive depth sensing methods that work in such a scenario are proposed2,3. These works utilize the atmospheric absorption of ambient long-wave infrared (LWIR) light to estimate the depth, temperature, and emissivity from three bandwidth observations2 or hyperspectral observations3.

Although these works have demonstrated the feasibility of absorption-based LWIR depth sensing, they require a high cost calibration process. These methods assume known absorption coefficients and camera’s response function and the estimation process of these parameters requires a lot of beforehand observations by putting a blackbody furnace at different depth and temperature, which is time-consuming. Besides, these parameters must be precisely calibrated to avoid the local minima as the solution of non-linear inverse problems.

In this paper, we propose an affine transform representation by introducing Wien’s approximation. We show that an affine matrix linearly connects observations and the scene parameters, i.e., depth and temperature. In this representation, the solution can be analytically obtained and the calibration process is turned into obtaining an affine matrix instead of estimating the precise coefficients. Analogous to the affine matrix estimation in traditional image processing, the calibration cost can be reduced to at least three observations. We also extend this calibration process to include the response function of the multispectral LWIR camera. Finally, the data required for the calibration can be reduced to four observations. In the experiments, we demonstrate that our proposed method achieves the same level of accuracy as the previous methods despite much fewer measurements.

Scope: This work’s purpose is to reduce the number of measurements for the calibration. While the precision of the depth measurement is competitive to the prior work, the precision is not the claim of this paper.

Related work

We briefly review the related work from two perspectives: passive depth sensing and spectral analysis in LWIR.

Passive depth sensing

Passive depth sensing is a fundamental challenge that has been studied for a long time1. Various methods have been proposed to estimate depth from images and sensors in challenging conditions, such as low-light or dark environments.

Traditional stereo camera setups and multi-view stereo are widely used for depth estimation4,5. These methods use two or more images of a scene from different viewpoints to estimate depth through geometric and photometric consistency across the images. However, these methods face limitations in dark scenes where the visibility of features and texture is greatly reduced6. The lack of well-defined image features makes it difficult for these techniques to estimate depth accurately.

Another conventional approach for depth estimation is depth from focus/defocus, which analyzes the amount of blur in images7,8 In dark scenes, where visibility and illumination are limited, capturing images with a sufficient amount of blur variation becomes challenging.

To address the challenges posed by dark scenes, the use of thermal (LWIR) cameras for depth sensing has been explored6,911. Thermal cameras capture thermal light emitted by objects, which is independent of visible light conditions12. They can work effectively in darkness, offering a potential solution to the limitations of traditional stereo and focus-based methods. However, these methods still rely on object textures, and they also face challenges in long-range depth estimation due to the lack of detailed disparity information.

Recently, absorption-based depth sensing using an LWIR camera that works in dark scenes and is capable of long-range measurements has been proposed2,3,1315. These approaches exploit the atmospheric absorption of the LWIR light from the environment and estimates the depth, temperature and emissivity by non-linear optimization using either three wavelength images2 or hyperspectral images3,1315. These methods rely on the wavelength-dependent atmospheric absorption characterized by an extinction coefficient, and the calibration for estimating the coefficient requires a lot of observations. Our proposed method is to improve absorption-based depth sensing using multi-spectral LWIR images by reducing the cost of calibration and estimating the depth without non-linear optimization. Our method provides a close-form solution from multi-spectral thermal images to the scene depth. While the other methods2,3,1315 use either numerical optimization or neural network to estimate the scene depth, our method only compute the matrix multiplication, which obviously reduces computational complexity.

Spectral analysis in LWIR

While common LWIR cameras use all LWIR ranges (8–14 μm), spectral analysis is useful for various applications. Gases have unique emission and absorption spectra in LWIR region depending on their types. Spectral observations are used to detect and identify the gases1621. In the field of remote sensing, LWIR spectral features are widely used for the analysis of soil22,23, mineral2325, and the estimation of vegetation23,26. For a comprehensive overview of spectral analysis in LWIR, we refer the reader to the survey of LWIR hyperspectral imaging27. We also utilize the multispectral LWIR observations to a new application that jointly estimates the scene depth and its temperature.

Passive depth sensing with multi-spectral LWIR measurements

We briefly review the prior work2 and discuss the remaining issues. According to Planck’s law, the spectral radiance of a black body Me for wavelength λ at absolute temperature T is given by

Me(λ;T)=2πhc2λ51ehc/λkT-1, 1

where h is the Planck constant, k is the Boltzmann constant, and c is the speed of light. Real objects are not ideal black bodies and the spectral radiance E from objects is represented as

E(λ;ϵ,T)=ϵMe(λ;T), 2

where ϵ is emissivity, which is the ratio to the radiance of a black body and depends on the material and surface structure.

The air absorbs the light emitted from objects while the light travels from the object to the camera. Lambert Beer’s law gives the amount of attenuation with the traveling distance d28:

iout(λ)=exp(-σ(λ)d)iin(λ), 3

where iout and iin denote the intensity after and before attenuation, respectively, and σ denotes the extinction coefficient of the air. Using Eqs. (1)–(3), we can express the observation model as:

I(λ)=Rv(λ)exp(-σ(λ)d)ϵMe(λ;T), 4

where Rv is the camera’s sensitivity.

Equation (4) has three unknown parameters d,T and ϵ. Previous methods2,3 show that these unknowns can be estimated with more than three observations at different wavelengths. However, these methods have two common problems. First, they require a large number of measurements for calibration. The extinction coefficient σ and camera sensitivity Rv should be calibrated beforehand, which requires many measurements as these parameters are wavelength-dependent. Second, solving the inverse problem of this equations is a non-linear optimization so estimating d and T is computationally expensive and prone to local minima, i.e., the solution is sensitive to the coefficients or the observations. To stably calibrate the coefficients, the measurements for the calibration tend to increase.

Proposed method

In this paper, we propose a linear approximation representation that reduces the number of observations for calibration to at least three measurements and enables the analytical solution of d and T by solving a linear system of equations.

Linear approximation by Wien’s distribution law

Black body radiation can be approximated by Wien’s distribution law when hc/λkT in the region of the short wavelength spectrum of thermal radiation28. According to the Wien’s approximation, black body radiation can be represented by,

Me(λ,T)=2πch2λ5exp-hcλkT. 5

By taking the ratio of two observations at two different wavelengths I(λi), I(λj) and using Eqs. (4) and (5), we get

I(λi)I(λj)=Rv(λi)λj5Rv(λj)λi5exp-(σ(λj)-σ(λi))dexp-hc(λi-λj)k1T. 6

Taking the logarithm of both sides, we get

Ii,j=logI(λi)I(λj)=C1i,j+C2i,jd+C3i,jT, 7

where T=1T is the inverse temperature and the constants are given by

C1i,j=logRv(λi)λj5Rv(λj)λi5, 8
C2i,j=-σ(λj)-σ(λi), 9
C3i,j=-hc(λi-λj)k. 10

Apparently, the logarithm of the ratio of measurements Ii,j varies linearly with respect to the distance d and the inverse of temperature T. Using measurements at three different wavelengths λi,λj and λl, two sets of measurement pairs are obtained as follows:

Ii,j=C1i,j+C2i,jd+C3i,jT, 11
Il,j=C1l,j+C2l,jd+C3l,jT. 12

Denoting the matrix as

A=C2i,jC3i,jC1i,jC2l,jC3l,jC1l,j001, 13

we obtain

Ii,jIl,j1=AdT1. 14

As the matrix A has the form of affine transformation1, each of its elements can be estimated from at least three observations of the object at different distances and temperatures analogous to the traditional image processing. Once the matrix is obtained, the distance d and the inverse of temperature T can be obtained by multiplying the inverse of A as shown in Fig. 1.

Fig. 1.

Fig. 1

The relationship between the pairs of observations (Ii,j,Il,j) and the pair of distance and temperature (d,T) can be expressed by an affine transformation. We estimate the distance and temperature (d,T) from the observations (Ii,j,Il,j) using the affine transformation. The affine transformation matrix can be obtained from at least three corresponding points. In this study, we perform the end-to-end calibration of the entire observation system using four corresponding points.

Extension to photometric calibration of observation system

In our implementation, we further add an extra measurement to include the calibration of the response function of the multispectral LWIR camera. In multispectral LWIR systems including the prior work2 and ours, several bandpass filters are placed in front of the camera to get multi-spectral LWIR images. Unfortunately, this type of setup suffers from the narcissus effect, which is caused by the camera’s self-emission reflected off the filter. To mitigate this effect, an external shutter is often used as shown in Fig. 2. The observed images with the shutter opened Iopen(λ) and closed Iclose(λ) are expressed as

Iopen(λ)=Iscene(λ)+Iref(λ), 15
Iclose(λ)=Iref(λ)+Ishutter(λ), 16

respectively, where Iscene(λ) is the intensity of the scene, Iref(λ) is the intensity reflected off the filter and Ishutter(λ) is the intensity of the thermal radiation from the shutter. By subtracting Eq. 15 from Eq. 16 and solving for Iscene(λ), we get

Iscene(λ)=Iopen(λ)-Iclose(λ)+Ishutter(λ) 17
=Idiff(λ)+Ishutter(λ), 18

where we replace Iopen(λ)-Iclose(λ) to Idiff(λ) for simplification. The previous work estimates the intensity of the thermal radiation from the shutter Ishutter(λ) by measuring the temperature of the shutter directly and applying Planck’s law2. This approach is, however, not so accurate as the ideal model does not match the real world. In our method, we estimate the intensity Ishutter(λ) based on the measurements of the calibration target as well as the affine matrix A.

Fig. 2.

Fig. 2

Experimental setup. (a) We build a multi-spectral LWIR measurement system. The system consists of a LWIR camera, a filter wheel with three band-pass filters, and an external shutter. (b) We capture a black body furnace for a target object. The black body furnace and the camera are placed in a straight line, and images are captured for each temperature and distance.

The observations of the calibration target at the combination of two different temperatures T1,T2 and distances d1,d2 are expressed by Eq. 4 as

Iscene(λ;d1,T1)=Rv(λ)exp(-σ(λ)d1)ϵMe(λ;T1), 19
Iscene(λ;d1,T2)=Rv(λ)exp(-σ(λ)d1)ϵMe(λ;T2), 20
Iscene(λ;d2,T1)=Rv(λ)exp(-σ(λ)d2)ϵMe(λ;T1), 21
Iscene(λ;d2,T2)=Rv(λ)exp(-σ(λ)d2)ϵMe(λ;T2). 22

By taking the ratio of two observations at different distances but the same temperature, we get

Iscene(λ;d1,T1)Iscene(λ;d2,T1)=Iscene(λ;d1,T2)Iscene(λ;d2,T2)=exp(-σ(λ)d1)exp(-σ(λ)d2) 23

Substituting Eq. 18 into Eq. 23 and solving for Ishutter(λ) yields

Ishutter(λ)=Idiff(λ;d1,T1)Idiff(λ;d2,T2)-Idiff(λ;d2,T1)Idiff(λ;d1,T2)Idiff(λ;d2,T1)+Idiff(λ;d1,T2)-Idiff(λ;d1,T1)-Idiff(λ;d2,T2). 24

Equation 24 shows that the intensity from the shutter Ishutter(λ) can be obtained from four observations at different temperatures and distances. Finally, we can express all calibration parameters in the measurement domain instead of estimating physical coefficients of the system.

Experiments

Validity of Wien’s approximation

Our proposed method uses Wien’s approximation to transform the non-linear observation model into a linear model. Here we discuss the validity of Wien’s approximation for black body radiation. The temperature and wavelength range of the target objects in this study are 270–370K and 8–13 μm, respectively. Figure 3 shows the spectral radiance of Plank’s law and Wien’s law at 300K. The relative error between these two models is 0.89%, indicating that Wien’s approximation provides a sufficiently accurate approximation in the target range of our study.

Fig. 3.

Fig. 3

Comparison of spectral radiance of Planck’s law and Wien’s law at 300K. Wien’s law shows a similar spectral radiance in our target spectral range (8–13 μm).

Real-world experiments

Milti-spectral LWIR observation system

We built a similar observation system as previous work2 as shown in Fig. 2a. Only Nagase et al.’s system2 is reproducible as they use off-the-shelf components while any other system3,1315 is not publicly accessible, we can only compare against Nagase et al..2 The system consists of a thermal camera (FLIR Boson 640 40 mm, NETD=40mK), a filter wheel with three narrow bandpass filters, and an external shutter. The wavelength of each filter is λi=8μm(CWL8.248μ m, FWHM 452 nm), λj=9μm(CWL9.127μm, FWHM 545 μm), and λl=10μm(CWL10.400μm, FWHM 737 μm), respectively. The shutter is sprayed with black body paint and cooled using a Peltier device. The shutter is used to remove the narcissus effect and reduces imaging noise.

Data collection

We capture a black body furnace by changing the temperatures and distances as shown in Fig. 2b. The temperature is set to four different temperatures: 60 °C, 70 °C, 80 °C and 90 °C. The distance ranges from 1 to 15 m, increasing at 1 m intervals. We use the data of 60 °C and 90 °C for calibration and 70 °C and 80 °C for evaluation. Images at three wavelengths are captured by changing the filters for each distance and temperature.

Calibration

We use the following 4 sets of temperature and distance to estimate the matrix A in our proposed method: (1 m, 60 °C), (10 m, 60 °C), (1 m, 90 °C), (10 m, 90 °C). Regarding the previous work, the camera sensitivity Rv and the extinction coefficient σ are estimated in the calibration. For these estimations, we use all distance measurements at temperatures of 60 °C and 90 °C, 30 sets of observations in total.

Results

First, we visualize the pair of all observations (Ii,j,Il,j) shown in Fig. 4. The points from 1 to 15 m at each temperature are connected sequentially in the order of their decreasing distance. Points A to D are the four points used to estimate the affine matrix. The magenta dotted line represents the projection of points where the depth d ranges from 1 to 15 m, and the temperature T of 60 °C, 70 °C, 80 °C and 90 °C. Each point is regularly aligned and matches the magenta grid. The results show that our proposed method can estimate the depth and temperature using an affine transformation matrix estimated using only four-point observations.

Fig. 4.

Fig. 4

A visualization of the pair of all observations (Ii,j,Il,j). The solid line represents the observations with the same temperature and the color represents the temperature. We use four points A–D to estimate the affine matrix. The magenta dotted line represents the projected points of the depth (from 1 to 15 m in steps of 1m) and temperature (from 60 to 90 °C in steps of 10 °C) using the estimated matrix.

Next, we compare the results of estimated distance and temperature using our method with those of the previous method2 shown in Fig. 5. The green and orange lines are the estimated results from the observations at 70 °C and 80 °C using the affine matrix obtained by four points A–D in Fig. 4, respectively. We use root mean square error (RMSE) between the ground truth and estimated results for quantitative evaluation. Although our method uses significantly less data for calibration (4 vs. 30), both distances and temperatures are estimated accurately as RMSE is at the same level.

Fig. 5.

Fig. 5

Comparison of the estimated depth and temperature. Our proposed archives a similar accuracy with the previous work2 while the amount of data for calibration is reduced.

Demonstration

For a demonstration of our method, we apply the algorithm to entire the image. Figure 6 shows the qualitative results on real-world scenes including dark environment. The target objects are a pot, a clothes iron, an oven, a griddle, and a motorbike as shown in the upper row of Fig. 6. The distances of the objects are roughly measured by a scale. The calibration process is carried out using the black body furnace. We use the same filters as Section 5.2 and the depth is estimated pixel by pixel using the same algorithm. The background of the image is masked out in the same way of the previous work2.

Fig. 6.

Fig. 6

Demonstration on real scenes. The upper row shows the scene images and the targets. Target distances are roughly measured using a scale and the room light is turned off while the measurement. The bottom row shows the estimated results. The background is manually masked out. It is shown that the depth of the target can be measured in dark scenes.

The estimated scene depth is shown in the bottom row of Fig. 6. It is shown that the depths of the target objects are correctly estimated. Especially in the motorbike scene, it is shown that our method works well in dark environment, where other passive approaches including stereo, depth from focus/defocus, and shape from shading are not applicable.

Conclusion

This paper presents a simplified approach for absorption-based multi-wavelength LWIR ranging that avoids the time-consuming calibration and non-linear optimization. By applying Wien’s approximation to Planck’s thermal radiation law, the observation model is linearized and the relationship between the measured intensity and the scene parameters of temperature and distance is expressed by an affine transformation. The inverse problem can be solved analytically by estimating the affine matrix from only four measurements. Experimental results in a real-world environment show that the method can estimate both temperature and distance without loss of accuracy compared to existing methods.

In future work, we are interested in exploring the calibration method using fewer images or improving the accuracy using learning-based approaches.

Acknowledgements

A part of this work is supported by JSPS Kakenhi Grant #JP-22K18420.

Author contributions

T.K., R.N., H.M., and W.C. conducted the experiments. T.K., R.N., and K.T. conducted the coding and evaluations. All authors reviewed the manuscript.

Data availibility

The datasets analyzed during the current study are available from the corresponding author upon reasonable request.

Declarations

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Szeliski, R. Computer Vision: Algorithms and Applications Texts in Computer Science (Springer International Publishing, 2022).
  • 2.Nagase, Y., Kushida, T., Tanaka, K., Funatomi, T. & Mukaigawa, Y. Shape from thermal radiation: Passive ranging using multi-spectral lwir measurements. in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition 12651–12661 (2022). 10.1109/CVPR52688.2022.01233.
  • 3.Gallastegi, U. D., Rueda-Chacon, H., Stevens, M. J. & Goyal, V. K. Absorption-based, passive range imaging from hyperspectral thermal measurements. arXiv preprint arXiv:2308.05818 (2023).
  • 4.Brown, M., Burschka, D. & Hager, G. Advances in computational stereo. IEEE Trans. Patt. Anal. Mach. Intell.25, 993–1008. 10.1109/TPAMI.2003.1217603 (2003). [Google Scholar]
  • 5.Poggi, M., Tosi, F., Batsos, K., Mordohai, P. & Mattoccia, S. On the synergies between machine learning and binocular stereo for depth estimation from images: A survey. IEEE Trans. Patt. Anal. Mach. Intell.[SPACE]10.1109/TPAMI.2021.3070917 (2021). [DOI] [PubMed] [Google Scholar]
  • 6.Lu, Y. & Lu, G. An alternative of LiDAR in nighttime: Unsupervised depth estimation based on single thermal image. in 2021 IEEE Winter Conference on Applications of Computer Vision (WACV) 3832–3842 (2021). 10.1109/WACV48630.2021.00388.
  • 7.Pentland, A. P. A New sense for depth of field. IEEE Trans. Patt. Anal. Mach. Intell.[SPACE]10.1109/TPAMI.1987.4767940 (1987). [DOI] [PubMed] [Google Scholar]
  • 8.Carvalho, M., Le Saux, B., Trouvé-Peloux, P., Almansa, A. & Champagnat, F. Deep depth from defocus: How can defocus blur improve 3D estimation using dense neural networks? in Computer Vision: ECCV 2018 Workshops, vol. 11129, 307–323 (2019). 10.1007/978-3-030-11009-3_18.
  • 9.Kim, N., Choi, Y., Hwang, S. & Kweon, I. S. Multispectral transfer network: Unsupervised depth estimation for all-day vision. Proc. AAAI Conf. Artif. Intell.[SPACE]10.1609/aaai.v32i1.12297 (2018). [Google Scholar]
  • 10.Yun, S. et al. STheReO: Stereo thermal dataset for research in odometry and mapping. in 2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 3857–3864 (2022). 10.1109/IROS47612.2022.9981857.
  • 11.Shin, U., Park, J. & Kweon, I. S. Deep depth estimation from thermal image. in 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 1043–1053 (2023). 10.1109/CVPR52729.2023.0010.
  • 12.Gade, R. & Moeslund, T. B. Thermal cameras and applications: A survey. Mach. Vis. Appl.25, 245–262. 10.1007/s00138-013-0570-5 (2014). [Google Scholar]
  • 13.Gallastegi, U. D., Rueda-Chacon, H., Stevens, M. J. & Goyal, V. K. Absorption-based ranging from ambient thermal radiation without known emissivities. in 2022 Conference on Lasers and Electro-Optics (CLEO) 1–2 (2022). 10.1364/CLEO_SI.2022.STh5J.3.
  • 14.Gallastegi, U. D., Rueda-Chacón, H., Stevens, M. J. & Goyal, V. K. Absorption-based hyperspectral thermal ranging: Performance analyses, optimization, and simulations. Opt. Expr.32, 151. 10.1364/OE.507927 (2024). [DOI] [PubMed] [Google Scholar]
  • 15.Bao, F. et al. Heat-assisted detection and ranging. Nature619, 743–748. 10.1038/s41586-023-06174-6 (2023). [DOI] [PubMed] [Google Scholar]
  • 16.Hagen, N. Survey of autonomous gas leak detection and quantification with snapshot infrared spectral imaging. J. Opt.22, 103001. 10.1088/2040-8986/abb1cf (2020). [Google Scholar]
  • 17.Tremblay, P. et al. Standoff gas identification and quantification from turbulent stack plumes with an imaging Fourier-transform spectrometer. in SPIE Defense, Security, and Sensing (eds Vo-Dinh, T., Lieberman, R. A. & Gauglitz, G.) , 76730H (2010). 10.1117/12.850127.
  • 18.Sandsten, J., Weibring, P., Edner, H. & Svanberg, S. Real-time gas-correlation imaging employing thermal background radiation. Opt. Expr.6, 92. 10.1364/OE.6.000092 (2000). [DOI] [PubMed] [Google Scholar]
  • 19.Naranjo, E., Baliga, S. & Bernascolle, P. IR gas imaging in an industrial setting. in SPIE Defense, Security, and Sensing (eds Dinwiddie, R. B. & Safai, M.) , 76610K (Orlando, Florida, 2010) 10.1117/12.850137.
  • 20.McRae, T. G. & Kulp, T. J. Backscatter absorption gas imaging: A new technique for gas visualization. Appl. Opt.32, 4037. 10.1364/AO.32.004037 (1993). [DOI] [PubMed] [Google Scholar]
  • 21.Ohel, E. et al. IEEE 24th Convention of Electrical & Electronics Engineers in Israel 285–289 (2006). 10.1109/EEEI.2006.321072.
  • 22.Eisele, A. et al. Advantages using the thermal infrared (TIR) to detect and quantify semi-arid soil properties. Remote Sens. Environ.163, 296–311. 10.1016/j.rse.2015.04.001 (2015). [Google Scholar]
  • 23.Cao, L. et al. LWIR hyperspectral image classification based on a temperature-emissivity residual network and conditional random field model. Int. J. Remote Sens.43, 3744–3768. 10.1080/01431161.2022.2105667 (2022). [Google Scholar]
  • 24.Van Der Meer, F. D. et al. Multi- and hyperspectral geologic remote sensing: A review. Int. J. Appl. Earth Obs. Geoinform.14, 112–128. 10.1016/j.jag.2011.08.002 (2012). [Google Scholar]
  • 25.Vaughan, R., Calvin, W. M. & Taranik, J. V. SEBASS hyperspectral thermal infrared data: Surface emissivity measurement and mineral mapping. Remote Sens. Environ.85, 48–63. 10.1016/S0034-4257(02)00186-4 (2003). [Google Scholar]
  • 26.Neinavaz, E., Schlerf, M., Darvishzadeh, R., Gerhards, M. & Skidmore, A. K. Thermal infrared remote sensing of vegetation: Current status and perspectives. Int. J. Appl. Earth Obs. Geoinform.102, 102415. 10.1016/j.jag.2021.102415 (2021). [Google Scholar]
  • 27.Manolakis, D. et al. Longwave infrared hyperspectral imaging: Principles, progress, and challenges. IEEE Geosci. Remote Sens. Mag.7, 72–100. 10.1109/MGRS.2018.2889610 (2019). [Google Scholar]
  • 28.Vollmer, M. & Möllmann, K.-P. Infrared Thermal Imaging: Fundamentals, Research and Applications 2nd edn. (Wiley-VCH Verlag GmbH & Co, KGaA, 2017). [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The datasets analyzed during the current study are available from the corresponding author upon reasonable request.


Articles from Scientific Reports are provided here courtesy of Nature Publishing Group

RESOURCES