Skip to main content
MethodsX logoLink to MethodsX
. 2022 Apr 26;9:101712. doi: 10.1016/j.mex.2022.101712

An approach for monitoring temperature on fruit surface by means of thermal point cloud

Nikos Tsoulias a,, Sven Jörissen a,b, Andreas Nüchter b,
PMCID: PMC9096681  PMID: 35572811

Abstract

Heat and excessive solar radiation can produce abiotic stresses during apple maturation, resulting fruit quality. Therefore, the monitoring of temperature on fruit surface (FST) over the growing period can allow to identify thresholds, above of which several physiological disorders such as sunburn may occur in apple.

The current approaches neglect spatial variation of FST and have reduced repeatability, resulting in unreliable predictions. In this study, LiDAR laser scanning and thermal imaging were employed to detect the temperature on fruit surface by means of 3D point cloud. A process for calibrating the two sensors based on an active board target and producing a 3D thermal point cloud was suggested. After calibration, the sensor system was utilised to scan the fruit trees, while temperature values assigned in the corresponding 3D point cloud were based on the extrinsic calibration. Whereas a fruit detection algorithm was performed to segment the FST from each apple.

  • The approach allows the calibration of LiDAR laser scanner with thermal camera in order to produce a 3D thermal point cloud.

  • The method can be applied in apple trees for segmenting FST in 3D. Whereas the approach can be utilised to predict several physiological disorders including sunburn on fruit surface.

Keywords: Point cloud, Thermal point cloud, Fruit temperature, Sunburn, Food quality, Precision Horticulture

Graphical abstract

Image, graphical abstract


Specifications table

Subject Area: Agricultural and Biological Sciences
More specific subject area: Horticultural Technology
3D point cloud analysis
Machine vision
Method name: An approach for monitoring fruit temperature in 3D
Name and reference of original method: The computational methods are inspired by the literature and primarily:
  • Borrmann, D., Nüchter, A., Dakulović, M., Maurović, I., Petrović, I., Osmanković, D., & Velagić, J. (2014). A mobile robot based system for fully automated thermal 3D mapping. Advanced Engineering Informatics, 28(4), 425–440. 10.1016/J.AEI.2014.06.002

  • Tsoulias, Nikos, Dimitrios S. Paraforos, George Xanthopoulos, and Manuela Zude-Sasse. "Apple shape detection based on geometric and radiometric features using a LiDAR laser scanner." Remote Sensing 12, no. 15 (2020): 2481. 10.3390/rs12152481

Resource availability: Python code (www.python.org) was written and a Python code script file has been created.

Introduction

Excess solar radiation, elevated temperatures and low relative humidity are the main cause of abiotic stress on the fruit skin in orchards. In apples (Malus x domestica Borkh.), such field conditions suppress anthocyanin accumulation resulting in low pigmentation on fruit surface [42]. Enhanced temperatures on apple surface postpones starch degradation and consequently the conversion into sugars, especially in high altitudes [32]. Similar conditions can enhance respiration rate or reduce net photosynthesis, which in turn influences fruit growth rate at cell division stage [6,26], producing apples of smaller size [31]. Whereas, after fruit set, high temperatures (35°C) have also been related with decreased firmness levels at harvest stage [40].

Several physiological disorders can occur in the exposed surface of apples including sunburn, compromising fruit quality, storability and enhancing food waste. Longer periods of solar radiation and high temperatures are susceptible to appear, over the growing season of apples, due to climate change increasing yield losses. Recent reports mentioned annual yield losses up to 10 % in the US and New Zealand [22,33], 40 % in Australia [20], and from 10 to 50 % in south Africa [29], in apple orchards. Fruit skin temperature (FST) can be utilised as a reliable indicator to identify types of sunburn symptoms in apples [23]. Schrader et al. [24] found that when FST reaches around 52°C for longer than 10 min the epidermal cells exposed directly to the sun die, destroying the photosynthetic mechanism of the fruit. The combination of high ultraviolet radiation and FST (46-49°C) results in browning sunburn, discoloring the exposed peel due to chlorophyll degradation, and producing different levels of bronzing in the flesh [21]. Furthermore, shaded fruit skin suddenly exposed to moderate temperatures (< 31 °C) may result in photooxidative stress [9]. However, the surface Incidence and severity of the damage depend on a complex interplay of these factors together with the biochemical, physiological, and morphological condition of the apple, all of which are a function of the phenological stage, cultivar and adaptation to meteorological conditions. The monitoring of FST over the season structural characteristics in apple trees provides decisive knowledge for management within the orchard.

Contact methods for measuring FST include pushing the sensory bulb of a thermometer under the peel of the apple and inserting thermocouples on the fruit surface. However, the latter techniques wound the fruit surface, decreasing the repetitiveness of the measurement [18]. Whereas, the spatial variation of temperature on the fruit surface is neglected since the measurement takes place at one location. However, non-destructive techniques for measuring the FST have gradually gained a substantial advancement by the implementation of numerous two-dimensional machine vision systems in agriculture, using color or spectral information combined with thermal imaging. However, fruit localization and segmentation, for the monitoring of FST, based only on thermal information may be biased or fail due to similar temperature [11]. Chandel et al., [8] coupled an RGB camera with a thermal module to model FST, revealing a coefficient of determination (R2) up to 0.90 with the FST derived from the micro-climate sensor. A similar system was developed using colour information to obtain apple size and infra-red images for estimating FST in real time [25]. However, fruit segmentation based on captured images are susceptible to light variations and may be biased due to equal colour or shading condition of fruit, leaves and woody parts [17,37].

Three-dimensional (3D) vision systems received attention in horticulture, allowing to overcome the limitations of 2D imaging methods [14]. Hence, the shape of apple trees can be described using high resolution 3D point cloud data that can be acquired either directly from cameras such as RGB-depth [10] or using photogrammetric techniques with spectral [5,16] and RGB cameras [12,15]. However, also in the 3D analysis, the variation of light shading conditions within the canopy reduces the quality of the 3D point cloud acquired from camera systems [30]. The 3D point cloud and radiometric data can be generated from light detection and ranging (LiDAR) sensors, which operate based on time-of-flight principle (ToF). An active laser diode emits a laser beam and reflected photons describe the object surface, overcoming the effects of varying light conditions. The sensor is typically mounted on terrestrial platforms and operates parallel to the tree rows scanning the canopy surfaces from both sides, analysing each tree even in large-sized areas such as landscape and forests. Several studies in fruit production used LiDAR data to monitor tree geometry such as the canopy volume over tree growing period [7,41]. By means of this data, the spatio-temporal development of leaf area and woody parts were monitored trees within two seasons, utilising geometric and radiometric features [39]. Whereas, fruit detection approaches were proposed for segmenting shape and size in apples [13]. Despite the potential of 3D data in plant phenotyping, information on tree or fruit temperature is not present by default, while methods have been proposed and investigated in the field of architecture [3,19] and robotics [1,4]. Recently, a terrestrial LiDAR laser scanner was coupled with a thermal camera for reconstructing the 3D thermal point cloud in avocados [34]. Individual trees scanned from west and east side to acquire leaf temperature, revealing an ± 5 °C mean bias error compared with manual readings from both sides due to asynchronicity of LiDAR data with camera pixels. However, the scope of the conducted studies has been limited to small-scale validation concept. No study has been reported yet to observe the suitability of thermal - LiDAR 3D sensing for estimating FST in field conditions, an essential step for modeling FST and improve sunburn management strategies.

The present study aimed to (i) develop a robust method for merging 3D LiDAR data with thermal images, (ii) evaluate the data fusion under laboratory and field conditions using a metal tree target and (iii) the segmentation of apple fruit surface temperature from 3D thermal tree point clouds.

Materials and methods

Site description

The experiment was conducted in the Leibniz Institute of Agricultural Engineering and Bioeconomy (ATB) experimental station located in Marquardt, Germany (Latitude: 52.466274° N, Longitude: 12.57291° E). The orchard is located on an 8% slope with southeast orientation. The orchard is planted with trees of Malus  ×  domestica Borkh. ‘Gala’ and ‘JonaPrince’, and pollinator trees ‘Red Idared’ each on M9 rootstock with 0.95 m distance between trees, trained as slender spindle, which form the majority of apple trees in world-wide production, with an average tree height of 2.5 m. Trees are supported by horizontally parallel wires.

Instrumentation

A rigid aluminium frame carrying the sensors (Fig. 1a) is mounted on a rigid linear tooth-belt conveyor system (Module 115/42, IEF Werner, Germany) of 0.8 m length, employing a servo positioning controller (LV-servoTEC S2, IEF Werner, Germany) (Fig. 1b), to perform intrinsic and extrinsic calibration using an active pattern with clearly defined heat sources (m = 30). The linear conveyor moved at 20 mm s−1 (± 0.05 mm accuracy) forward speed. A mobile 2D LiDAR sensor (LMS-511, Sick AG, Waldkirch, Germany) was mounted vertically on the metal frame at 0.7 m above the ground level. The LiDAR sensor configured with a 0.1667° angular resolution, 25 Hz scanning frequency, a scanning angle of 180 and a wavelength of 905 nm. Additionally, a thermal camera (A655sc, FLIR Systems Inc., MA, USA) placed at 0.2 m distance above the laser scanner. The camera has a spatial resolution of 640 × 480 pixels at 50 Hz, with a spectral range from 7.5 to14 μm, an operational temperature range from −40°C to 150°C and a thermal resolution < 0.05°C. A lens (T198065, FLIR Systems Inc., MA, USA) with a focal length of 6.5 mm (diagonal 80o) is attached to the camera. The calibration was carried out in a room with no windows, controlled ventilation and temperature (15°C). The LiDAR and the thermal camera were connected via Ethernet to a laptop with software developed in LabVIEW (version NXG 5.1, National Instruments, Texas, USA) for data acquisition. The positioning controller of the linear conveyor connected to the same computer with a RS-232 serial port using the S2 Commander software (version 4.1.4201.1.1, IEF Werner, Germany) for configuration and operation.

Fig. 1.

Fig 1

Representation of (a) the sensor-frame system; mounted on (b) linear tooth-belt and (c) circular conveyor system.

After calibration, the phenotype sensing system was mounted on a circular conveyor platform, established in the experimental apple orchard (TechGarden), employing an electrical engine working with 50 Hz (DRN71, SEW Eurodrive, Germany) and stainless-steel chain with mechanical suspensions for varying plant sensors (Fig 1c). A real time kinematic global navigation satellite system (AgGPS 542, Trimble, Sunnyvale, CA, USA) is used to geo-reference the data and an inertial measurement unit (MTi-G-710, XSENS, Enschede, Netherlands) to acquire orientation information are arranged on the sensor frame. The root mean square error (RMSE) of orientation noted at 0.25° for roll (φ), pitch (θ) and yaw (ψ). Furthermore, an RTK-GNSS (AgGPS 542, Trimble, Sunnyvale, CA, USA) used for georeferencing each individual scanning profile of the 3D point cloud. The horizontal and vertical accuracy of the RTK-GNSS is ± 25 mm + 2 ppm and ± 37 mm + 2 ppm, respectively. The IMU is placed 0.3 m aside from the LiDAR sensor, while the receiver antenna of RTK-GNSS is mounted 0.6 m abive the laser scanner (Fig. 1). This phenotyping platform was established in a single row of the experimental apple orchard in 2020. The platform enables the automated monitoring of 109 trees in one row of 84 m length.

Data pre-processing

The accurate projection of thermal information on the 3D point cloud requires calibration of the sensor frame system. More specifically, the process consists of two parts: (i) the intrinsic calibration of the thermal camera for determining camera matrix and distortion parameters, and (ii) the extrinsic calibration between camera and LiDAR coordinate system to define rotation and translation. The calibration tool chain is written in Python 3.8 (Python Software Foundation) and uses the OpenCV library (Bradski & Kaehler, 2008) for image processing and Open3D [36] for point cloud processing. The following sections describe the methodology behind the calibration tool chain.

Intrinsic calibration

The intrinsic calibration of a thermal camera is typically estimated by detecting geometric features from patterns in the image, such as corners from chessboard patterns of known dimensions. However, chessboard patterns are not suitable for calibrating thermal cameras, since even when passively heated, the temperature difference between black and white areas is not sufficient to be identified as distinct corners. Therefore, as suggested in [2], an actively heated lightbulb pattern was constructed (Fig. 2 a,b). The pattern consisted of a wooden board (500 × 600 mm), containing 30 (m) 12V lightbulbs of 4 mm diameter, arranged in a 5 × 6 grid with a distance of 100 mm.

Fig. 2.

Fig 2

Lightbulb blob calibration pattern, (a) front view with dimensions and (b) back view with handles.

The lightblubs appeared as blobs in thermal images, therefore, the OpenCVs SimpleBlobDetector (Bradski, G., & Kaehler, A. 2000) algorithm was used to obtain their respectivecoordinates. The latter was based on the region of connected points, which determined by colour thresholding, grouping and size of detected blobs. More specifically, the source images were converted to several binary images, thresholding was applied, starting with minThreshold, ending with maxThreshold in tresholdStep increments. The segmented white pixels were grouped and, their overall shape and size is estimated, while the grouped pixels across all thresholded images were combined for calculating the center and radius for each individual blob. Moreover, the size of the blobs (minArea, maxArea) was configured. Three sets of parameters were empirically determined and are further noted as sensitivity (s).

The intrinsic parameters of the thermal camera, namely the focal length in x) and y (f||y) direction, the camera center (cx,cy) as well as radial (k1,k2,k3 and tangential (p1,p2) distortion were determined using [35]. The relationship of a 3D real world point O=(X,Y,Z)T and its 2D projection o=(x,y)T on the sensor plane is defined by

(xy)=(fxX+ZcxfyY+Zcy) (1)

Expressing (Eq. (1)) with homogeneous coordinates yields

(xy1)=[fx0cx0fycy001](XYZ) (2)

with

A=[fx0cx0fycy001] (3)

as the camera matrix. The distorted image points (xd,yd)T are calculated as

(xdyd)=(x(1+k1r2+k2r4+k3r6)+2p1xy+p2(r2+2x2)y(1+k1r2+k2r4+k3r6)+p1(r2+2y2)+2p2xy) (4)

with r=x2+y2.

For a calibration pattern of known dimensions with a set of m lightbulb features, n images were captured and the reprojection error of the detected features minimised as

i=1nj=1moijo^(A,D,Ri,ti,Oj)2 (5)

where oij are the coordinates of feature j in imagei and o^(A,D,Ri,ti,Oj) is the projection of the corresponding 3D point Oj with the distortion coefficients D, camera matrix A and the rotation matrix Ri and translation vector ti as the relation between the camera and world coordinate system.

The process of detecting the blobs in an image is visualised in Fig. 3. The image was initially scaled to its temperature range (T, consisting of the minimum and maximum temperature found, and converted to 8 bit to make the blobs more visible. The aforementioned blob detector was configured with a sensitivity s and applied to the image. If the total number of blobs was found, the blob detection deemed successful, otherwise the SimpleBlobDetector reconfigured and applied again. In the case that three different sensitivities do not yield a successful result, the image was rescaled decreasing the temperature range by 5 degrees and repeating the blob detection. Whereas, if after 4 different temperature ranges, each of which has been analysed with the 3 sensitivity settings, the required number of blobs is not found, then the image is deemed a failure.

Fig. 3.

Figure 3

Flow chart of blob detection process for one image. T stands for different temperature ranges, s is the sensitivity of the blob detector and n is the number of detected blobs.

This process compensates for the rather wide opening angle of the lens, which leads to a big variety of pattern positions to cover the whole frame as well as recording the pattern at an angle and the desired focus point. For example, enhanced blob size is perpendicular to image plane, while decreased blob size can depict high incident angle. The sensitivity parameters that have been utilised in this work, for the specific thermal camera and calibration pattern, are listed in Table 1. The different temperature ranged as well as the effect of different sensitivity settings was visualised in Fig. 4 and Fig. 5, respectively.

Table 1.

Sensitivity parameters used for SimpleBlobDetector.

Sensitivity minThreshold maxThreshold thresholdStep minArea maxArea
1 20 220 15 1 60
2 30 220 15 3 80
3 50 220 20 5 100

Fig. 4.

Fig 4

The effect of different temperature ranges of the 8-bit conversion of the image. (a) T = [20°C, 60°C], (b) T = [25°C, 60°C], (c) T = [30°C, 60°C], (d) T = [35°C, 60°C].

Fig. 5.

Fig 5

The effect of the three different sensitivity settings on the same image scaled to the temperature range T = [20°C, 60°C]. (a) sensitivity s = 1, (b) sensitivity s= 2, (c) sensitivity s = 3.

Extrinsic calibration

To get a 3D point cloud and a corresponding thermal image from the 2D LiDAR sensor and thermal camera, were mounted on the linear conveyor as shown in Fig. 1b. The conveyor is moving the sensor setup with a constant velocity perpendicular to the LiDAR scanning plane, thus yielding a 3D point cloud. Similar to (Eq. (2)), the relation of a point in LiDAR coordinates L=(XL,YL,ZL)T and the corresponding image point o = (x,y)T defined by

(xy1)=A[100001000010][Rt0000](XLYLZL1), (6)

with A being the camera matrix from (Eq. (3)), R the rotation and t the translation between the coordinate system of the thermal camera and LiDAR laser scanner. The latter variables are known as the extrinsic parameters, determined based on the relation of 2D image features (o) to their corresponding 3D LiDAR coordinates (L), which referred as feature pairs. For such set of point cloud and thermal image with m distinct feature pairs (oi,Li)i=1m, rotationR and translation t between camera and LiDAR coordinate system are obtained by minimising

i=1mojo^(A,D,R,t,Lj)2, (7)

with oi as the pixel coordinates of blob i, Lj as the corresponding 3D point in LiDAR coordinates, A and D as camera matrix and the distortion coefficients described in Eq. (3), ((4)) and o^(A,D,R,t,Lj) as the projection of Lj onto the image plane.

To extract the 3D LiDAR coordinates (L) of the features from the point cloud, a processing pipeline was applied (Fig. 6). More specifically, the point cloud was cropped with an axis aligned bounding box to remove ceiling, floor, and walls (Fig. 6Error: Reference source not found, a). This was done to ensure, that the biggest remaining plane in the point cloud is the calibration pattern itself. Moreover, the Open3Ds plane detection was applied to detect the prominent plane in the point cloud, using the random sample consensus algorithm (RANSAC) [36] (Fig. 6, b, white points), and all remaining points were removed (Fig. 6, b, red points). For the point cloud of the prominent plane, the bounding box was calculated, and the dimensions were checked to ensure, that they match with the size of the actual calibration pattern and all filtered points were removed.

Fig. 6.

Fig 6

Processing of point cloud: (a) box crop to remove walls, floor and ceiling, (b) plane detection, (c) downsampled data with local coordinate system.

A principal component analysis on the voxel downsampled data (Fig. 4, c) was performed to calculate the eigenvectors and corresponding eigenvalues. The eigenvectors and the centroid of the point cloud provided an initial estimation of the orientation and position of the calibration pattern (Tinitial and its local coordinate system relative to the laser scanner (Fig. 4, c). To optimise this calculation, a point cloud of the calibration pattern was synthetically sampled to serve as a perfect model, was roughly aligned with the subsampled point cloud using Tinitial. Registering the model to the data using the iterative closest point (ICP) algorithm yields a correction transformation Tcorr. The concatenation of Tinitial and Tcorr from the ICP algorithm provided the final transformation Tfinal of the pattern relative to the scanner (Eq. (8))

Tfinal=[Rt0000]=Tinitial*Tcorr (8)

When Tfinal was specified based on the geometry of the calibration pattern (cf. Fig. 2), the position of the lightbulbs in LiDAR coordinates (L) was determined (Fig. 7). Since the calibration pattern is fixed relative to both sensors, Eq. (7) gives the desired orientation and translation between both sensors.

Fig. 7.

Fig 7

Lightbulb position determination.

For the extrinsic calibration, a dataset consisting of 21 scans for different calibration pattern positions within the scanning area was recorded. Of those 21 scans, 6 could not be processed properly with all 6 scans having a section of the calibration pattern cut off due to bad placement within the scanning area. For each valid scan, the extrinsic parameters were determined according to the aforementioned pipeline (Fig. 3). Each set of parameters was then used to calculate the RMSE of the reprojection of all valid datasets according to

rmse=i=1Nj=1Mmi,jmi,j2N*M, (9)

with N being the number of valid scans, M being the number of points per scan, mi,j being the jth point of scan i projected onto the image plane and mi,j being the jth point of scan i of the calibration pattern.

Data Fusion

When all intrinsic and extrinsic parameters determined, temperature values from the thermal image were assigned to the corresponding 3D points. All points in the point cloud were projected in the image plane (Fig. 7). If the projected point lied within the image plane, the corresponding temperature value was assigned. Whereas, if the point was outside the plane, a value outside the temperature range of the dataset, in this case -10 °C, was assigned. This ensured that points outside of the field of view of the thermal camera were not omitted.

Segmenting apple temperature

After calibration, the phenotype system was mounted on the circular conveyor in order to scan the fruit trees from both sides. The temperature values assigned in the corresponding 3D point cloud were based on the extrinsic calibration. According to Tsoulias et al., [27] rigid translations and rotations were applied on each point of the 3D point cloud, while alignment of pairing tree sides was carried out using ICP. The bivariate point density histogram was proposed to estimate the stem position of each tree (n = 20), while a cylindrical boundary was projected around the estimated stem positions to segment each individual tree.

For defining the position and shape of apples, the geometric feature of curvature (C) was calculated for each point of each 3D tree point cloud using the k-nearest neighbors algorithm [28]. For this purpose, the local neighborhood of points Pi= [xi,yi,zi] was analysed in 3D coordinates. The total number (N) of Pi within each tree's cloud was used to estimate the mean of all nearest neighbors

P˜=1Ni=1N(Pi) (10)

After mean centering, each Pi with P˜ value per nearest neighbor's set and decomposition of covariance matrix. The latter was decomposed based on the singular value decomposition, producing the eigenvalues (λ1, λ2, λ3), which were classified according to decreasing percentage of explained variance in the data. The eigenvalues were scaled between 0 and 100, allowing the comparison of different clusters. More specifically, the values closer to 100, the higher the likelihood for shape of points to be curved. The probability density function was performed to define the thresholds of curvature (Cth) and LiDAR's backscattered reflectance (Rth) defining the range of apple points in terms of C and backscattered reflectance (CA and RA). The points that fulfilled the criteria of Cth ≤ CA, and Rth ≤ RA were segmented and categorised as apples. This allowed to define the temperature values on the surface of apples by means of LiDAR (FSTLiDAR). The temperature on fruit surface was manually measured (FSTManual) (n = 285) with an infrared thermometer (Microscanner D501, Exergen, Watertown, USA) and compared with the correspondent averaged FSTLiDAR. The detected apples were categorised, in west and east, based on their position on the tree side.

Evaluation

A metal tree frame with dimensions 2 m × 0.30 m × 0.05 m was constructed to assess the measuring uncertainty of the phenotypic system in terms of temperature (Fig. 8). Five bars with 0.30 m distance from each other were placed horizontally on each side of a metal trunk. Sphere targets of 60 mm (n = 3) and 80 mm (n = 12) size were applied to assess the temperature derived by the phenotypic platform. The spheres were coated with white barium sulphate (BaSO4, CAS Number: 7727-43-7, Merck, Germany) and blackened urethane (S black, Avian Technologies, New London, NH, USA) for acquiring the minimum (SW) and maximum (SB) TLiDAR on the sphere surface. The phenotypic system was utilised to scan the metal frame indoors and outdoors in the orchard, and an infrared thermometer to manually acquire the temperature on the sphere surface (TManual).

Fig. 8.

Fig 8

Representation of the metal tree frame of known distances with sphere targets.

Descriptive statistics were applied to all datasets capturing minimum (min), maximum (max), mean, standard deviation (SD). A regression analysis was performed to quantify linear relationships between the manual measurements and the detected temperature by means of LiDAR, and RMSE, mean bias error (MBE). Descriptive statistics were carried out by Matlab (v.R2018b, Mathworks Inc., Natick, MA, USA).

Results

Intrinsic calibration

Sample images for the intrinsic calibration process are visible in Fig. 9. The raw thermal image (left) is scaled to a temperature range (middle) and the blobs are detected and sorted (right) according to the visible color scheme from red to violet. For our dataset of 140 images, 10 images were discarded, with all 10 having at least one of the blobs missing due to poor alignment of the pattern while recording the images. The root mean squared error (RMSE) of the reprojection yielded 0.33.

Fig. 9.

Fig 9

(a): Raw Thermal Image, (b): Scaled to Temperature Range, (c): Detected and Sorted Blobs.

Extrinsic calibration

The calculated RMSE was shown in Fig. 10. Calculating the mean of all extrinsic parameters element-wise yielded an RMSE of 1.82 (grey line), thus it was decided to further use the parameter set with the lowest RMSE (Nr. 6, 1.25).

Fig. 10.

Fig 10

Root Mean Square Error (RMSE) of Extrinsic Calibration per Scan. The RMSE yielded by averaging all parameters elementwise is shown as a line in grey, the Minimum RMSE is shown as a line in yellow.

Data fusion

An example of a point cloud of the calibration pattern, containing LiDAR and temperature information was depicted in Fig. 8. Temperature values were scaled from 15 to 55 °C. The center of blobs showed a mean TLiDAR of 46.3 °C with an SD 2.93 °C, while a less pronounced mean TLiDAR of 21.41 °C the 3.03 °C was found in the rest points of the calibration pattern Fig. 11.

Fig. 11.

Fig 11

Point cloud of the calibration pattern.

Evaluation

The values of TLiDAR on the white (Sw) and black (SB) surfaces appeared above 19.54 °C and below 19.84, respectively (Table 2). The temperature difference between the spheres was marginally differed, since no passive heat was applied, and the ambient temperature of the room remained at 19 °C. The TLiDAR was related to the TManual, revealing an adjacent coefficient of determination (R2adj) of 0.95 RMSE = 0.02 °C in SB and 0.94 with and 0.01 °C in SW. Generally, high measuring uncertainty was noticed on spheres, when the metal construction placed in field conditions (Fig. 12 b), particularly in the black coated spheres. The minimum and maximum TLiDAR showed 1.5 °C difference on the surface of SB.

Table 2.

Results of LiDAR detected temperature (TLiDAR) of spheres on the metal tree indoors and outdoors (n = 15), regarding maximum (max) [°C], minimum (min) [°C], standard deviation (SD) [°C], mean bias error (MBE) [°C], root mean square error (RMSE) [°C], and adjusted coefficient of determination (R2adj).

min max mean [°C] SD MBE RMSE R2adj
Indoors SB 19.72 19.84 19.69 0.08 0.02 0.02 0.95
SW 19.54 19.82 19.84 0.07 0.01 0.01 0.94
Outdoors SB 20.81 22.31 22.20 0.67 -0.03 0.11 0.97
SW 19.18 19.63 19.39 0.18 0.01 0.04 0.95

Fig. 12.

Fig 12

The resulting thermal point cloud of the (a) metal tree indoors and (b) outdoors.

The methodology was applied in the orchard with a total number of 285 apples, 130 days after full bloom. The temperature varied in the 3D point cloud of the trees (Fig. 13a). Tree organs, found above 2 m, revealed reduced TLiDAR not exceeding 18.2 °C. Moreover, the TLiDAR on stem points showed a mean value of 20.6 °C with 0.65 °C standard deviation. After the application of fruit detection algorithm, the shape from 272 with an 89.7% F1 score was detected. The FSTLiDAR ranged between 16 and 22 °C (Fig. 13b).

Fig. 13.

Fig 13

Representation of (a) 3D thermal point cloud and (b) segmented temperature on fruit surface (FSTLiDAR) [°C] in sampled trees measured with at DAFB120.

The FSTManual was related with FSTLiDAR of apples in the west and east side of the trees, resulting in an R2adj of 0.91 and 0.99 with an RMSE of 0.25 and 0.01, respectively. The fruit located in the east side of tree developed an enhanced average FSTLiDAR (18.8 ± 0.75 °C), while a less pronounced value (18.3 ± 0.61 °C) was observed in the west side (Fig. 14). In parallel, apples from both sides depicted a similar range in terms of height.

Fig. 14.

Fig 14

Scatter plot and marginal box-Whisker plot of the segmented temperature on fruit surface (FSTLiDAR) and of fruit height, categorised based on the west (W) and east side (E) of the tree. The standard deviation is represented by lower and upper edges of the box, the dash in each box indicates the average.

Overall, the monitoring system demonstrated great potential for monitoring FSTLiDAR in apples. Moreover, the enhanced field of view of LiDAR laser scanner can determine the FST, which derive from the entire 3D tree profile, allowing to model fruit temperature and improve decision making in the orchard. The frequent acquisition of FSTLiDAR can be utilised as control measures to detect damaged fruit on the tree, increasing fruit storability and reducing food waste. The acquired FST information could be utilized, in future, for predicting various abiotic stresses (e.g. sunburn) and comprehending its effect on soluble solid content in relation with the position of the fruit in the tree canopy. The described methodology with specific customization, based on sensor availability, could be utilised for heat-stress monitoring in other perennial specialty crops.

Conclusions

The developed methodology was able to register the thermal images on LiDAR 3D point cloud with the lowest RMSE of 1.2 MSE/pixel. Application of the metal construction allowed the evaluation of the extrinsic calibration, presenting a highest 0.02 °C RMSE with 0.95 R2adj in the lab, and 0.11 °C RMSE with 0.97 under field conditions. It also provided meaningful information about the FSTLiDAR on apples, which correlated strongly with the FSTManual (R2adj = 0.99) in the east side of the tree. The values of apples in the east side of tree showed enhanced FSTLiDAR values compared to the west side. In summary, the phenotypic system was able to detect the temperature on apple surface, a result that can be utilised in the monitoring and prevention of fruit sunburn.

Declaration of Competing Interest

The authors declare no conflict of interest.

Acknowledgements

The project SHEET is part of the ERA-NET Cofund ICT-AGRI-FOOD, with funding provided by national sources [BLE] and co-funding by the European Union’s Horizon 2020 research and innovation program, Grant Agreement number 862665. This publication was supported by the Open Access Publication Fund of the University of Würzburg.

Footnotes

Supplementary material associated with this article can be found, in the online version, at doi:10.1016/j.mex.2022.101712.

Contributor Information

Nikos Tsoulias, Email: ntsoulias@atb-potsdam.de.

Andreas Nüchter, Email: andreas.nuechter@uni-wuerzburg.de.

Appendix. Supplementary materials

mmc1.zip (19.7KB, zip)

References

  • 1.Azam, S., Munir, F., Sheri, A. M., Ko, Y. M., Hussain, I., & Jeon, M. (2019). Data fusion of lidar and thermal camera for autonomous driving. Applied Industrial Optics 2019 (2019), Paper T2A.5, Part F167-AIO 2019, T2A.5. 10.1364/AIO.2019.T2A.5. [DOI]
  • 2.Borrmann, D. (2018). Multi-modal 3D mapping - Combining 3D point clouds with thermal and color information [Universität Würzburg]. doi: 10.25972/OPUS-15708. [DOI]
  • 3.Borrmann, D., Elseberg, J., & Nüchter, A. (2013). Thermal 3D mapping of building façades. Advances in Intelligent Systems and Computing, 193 AISC(VOL. 1), 173–182. doi: 10.1007/978-3-642-33926-4_16. [DOI]
  • 4.Borrmann D., Nüchter A., Dakulović M., Maurović I., Petrović I., Osmanković D., Velagić J. A mobile robot based system for fully automated thermal 3D mapping. Adv. Eng. Inf. 2014;28(4):425–440. doi: 10.1016/J.AEI.2014.06.002. [DOI] [Google Scholar]
  • 5.Bulanon D.M., Lonai J., Skovgard H., Fallahi E. Evaluation of different irrigation methods for an apple orchard using an aerial imaging system. ISPRS Int. J. Geo-Inf. 2016;5(6):79. doi: 10.3390/IJGI5060079. 2016, Vol. 5, Page 79. [DOI] [Google Scholar]
  • 6.Calderón-Zavala, G., Lakso, A. N., & Piccioni, R. M. (n.d.). Temperature effects on fruit and shoot growth in the apple (Malus domestica) early in the season.
  • 7.Chakraborty M., Khot L.R., Sankaran S., Jacoby P.W. Evaluation of mobile 3D light detection and ranging based canopy mapping system for tree fruit crops. Comput. Electron. Agric. 2019;158:284–293. doi: 10.1016/J.COMPAG.2019.02.012. [DOI] [Google Scholar]
  • 8.Chandel A.K., Khot L.R., Osroosh Y., Peters T.R. Thermal-RGB imager derived in-field apple surface temperature estimates for sunburn management. Agric. For. Meteorol. 2018;253–254:132–140. doi: 10.1016/j.agrformet.2018.02.013. [DOI] [Google Scholar]
  • 9.Chen L.S., Li P., Cheng L. Effects of high temperature coupled with high light on the balance between photooxidation and photoprotection in the sun-exposed peel of apple. Planta. 2008;228(5):745–756. doi: 10.1007/S00425-008-0776-3/FIGURES/11. [DOI] [PubMed] [Google Scholar]
  • 10.Fu L., Gao F., Wu J., Li R., Karkee M., Zhang Q. Application of consumer RGB-D cameras for fruit detection and localization in field: a critical review. Comput. Electron. Agric. 2020;177 doi: 10.1016/J.COMPAG.2020.105687. [DOI] [Google Scholar]
  • 11.Gan H., Lee W.S., Alchanatis V., Ehsani R., Schueller J.K. Immature green citrus fruit detection using color and thermal images. Comput. Electron. Agric. 2018;152:117–125. doi: 10.1016/j.compag.2018.07.011. [DOI] [Google Scholar]
  • 12.Ge L., Zou K., Zhou H., Yu X., Tan Y., Zhang C., Li W. Three dimensional apple tree organs classification and yield estimation algorithm based on multi-features fusion and support vector machine. Inf.Process. Agric. 2021 doi: 10.1016/J.INPA.2021.04.011. [DOI] [Google Scholar]
  • 13.Gené-Mola J., Gregorio E., Guevara J., Auat F., Sanz-Cortiella R., Escolà A., Llorens J., Morros J.R., Ruiz-Hidalgo J., Vilaplana V., Rosell-Polo J.R. Fruit detection in an apple orchard using a mobile terrestrial laser scanner. Biosyst. Eng. 2019;187:171–184. doi: 10.1016/j.biosystemseng.2019.08.017. [DOI] [Google Scholar]
  • 14.Gongal A., Karkee M., Amatya S. Apple fruit size estimation using a 3D machine vision system. Inf. Process. Agric. 2018;5(4):498–503. doi: 10.1016/j.inpa.2018.06.002. [DOI] [Google Scholar]
  • 15.Hobart M., Pflanz M., Weltzien C., Schirrmann M. Growth height determination of tree walls for precise monitoring in apple fruit production using UAV photogrammetry. Remote Sens. 2020;12(10):1656. doi: 10.3390/RS12101656. 2020, Vol. 12, Page 1656. [DOI] [Google Scholar]
  • 16.Hosoi F., Umeyama S., Kuo K. Estimating 3D chlorophyll content distribution of trees using an image fusion method between 2D camera and 3D Portable scanning lidar. Remote Sens. 2019;11(18):2134. doi: 10.3390/RS11182134. 2019, Vol. 11, Page 2134. [DOI] [Google Scholar]
  • 17.Kurtulmus F., Lee W.S., Vardar A. Immature peach detection in colour images acquired in natural illumination conditions using statistical classifiers and neural network. Precis. Agric. 2014;15(1):57–79. doi: 10.1007/S11119-013-9323-8/TABLES/7. [DOI] [Google Scholar]
  • 18.Li L., Peters T., Zhang Q., Zhang J. Modeling apple surface temperature dynamics based on weather data. Sensors. 2014;14(11):20217–20234. doi: 10.3390/S141120217. 2014, Vol. 14, Pages 20217-20234. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Lin D., Jarzabek-Rychard M., Tong X., Maas H.G. Fusion of thermal imagery with point clouds for building façade thermal attribute mapping. ISPRS J. Photogramm. Remote Sens. 2019;151:162–175. doi: 10.1016/J.ISPRSJPRS.2019.03.010. [DOI] [Google Scholar]
  • 20.Lolicato, S. S. P. for F. P. M. for P. S. on F. T. D. of P. I. (DPI): M. V. A. 2011. (2006). Sun protection for fruit a practical manual for preventing sunburn on fruit-2011. www.dpi.vic.gov.au
  • 21.Morales-Quintana L., Waite J.M., Kalcsits L., Torres C.A., Ramos P. Vol. 260. Elsevier B.V; 2020. Sun injury on apple fruit: Physiological, biochemical and molecular advances, and future challenges. (Scientia Horticulturae). [DOI] [Google Scholar]
  • 22.Racsko J., Schrader L.E. Sunburn of apple fruit: historical background, recent advances and future perspectives. Crit. Rev. Plant Sci. 2012;31(6):455–504. doi: 10.1080/07352689.2012.696453. [DOI] [Google Scholar]
  • 23.Ranjan R., Khot L.R., Peters R.T., Salazar-Gutierrez M.R., Shi G. In-field crop physiology sensing aided real-time apple fruit surface temperature monitoring for sunburn prediction. Comput. Electron. Agric. 2020;175 doi: 10.1016/j.compag.2020.105558. [DOI] [Google Scholar]
  • 24.Schrader L.E., Kahn C.B., Sun J., Xu J., Zhang J., Felicetti D.A. Effects of high temperature and high solar irradiance on sunburn, quality, and skin pigments of apple fruit. Acta Hortic. 2011;903:1025–1039. doi: 10.17660/ACTAHORTIC.2011.903.144. [DOI] [Google Scholar]
  • 25.Shi G., Ranjan R., Khot L.R. Robust image processing algorithm for computational resource limited smart apple sunburn sensing system. Inf. Process. Agric. 2020;7(2):212–222. doi: 10.1016/J.INPA.2019.09.007. [DOI] [Google Scholar]
  • 26.Stanley, C. J., Tustin, D. S., Lupton, G. B., Mcartney, S., Cashmore, W. M., & de Silva, H. N. (2015). The Journal of Horticultural Science and Biotechnology Towards understanding the role of temperature in apple fruit growth responses in three geographical regions within New Zealand. doi: 10.1080/14620316.2000.11511261. [DOI]
  • 27.Tsoulias N., Paraforos D.S., Fountas S., Zude-Sasse M. Estimating canopy parameters based on the stem position in apple trees using a 2D lidar. Agronomy. 2019 doi: 10.3390/agronomy9110740. [DOI] [Google Scholar]
  • 28.Tsoulias N., Paraforos D.S., Xanthopoulos G., Zude-Sasse M. Apple shape detection based on geometric and radiometric features using a LiDAR laser scanner. Remote Sens. 2020;12(15):2481. doi: 10.3390/RS12152481. 2020, Vol. 12, Page 2481. [DOI] [Google Scholar]
  • 29.Wand S.J.E., Theron K.I., Ackerman J., Marais S.J.S. Harvest and post-harvest apple fruit quality following applications of kaolin particle film in South African orchards. Sci. Hortic. 2006;107(3):271–276. doi: 10.1016/J.SCIENTA.2005.11.002. [DOI] [Google Scholar]
  • 30.Wang H., Lin Y., Wang Z., Yao Y., Zhang Y., Wu L. Validation of a low-cost 2D laser scanner in development of a more-affordable mobile terrestrial proximal sensing system for 3D plant structure phenotyping in indoor environment. Comput. Electron. Agric. 2017;140:180–189. doi: 10.1016/j.compag.2017.06.002. [DOI] [Google Scholar]
  • 31.Warrington I.J., Fulton T.A., Halligan E.A., de Silva H.N. Apple fruit growth and maturity are affected by early season temperatures. J. Am. Soc. Hortic. Sci. 1999;124(5):468–477. doi: 10.21273/JASHS.124.5.468. [DOI] [Google Scholar]
  • 32.Watkins C.B., Reid M.S., Harman J.E., S Padfield C.A., Watkins M.S., Reid C.B., Harman C.A.S., Padfield J.E. Starch iodine pattern as a maturity index for Granny Smith apples 2. Differences between districts and relationship to storage disorders and yield Starch iodine pattern as a for Granny Smith apples matu rity index 2. Differences between districts and relationship to storage disorders and yield. N. Z. J. Agric. Res. 1982;25:587–592. doi: 10.1080/00288233.1982.10425224. [DOI] [Google Scholar]
  • 33.Wünsche J.N., Greer D.H., Palmer J.W., Lang A., McGhie T. Sunburn - The cost of a high light environment. Acta Hortic. 2001;557:349–356. doi: 10.17660/ACTAHORTIC.2001.557.46. [DOI] [Google Scholar]
  • 34.Yandún Narváez F.J., Salvo del Pedregal J., Prieto P.A., Torres-Torriti M., Auat Cheein F.A. LiDAR and thermal images fusion for ground-based 3D characterisation of fruit trees. Biosyst. Eng. 2016;151:479–494. doi: 10.1016/j.biosystemseng.2016.10.012. [DOI] [Google Scholar]
  • 35.Zhang Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000;22(11):1330–1334. doi: 10.1109/34.888718. [DOI] [Google Scholar]
  • 36.Zhou, Q.Y., Park, J., & Koltun, V. (n.d.). Open3D: a modern library for 3D data processing. arXiv preprint arXiv:1801.09847 (2018), doi: 10.48550/arXiv.1801.09847. [DOI]
  • 37.Zude-Sasse M., Akbari E., Tsoulias N., Psiroukis V., Fountas S., Ehsani R. In: Progress in precision agriculture. Kerry R., Escola A., editors. Springer; Cham: 2021. Sensing approaches for precision agriculture; pp. 221–251. [DOI] [Google Scholar]
  • 38.Tsoulias N., Xanthopoulos G., Fountas S., Zude-Sasse M. Effects of soil ECa and LiDAR-derived leaf area on yield and fruit quality in apple production.". Biosystems Engineering. 2022 doi: 10.1016/j.biosystemseng.2022.03.007. In press. [DOI] [Google Scholar]
  • 39.Lachapelle M, Bourgeois G, DeEll J.R. Effects of Preharvest Weather Conditions on Firmness of ‘McIntosh’ Apples at Harvest Time. HortScience. 2013;48(4):474–480. doi: 10.21273/HORTSCI.48.4.474. [DOI] [Google Scholar]
  • 40.Tsoulias N., Fountas S., Zude-Sasse Manuela. Estimating the canopy volume using a 2D LiDAR in apple trees. In IV International Symposium on Horticulture in Europe-SHE2021 1327. 2021;1327:437–444. doi: 10.17660/ActaHortic.2021.1327.58. [DOI] [Google Scholar]
  • 41.Honda C., Bessho H., Murai M., Iwanami H., Shigeki M., Kazuyuki A., Masato W., Yuki M.T., Hiroko H., Miho T. Effect of temperature on anthocyanin synthesis and ethylene production in the fruit of early-and medium-maturing apple cultivars during ripening stages. HortScience. 2014;49(12):1510–1517. doi: 10.21273/HORTSCI.49.12.1510. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

mmc1.zip (19.7KB, zip)

Articles from MethodsX are provided here courtesy of Elsevier

RESOURCES