Skip to main content
Sensors (Basel, Switzerland) logoLink to Sensors (Basel, Switzerland)
. 2024 Aug 21;24(16):5409. doi: 10.3390/s24165409

A Comprehensive Review of LiDAR Applications in Crop Management for Precision Agriculture

Sheikh Muhammad Farhan 1, Jianjun Yin 1,*, Zhijian Chen 1, Muhammad Sohail Memon 1
Editor: Asim Biswas1
PMCID: PMC11360157  PMID: 39205103

Abstract

Precision agriculture has revolutionized crop management and agricultural production, with LiDAR technology attracting significant interest among various technological advancements. This extensive review examines the various applications of LiDAR in precision agriculture, with a particular emphasis on its function in crop cultivation and harvests. The introduction provides an overview of precision agriculture, highlighting the need for effective agricultural management and the growing significance of LiDAR technology. The prospective advantages of LiDAR for increasing productivity, optimizing resource utilization, managing crop diseases and pesticides, and reducing environmental impact are discussed. The introduction comprehensively covers LiDAR technology in precision agriculture, detailing airborne, terrestrial, and mobile systems along with their specialized applications in the field. After that, the paper reviews the several uses of LiDAR in agricultural cultivation, including crop growth and yield estimate, disease detection, weed control, and plant health evaluation. The use of LiDAR for soil analysis and management, including soil mapping and categorization and the measurement of moisture content and nutrient levels, is reviewed. Additionally, the article examines how LiDAR is used for harvesting crops, including its use in autonomous harvesting systems, post-harvest quality evaluation, and the prediction of crop maturity and yield. Future perspectives, emergent trends, and innovative developments in LiDAR technology for precision agriculture are discussed, along with the critical challenges and research gaps that must be filled. The review concludes by emphasizing potential solutions and future directions for maximizing LiDAR’s potential in precision agriculture. This in-depth review of the uses of LiDAR gives helpful insights for academics, practitioners, and stakeholders interested in using this technology for effective and environmentally friendly crop management, which will eventually contribute to the development of precision agricultural methods.

Keywords: precision agriculture, LiDAR technology, crop management, disease detection, yield estimation, autonomous harvesting systems

1. Introduction

Precision agriculture is a developing area that uses modern technologies and data-driven methods to maximize crop yield [1,2]. Traditional agricultural practices have drawbacks for productivity, resource use, and crop management [3]. To address these challenges, the integration of LiDAR (light detection and ranging) technology in precision agriculture has emerged as a promising solution. Using laser pulses, LiDAR can produce precise 3D models of the surrounding area for use in remote sensing applications. Since its creation for mapping purposes, LiDAR has found extensive use in several sectors, including forestry, urban design, and, most recently, agriculture [4,5,6]. It is an important tool for crop management since it can record precise, high-resolution data. The introduction of LiDAR technology in precision agriculture has changed crop management techniques, allowing farmers to make educated choices based on accurate and real-time data [7]. LiDAR technology facilitates more targeted interventions and optimizes resource use—such as water, fertilizers, and pesticides—by providing precise data on crop, soil, and field conditions. Agricultural robots equipped with LiDAR enhance various functions, including crop monitoring, disease detection, weed management, yield estimation, mapping, autonomous navigation, and harvesting operations [8,9,10,11]. Several studies suggest combining data from aerial and terrestrial LiDAR systems for crop, canopy, and ground may be a simple, effective, and inexpensive way to enhance site-specific monitoring in agriculture [12,13,14]. There is substantial potential for the use of LiDAR in precision agriculture for crop management. However, this field still requires a thorough examination. LiDAR applications’ present condition may be better understood by researching case studies, reviewing the existing literature, and critically evaluating recent developments.

Previous literature reviews [15,16,17] on LiDAR in precision agriculture have provided valuable insights but have not focused on crop management, particularly crop cultivation and harvesting. Debnath et al. [15] explored LiDAR applications within a limited scope, and Rivera et al. [16] focused exclusively on studies limited to a 5-year timeframe (2017–2022). This comprehensive review examined the application of LiDAR in crop management, emphasizing its role in improving cultivation and harvesting, earlier foundational work, and incorporating the latest advancements. This broader perspective allows us to provide a thorough and up-to-date overview of the possible advantages and difficulties connected with the application of LiDAR technology in crop management methods by concentrating on the unique function of LiDAR technology in these important areas of precision agriculture.

The review paper follows a well-organized structure to explore the critical evaluation of applications of LiDAR for cultivating and harvesting crops in precision agriculture. The paper begins with an introduction that establishes the background and context of precision agriculture, emphasizing the importance of efficient crop management. It then introduces LiDAR technology as an emerging tool with revolutionary potential in this field. The research objective of the paper is to provide a comprehensive evaluation of LiDAR applications. The subsequent sections are divided into key themes. The first section provides an overview of LiDAR technology in precision agriculture, discussing its definition, principles, and different types of LiDAR systems. A key summary of the study results on airborne LiDAR systems (ALSs), terrestrial LiDAR systems (TLSs), and mobile LiDAR systems (MLSs), has been presented in the tables for better understanding and clarity. Key differences between and uses of each LiDAR system are concisely summarized in these tables, which serve as crucial references. The following sections delve into the applications of LiDAR in crop cultivation and harvesting separately, examining its role in crop monitoring, weed detection, plant health assessment, soil analysis, yield estimation, and autonomous harvesting systems. A critical evaluation section follows, exploring the advantages and limitations of LiDAR technology. Various LiDAR systems and their uses in crop management are summarized in detail at the end of the paper. This table provides a brief overview of the cutting-edge technologies being used by scientists all around the globe to improve crop management. It concludes with a comparison of different LiDAR system types (ALS, TLS, and MLS) and their limitations. The paper concludes with future perspectives and challenges, highlighting emerging trends, research gaps, and potential solutions. By following this structured approach, the review paper aims to give a thorough and insightful analysis of the role of LiDAR in revolutionizing crop management in precision agriculture.

1.1. Overview of Precision Agriculture

Field-level management techniques for supporting sustainable food production systems are improved by precision agriculture. Furthermore, to produce food sustainably, agricultural operations must be more closely matched to the potential of soil fertility, crop requirements, and environmental circumstances [18]. Precision agriculture aims to maximize agricultural earnings using many essential tactics. It first emphasizes effective resource management by operating systems for applying fertilizers, agrochemicals, and water at varying rates. Ensuring these inputs are dispersed precisely where required helps reduce waste and increase efficacy [19].

Furthermore, precision farming tries to reduce agricultural output losses during harvest. Farmers may decide the best time to harvest their crops and ensure the highest yield and quality using cutting-edge technology and real-time data analysis [20,21]. This strategy strives to reduce the negative environmental impacts of farming. Precision agriculture minimizes fertilizer loss into water bodies and greenhouse gas emissions using focused irrigation and precise input application that protects the environment [22,23]. Finally, precision agriculture aims to minimize the total environmental impact of agricultural inputs. Promoting techniques that support sustainable agricultural systems and boost long-term soil health includes techniques such as carbon sequestration and soil organic matter improvement [24,25]. In conclusion, precision agriculture refers to a variety of techniques that enhance resource management, reduce yield losses, reduce environmental hazards, and maximize the overall effect of agricultural inputs, eventually resulting in higher agricultural profitability and sustainability.

1.2. Importance of Efficient Crop Management and the Emerging Role of LiDAR Technology

Effective crop management is crucial in the context of sustainable agriculture and the need to meet the expanding requirements of a rapidly expanding global population [26]. Since LiDAR technologies have the potential to revolutionize precision farming, their development and application in this context have received a lot of attention. The study conducted by Xu et al. [27] research centered on the advancement and assessment of a UAV-LiDAR system designed to facilitate precision agriculture and plant phenotyping. The highest canopy height was estimated with a reported error of 0.1 m. The results provide empirical support for the effectiveness of the LiDAR system, suggesting its potential use in fields including precision crop management and plant breeding. Since LiDAR can take exact three-dimensional data on crop health, terrain, crop breeding, and vegetation structure, farmers may use this information to manage their fields better regarding irrigation, fertilization, insect control, and more [28]. Zhang et al. [29] reported that LiDAR data combined with sophisticated analytics and machine learning algorithms further enhanced capability, enabling farmers to recognize crop-stress zones, adopt better planting practices, and precisely target treatments. LiDAR technology enhances environmental sustainability, reduces resource waste, and boosts agricultural productivity. It aids in creating high-resolution elevation models and vegetation maps, supporting watershed management, land use planning, and climate adaptation while optimizing resource use and advancing sustainable food production amidst growing global populations. It is of the highest significance in modern agriculture to have effective crop management to fulfill the ever-increasing needs for food production while simultaneously reducing the number of resources used and the negative impact on the environment [30,31]. The implementation of precision agricultural methods, such as LiDAR applications, is essential for reaching these goals.

1.2.1. Enhancing Productivity

Using effective crop management techniques, farmers may maximize production potential by carefully matching inputs such as water, fertilizer, and pesticides to the demands of various crops. By delivering thorough and reliable data regarding crop health, growth patterns, and fertilizer needs, LiDAR technology significantly increases agricultural yield. El-Naggar et al. [32] used terrestrial LiDAR for the estimation of crop growth and water use. When compared to manually observed canopy height, the TLS findings showed a noteworthy and statistically significant correlation with minor biases and errors. The R2 values of the correlation coefficients for barley, pea, and bean were 5.85 (RMSE = 0.95), 3.01 (RMSE = 0.93), and 1.82 (RMSE = 1.82), respectively. Additionally, the TLS approach showed promise—with an RMSE of 37.56 and an R2 value of 0.70 for predicting bean biomass. LiDAR allows farmers to assess crop attributes such as height, density, and canopy structure precisely, using laser beams to produce high-resolution 3D reconstructions of the terrain. According to a study by Eyre et al. [33] geographically weighted regression (GWR) models that utilize topographic variables derived from LiDAR data are highly efficient at identifying field-scale variations in crop yield across various varieties. The coefficient of determination values for maize, wheat, soybeans, and the overall average of all crops were R2 = 0.80, 0.73, 0.71, and 0.75, respectively, based on the mean of the local relationships. LiDAR data identifies nutrient deficits and insect infestations, enabling targeted actions. Canopy models provide insights into light, airflow, and shade patterns, optimizing planting densities, irrigation, and pruning.

LiDAR data enhances irrigation design and water management, provides accurate yield estimates, and integrates with remote sensing and analytics for harvest planning and supply chain management. Farmers may embrace precision agriculture, maximize resource allocation, reduce environmental impact, and contribute to efficient and sustainable food production using LiDAR technology with other instruments [34].

1.2.2. Resource Optimization

Precision agriculture has been transformed by LiDAR technology, which has great promise for increasing agricultural output while solving issues with resource efficiency, water shortages, and environmental sustainability. LiDAR provides precise irrigation procedures that save water use while assuring optimum plant development by detecting soil moisture levels and crop water needs [35]. LiDAR is also essential for the targeted administration of pesticides and fertilizers since it helps avoid over-application and possible environmental damage [36]. Farmers can detect regions of changing crop density and health thanks to its capability to build comprehensive 3D models of crop canopies, allowing the creation of prescription maps for exact input applications. This technique, known as variable rate application, minimizes the effect on the environment, uses fewer chemicals, and best utilizes available resources.

Additionally, LiDAR data may be combined with data analytics and machine learning to create predictive models, allowing data-driven decision-making for picking the best crop types, modifying planting densities, and implementing timely interventions [37]. The advantages of the technology also extend to the management of orchards and vineyards, enabling the precise assessments of tree and vine structures for improved pruning techniques, canopy management, and yield estimates. Farmers may achieve sustainable and effective crop production via precision farming techniques by using the power of LiDAR and combining it with other technologies, eventually resulting in a more resource-efficient and ecologically conscientious agricultural sector [38,39].

1.2.3. Disease and Pest Management

The early identification of plant stress indicators and disease signs using LiDAR data helps farmers intervene strategically to slow disease development. Farmers may use LiDAR to administer targeted treatments, eliminating the need to use broad-spectrum pesticides and encouraging sustainable agricultural methods [40]. Incorporating LiDAR data with remote sensing, machine learning, and data analytics enables the creation of predictive models for disease and parasite outbreaks, thereby facilitating proactive decision-making [41]. By giving data on plant water needs and nutrient distribution, LiDAR also helps improve irrigation and fertilization operations. Additionally, the system offers agricultural yield assessment and forecasts, supporting farmers in projecting production potential, optimizing resource allocation, and enhancing harvest procedures. Improved production decreased environmental effects, and the development of sustainable agriculture are all possible because of LiDAR’s capacity to deliver precise crop information, identify plant health concerns, and optimize resource management [42,43].

1.2.4. Sustainability and Environmental Impact

Achieving sustainable agricultural practices is a global priority. Efficient crop management practices supported by LiDAR applications contribute to reducing the environmental footprint of agriculture. By optimizing inputs and minimizing the wastage of resources such as water and fertilizers, LiDAR aids in the reduction of greenhouse gas emissions, nutrient runoff, and soil degradation. This leads to more environmentally friendly and sustainable agricultural systems [44,45].

In conclusion, efficient crop management is crucial for meeting the increasing demand for food production while minimizing resource utilization and environmental impact.

2. LiDAR Technology in Precision Agriculture: An Overview

2.1. LiDAR in Precision Agriculture

LiDAR technology operates based on the principles of laser ranging and time-of-flight (TOF) measurement. It involves emitting laser pulses towards the target area, which bounce back when they encounter objects or surfaces, allowing for precise distance calculations [46,47]. LiDAR systems provide precise distance measurements by measuring the time it takes for the laser pulses to return. To create accurate 3D maps of the surroundings, these data are combined with the angles and placements of the laser pulses. LiDAR scanners generate laser beams in a pattern that scans a large region while quickly capturing several points in each location. These points precisely depict the shape outlines and spatial features of the items in the scanned area, forming point clouds together. Additionally, it is possible to measure the strength of the returned laser pulses, providing extra information on the reflectance or surface characteristics of the objects [48,49,50]. Figure 1 provides a detailed explanation of the LiDAR system’s operating concept as described by Bates et al. [51].

Figure 1.

Figure 1

Use of LiDAR in precision agriculture [51].

The distance a single photon has traveled to and from an object is calculated using Equation (1):

d = c t/2, (1)

where “d” is the distance from an object; “c” is the speed of light, whose value is constant; and “t” is the time-of-flight, which can be measured by the difference between the start time of the emitted pulse and the time the reflected pulse hits the sensor.

Phase shift measurement (PMS), which uses a continuous light source, modulates light power at a fixed frequency. As a result, we may say that the modulated light has a sinusoidal profile. Equation (2) may be used to calculate the distance between the source and the object in terms of the angle between the waves’ peaks:

d = c∆θ/2πf (2)

where “d” is the distance, “c” is the speed of light, “∆θ” is the phase difference, and “f” is the frequency of the modulated power. TOF or PMS distance measurements are converted into elevation data. These elevation data appropriately represent numerous ground objects in the scanned region. Researchers and professionals can effectively visualize and analyze the terrain, the topography, and the structural characteristics of ground objects by utilizing the precise elevation data derived from TOF or PMS measurements [52].

LiDAR has gained recognition in precision agriculture as a powerful tool for crop monitoring, management, and harvesting due to its ability to capture highly accurate and detailed spatial information. By providing precise measurements of crop height, canopy structure, and vegetation density [53,54], LiDAR enables farmers and agronomists to assess crop health, monitor growth patterns, and detect variations or stress factors [55,56]. Furthermore, LiDAR assists in soil analysis, allowing for precise soil mapping, the identification of soil properties, and the assessment of soil erosion risks. This spatial information empowers farmers to make data-driven decisions for optimizing irrigation, fertilization, pest control, and overall crop management.

2.2. Types of LiDAR Systems and Their Use in Precision Agriculture

A detailed overview of several LiDAR systems and their specialized applications in precision agriculture is explained by Jin et al. [57], as presented in Figure 2. Researchers categorized LiDAR systems based on the range of LiDAR sensors, distinguishing between proximal sensing and remote sensing. Proximal sensing platforms, such as terrestrial LiDAR systems (TLSs) and mobile LiDAR systems (MLSs), and remote sensing platforms, such as airborne LiDAR systems (ALSs), play distinct roles in precision agriculture, each contributing unique capacities to the field. By analyzing these various LiDAR technologies in-depth, this review provides valuable insights into how each system can be utilized for crop management.

Figure 2.

Figure 2

Different types of LiDAR systems are used in precision agriculture [57].

2.2.1. Airborne LiDAR Systems (ALSs)

Airborne LiDAR systems involve the installation of the LiDAR system onto unmanned aerial vehicles (UAVs), which can be aircraft or drones, allowing them to efficiently collect data at low altitudes over agricultural fields, as shown in Figure 3 and Figure 4. These systems emit laser pulses and measure the time it takes for the pulses to return after hitting the target objects [58].

Figure 3.

Figure 3

Airborne LiDAR system [58].

By analyzing the returned signals, the ALS can create highly accurate 3D models of the terrain and vegetation [59]. They provide valuable information for precision agriculture, such as crop health assessment, vegetation mapping, and canopy structure analysis [60]. The ALS provides high-resolution data, offers a cost-effective solution for precision agriculture, and has been successfully used for crop monitoring, terrain modeling, and flood risk assessment in various agricultural settings [61]. Rakesh et al. [62] reported that ALSs can perform targeted analysis and decision-making for crop management, such as crop yield estimation, disease detection, and irrigation management. Fareed et al. [63] stated that ALSs can capture detailed information about crop conditions, plant health, and pest infestations at the field level. After a comprehensive review, Aslan et al. [64] concluded that ALS is particularly useful for field mapping, yield estimation, and resource optimization. Dowling et al. [65] and Qin et al. [66] employed UAV-LiDAR for mapping and navigation and demonstrated effective obstacle avoidance capabilities in their studies. Turner et al. [67] used ALS observations to map soil surface roughness (SR) in agriculture. The findings demonstrated that soil profile surface heights estimated by ALS were more precise and accurate than those estimated by ground measurements. The effects of farming activities on surface roughness were tracked using LiDAR data, which showed promising results for mapping SR in agriculture. Ladefoged et al. [68] found geographical and temporal trends in the evolution of local agricultural systems using a high-resolution ALS. Researchers emphasized that integrating LiDAR data with productivity models improved the comprehension of agricultural growth in the Hawaiian area. Zhang et al. [69] reported that accurate estimation of grassland vegetation parameters such as maximum, minimum, and mean canopy height, AGB, and fractional vegetative coverage at a high spatial resolution can be achieved with a UAV-mounted ALS, as presented in Figure 4.

Figure 4.

Figure 4

UAV-mounted ALS for canopy height estimation [69].

Liu and Bo [70] used the UAV–LiDAR system to identify different crop species in the complicated, fragmented agricultural landscape and crop planting structure by using canopy height model (CHM) data. The accuracy of crop species classification was significantly improved by including geometric and textural cues in the object-based classification technique. The suggested object-based classification framework enabled the research to categorize crop species with an overall accuracy of 90.33%. Table 1 illustrates the diverse applications of ALSs and important findings from the literature in the realm of crop management. The classification, implementation, and limitations of ALSs are presented in Table 2.

Table 1.

Uses of ALSs in the realm of crop management.

Crop Crop Management Practices Important Findings Ref.
Sorghum Growth monitoring UAV–LiDAR and RGB photogrammetry accurately assessed canopy height (CH) and LAI in sorghum. LiDAR outperformed photogrammetry, achieving superior accuracy with an R2 of 0.975 and a RMSE of 5.94% for CH. [71]
Pea, Chickpea Crop height estimation Highly significant correlations (p < 0.0001) were found between LiDAR-estimated and manually measured plant heights, with correlation coefficients (r) of 0.74 for chickpeas and 0.91 for peas. [72]
Atlas, Red Carina, Pasto Height, biomass, and grain
estimation
The LiDAR-based hyperspectral physiological reflectance index (PRI) demonstrated superior performance, distinguishing between salt-affected and treated plants. The overall dataset achieved an R2 of 0.46, with specific subgroups reaching an R2 of 0.64. [73]
Potato Monitoring agricultural
biomass and plant growth
The LiDAR scanner demonstrated a strong correlation with field-measured parameters, with a high R2 of 0.89 and a low RMSE of 0.028 m for PH, and an R2 of 0.81 with an RMSE of 31.65% for AGB. [74]
Sugarcane Prediction of biomass and
leaf N2 content
Crop growth parameters were monitored, including height, density, and vegetation indices. Predictive models were evaluated based on multispectral predictors alone, LiDAR predictors alone, and a fusion of both, then compared against an NDVI benchmark. The multispectral model showed slightly superior performance (R2 = 0.57) compared to the LiDAR model (R2 = 0.52), with both models surpassing the NDVI benchmark (R2 = 0.34). [75]
Sugarcane Estimation of yield and aboveground fresh
weight
Six regression algorithms (MLR, SMR, GLM, GBM, KRLS, and RFR) were employed to construct the sugarcane aboveground fresh weight (AFW) model using LiDAR data. Results showed that RFR outperformed other models’ prediction accuracy (R2 = 0.96, RMSE = 1.27 kg m−2). The final sugarcane AFW distribution maps demonstrated strong agreement with detected values (R2 = 0.97, RMSE = 1.33 kg m−2). [76]
Sorghum Biomass estimation and
prediction
Two biomass prediction methods were explored: one utilized LSM to predict biomass from LiDAR point cloud data directly, and the other employed the APSIM crop simulation model. The LiDAR approach yielded R2 values of 0.48 and 0.55 for SVR and MLP, respectively, while the APSIM model achieved R2 of 0.31 and 0.67 for SVR and MLP, respectively. [77]
Avocado, Macadamia, Mango Orchard and tree crown
assessment
The study found that crown structure measurements, particularly those based on the top of the crown, exhibited strong consistency between ALS and TLS data. Crown area measurements showed the highest correlation (R2 = 0.997) between the two data sources. The linear model’s RMSE for maximum crown height derived from ALS and TLS data was 0.29 m, with an R2 of 0.99. [78]
Wheat Monitoring of CH, biomass, and N2 uptake The 95th percentile of normalized LiDAR points showed a strong correlation (R2 = 0.88) with manually measured crop heights and (R2 = 0.92) with crop heights obtained from a UAV system using optical imaging, out of the 57 UAV–LiDAR metrics that were analyzed. [79]
Sorghum Biomass prediction Geometric features extracted from LiDAR data yielded dependable and precise biomass predictions. The 750–1100 nm spectral range proved to be the most informative for biomass prediction, with R2 values for end-of-season biomass ranging from 0.64 to 0.89. [80]
Table 2.

Classification, implementation, and limitations of airborne LiDAR systems.

Component Description Implementation Limitations
LiDAR Sensors Emits laser pulses and measures the time it takes for them to return Mounted on aircraft (e.g., UAVs and planes) Limited by battery life and flight duration (especially for UAVs)
Global Positioning System (GPS) Provides precise location information Integrated with LiDAR sensor Signal interference can affect accuracy, especially in dense vegetation
Inertial Measurement Unit (IMU) Measures the rate of acceleration and changes in rotational attributes Works with GPS to provide accurate positioning data Sensor drift over time can affect data accuracy
Data Storage Onboard storage system for capturing LiDAR data High-capacity storage systems onboard Storage capacity may limit the amount of data collected during a single flight.
UAV (Unmanned Aerial Vehicle) Aircraft without a human pilot onboard are used for carrying LiDAR sensors Suitable for small to medium-scale areas Limited flight time and payload capacity; subject to weather conditions
Manned Aircraft Airplanes or helicopters piloted by humans are used for carrying LiDAR sensors. Suitable for large-scale mapping Higher operational costs and regulatory restrictions

2.2.2. Terrestrial LiDAR Systems (TLSs)

TLSs are ground-based setups that utilize stationary tripod stands to scan and capture detailed crop information at different locations. These systems emit laser pulses in various directions, capturing multiple measurements from different angles. By combining these measurements, TLSs generate precise 2D and 3D point cloud data of the surrounding environment, as shown in Figure 5.

Figure 5.

Figure 5

TLS for crop height estimation [81].

Many researchers reported that TLSs are especially effective for detailed mapping of crop structures, including plant height, canopy density, and individual plant or tree measurements. In contrast to UAV-based LiDAR, which typically covers larger areas from above, a TLS provides high-resolution, ground-level data that offers greater precision in assessing specific crop attributes and intricate details within a given area. It is also reported that TLSs are superior to UAV–LiDAR for accurate crop health assessment, growth pattern assessment, and biomass estimation at the field level. Researchers have utilized TLSs for crop phenotyping, growth analysis, and precision irrigation management [82].

Hosoi and Omasa [83] estimated vertical plant area density profiles of the wheat canopy during several development phases, including tillering, stem elongation, blooming, and ripening, using a portable TLS. By setting up regression models between LiDAR-measured plant area density and organ dry weight, the researchers were able to determine the total dry weight and carbon stocks of above-ground wheat organs. Researchers concluded that these results could help better understand how carbon is stored in agricultural systems and improve crop management techniques and carbon sequestration methods. Martínez et al. [84] used a TLS to measure the LAI of grapevine and reported that the LiDAR sensor can also produce maps if the maximum distance between scan points does not exceed 15 m. However, to avoid LAI overestimation, increasing the horizontal resolution of LiDAR scanning is essential. Hobart et al. [85] used drones and photogrammetry to measure tree wall heights in apple orchards and compared them with TLS data as a reference, as described in Figure 6. Researchers reported that although there was a good match between the drones’ point clouds and the ground-based LiDAR data, the drones’ point clouds had trouble collecting the fine apple tree shoots, which resulted in lower assessments of tree wall heights. The method can be used to manage orchards precisely, but it must be modified to consider the wider tree gaps and the decreased vertical extent of tree walls.

Figure 6.

Figure 6

View of apple trees rows as an RGB, point clouds from drones (orange) and LiDAR (blue) superimposed with tree wall height curves [85].

Using a TLS, Hofle [86] described a unique method for mapping individual maize plants. The developed method involved point cloud segmentation and filtering detected plants accurately and consistently, as shown in Figure 7. The amplitude variation of homogenous regions was minimized using radiometric correction, improving the separability of maize plants and the surrounding soil with greater accuracy when applied to LiDAR point clouds taken by TLS, which provided benefits including lessened obstruction effects and more uniform point density.

Figure 7.

Figure 7

(a) Maize plants (arranged in 8 rows) with additional vegetation, and (b) a 3D model of the segmented point cloud [86].

Xu et al. [87] developed a unique precision farming technique combining TLS and camera data to identify corn seedlings in fields precisely. The technique required removing distance effects from TLS intensity data, registering point cloud and camera data for proper color representation, and separating corn plants from the soil with a random forest algorithm employing geometric and radiometric parameters. The performance of employing either feature type alone was improved in a case study using a commercial TLS sensor with an integrated camera, which showed a high accuracy of 98.8% in distinguishing maize seedlings from the soil. The results of Koenig et al. [88] highlight the significant effectiveness of geometric and radiometric LiDAR point cloud characteristics in accurately classifying tree induction data to identify post-harvest growth. Key findings from the literature are included in Table 3, which shows TLS uses in crop management.

Table 3.

Use of TLSs in the realm of crop management.

Crop Crop Management Practices Important Findings Ref.
Barley Plant-height measurement A comparison was made between TLS and UAV-based imaging for CSM-derived plant height in crops using analyses based on polygon grids. The results revealed a high correlation between TLS and UAV-derived plant height (R2 = 0.91), with a 4.81% higher coefficient of variance observed in TLS compared to UAV data. [89]
Wheat High-throughput phenotyping of plant height The comparison of PH derived from LiDAR and structure from motion revealed high consistency, with a strong correlation (R2 ≈ 0.98) and minimal RMSE values (RMSE = 8.4 cm). LiDAR and structure from motion exhibited high repeatability (H2) in plant height. [90]
Rice Biomass estimation for individual organs and aboveground biomass (AGW) Three regression approaches, SMLR, RF, and LME modeling, were assessed for biomass estimation using an extensive TLS. The LME model exhibited the most significant improvement in panicle biomass, showing a 0.74 increase in R2 (LME: R2 = 0.90, SMLR: R2 = 0.16) and a 1.15 t/ha decrease in RMSE (LME: RMSE = 0.79 t/ha, SMLR: RMSE = 2.94 t/ha). In comparison to SMLR and RF, LME modeling provided similar AGB estimation accuracies for pre-heading stages but notably higher accuracies for post-heading stages (LME: R2 = 0.63, RMSE = 2.27 t/ha; SMLR: R2 = 0.42, RMSE = 2.42 t/ha; RF: R2 = 0.57, RMSE = 2.80 t/ha). [91]
Wheat Estimation of plant height, ground cover, and AGW Canopy height strongly correlated with LiDAR (R2 = 0.99, RMSE = 0.017 m). In contrast to NDVI, LiDAR remained unaffected by saturation at high ground cover and exhibited a strong association (R2 = 0.92, slope = 1.02) at ground cover above 0.8. AGW estimation employed 3D voxel index (3DVI) and 3D profile index (3DPI), with the strongest associations with biomass observed for 3DPI (R2 = 0.93) and 3DVI (R2 = 0.92). [92]
Vineyard Canopy characterization (crop height, width, volume, and leaf area) Ultrasonic and LiDAR sensors were compared to manual canopy measurements. Strong correlations were found between crop volume values from ultrasonic sensors and leaf area index (R2 = 0.51) and canopy volumes measured by ultrasonic and LiDAR sensors (R2 = 0.52). LiDAR accurately predicted canopy volume. [93]
Grapevine Canopy geometry characterization Significant correlations emerged between LIDAR impacts and LAI during each growth stage. The estimated values of tree row volume (R2 = 0.99) and leaf wall area (R2 = 0.95) exhibited statistical significance relative to the vine’s growth stage. [94]
Apple, Pear, Vineyards Measurement of non-destructive vegetative volume and surface area of tree row Strong correlations were found between manual and LIDAR-based measurements of vegetative volume in tree-row plantations, especially with the tree area index (TAI) parameter. A significant correlation (R2 = 0.814) was also observed between LIDAR volume and foliar area. [95]
Maize Detection and discrimination of plants and weeds A high correlation (R2 = 0.75) existed between LiDAR-measured height and actual plant height. Achieving up to 95% accuracy, the LiDAR-based sensor effectively discriminated vegetation from the ground. Weed discrimination success was demonstrated through canonical discriminant analysis (CDA) with a rate of 72.2%. [96]
Vineyard,
Apple
Monitoring of pesticide clouds The LiDAR system recorded the mid-range spray drift with a 2.4 m distance resolution and a 100 ms temporal resolution at the highest pulse repetition frequency. [97]
Vineyard Drift detection in vineyard spraying At an airflow rate of 34,959 m3/h, correlation coefficients for conventional nozzles ranged from 0.87 to 0.91, while for air injection nozzles, the range was from 0.88 to 0.40. At 27,507 m3/h, conventional nozzles showed coefficients between 0.85 and 0.94, whereas air injection nozzles ranged from 0.07 to 0.88. Finally, at 6423 m3/h, conventional nozzles had coefficients from 0.93 to 0.98. [98]

2.2.3. Mobile LiDAR Systems (MLSs)

Mobile LiDAR systems are portable devices that allow users to collect data by mounting the scanners on manned or unmanned ground vehicles such as tractors or robot vehicles, which are autonomous or remotely controlled and designed to operate on land. It can even be mounted on a backpack or held by a person walking. These systems are typically lightweight and easy to operate [99]. With the MLS, researchers and farmers can capture localized information and perform detailed measurements in smaller areas or specific field sections. They provide the ability to assess crop height variations, monitor canopy structure, and identify micro-environmental factors that influence crop growth. The MLS is useful for on-the-spot assessments and can complement data from other LiDAR platforms. Researchers have utilized MLSs for crop height estimation, plant architecture analysis, leaf area index, and precision nutrient management [100,101].

Zhou et al. [102] reported that MLSs performed better than ALSs when scanning tree growth. Point cloud data provides more precision and completeness than an MLS and can accurately extract forestry characteristics and record individual tree branch structures. Underwood et al. [103] employed an MLS to efficiently map flower and fruit distributions and forecast individual tree production in almond orchards, as depicted in Figure 8. The technology employed a robotic ground vehicle equipped with LiDAR and video sensors to generate a 3D map of the orchard and identify the plants throughout the year. A significant linear relation existed between canopy volume measured by LiDAR and yield. The created maps were used to classify almonds correctly. Researchers concluded that these findings advance precision agriculture by providing valuable insights for optimizing almond orchard management and increasing the accuracy of yield prediction.

Figure 8.

Figure 8

Mapping an almond orchard using an MLS [103].

Arno et al. [104] predicted LAI by transversely surveying grapevines along the rows using MLS sensors placed on a tractor. Vine height, cross-sectional area, canopy volume, and tree area index (TAI) were calculated. The TAI significantly correlated with LAI values, demonstrating the viability of using LiDAR sensors to characterize grapevine leaves. In the wheat improvement project, Deery et al. [105] assessed the repeatability of nondestructive measurements of AGB and crop growth rate (CGR) using an MLS. The research compared two LiDAR-based techniques for calculating AGB using damage data. Different water supply levels and wheat genotypes were the subjects of several investigations. According to the findings, there are strong correlations between MLS-derived biomass indices and AGB, suggesting that MLSs may serve as a reliable substitute for AGB. The study’s overall findings demonstrate MLSs’ potential as a trustworthy tool for evaluating AGB and CGR in wheat breeding and research.

In their work, Ruhan et al. [106] estimated the aboveground AGB of individual trees using a backpack LiDAR system and optimized quantitative structural models (AdQSM). They were able to accurately determine the diameter at breast height (DBH) of each tree by using the point cloud data obtained from the backpack LiDAR system. The investigation confirmed that the use of backpack LiDAR in a non-destructive manner could produce accurate estimates of the AGB of individual trees. A unique phenotyping approach has revolutionized field-based plant research, as Zhu et al. [107] demonstrated. It combined a backpack LiDAR device with the GUI program Crop Quant-3D, as shown in Figure 9. The method made it possible to collect millions of 3D points to measure crop height and complicated features precisely, such as changes in canopy structure. With good correlations to traditional manual measures, the integrated approach effectively separated genotype and treatment effects on important morphological features. The system showed potential in effectively utilizing current genetic data for improved plant phenomics research, yet there is potential for improvement in accuracy and cost.

Figure 9.

Figure 9

Data acquisition using backpack LiDAR system (a) Aerial view of the experimental site (b) Backpack LiDAR system (c) Function and data acquisition (d) 3D reconstruction of the experimental field (e) AGB estimation through point clouds data [107].

Guo et al. [108] developed and tested Crop 3D, a high-throughput crop phenotyping technology. They used LiDAR technology with other advanced sensors to gather phenotypic data from several sources during the crop growth cycle. The researchers discussed the platform’s design, functionality, testing outcomes, and prospective applications and prospects in crop phenotyping. They concluded from their research that systems merging LiDAR and conventional remote sensing methods could be the future of high-throughput crop phenotyping. An overview of the various ways in which the MLS optimizes agricultural practices is presented in Table 4, which provides a concise summary of MLS applications in crop management and highlights significant findings from the relevant literature.

Table 4.

MLS application in crop management.

Crop Crop Management Practices Important Findings Ref.
Apple Estimation of tree canopy density The 2D algorithm identified peak points at the canopy center, correlating with apple variety, tree age, and tall spindle form. Notably, the middle section exhibited higher canopy points than the top and bottom. The 3D algorithm excelled in evaluating tree canopy point density, surpassing the 2D version. Ensuring precise alignment during scanning is vital to avoid experimental errors. [109]
Wheat Crop biomass estimation A proximal active reflectance sensor offering spectral indices and crop height estimates was compared to the LiDAR system. The correlation between LiDAR-derived crop height and crop biomass was 0.79, revealing substantial variability in biomass across the field. This suggested the potential of LiDAR technology for large-scale operations and site-specific management. [110]
Apple Leaf area detection A test platform was constructed to measure orchard tree canopy leaf areas manually. Polynomial regression, BPNN, and PLSR algorithms were employed to analyze the relationship between canopy point clouds and leaf areas. The BP neural network (86.1% test, 73.6% verification accuracy) and PLSR (78.46% test, 60.3% verification accuracy) outperformed the Fourier function in polynomial regression (59.73% accuracy). [111]
Vineyard Estimation of canopy size parameters (thickness, height, and volume) and LAI UAVs, MLSs, and mobile apps (MA) effectively estimated canopy size variations. Strong correlations (R2 > 0.7) were observed, with the highest at R2 = 0.78 (UAV vs. MLS) for canopy volumes. Height data showed robust correlations (R2 = 0.86, MA vs. MLS), while thickness data had weaker correlations (R2 = 0.48, UAV vs. MLS). LAI demonstrated moderate but consistent correlations with canopy volumes, ranging from R2 = 0.69 (LAI vs. UAV) to R2 = 0.74 (LAI vs. V MLS). [112]
Sorghum Detection and measurement of estimation of individual sorghum panicles Panicles were identified with 89.3% overall accuracy, encompassing a 10.7% omission and 14.3% commission rate. Estimated panicle dimensions demonstrated a strong correlation with LiDAR-derived measurements (panicle length: r = 0.88, RMSE = 3.10 cm; panicle width: r = 0.79, RMSE = 1.67 cm; plant height: r = 1.00, RMSE = 0.80 cm). Comparison with harvested panicle data revealed moderate-to-high correlations (panicle length: r = 0.79, RMSE = 2.48 cm; panicle width: r = 0.63, RMSE = 1.49 cm; plant height: r = 0.86, RMSE = 11.4 cm). [113]
Cabbage, Leek, Potato, Wheat Canopy estimation Soil and plant segregation was accomplished by calculating weighted sums, eliminating the need for additional sensor data. This dynamic method extracted vegetation from point clouds in strips with varying coverage and sizes. The resulting vegetation clouds were validated against drone imagery, confirming a precise match with all green areas. [114]
Ryegrass Estimation of fresh weight and dry matter yield R2 between FWY and seasons (winter, spring, summer, and autumn) were 0.81, 0.92, 0.94, and 0.90, respectively. Similarly, the R2 values between DMY and the seasons were 0.87, 0.73, 0.87, and 0.79, respectively. These results suggest that LIDAR estimation of DMY is accurate within seasons for paired-row breeding plots. However, it is sensitive to significant changes in dry matter content (%) among seasons, requiring seasonal algorithms for correction. [115]
Miscanthus giganteus Measurement of crop height The sensor assessed stem densities in static mode, yielding an average error of 5.08% (max 8%, min 1.8%). It also measured crop height in a 5 × 10 m field, showing a 4.2% error compared to manual measurements. The sensor traversed a field edge in dynamic mode, generating a three-dimensional crop structure. An ordinary least-squares surface-fitting algorithm produced top and ground surfaces, resulting in an average crop height. Dynamic measurements showed a 3.8% average error (max 6.5%, min 1.5%). [116]

3. Applications of LiDAR in Crop Cultivation

3.1. Crop Monitoring and Management

Precision agriculture relies on effective crop monitoring and management strategies, and LiDAR technology offers valuable tools for these tasks [117]. By utilizing LiDAR data, farmers and agronomists can gain insights into plant health, disease detection, weed management, and crop growth monitoring [118,119,120].

Figure 10 presents an overview of the applications of LiDAR in agriculture for crop monitoring and management practices, as using just RGB data prevents the correct extraction of plant structural information. LiDAR, a potentially active technique, deals with overcoming this problem by sending laser pulses through tree canopies and picking up vegetation in the understory. LiDAR delivers accurate, three-dimensional (3D) plant data by measuring the return pulse time. LiDAR sensors have been successfully used in several successful projects to estimate different vegetation parameters, including vegetation height [121], coverage [122], leaf area index [123], above-ground biomass [124], and crown size and volume [125]. For example, Brede et al. [126] used the TreeQSM approach to estimate tree volume in their comparison of UAV laser scanning with terrestrial laser scanning. A machine learning–based classification method was carried out by Moorthy et al. [127] to discriminate between woody and leafy components in 3D point clouds. In most instances, the solution outperformed previous methods, eliminating the need for additional post-processing procedures. Based on their research, Montzka et al. [128] concluded that crop height, gap percentage, and intensity are three LiDAR metrics that can predict the AGB, with slightly improved accuracy when testing plants in wet environments. In recent years, LiDAR technology has enabled precise and detailed measurements of crop characteristics [129,130,131,132], such as crop height [133], canopy density, plant growth rate [134], plant species diversity [135], and aboveground fresh and dry biomass [136,137,138]. This information allows farmers to monitor the health and development of crops more accurately. By analyzing LiDAR data, they can identify variations in plant height, which may indicate uneven nutrient distribution or pest infestations [139]. Detailed information regarding these applications is described in Figure 10.

Figure 10.

Figure 10

An overview of applications of LiDAR in agriculture (A) canopy and individual height measurement, (B) growth prediction and measurement, (C) field management practices, (D) breeding, (E) crop management practices) [57].

3.1.1. Plant Health Assessment, Maintenance, and Disease Detection

Traditional methods for plant health assessment, maintenance, and disease detection are often unreliable and costly. In contrast, LiDAR technology offers detailed information on vegetation structure, crop growth, leaf area, and canopy density-key indicators of plant health—providing a more accurate and cost-effective solution. By analyzing LiDAR-derived metrics such as plant height, canopy volume, or leaf area index, researchers and farmers can assess the overall condition of crops [140]. Deviations from expected values may indicate stress, nutrient deficiencies, or disease presence. The early detection of diseases is crucial for implementing timely interventions, such as targeted pesticide application or adjusting irrigation and fertilization practices [20].

LiDAR technology has been extensively studied for estimation-oriented agricultural applications, including maintenance tasks. It has proven effective across different soil types and temperature conditions, including in orchards, providing valuable insights for crop management and optimization. For example, Escola et al. [141] substitute conventional approaches to calculate the number of trees in an olive grove by using an MLS. This approach was also used by Sandonis-Pozo et al. [142] to estimate the canopy characteristics of almond trees to locate the canopy areas requiring maintenance. Anken et al. [143] used a more accurate method for the precise measurements of canopy area in a project named ‘3D Mosaic’, which aimed to better regulate water and fertilizer use in orchards. Using conventional cameras, two techniques were used in a plum orchard in Potsdam, Germany: LiDAR (plane-mounted laser) and NIR (near-infrared) imaging. Both techniques generated precise canopy size measurements and demonstrated a good connection with real leaf area. The camera system provided a more affordable alternative, while the LiDAR system has the benefit of easy data processing. Moreover, traditional methods of measuring the height of wheat are labor-intensive and prone to inaccuracy. However, since it reveals the yield and weather resilience, this assessment is significant for this crop type. Dhami et al. [144] presented methods for assessing wheat crop heights using a 3D LiDAR sensor mounted on a UAV. They created a mechanism for retrieving plant heights from 3D LiDAR data in plot-based phenotyping contexts. The researchers also created a toolchain for modeling phenotyping farms and determined plant height in a wheat field with an accuracy of RMSE of 6.1 cm. Notably, their method accurately predicted plant heights in a field with 112 plots, making it the first time 3D LiDAR data were gathered over a wheat field by an aerial robot. A multi-sensor phenotyping system was developed by Yuan et al. [145] for estimating wheat canopy heights using ground phenotyping technologies, UAV, and manual measurements. Ultrasonic sensors and LiDAR technologies were used in the system, as shown in Figure 11. The accuracy of wheat height estimates was improved by pre-processing LiDAR data to counteract the slanting effect. The RMSE for LiDAR was 0.05 m, and the R2 for the correlation between manual measurements and LiDAR was 0.97. The RMSE for UAV was 0.09 m, and the R2 for the correlation between manual and UAV was 0.91. Both measurements were used to determine canopy height. LiDAR produced the most accurate findings, with an R2 of 0.97 and an RMSE of 0.05 m.

Figure 11.

Figure 11

Phenotyping system and scanning areas of LiDAR and ultrasonic sensors [145].

Meanwhile, maize heights were examined by Ziliani et al. [146] and Gao et al. [147]. With this metric, one can get an overall assessment of the plant’s health condition. With this index, farmers can evaluate the overall condition of the plant. In related research, Zhou et al. [148] employed an ALS to track maize development and examine how the climate affected this plant over the lodging season. Estimating sugar cane yields requires careful crop observation during its entire development cycle. Sofonia et al. [149] monitored sugar cane yield with different nitrogen fertilizer treatments (0, 70, 110, 150, and 190 Nkg·ha−1) in Australia using an ALS. This study aimed to simultaneously compare two systems, CSIRO Hovermap LiDAR, and Micasense RedEdge multispectral camera, to conclude the relationships between plant height, biomass, and yield, as shown in Figure 12. The results indicated that both technologies offer accurate crop height measurements and timely problem detection. However, UAV–LiDAR exhibited greater consistency and stronger correlations when analyzing the biophysical properties of sugarcane compared to optical remotely sensed data.

Figure 12.

Figure 12

(A) LiDAR and (B) photogrammetry time-series point cloud data colored by survey (dark blue to red) [149].

Ghamkhar et al. [150] suggested employing an MLS to do this effectively. They were the first to get findings that could be put into practice and compared to the conventional method.

3.1.2. Weed Detection and Fertilizer Application

Effective weed detection, control, and precise fertilizer application are vital for optimizing crop productivity. LiDAR technology enhances these practices by offering detailed insights into plant height and canopy structure. This enables accurate weed detection and targeted fertilizer application, as LiDAR can differentiate between crops and weeds based on variations in vegetation height and structure. This enables the development of precise and targeted weed control strategies, reducing the reliance on broad-spectrum herbicides and minimizing the potential negative environmental impact [151]. Removing weeds from fields before they consume crop nutrients is essential. As a result, it is essential to constantly monitor crops for the presence of wild plants and remove them before they may spread disease. Pretto et al. [152] provide an alternate method for the identification of wild plants by combining the use of a TLS and an ALS. The researchers in this study detect and manage the removal of wild plants with the help of an ALS and a TLS, as shown in Figure 13. Computer vision technologies such as LiDAR were used to navigate robotic vehicles during agricultural exploration, and weed detection was accomplished using the AgriColMap method. This vehicle can estimate the density of the crop and use that information to guide itself around the field.

Figure 13.

Figure 13

Conceptual identification and removal of wild plants using an ALS and a TLS (a) Weed detection via ALS (b) Signals transfer to TLS (c) Navigation to exact location (d) Removal of weeds via TLS [152].

Liu et al. [153] used UAV-LiDAR to estimate differences in the plant height of cotton plants across space. The research discovered a maximum relative error of 12.73% and an error value of 3.48 cm by comparing hand plant height measurements with LiDAR measurements. The coefficient of variation analysis showed height variations in the crop row direction ranged from 0.54 to 1.04. In the direction perpendicular to the crop row, the coefficient of variation ranged from 0.06 to 1.27. The study showed insights into deriving geometric data from field crops and delivers useful information for variable equipment operations in cotton fields. Nguyen et al. [154] developed DairyBioBot, an unmanned ground vehicle (UGV) equipped with a ground-based LiDAR sensor and real-time kinematic (RTK) positioning system. In broad perennial ryegrass field experiments, this novel approach enabled accurate and effective monitoring of plant volume as a proxy for biomass. R2 values of 0.71 at the row level and 0.73 at the plot level showed a robust relationship between LiDAR-derived plant volume and biomass on a fresh mass (FM) basis. Robotic equipment created by Cruz-Ulloa et al. [155] made it possible to automate the fertilization process and treat each plant individually. Researchers identified cabbages as the target crop and used LiDAR technology to automate fertilizer management. Researchers reported that point clouds provide remarkable potential for exact localization, which in turn makes it possible to follow and monitor the development of specific crops over time. Machine learning methods such as Euclidean clustering [156], support vector machines, k-nearest neighbors, and k-means have often been employed for applications requiring fruit or tree measurements and phenotyping [157]. However, voxelization was used by Itakura and Hosoi [158] to recognize the weeds present in the crop.

3.1.3. Crop Growth Monitoring

Monitoring crop growth and estimating potential yield are critical aspects of precision agriculture. LiDAR technology provides accurate and real-time data on crop height, canopy cover, and biomass. By regularly scanning fields, farmers can track the progress of crop growth and identify areas with potential issues. LiDAR data can help optimize irrigation scheduling, nutrient management, and harvest planning [159]. Many researchers recommended that the data collected through LiDAR were intended to be used by the farmer to optimize crop production, and the farmer may project the yield to get insight into the condition of the crop. Changes in crop biomass indicate to farmers that the plant has entered the reproduction phase. This shift must be noticed so that the farmer can estimate the quantity of fertilizers required in advance. Li et al. [160] found that compared to the traditional manual approach, an ALS was considerably more effective, time-saving, and useful even in difficult-to-reach locations and adverse geographic conditions for calculating LAI and plant height in maize fields. Their study emphasizes the potential of ALS technology in increasing data-gathering procedures for LAI evaluation and optimizing agricultural operations, as shown in Figure 14. With R2 values of 0.89, 0.86, and 0.78, the analysis showed a significant correlation between the canopy height data from the 3D point clouds and manual measurements.

Figure 14.

Figure 14

Measurement of LAI using UAV–LiDAR systems [160].

3.1.4. Autonomous Navigation System

For autonomous vehicles, efficient field navigation is crucial since it enables the accurate completion of various tasks. Modern autonomous vehicles use a variety of sensors to gather important information about their surroundings and current robotic state [161]. Cameras and dual-antenna real-time kinematic (RTK) systems provide a rich data stream, but their effectiveness is very sensitive to changes in lighting and environment [162]. As a result, LiDAR sensors provide improved navigational dependability in an agricultural environment, making them preferable to visual cameras [163].

Wenju et al. [164] developed an orchard-harvesting robot with autonomous navigation. The technique solved the problem of frequent navigational breaks by enabling continuous operation while turning. The method changed navigation modes based on GNSS distances. The results of the experiment were promising, with minimal data loss during communication and repeated pauses in orchard rows. The study emphasized the potential advantages of LiDAR technology for increasing navigational accuracy and orchard harvesting efficiency. To address the requirement for effective plant protection, Wang et al. [165] described the creation of an autonomous navigation system for orchards. The system used a three-module strategy: perception, decision, and control. Millimeter-wave radar and 3D LiDAR were used to identify obstacles and perceive surroundings. Based on LiDAR data, orchard navigation lines were extracted using a four-step methodology. The ADRC control method successfully improved the noise immunity of the plant protection machinery’s system. This was a great achievement in the field of agriculture technology, as it ensures that the machines can operate effectively without any interruption caused by external interference.

Liu et al. [166] combined automated navigation (AN) with precision variable-rate spraying (PVS) utilizing a single 3D LIDAR sensor to increase pesticide application safety in orchards while reducing environmental effects. Fruit trees were found, and the region of interest (ROI) was determined using a LIDAR sensor. The random sample consensus (RANSAC) technique was used in 2D processing inside the ROI to compute the center-of-mass coordinates of fruit trees and calculate the robot’s vertical distance from the fruit tree row center line. The encoder and inertial measurement unit (IMU), which correct data from the fruit tree canopy, are then used to guide the robot along the FTR center line. The findings demonstrate outstanding accuracy by drastically reducing pesticide application, air drift, and ground loss compared to conventional spraying, achieving reductions of 32.46%, 44.34%, and 58.14%, respectively. Reduced pesticide volume, ground loss, air drift, and improved environmental control result from this integrated strategy using 3D LIDAR, an encoder, and an IMU. In recent research, Bertoglio et al. [167] developed a navigation system that utilized LiDAR and wheel encoder sensors for precise navigation in row-structured agricultural environments such as vineyards. The technique used the straight, regular geometry of rows of plants. In simulated and real-world testing, the system showed mean displacement errors of 0.049 m and 0.372 m, respectively, for in-row navigation.

Additionally, the researchers devised an end-row points detection to facilitate end-row navigation in vineyards, a feature frequently neglected in comparable works. The outcomes demonstrate the efficacy and reliability of their methodology. Many algorithms, such as the graph-based optimization of simultaneous localization and mapping (SLAM) [168], RANSAC [169], and H∞ Control System [170], have also been employed for navigation-based applications. Moreover, precision agriculture has led to autonomous agricultural equipment, which decreases human labor, simplifies processes, and boosts production. However, precise obstacle detection and avoidance technologies are needed to keep these machines safe [171]. Traditional LiDAR-based obstacle identification approaches for farming areas are laborious and time-consuming owing to manually created features. To address this issue, Qin et al. [172] used deep learning on agricultural equipment with LiDAR, a camera, and GNSS/INS. Researchers used focal sparse convolution to train a 3D obstacle detector called Focal Voxel R-CNN and extracted useful features from sparse point cloud data. With a mean average precision (mAP) of 91.43% and a detection speed of 28.57 frames per second (FPS), the proposed model greatly outperformed Voxel R-CNN in terms of detection performance. This methodology presented autonomous agricultural machinery with a more dependable approach to obstacle identification. Kong et al. [173] used 2D LiDAR combined with the minimum cost maximum flow (MCMF) and density-based spatial clustering of applications with noise (DBSCAN) algorithms to identify obstacles in front of tractors. The study introduced a mechanism to classify security alert levels based on obstacle states and a way to distinguish between static and dynamic obstacles. Vehicles were tested for static and dynamic obstacles at different speeds and directions. The results revealed a remarkable average warning accuracy rate of 89.5%. The system’s reliable obstacle prediction ensures agricultural vehicle safety, advancing agricultural mechanization. Jiang and Ahmed [174] employed LiDAR to guide an autonomous spraying robot for orchard operations.

3.1.5. Above-Ground Biomass (AGB) and Yield Estimation

Researchers also used LiDAR to accurately estimate above-ground biomass [175,176,177,178,179] and yield estimation [180,181]. The ability to accurately estimate canopy structure, height, and biomass distribution is useful for evaluating carbon sequestration and fine-tuning agricultural management strategies. Rapid, non-destructive data collection using LiDAR is essential for maintaining effective and sustainable agriculture methods. In southern Spain, Rodriguez et al. [182] measured carob biomass with an ALS and evaluated the crop’s carbon storage with allometric methods; this classification was essential for estimating the soil’s qualities and assessing the plants’ biomass. Sun et al. [183] used an advanced MLS to track cotton plant development throughout the seasons and year. This novel technique shed light on the dynamics of plant growth under various climatic circumstances and demonstrated how data-driven approaches have the potential to transform agricultural management practices. The successful implementation of the MLS provided thorough monitoring capabilities and has implications for improving crop yield and quality while reducing resource consumption and environmental impact, highlighting its significance in advancing productive and sustainable agricultural practices. It was also stated that yield estimates could be improved by incorporating LiDAR data.

A semi-autonomous vehicle designed for vineyard monitoring was created by Vidoni et al. [184] in response to the need for frequent estimates. Two LiDAR scanners were installed on the truck to thoroughly scan the vineyards, allowing for the study of plant volume and morphology. The researchers created a complex algorithm based on measurements of branch thickness and the NDVI. The NDVI was crucial to get important data on the properties of the tree canopy.

3.2. Soil Analysis and Management

LiDAR technology significantly benefits soil analysis and management practices in precision agriculture. By capturing detailed topographic information, LiDAR assists in mapping, classifying, and monitoring essential soil parameters [185].

3.2.1. Soil Mapping and Classification

Combining LiDAR data with precise georeferencing techniques allows for the creation of high-resolution soil maps. By analyzing variations in surface elevation and slope from LiDAR, researchers and farmers can identify and map different soil types and their spatial distribution within a field. This detailed information supports accurate soil classification and enables targeted soil management practices, such as variable-rate fertilization and site-specific irrigation. As commercial airborne and ground-based LiDAR technology becomes more widely available, it shows tremendous potential for quantifying the spatial distribution of surface roughness (SR) across large areas reasonably and effectively [186]. By measuring the laser pulse return times transmitted from an airborne platform toward the ground, LiDAR creates a 3D point cloud. Airborne laser scanners (ALSs) can outline the geographical distribution of SR using high-density, accurate elevation data that may be quickly gathered over wide regions, providing a suitably dense sample rate [187]. Foldager et al. [188] highlighted the benefits of LiDAR technology for the efficient and precise analysis of soil surfaces in agricultural practices. Researchers measured and analyzed a furrow’s cross-sectional size and shape following a trailing shoe sweep using a manual pinboard and a LiDAR sensor. Pinboard and LiDAR measurements demonstrated significant differences of up to 41%. Additionally, LiDAR research on the effect of irrigation on furrow areas revealed variable increases in cross-sectional area under various irrigation levels. Hollaus et al. [189] used a dual approach to quantify roughness in their research and used two different measures of roughness, “Surface Roughness” (SR) and “Terrain Roughness” (TR). Full-waveform airborne laser scanning (FWF-ALS) terrain points within a certain height range from the terrain surface were measured for the SR computation. They proved that FWF-ALS helped in characterizing surface properties with its laser point attributes, and they introduced a new method, vertical roughness mapping (VRM), for identifying roughness across different vertical layers in forested regions, with accurate results when validated against in situ reference data.

3.2.2. Soil Moisture Content and Nutrient Analysis

LiDAR data can indirectly provide insights into soil moisture content by analyzing vegetation reflectance patterns and canopy structure. Changes in vegetation characteristics detected by LiDAR, such as reduced canopy height or increased gaps, may indicate areas with higher moisture stress. LiDAR data, when combined with spectral analysis techniques, can also help estimate soil nutrient levels. By examining the relationship between LiDAR-derived metrics and nutrient content, farmers can develop precise fertilization plans designed to meet the demands of varying soil types throughout a field.

Phosphorus levels are another crucial aspect of soil nutrient management, as they pose a threat to water quality. Cassidy et al. [190] employed an ALS to examine this trait in crops and mitigate the potential for excessive phosphorus in soil. The research developed a runoff routing model using LiDAR elevation data and soil hydraulic conductivity values to identify hydrologically sensitive areas. The presence of above-optimal soil phosphorus levels in agricultural fields makes places where surface runoff paths are concentrated and where the danger to water quality is greater.

To establish the appropriate resolution for defining observed soil moisture patterns, Southee et al. [191] investigated three LiDAR-derived terrain surfaces produced at different spatial resolutions. They used the topographic wetness index (TWI), percent elevation index (PEI), and canopy height model (CHM) to analyze soil moisture at spatial resolutions of 2 m, 5 m, 10 m, and 20 m, respectively. Algorithms for removing depression were also used. For depths of 0–15 cm and 0–40 cm, respectively, the coefficients of determination between soil moisture and TWI were R2 0.346 and R2 0.292. According to the findings, predicting soil moisture patterns at shallow depths (from 0 to 15 cm) may be more successful when using high-spatial-resolution variables (between 2 m and 5 m). On the other hand, deeper depths (from 0 to 40 cm) could be better suited for coarser resolutions (10 m, 20 m).

To better plan water and irrigation management, Demelezi et al. [192] showed how the spatial visualization of data could show the advantages of using high-accuracy remote sensing LiDAR data to find soil hydrological characteristics and to prevent potential field waterlogging. Additionally, the utilization of remotely sensed data, such as LiDAR, provided a significant benefit in data collection and site information presentation. As a result, at the plot-farm level, mapping and digitalizing landscape LiDAR information was beneficial, accurate, and simple to comprehend. Comparatively, Kemppinen et al. [193] used LiDAR data and field experiments to examine the role of soil and land surface variables on landscape-scale soil moisture change. This investigation modeled soil moisture and its temporal change using generalized additive models (GAM), boosted regression trees (BRT), and RFR. Figure 15 presents the forecasting abilities of four different soil moisture modeling techniques. The horizontal and vertical segments show the ranges of each modeling technique. Results showed that the average predictive performance was an R2 value of 0.47 and an RMSE of 9.34 VWC (volumetric water content, %), whereas the average model fit was an R2 value of 0.60 and an RMSE of 8.04 VWC%. The temporal variation models performed predictably with an R2 of 0.01 and an RMSE of 15.29 CV%, and they fit with an R2 of 0.25 and an RMSE of 13.11 CV%.

Figure 15.

Figure 15

Comparing four soil moisture modeling methods [193].

4. LiDAR Applications in Crop Harvesting

Real-time crop production process optimization, including fertilization, crop protection, and harvesting, becomes possible when these essential aspects are considered. However, research and development on the vehicle-based use of LiDAR technology in agriculture for collecting and exploiting such characteristics is still in the early stages. This section explores how LiDAR technology is utilized in various aspects of crop harvesting to improve efficiency and productivity. We discuss three key areas of application: crop maturity and yield prediction, autonomous harvesting systems, and post-harvest quality assessment.

Saeys et al. [194] investigated how to assess crop density in front of combine harvesters using two LiDAR sensors and data-processing techniques. An illustration of the crop density and combine harvester mounted with the LiDAR system is depicted in Figure 16. It was shown that using a linear model based on the local standard deviation of the penetration depth, accurate estimations of crop density could be generated from the scans from both sensors. R2 values for this linear model varied from 0.81 to 0.96 during experiments with various driving speeds and machine vibration levels. The suggested new approach for predicting crop volume showed excellent results. Researchers concluded LiDAR technology could improve combine harvester automated feed rate control systems.

Figure 16.

Figure 16

Different crop densities and combine harvester with LiDAR system (a) 100 ears per m2; (b) 200 ears per m2; (c) 300 ears per m2; (d) 400 ears per m2 [194].

Selbeck et al. [195] investigated the vehicle-based scanning LiDAR sensor in terms of its measuring properties in maize stands. Laser beam form, laser layer, laser echoes, field of view, and data density variation were all mentioned as potential sources of measurement error, and solutions were proposed to eliminate them. The 3D model made from the scanner data allowed the computation of the volume and biomass parameters for machine control and gave adequate surface information about maize stands. The researchers concluded that there is a correlation between the weighted biomass and the height and volume calculated by the model. Deremetz et al. [196] suggested an ultra-wide-band-based technique to guarantee a relative localization without requiring a direct line-of-sight to the object. To replicate a person’s route without direct communication or an absolute localization system, the approach locally approximates the trajectory of the human leader as a circle. Prins and Niekerk [197] used LiDAR, Sentinel-2, aerial, and machine learning to distinguish five crop varieties in an intensively planted region. Ten machine-learning techniques and various combinations of the three datasets were tested. The classification results were interpreted by comparing the aggregate accuracies, kappa, standard deviation, and f-score. Using LiDAR data to distinguish between various crop types was effective, with XG Boost offering the most significant overall accuracy of 87.8%. Furthermore, the crop-type maps made using Sentinel-2 data and those created using LiDAR data largely agreed.

To determine the optimal time to harvest sugarcane, Canata et al. [198] investigated LiDAR technology, specifically a laser sensor mounted on agricultural platforms. The measurement system integrates a laser sensor, GNSS receiver, and inertial device with a computer on the platform of an agricultural tractor. The data collection process was carried out around ten days before harvest. The approach generated point clouds with a density of roughly 2000 points m−2 and extracted sugarcane plant height measurements. The research discovered that platform oscillation vibration considerably impacted the dataset in one experimental area owing to high-amplitude spectral power. The suggested measuring system identified sugarcane plants in the pre-harvest phase without signal saturation issues.

4.1. Crop Maturity and Yield Prediction

LiDAR technology offers non-destructive and accurate methods for crop maturity and yield prediction. By capturing detailed 3D data of crop canopies, LiDAR enables precise assessment of key parameters such as plant height, leaf area, and canopy density. These metrics are critical for determining crop maturity and optimizing harvest timing, thereby maximizing crop quality and yield potential. LiDAR-based yield estimation models have shown promising results in predicting crop yields with high accuracy, contributing to better planning and resource allocation. To forecast wheat production and grain protein content (GPC), Sun et al. [199] used LiDAR technology with deep learning algorithms and time-series proximate sensing data. They incorporated LiDAR data using data fusion methods, which significantly increased prediction accuracy. Apple fruit detection and yield prediction were accomplished by Gene-Mola et al. [200] using a multi-beam LiDAR sensor with forced airflow and an air-assisted sprayer. The fruit identification algorithm was developed using reflectance thresholding (RT) and a support vector machine (SVM). The experiment was designed to boost fruit detection and decrease fruit occlusions by shifting the tree foliage using a multi-view sensor. Figure 17 illustrates how forced airflow affects fruit identification.

Figure 17.

Figure 17

Multi-beam LiDAR system for the detection of fruits [200].

When comparing the number of apples found by LiDAR with the forced airflow system to the real number of apples per tree, the RMSE was 19.0%, and the R2 was 0.58 and 0.54, respectively, when scanning was performed from the east and west sides. The RMSE was 5.7%, and the R2 was 0.87 when both sides of the trees were evaluated. As a surrogate for biomass, canopy height has been widely adopted [201]. Tilly et al. [202] added a field spectrometer to their platform and derived bivariate biomass regression models by fusing 3D data and spectral TLS data. Researchers used linear and exponential biomass regression models (BRMs) to evaluate the accuracy of plant height and vegetation indices (VIs) as fresh and dry biomass estimators. Their results demonstrate the promising potential of remotely sensed plant characteristics for estimating barley biomass. Li et al. [203] investigated the potential of airborne-LiDAR data in measuring the AGB and belowground biomass (BGB) of maize; it identifies canopy height and LAI as essential factors that directly affect biomass components. LiDAR-derived maize height and LAI data were used to illustrate the geographical distribution of biomass components and field-based estimate methods were presented. The findings showed that LiDAR-estimated biomass was equivalent to field-measured biomass, highlighting the enormous potential of airborne LiDAR for determining canopy height, LAI, and the various biomass components of maize at the height of the growth season.

4.2. Autonomous Harvesting Systems

LiDAR technology is essential for the advancement of autonomous harvesting systems. It provides real-time, precise data on crop location, size, and orientation, enabling autonomous vehicles and robotic systems to navigate and maneuver with high accuracy in agricultural fields. By employing both 2D and 3D LiDAR, these systems can effectively detect and avoid obstacles, including plants, structures, and other potential hazards, minimizing the risk of collisions. This capability enhances the safety and efficiency of harvesting operations, allowing for continuous, high-quality crop collection while reducing the need for manual intervention. Furthermore, the integration of LiDAR data with other sensors and advanced algorithms improves decision-making processes, optimizing the overall performance and productivity of autonomous harvesting systems [204,205]. Additionally, LiDAR-based perception systems enhance the precision and dexterity of harvesting robots, enabling them to harvest ripe crops selectively while minimizing damage to the surrounding vegetation. Geer et al. [206] developed a LiDAR-based SLAM autonomous robotic system for harvesting strawberries with a novel software architecture, GMapping version 1.3, integrated with ROS and a SLAM algorithm. Laboratory and field experiments were conducted to assess the accuracy of the autonomous harvesting system. Researchers reported that advanced robotic systems could identify the correct rows to travel between while maintaining a constant distance from the fruit trees. The perception system was designed to function effectively, avoiding fruit damage or loss of position. Vrochidou et al. [207] developed a LiDAR-based autonomous harvester for grape harvesting, which mainly consisted of three units: (1) an aerial unit, (2) a remote-control unit, and (3) the ARG ground unit, as shown in Figure 18. Three machine vision algorithms-grape cluster detection, ripeness assessment, and grape stem identification—were sequentially executed to complete the harvesting process. LiDAR utilized two algorithms for localization: (1) an ICP algorithm for mapping robot pose utilizing 3D data from 16 laser beams to ensure precise position monitoring, and (2) a wall-following algorithm using one LiDAR laser beam to maintain a fixed operational distance. Researchers have also confirmed that the inherent versatility built into the system’s architecture enables it to be seamlessly adapted for various crops and orchard settings.

Figure 18.

Figure 18

Conceptual architecture system of the integration of three parts [207].

Xiong et al. [208] designed a fully integrated LiDAR-based strawberry harvesting system for picking strawberries in clusters. The data from the 2D LiDAR were processed using the GMapping SLAM technique to generate a map. The system was tested on a strawberry farm, demonstrating its capacity to effectively harvest strawberries that were partly covered by leaves. Depending on the parameters, the first-attempt success rates ranged from 50.0% to 97.1%, while on the second try, these rates rose to 75.0–100.0%.

4.3. Post-Harvest Quality Assessment

After harvesting, maintaining the quality of crops is crucial for maximizing their market value. LiDAR technology offers valuable insights for post-harvest quality assessment. By analyzing the optical properties of harvested crops, such as color and surface characteristics, LiDAR can help detect defects, damage, or signs of spoilage. This information enables the timely sorting and grading of crops, ensuring that only high-quality products reach the market. Furthermore, LiDAR-based 3D scanning can facilitate precise crop volume and shape measurements, providing essential data for efficient packaging and storage. Many researchers used LiDAR technology to quantify crop height and wheat straw production for post-harvest quality evaluation [209,210]. When utilized along with yield monitor data, this might be very helpful in calculating and mapping signs of crop stress for the post-harvest evaluation of growing season conditions.

Long et al. [211] used a hybrid strategy, combining a multi-sensor system with LiDAR sensor technology to present accurate, site-specific measures of grain production, grain protein content, and straw yield. The system included a mass-flow yield monitor, an optical NIR spectrometer, a LiDAR sensor, a global positioning system receiver, and a laptop computer, as shown in Figure 19. This combined method maintained the same level of spatial resolution as grain yield. By analyzing connections between grain yield and protein maps produced by the yield monitor and protein sensor, researchers could detect areas in fields where nitrogen or water stress was limiting grain output. The results of this study highlight the potential of stress-related indicator maps in enhancing the ability to evaluate grain yield maps.

Figure 19.

Figure 19

Multi-sensor system for post-harvest assessment of environmental stress in wheat [208].

Mulley et al. [212] investigated the potential for determining the health of date palms using accessible thermal, hyperspectral, LiDAR, and visual RGB pictures. Despite difficulties in preprocessing and evaluating the quality of hyperspectral and thermal data, the approaches used in this inquiry provide interesting directions for future investigations. These results highlight how remote sensing data may support precision agricultural techniques and improve plantation management. Notably, the combined use of high-resolution thermal and hyperspectral images may provide information on the health of specific trees. Additionally, combining these indicators may provide a complete tree health assessment technique. Table 5 summarizes different LiDAR systems and LiDAR sensors used for efficient crop management by different researchers.

Table 5.

Summary of the use of LiDAR systems for efficient crop management.

LiDAR Types LIDAR Sensor Software System Algorithm Model Crop Feature Ref.
TLS RIEGL VZ-1000 RiSCAN Pro 8.0 ICPA
IDW
MCSM, DEM Paddy rice Crop height [213]
TLS ILRIS TerraScan 16.0
Surfer 8
ICP QTM Shrubs Biomass detection [214]
TLS RIEGL VZ-1000 RiSCAN Pro 8.0 ICP CSM, DTM, CHM Wheat Canopy hight detection [215]
ALS VUX-1UAV TerraScan 16.0
R Studio
Random forest CHM, DEM, DSM Maize, Soybean Crop heights estimation [216]
ALS Leica ALS70 TerraScan 16.0
ENVI 5.6
TIN
FLASH
DTM, BEM Maize Biomass estimation [217]
ALS DJI Matrice M300 CloudCompare 2.11.3
DJI Terra 3.0
Python 3.7
LiDAR 360
TransUNet
U-Net, FCN
PCTA
CXHA
CMM, DSM,
DTM
Broccoli Canopy and head estimation [218]
TLS SICK
LMS200
CloudCompare 2.11.3
MATLAB R2013a
R Studio 0.96
- DTM Sugarcane Monitoring production [219]
TLS RigelScan Elite PCL 1.9.1 RANSAC
PCPA
CGA
- Legume Phenotyping [220]
ALS RIEGL RiCOPTER CloudCompare 2.10.2
RiPROCESS 4.0,
RiPRECISION 4.0
3DPI DTM, DSM, CHM Potato, Sugar beet, Wheat Crop height and biomass estimation [221]
TLS LMS 511,
SICK AG
LabVIEW 2.0
MATLAB 2016b
PCPA
K-means clustering
LOWESS
3DLSM Sorghum, Maize Morphological traits detection [222]
TLS FARO Focus S70 SCENE 2022
PCL 1.12.0
DBSCAN
RNN
KDTree
MSM
MOSM
Maize Segmentation and stratification [223]
TLS LMS-Z420i RiSCAN Pro SDA CSM, DTM, DSM, Barley, Sugar beet Crop heights estimation [224]
ALS RIEGL VUX-1UAV LiDAR360 - CHM, DEM, DTM, DSM Maize Lodged crop height estimation [225]
TLS Kaarta Stencil 2 CloudCompare 2.12.4
MATLAB 2022a
CXHA
ASA
- Olive Tree volume estimation [226]
TLS RIEGL LMS-Q680i SideLook 1.1.01
R Studio 1.0
Mean height DTM, CMM Corn Hail defoliation assessment [227]
TLS FX6 LIDAR PCL 1.0
MATLAB R2011a
RANSAC CRM Maize Plant detection and mapping [228]
TLS LiDAR Lite V3 R Studio
PCL
Arduino IDE 1.8.13
- IWM Palm Bunch ripeness prediction [229]
TLS FARO Focus 3D X120 FARO SCENE 5.4.4
LiDAR360
- DTM, Maize Monitoring phenotypes [230]
TLS Velodyne VLP-16 VeloView 3.5
MATLAB
RANSAC LBM - Plowing furrow recognition [231]
TLS LiDAR-Lite v3
LMS400
CloudCompare
Python 2.7
MATLAB 2017a
PCSA
PCAA
3D point cloud Canola Leaf morphology extraction [232]
TLS TDS-130L
3D
Point Cloud
MeshLab
RANSAC
ICPA
- Tomato Leaf morphology analysis [233]
TLS

ALS
FARO Focus 350
RIEGL VUX-1UAV
CloudCompare2.11.3
RiProcess 2.8
FARO SCENE
POSPAC UAV 5.0
- LADE Maize Ear height estimation [234]
ALS Leica Aibot X6V2 CloudCompare 2.9.1
MATLAB R2019a
ASA
PCAA
DTM,
DEM
Apple Crown Parameters detection [235]
ALS Leica ALS50-II FUSION/LDV 3.42 Tree detection algorithm DTM, CHM Olive Pruning biomass estimation [236]
ALS LM7800 CloudCompare, MATLAB,
FUSION/LDV
ASA,
k-means algorithms
DTM, DSM Walnut Crown parameters detection [237]
MLS LMS511 pro CloudCompare 2.10.2
MATLAB R2021a
ICPA
CHA
- Strawberry Plant vegetative growth [238]
TLS 3D Velodyne, 2D Sick LMS PCL 1.11.1
CloudCompare 2.10.2
Pix4D 4.6.0
ICPA
RANSAC
KM, DEM Citrus Canopy height [239]
TLS LMS-511 CloudCompare 2.11.0
MATLAB R2022a
ICPA 3D Point Cloud
LRM, KNN
Apple Tree canopy and fruit quality [240]
TLS SICK LMS111 LabVIEW 2020
MATLAB R2020a
ICPA CSM, CHM, DSM Apple Canopy leaf area [241]
TLS Sick
LMS-111
MATLAB R2022a ASA - Peach Tree canopy estimation [242]
TLS Sick LMS200 PCL 1.8.1
MATLAB R2018a
LRM - Apple, Pear, Vineyard Canopy leaf area [243]
TLS OMD10M-R2000-B23-V1V1D Python 3.7
LabVIEW 2019
SORA
GNBA
URL,
PLSRM
Banana Fruit chlorophyll estimation [244]
ALS Velodyne Puck LITE PCL
CloudCompare 2.11.3
MATLAB R2021a
ICPA
SLAM
CHM, LAI Peach Plant parameter estimation [245]
TLS Sick
LMS 200
CloudCompare 2.8.1
R Studio 1.1.x
ASA
CXHA
- Orange Canopy volume estimation [246]
TLS Sick AG LMS111 SOPAS ET
MATLAB R2016a
PCA - Plants Plant monitoring [247]
TLS Optech ILRIS 36D Polyworks v10 - LLSVM Grapevine Biomass estimation [248]
MLS, TLS Backpack DG50
FARO Focuss 350
CloudCompare 2.6.1
MATLAB 2016a
LiDAR 360 v4.1
TreeQSM QSM Apple Tree branch information [249]
MLS VLP-16 LiDAR CloudCompare 2.9.1
MATLAB 2016a
CXHA, CSF
ASA
VBS
ASBS
DTM Grapefruit Canopy parameter estimation [250]
TLS FARO Focus 3D CloudCompare
MATLAB
TreeQSM
FARO SCENE v2022.1.0
- QSM Peach 3D crown architecture estimation [251]
MLS Velodyne VLP-16 VeloView
MATLAB 2020a
RANSAC
ASA
LRM
CFPD
Apple Canopy density estimation [252]
MLS Stereo vision ROS Skeletonization, CNN TM Cherry Pruning [253]
TLS FARO
Focus S70
PCL 1.11.1
Visual Studio 2022
FV-2200
DTA
LA-DT
LRM
LAEM
Rapeseed Leaf estimation [254]

From the summary provided in Table 5 above, it is observed that, due to their easy access to crops, MLSs are useful for a variety of tasks, including crop monitoring, object detection and classification, tree volume estimation, crop monitoring, and navigation. On the other hand, the TLS has a distinct advantage in tasks such as pruning and is particularly suitable for digitization-related tasks such as accurately capturing tree structure and foliage. For both the MLS and TLS, the following LiDAR sensor types are often used: LMS511, LMS111, Focus 350, LMS511, and VLP-16 (also known as Puck). Regarding duties such as tree counting, determining irrigation areas, assisting navigators, and monitoring orchards, ALSs excel because of their unique viewpoint and are excellent at capturing the spatial arrangement of things. When it comes to processing point clouds, CloudCompare, LiDAR360, and Point Cloud Library are the three most used tools, and the most prevalent sensor type is the VLP-16. ArcGIS is a frequently used option for viewing and analyzing LiDAR point cloud rasterization data. By utilizing point cloud processing techniques, one can estimate metrics, classify and cluster data for artificial vision and monitoring purposes, and voxelize digital representations of foliage, seeds, plants, and tree structures. This enables an integrated approach to tasks utilizing LiDAR technology. Table 6 provides a comprehensive comparison between different types of LiDAR systems (ALS, TLS, and MLS) and their respective advantages, applications, and limitations for crop management in precision agriculture.

Table 6.

Comparison of airborne LiDAR systems (ALSs), terrestrial LiDAR systems (TLSs), and mobile LiDAR systems (MLSs) for crop management in precision agriculture.

Aspect Airborne LiDAR System (ALS) Terrestrial LiDAR System (TLS) Mobile LiDAR System (MLS)
Platform Aircraft (e.g., UAVs, planes, helicopters) Stationary tripods or ground vehicles Moving vehicles (e.g., tractors, ATVs)
Coverage Area Large-scale, extensive coverage Small to medium-scale, site-specific Medium-to-large-scale, linear coverage (e.g., along field rows)
Data Collection Rapid, wide-area scanning High-detail, stationary, or limited-area scanning Continuous, high-density scanning along paths of movement
Altitude/Range High altitude, large range Ground level, limited range Variable altitude, depending on vehicle height and field conditions
Resolution Moderate to high, depending on altitude and sensor Very high due to proximity to crops High, suitable for detailed mapping of crop rows and field features
Applications Large field surveys, topographic mapping, biomass estimation Detailed plant structure analysis, canopy height measurement Field condition monitoring, crop health assessment, yield estimation
Advantages Extensive coverage, rapid data collection over large areas High precision and detail, minimal atmospheric interference Mobility, ability to cover large areas quickly and repeatedly
Limitations Weather dependent, regulatory restrictions, cost Limited coverage area, requires multiple setups for large fields Limited by vehicle access, potential for obstructions in the field
Crop management practices Field topography, crop biomass, and health mapping Detailed crop structure analysis, disease detection Monitoring crop growth, assessing field variability, yield prediction

5. Conclusions

In conclusion, this review paper explored the critical evaluation of LiDAR applications in precision agriculture for crop management. LiDAR technology has shown great potential in revolutionizing crop management practices in precision agriculture. Its applications in crop yield forecasting, biomass mapping, crop height measurement, and autonomous navigation for agricultural robots demonstrate its ability to provide accurate and timely data for decision-making. As LiDAR technology advances, it is expected to play an increasingly important role in optimizing cultivation and harvesting practices, ultimately contributing to more sustainable and efficient agricultural systems. LiDAR technology emerged as a promising tool with its unique capabilities and advantages in agriculture. This section explores the future perspectives and challenges associated with LiDAR applications in precision agriculture and the potential advancements and innovations that will likely shape the field.

5.1. Emerging Trends and Innovations in LiDAR Technology

This subsection focuses on the emerging trends and innovations in LiDAR technology that are relevant to precision agriculture. New developments in LiDAR technology for precision agriculture encompass a range of advancements. These include downsizing and integrating LiDAR sensors onto drones and equipment, resulting in enhanced compactness and durability. Moreover, higher resolution and multi-wavelength capabilities significantly improve object recognition. Sensor fusion has enhanced environmental modeling accuracy, while integration with autonomous vehicles and advanced driver-assistance systems has opened new avenues for real-time crop monitoring in precision agriculture. Combining LiDAR with data from other sensors allows for comprehensive crop health and environmental assessments. Additionally, LiDAR, improved by machine learning and AI, facilitates automated crop management, variable rate application, and 3D modeling. Additionally, autonomous vehicles and robots with LiDAR are transforming processes such as planting and harvesting, and cloud-based data processing and accessibility make precision agriculture accessible and cooperative. These emerging trends highlight the evolving nature of LiDAR technology and its potential to revolutionize precision agriculture.

5.2. Key Challenges and Research Gaps

In this subsection, we address the key challenges and research gaps that exist in the implementation of LiDAR technology for precision agriculture. We discuss obstacles such as data processing and interpretation, sensor calibration and accuracy, and cost-effectiveness. Additionally, we explore the need for further research in data fusion techniques, standardized protocols, and long-term impact assessments.

Despite the promising applications of LiDAR in precision agriculture, several challenges remain. The main challenge with using LiDAR technology in agricultural applications is the initial capital cost, which is often significant. To address this, there is a need for cost-effective LiDAR sensors designed specifically for research applications that offer a balance between affordability and performance. Researchers must also devise methods to fully leverage the capabilities of these more economical sensors to facilitate broader adoption in the agricultural sector. In addition, the operational and processing difficulties associated with LiDAR data and the limited canopy penetration capacity for low-stature vegetation such as maize also present a research gap in LiDAR application in agriculture.

5.3. Potential Solutions and Future Directions

To overcome existing challenges, further research and development are needed to improve LiDAR technology and its integration with other remote sensing techniques. Furthermore, the development of automated, on-the-go data processing capabilities and specialized commercial LiDAR systems for field operation may aid the use of LiDAR in precision agriculture. One of the key areas of research is to improve data processing efficiency, which is crucial for handling large quantities of data collected by LiDAR sensors. Researchers are now studying expert algorithms and computational approaches to streamline data processing and analysis to facilitate the rapid and accurate estimation of biomass components. Recent research also covers sensor calibration techniques, which are essential to ensure the accuracy and reliability of LiDAR data. Calibration methods are being improved to decrease data collection errors and increase biomass estimation accuracy.

Cost saving is another important consideration when using LiDAR technology for crop parameter estimates. Researchers are investigating novel approaches to reduce costs and improve the affordability of LiDAR systems for farmers and agricultural practitioners. Reducing costs while increasing data quality involves analyzing alternate sensor designs, using drone-based LiDAR platforms, and optimizing data gathering techniques.

We stress the value of multidisciplinary interactions in addition to technological developments. The issues faced by LiDAR-based crop management techniques might be addressed comprehensively by bringing together professionals from multiple disciplines, such as remote sensing, agronomy, and data science. Researchers may design integrated solutions that use the full capabilities of LiDAR technology in agriculture by combining their skills, knowledge, and resources. LiDAR technology can potentially improve the performance of other precision agricultural tools. LiDAR data may be used with data from drones, remote sensing satellites, and ground-based sensors to provide a thorough and precise picture of crop management practices. This connection enables rapid and well-informed decision-making.

Acknowledgments

The authors would like to extend their sincere gratitude to Wang Pan from School of Automotive and Traffic Engineering, Jiangsu University, for his invaluable assistance in the writing and reviewing of this manuscript.

Nomenclature

ASBS Alpha Shape by Slices Algorithm
ASA Alpha Shape Algorithm
APSIM Agricultural Production Simulator
BEM Biomass Estimation Model
BPNN Back Propagation Neural Network
CHM Crop Height Model
CRM Crop Row Model
CSM Canopy Surface Model
CMM Canopy Metric Model
CSF Cloth Simulation Filter Algorithm
CFPD Canopy Foliage Prediction Model
CMA Canopy Metric Algorithm
CHA Concave Hull Algorithm
CGA Computational Geometry Algorithm
CXHA Convex Hull Algorithm
CVM Cross Validation Model
DTM Digital Terrain Model
DEM Digital Elevation Model
DSM Digital Surface Model
DTA Delaunay Triangulation Algorithm
DBSCAN Density Based Clustering Algorithm
FLASH Fast Line-of-Sight Atmospheric Analysis of Spectral Hypercubes
GBM Generalized Boosted Model
GMM Gaussian Mixture Model
GLM Generalized Linear Model
GNBA Gaussian Naïve Bayes Algorithm
IWM Iterative Waterfall Model
ICPA Iterative Closest Point Algorithm
IWM Iterative Waterfall Model
IDW Inverse Distance Weighting Algorithm
KM Kinematic Model
KNN k-Nearest Neighbour Algorithm
LAEM Leaf Area Estimation Model
LAI Leaf Area Index
LADE Leaf Area Density Estimation Model
LA-DT Leaf Area- Delaunay Triangulation
LBM Laser Beam Model
QTM Quantitative Terrain Model
QSM Quantitative Structure Model
RFR Random Forest Regression Model
LRM Linear Regression Model
LLSVM Lateral Linear Stereoscopic Vision Model
LME Linear Mixed Effects Model
MCSM Multitemporal Crop Surface Model
MSM Multi-Spectral Model
MLRM Multiple Linear Regression Model
MSM Maize Segmentation Model
MOSM Maize Organ Stratification Model
NDVI Normalized Difference Vegetation Index
OQSM Optimized Quantitative Structural Model
PLSRM Partial Least Square Regression Model
PCL Point Cloud Library
PCA Poly Cylinder Algorithm
PCSA Point Cloud Segmentation Algorithm
PCTA Point Cloud Transformer Algorithm
PCPA Point Cloud Processing Algorithm
PCAA Principal Component Analysis Algorithm
PH Plant Hight
RANSAC Random Sample Consensus Algorithm
ROS Robot Operating System
SLAM Simultaneous Localization and Mapping Algorithm
SVIA Spectral Vegetation Indices Algorithm
SLMR Stepwise Multiple Linear Regression
SVMM Support Vector Machine Model
SDA Standard Deviation Algorithm
SORA Statistical Outlier Removal Algorithm
TIN Triangulation Network Filtering Algorithm
TM Tree Model
ULR Univariate Linear Regression
VBA Voxel-Based Algorithm
3DPI 3D Photo Inpainting
3DPC 3D Point Cloud
3DCM 3D Crop Model
3DLSM 3D Leaf Surface Model
3DSM 3D Surface Model

Author Contributions

Conceptualization, S.M.F. and J.Y.; methodology, S.M.F. and J.Y.; writing—original draft preparation, S.M.F.; writing—review and editing, Z.C., M.S.M. and S.M.F. All authors have read and agreed to the published version of the manuscript.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Written informed consent has been obtained from the patients to publish this paper.

Data Availability Statement

All of the data generated or analyzed during this study are included in this published article.

Conflicts of Interest

The authors declare no conflicts of interest.

Funding Statement

This research was funded by the National Natural Science Foundation of China (Grant No. 52350410469) and the National Natural Science Foundation of China (Grant No. 52375248).

Footnotes

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

References

  • 1.Bongiovanni R., Lowenberg-DeBoer J. Precision Agriculture and sustainability. Precis. Agric. 2004;5:359–387. doi: 10.1023/B:PRAG.0000040806.39604.aa. [DOI] [Google Scholar]
  • 2.Jin Y., Liu J., Xu Z., Yuan S., Li P., Wang J. Development status and trend of agricultural robot technology. Int. J. Agric. Biol. Eng. 2021;14:1–19. doi: 10.25165/j.ijabe.20211404.6821. [DOI] [Google Scholar]
  • 3.Tahilyani S., Saxena S., Karras D.A., Gupta S.K. Deployment of autonomous vehicles in agricultural and using Voronoi partitioning; Proceedings of the 2022 International Conference on Knowledge Engineering and Communication Systems (ICKES); Chickballapur, India. 28–29 December 2022. [Google Scholar]
  • 4.Sparks A. Master’s Thesis. Michigan Technological University; Houghton, MI, USA: 2016. Use of LiDAR in the Design of Grassed Waterways: Case Study in Agricultural Management in Oklahoma. [Google Scholar]
  • 5.Di Stefano F., Chiappini S., Gorreja A., Balestra M., Pierdicca R. Mobile 3D scan LiDAR: A literature review. Geomatics, Natural Hazards and Risk. Geomatics. 2021;12:2387–2429. [Google Scholar]
  • 6.Guo Q., Su Y., Hu T., Guan H., Jin S., Zhang J., Zhao X., Xu K., Wei D., Kelly M., et al. Lidar boosts 3D ecological observations and modelings: A review and perspective. IEEE Geosci. Remote Sens. Mag. 2020;9:232–257. doi: 10.1109/MGRS.2020.3032713. [DOI] [Google Scholar]
  • 7.Qiu Q., Li X. LiDAR point-cloud odometer based mobile robot routine tracking in orchards; Proceedings of the 2022 12th International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER); Changbai Mountain, China. 27–31 July 2022. [Google Scholar]
  • 8.Nehme H., Aubry C., Solatges T., Savatier X. Lidar-based structure tracking for agricultural robots: Application to autonomous navigation in vineyards. J. Intell. Robot. Syst. 2021;103:1–16. doi: 10.1007/s10846-021-01519-7. [DOI] [Google Scholar]
  • 9.Reger M., Stumpenhausen J., Bernhardt H. Evaluation of LiDAR for the free navigation in agriculture. AgriEngineering. 2022;4:489–506. doi: 10.3390/agriengineering4020033. [DOI] [Google Scholar]
  • 10.Radočaj D., Plaščak I., Jurišić M. Global navigation satellite systems as state-of-the-art solutions in precision agriculture: A review of studies indexed in the web of science. Agriculture. 2023;13:1417. doi: 10.3390/agriculture13071417. [DOI] [Google Scholar]
  • 11.Ding H., Zhang B., Zhou J., Yan Y., Tian G., Gu B. Recent developments and applications of simultaneous localization and mapping in agriculture. J. Field Robot. 2022;39:956–983. doi: 10.1002/rob.22077. [DOI] [Google Scholar]
  • 12.Ahmad U., Nasirahmadi A., Hensel O., Marino S. Technology and Data Fusion Methods to Enhance Site-Specific Crop Monitoring. Agronomy. 2022;12:555. doi: 10.3390/agronomy12030555. [DOI] [Google Scholar]
  • 13.Lin Y.-C., Habib A. Quality control and crop characterization framework for multi-temporal UAV LiDAR data over mechanized agricultural fields. Remote Sens. Environ. 2021;256:112299. doi: 10.1016/j.rse.2021.112299. [DOI] [Google Scholar]
  • 14.Alexopoulos A., Koutras K., Ali S.B., Puccio S., Carella A., Ottaviano R., Kalogeras A. Complementary use of ground-based proximal sensing and airborne/spaceborne remote sensing techniques in precision agriculture: A systematic review. Agronomy. 2023;13:1942. doi: 10.3390/agronomy13071942. [DOI] [Google Scholar]
  • 15.Debnath S., Paul M., Debnath T. Applications of LiDAR in Agriculture and Future Research Directions. J. Imaging. 2023;9:57. doi: 10.3390/jimaging9030057. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Rivera G., Porras R., Florencia R., Sánchez-Solís J.P. LiDAR applications in precision agriculture for cultivating crops: A review of recent advances. Comput. Electron. Agric. 2023;207:107737. doi: 10.1016/j.compag.2023.107737. [DOI] [Google Scholar]
  • 17.Li X., Liu C., Wang Z., Xie X., Li D., Xu L. Airborne LiDAR: State-of-the-art of system design, technology and application. Meas. Sci. Technol. 2020;32:032002. doi: 10.1088/1361-6501/abc867. [DOI] [Google Scholar]
  • 18.Dasika S.S. Biosystems and Agricultural Engineering. University of Kentucky; Lexington, KY, USA: 2018. Assessing the Spatial Accuracy and Precision of LiDAR for Remote Sensing in Agriculture. [Google Scholar]
  • 19.Ji X., Wei X., Wang A., Cui B., Song Q. A novel composite adaptive terminal sliding mode controller for farm vehicles lateral path tracking control. Nonlinear Dyn. 2022;110:2415–2428. doi: 10.1007/s11071-022-07730-x. [DOI] [Google Scholar]
  • 20.Sishodia R.P., Ray R.L., Singh S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020;12:3136. doi: 10.3390/rs12193136. [DOI] [Google Scholar]
  • 21.Bhakta I., Phadikar S., Majumder K. State-of-the-art technologies in precision agriculture: A systematic review. J. Sci. Food Agric. 2019;99:4878–4888. doi: 10.1002/jsfa.9693. [DOI] [PubMed] [Google Scholar]
  • 22.Al-Sammarraie M.A.J., Ali A., Hussein N.M. New irrigation techniques for precisions agriculture: A review. Plant Arch. 2021;21:1734–1740. doi: 10.51470/PLANTARCHIVES.2021.v21.S1.275. [DOI] [Google Scholar]
  • 23.Achilles B., Maria P., Panagiotis D., Aglaia L.-T., Barouchas P., George S., George K., Shaohua W., Sotirios G. Internet of things (IoT) and agricultural unmanned aerial vehicles (UAVs) in smart farming: A comprehensive review. Internet Things. 2022;18:100187. [Google Scholar]
  • 24.Monteiro A., Santos S., Gonçalves P.H. Precision Agriculture for Crop and Livestock Farming—Brief Review. Animals. 2021;11:2345. doi: 10.3390/ani11082345. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Abobatta W.F. Precision Agriculture Technologies for Food Security and Sustainability. IGI Global Publishing; Hershey, PA, USA: 2021. Precision agriculture: A new tool for development; pp. 23–45. [DOI] [Google Scholar]
  • 26.Ali A., Hussain T., Tantashutikun N., Hussain N., Cocetta G. Application of Smart Techniques, Internet of Things and Data Mining for Resource Use Efficient and Sustainable Crop Production. Agriculture. 2023;13:397. doi: 10.3390/agriculture13020397. [DOI] [Google Scholar]
  • 27.Xu R., Li C., Bernardes S. Development and Testing of a UAV-Based Multi-Sensor System for Plant Phenotyping and Precision Agriculture. Remote Sens. 2021;13:3517. doi: 10.3390/rs13173517. [DOI] [Google Scholar]
  • 28.Furbank R.T., Jimenez-Berni J.A., George-Jaeggli B., Potgieter A.B., Deery D.M. Field crop phenomics: Enabling breeding for radiation use efficiency and biomass in cereal crops. New Phytol. 2019;223:1714–1727. doi: 10.1111/nph.15817. [DOI] [PubMed] [Google Scholar]
  • 29.Zhang F., Hassanzadeh A., Kikkert J., Pethybridge S., van Aardt J. Toward a structural description of row crops using UAS-based LiDAR point clouds; Proceedings of the IGARSS 2020—2020 IEEE International Geoscience and Remote Sensing Symposium; Virtual Symposium. 26 September–2 October 2020; pp. 465–468. [Google Scholar]
  • 30.Sheikh A., Abbas A. Barriers in efficient crop management in rice-wheat cropping system of Punjab. Pak. J. Agric. Sci. 2007;44:341–349. [Google Scholar]
  • 31.Arif C., Nugroho B.D.A., Maftukha R., Suryandika F., Hapsari U., Nihayah B., Naititi N.P.P.E., Sain R.I.A. IOP Conference Series: Earth and Environmental Science. IOP Publishing; Bristol, UK: 2021. Performance of agro-environmental monitoring for optimum water and crop management: A case study for East Nusa Tenggara, Indonesia. [Google Scholar]
  • 32.El-Naggar A., Jolly B., Hedley C.B., Horne D., Roudier P., Clothier B.E. The use of terrestrial LiDAR to monitor crop growth and account for within-field variability of crop coefficients and water use. Comput. Electron. Agric. 2021;190:106416. doi: 10.1016/j.compag.2021.106416. [DOI] [Google Scholar]
  • 33.Eyre R., Lindsay J., Laamrani A., Berg A. Within-Field Yield Prediction in Cereal Crops Using LiDAR-Derived Topographic Attributes with Geographically Weighted Regression Models. Remote Sens. 2021;13:4152. doi: 10.3390/rs13204152. [DOI] [Google Scholar]
  • 34.Golombek Y. Lidar Applications for Measuring and Quantifying Streetscape and Streetscape Features. University of Colorado at Denver; Denver, CO, USA: 2020. [Google Scholar]
  • 35.Estrada J., Sánchez H., Hernanz L., Checa M.J., Roman D. Enabling the Use of Sentinel-2 and LiDAR Data for Common Agriculture Policy Funds Assignment. ISPRS Int. J. Geo-Inf. 2017;6:255. doi: 10.3390/ijgi6080255. [DOI] [Google Scholar]
  • 36.Xue X., Zhou Q., Chen C., Cai C. Design and test of variable spray model based on leaf wall area in orchards. Trans. Chin. Soc. Agric. Eng. 2020;36:16–22. [Google Scholar]
  • 37.Botta A., Cavallone P., Baglieri L., Colucci G., Tagliavini L., Quaglia G.A. A review of robots, perception, and tasks in precision agriculture. Appl. Mech. 2022;3:830–854. doi: 10.3390/applmech3030049. [DOI] [Google Scholar]
  • 38.Roten R., Fourie J., Owens J., Trethewey J., Ekanayake D., Werner A. Urine patch detection using LiDAR technology to improve nitrogen use efficiency in grazed pastures. Comput. Electron. Agric. 2017;135:128–133. doi: 10.1016/j.compag.2017.02.006. [DOI] [Google Scholar]
  • 39.Feng G., Wang C., Wang A., Gao Y., Zhou Y., Huang S., Luo B. Segmentation of Wheat Lodging Areas from UAV Imagery Using an Ultra-Lightweight Network. Agriculture. 2024;14:244. doi: 10.3390/agriculture14020244. [DOI] [Google Scholar]
  • 40.Lednev V.N., Zavozin V.A., Grishin M.Y., Grigorieva D.V., Sdvizhenskii P.A. BIO Web of Conferences. EDP Sciences; Les Ulis, France: 2023. Drone-Based Fluorescence Lidar for Agricultural Applications. [Google Scholar]
  • 41.Wu D., Phinn S., Johansen K., Robson A., Muir J., Searle C. Estimating changes in leaf area, leaf area density, and vertical leaf area profile for mango, avocado, and macadamia tree crowns using terrestrial laser scanning. Remote Sens. 2018;10:1750. doi: 10.3390/rs10111750. [DOI] [Google Scholar]
  • 42.LeVoir S.J., Farley P.A., Sun T., Xu C. High-Accuracy adaptive low-cost location sensing subsystems for autonomous rover in precision agriculture. IEEE Open J. Ind. Appl. 2020;1:74–94. doi: 10.1109/OJIA.2020.3015253. [DOI] [Google Scholar]
  • 43.Walklate P.J., Richardson G.M., Baker D.E., Richards P.A., Cross J.V. Advances in Laser Remote Sensing for Terrestrial and Oceanographic Applications. Society of Photo Optical; Bellingham, WA, USA: 1997. Short-range Lidar measurement of top fruit tree canopies for pesticide applications research in the United Kingdom. [Google Scholar]
  • 44.Holmén B.A., Eichinger W.E., Flocchini R.G. Application of elastic lidar to PM10 emissions from agricultural nonpoint sources. Environ. Sci. Technol. 1998;32:3068–3076. doi: 10.1021/es980176p. [DOI] [Google Scholar]
  • 45.Ilari A., Piancatelli S., Centorame L., Moumni M., Romanazzi G., Pedretti E.F. Distribution Quality of Agrochemicals for the Revamping of a Sprayer System Based on Lidar Technology and Grapevine Disease Management. Appl. Sci. 2023;13:2222. doi: 10.3390/app13042222. [DOI] [Google Scholar]
  • 46.Shan J., Toth C.K. Topographic Laser Ranging and Scanning: Principles and Processing. CRC Press; Boca Raton, FL, USA: 2018. [Google Scholar]
  • 47.Raj T., Hashim F.H., Huddin A.B., Ibrahim M.F., Hussain A. A survey on LiDAR scanning mechanisms. Electronics. 2020;9:741. doi: 10.3390/electronics9050741. [DOI] [Google Scholar]
  • 48.Ferraz A., Bretar F., Jacquemoud S., Gonclave G.R. The role of Lidar systems in fuel mapping; Proceedings of the 27th International Conference on Optical Network Design and Modelling; Braunschweig, Germany. 18–20 February 2009. [Google Scholar]
  • 49.Ziębiński A., Cupek R., Kruk M., Drewniak M. Lidar technology in general purpose applications. Stud. Inform. 2016;37:15–32. [Google Scholar]
  • 50.Wang R., Peethambaran J., Chen D. Lidar point clouds to 3-D urban models $: $ A review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018;11:606–627. doi: 10.1109/JSTARS.2017.2781132. [DOI] [Google Scholar]
  • 51.Bates J.S., Montzka C., Schmidt M., Jonard F. Estimating canopy density parameters time-series for winter wheat using UAS Mounted LiDAR. Remote Sens. 2021;13:710. doi: 10.3390/rs13040710. [DOI] [Google Scholar]
  • 52.Lin Y. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics. Comput. Electron. Agric. 2015;119:61–73. doi: 10.1016/j.compag.2015.10.011. [DOI] [Google Scholar]
  • 53.Zhang H., Yemoto K. UAS-based remote sensing applications on the Northern Colorado Limited Irrigation Research Farm. Int. J. Precis. Agric. Aviat. 2019;2:20190202.50. doi: 10.33440/j.ijpaa.20190202.50. [DOI] [Google Scholar]
  • 54.García-Cimarras A., Manzanera J.A., Valbuena R. LiDAR Scan Density and Spatial Resolution Effects on Vegetation Fuel Type Mapping. Croat. J. For. Eng. J. Theory Appl. For. Eng. 2023;44:189–201. doi: 10.5552/crojfe.2023.1689. [DOI] [Google Scholar]
  • 55.Nurcholis M., Himawan I.Q., Wijayanti S.I., Darmaristianti A. Tropical Vegetation and Land Cover Mapping Using LiDAR. Planta Trop. 2019;7:8–18. doi: 10.18196/pt.2019.088.8-18. [DOI] [Google Scholar]
  • 56.Zheng C., Abd-Elrahman A., Whitaker V. Remote sensing and machine learning in crop phenotyping and management, with an emphasis on applications in strawberry farming. Remote Sens. 2021;13:531. doi: 10.3390/rs13030531. [DOI] [Google Scholar]
  • 57.Jin S., Sun X., Wu F., Su Y., Li Y., Song S., Xu K., Ma Q., Baret F., Jiang D., et al. Lidar sheds new light on plant phenomics for plant breeding and management: Recent advances and future prospects. ISPRS J. Photogramm. Remote Sens. 2021;171:202–223. doi: 10.1016/j.isprsjprs.2020.11.006. [DOI] [Google Scholar]
  • 58.Büyüksalih İ. Building Zone Regulation Compliance Using LIDAR Data: Real-Life Tests in İstanbul. Int. J. Environ. Geoinform. 2012;3:48–55. doi: 10.30897/ijegeo.304428. [DOI] [Google Scholar]
  • 59.Chen F.-q. Advantages and Application Prospects Study of Airborne LIDAR technology. Beijing Surv. Mapp. 2013;2:12–14. [Google Scholar]
  • 60.Chase A.S., Weishampel J. Using LiDAR and GIS to investigate water and soil management in the agricultural terracing at Caracol, Belize. Adv. Archaeol. Pract. 2016;4:357–370. doi: 10.7183/2326-3768.4.3.357. [DOI] [Google Scholar]
  • 61.Neuville R., Bates J.S., Jonard F. Estimating forest structure from UAV-mounted LiDAR point cloud using machine learning. Remote Sens. 2021;13:352. doi: 10.3390/rs13030352. [DOI] [Google Scholar]
  • 62.Rakesh D., Kumar N.A., Sivaguru M., Keerthivaasan K.V.R., Janaki B.R., Raffik R. Role of UAVs in innovating agriculture with future applications: A review; Proceedings of the 2021 International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA); Virtual. 8–9 October 2021. [Google Scholar]
  • 63.Fareed N., Flores J.P., Das A.K. Analysis of UAS-LiDAR Ground Points Classification in Agricultural Fields Using Traditional Algorithms and PointCNN. Remote Sens. 2023;15:483. doi: 10.3390/rs15020483. [DOI] [Google Scholar]
  • 64.Aslan M.F., Durdu A., Sabanci K., Ropelewska E., Gültekin S.S. A comprehensive survey of the recent studies with UAV for precision agriculture in open fields and greenhouses. Appl. Sci. 2022;12:1047. doi: 10.3390/app12031047. [DOI] [Google Scholar]
  • 65.Dowling L., Poblete T., Hook I., Tang H., Tan Y., Glenn W., Unnithan R.R. Accurate indoor mapping using an autonomous unmanned aerial vehicle (UAV) arXiv. 20181808.01940 [Google Scholar]
  • 66.Qin H., Meng Z., Meng W., Chen X., Sun H., Lin F., Ang M.H. Autonomous exploration and mapping system using heterogeneous UAVs and UGVs in GPS-denied environments. IEEE Trans. Veh. Technol. 2019;68:1339–1350. doi: 10.1109/TVT.2018.2890416. [DOI] [Google Scholar]
  • 67.Turner R., Panciera R., Tanase M.A., Lowell K., Hacker J.M., Walker J.P. Estimation of soil surface roughness of agricultural soils using airborne LiDAR. Remote Sens. Environ. 2014;140:107–117. doi: 10.1016/j.rse.2013.08.030. [DOI] [Google Scholar]
  • 68.Ladefoged T.N., McCoy M.D., Asner G.P., Kirch P.V., Puleston C.O., Chadwick O.A., Vitousek P.M. Agricultural potential and actualized development in Hawai’i: An airborne LiDAR survey of the leeward Kohala field system (Hawai’i Island) J. Archaeol. Sci. 2011;38:3605–3619. doi: 10.1016/j.jas.2011.08.031. [DOI] [Google Scholar]
  • 69.Zhang X., Bao Y., Wang D., Xin X., Ding L., Xu D., Hou L., Shen J. Using UAV Lidar to extract vegetation parameters of inner Mongolian grassland. Remote Sens. 2021;13:656. doi: 10.3390/rs13040656. [DOI] [Google Scholar]
  • 70.Liu X., Bo Y. Object-Based Crop Species Classification Based on the Combination of Airborne Hyperspectral Images and LiDAR Data. Remote Sens. 2015;7:922–950. doi: 10.3390/rs70100922. [DOI] [Google Scholar]
  • 71.Maimaitijiang M., Sagan V., Erkbol H., Adrian J., Newcomb M., LeBauer D., Pauli D., Shakoor N., Mockler T. UAV-Based sorghum growth monitoring: A comparative analysis of LIDAR and photogrammetry. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2020;3:489–496. doi: 10.5194/isprs-annals-V-3-2020-489-2020. [DOI] [Google Scholar]
  • 72.Zhang C., Craine W.A., McGee R.J., Vandemark G.J., Davis J.B., Brown J., Hulbert S.H., Sankaran S. High-throughput phenotyping of canopy height in cool-season crops using sensing techniques. Agron. J. 2021;113:3269–3280. doi: 10.1002/agj2.20632. [DOI] [Google Scholar]
  • 73.Ivushkin K., Bartholomeus H., Bregt A.K., Pulatov A., Franceschini M.H., Kramer H., van Loo E.N., Roman V.J., Finkers R. UAV based soil salinity assessment of cropland. Geoderma. 2019;338:502–512. doi: 10.1016/j.geoderma.2018.09.046. [DOI] [Google Scholar]
  • 74.Trepekli K., Westergaard-Nielsen A., Friborg T. EGU General Assembly Conference Abstracts. European Geosciences Union; Munich, Germany: 2020. Application of drone borne LiDAR technology for monitoring agricultural biomass and plant growth. [Google Scholar]
  • 75.Shendryk Y., Sofonia J., Garrard R., Rist Y., Skocaj D., Thorburn P. Fine-scale prediction of biomass and leaf nitrogen content in sugarcane using UAV LiDAR and multispectral imaging. Int. J. Appl. Earth Obs. Geoinf. 2020;92:102177. doi: 10.1016/j.jag.2020.102177. [DOI] [Google Scholar]
  • 76.Xu J.X., Ma J., Tang Y.N., Wu W.X., Shao J.H., Wu W.B., Wei S.Y., Liu Y.F., Wang Y.C., Guo H.Q. Estimation of sugarcane yield using a machine learning approach based on UAV-lidar data. Remote Sens. 2020;12:2823. doi: 10.3390/rs12172823. [DOI] [Google Scholar]
  • 77.Masjedi A., Zhao J., Thompson A.M., Yang K.W., Flatt J.E., Crawford M.M., Ebert D.S., Tuinstra M.R., Hammer G., Chapman S. Sorghum biomass prediction using UAV-based remote sensing data and crop model simulation; Proceedings of the IGARSS 2018—2018 IEEE International Geoscience and Remote Sensing Symposium; Valencia, Spain. 22–27 July 2018. [Google Scholar]
  • 78.Wu D., Johansen K., Phinn S., Robson A. Suitability of airborne and terrestrial laser scanning for mapping tree crop structural metrics for improved orchard management. Remote Sens. 2020;12:1647. doi: 10.3390/rs12101647. [DOI] [Google Scholar]
  • 79.Hütt C., Bolten A., Hüging H., Bareth G. UAV Lidar metrics for monitoring crop height, biomass and nitrogen uptake: A case study on a winter wheat field trial. PFG–J. Photogramm. Remote Sens. Geoinf. Sci. 2023;91:65–76. doi: 10.1007/s41064-022-00228-6. [DOI] [Google Scholar]
  • 80.Masjedi A., Crawford M.M., Carpenter N.R., Tuinstra M.R. Multi-temporal predictive modelling of sorghum biomass using UAV-based hyperspectral and LiDAR data. Remote Sens. 2020;12:3587. doi: 10.3390/rs12213587. [DOI] [Google Scholar]
  • 81.Terrestrial Lidar Scanning Research. [(accessed on 20 February 2024)]. Available online: https://sites.bu.edu/lidar/3d-reconstruction/
  • 82.Westling F., Bryson M., Underwood J. SimTreeLS: Simulating aerial and terrestrial laser scans of trees. Comput. Electron. Agric. 2021;187:106277. doi: 10.1016/j.compag.2021.106277. [DOI] [Google Scholar]
  • 83.Hosoi F., Omasa K. Estimating vertical plant area density profile and growth parameters of a wheat canopy at different growth stages using three-dimensional portable lidar imaging. ISPRS J. Photogramm. Remote Sens. 2009;64:151–158. doi: 10.1016/j.isprsjprs.2008.09.003. [DOI] [Google Scholar]
  • 84.Martínez-Casasnovas J. Mapping the leaf area index in vineyard using a ground based LiDAR scanner; Proceedings of the 11th International Conference on Precision Agriculture (ICPA 2012); Indianapolis, IN, USA. 15–18 July 2012. [Google Scholar]
  • 85.Hobart M., Pflanz M., Weltzien C., Schirrmann M. Growth height determination of tree walls for precise monitoring in apple fruit production using UAV photogrammetry. Remote Sens. 2020;12:1656. doi: 10.3390/rs12101656. [DOI] [Google Scholar]
  • 86.Höfle B. Radiometric correction of terrestrial LiDAR point cloud data for individual maize plant detection. IEEE Geosci. Remote Sens. Lett. 2013;11:94–98. doi: 10.1109/LGRS.2013.2247022. [DOI] [Google Scholar]
  • 87.Xu L., Xu T., Li X. Corn Seedling Monitoring Using 3-D Point Cloud Data from Terrestrial Laser Scanning and Registered Camera Data. IEEE Geosci. Remote Sens. Lett. 2019;17:137–141. doi: 10.1109/LGRS.2019.2916348. [DOI] [Google Scholar]
  • 88.Koenig K., Höfle B., Hämmerle M., Jarmer T., Siegmann B., Lilienthal H. Comparative classification analysis of post-harvest growth detection from terrestrial LiDAR point clouds in precision agriculture. ISPRS J. Photogramm. Remote Sens. 2015;104:112–125. doi: 10.1016/j.isprsjprs.2015.03.003. [DOI] [Google Scholar]
  • 89.Bareth G., Bendig J., Tilly N., Hoffmeister D., Aasen H., Bolten A. A comparison of UAV-and TLS-derived plant height for crop monitoring: Using polygon grids for the analysis of crop surface models (CSMs) Photogramm. Fernerkund. Geoinf. 2016;2016:85–94. doi: 10.1127/pfg/2016/0289. [DOI] [Google Scholar]
  • 90.Madec S., Baret F., De Solan B., Thomas S., Dutartre D., Jezequel S., Hemmerlé M., Colombeau G., Comar A. High-throughput phenotyping of plant height: Comparing unmanned aerial vehicles and ground LiDAR estimates. Front. Plant Sci. 2017;8:2002. doi: 10.3389/fpls.2017.02002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91.Li P., Zhang X., Wang W., Zheng H., Yao X., Tian Y., Zhu Y., Cao W., Chen Q., Cheng T. Estimating aboveground and organ biomass of plant canopies across the entire season of rice growth with terrestrial laser scanning. Int. J. Appl. Earth Obs. Geoinf. 2020;91:102132. doi: 10.1016/j.jag.2020.102132. [DOI] [Google Scholar]
  • 92.Jimenez-Berni J.A., Deery D.M., Rozas-Larraondo P., Condon AT G., Rebetzke G.J., James R.A., Bovill D., Furbank R.T., Sirault X.R. High throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR. Front. Plant Sci. 2018;9:237. doi: 10.3389/fpls.2018.00237. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Llorens J., Gil E., Llop J., Escolà A. Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards: Advances to improve pesticide application methods. Sensors. 2011;11:2177–2194. doi: 10.3390/s110202177. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Rinaldi M., Llorens J., Gil E. Precision Agriculture. Springer; Berlin/Heidelberg, Germany: 2013. Electronic characterization of the phenological stages of grapevine using a LIDAR sensor. [Google Scholar]
  • 95.Polo J.R.R., Sanz R., Llorens J., Arnó J., Escola A., Ribes-Dasi M., Masip J., Camp F., Gràcia F., Solanelles F., et al. A tractor-mounted scanning LIDAR for the non-destructive measurement of vegetative volume and surface area of tree-row plantations: A comparison with conventional destructive measurements. Biosyst. Eng. 2009;102:128–134. doi: 10.1016/j.biosystemseng.2008.10.009. [DOI] [Google Scholar]
  • 96.Andújar D., Rueda-Ayala V., Moreno H., Rosell-Polo J.R., Escolá A., Valero C., Gerhards R., Fernández-Quintanilla C., Dorado J., Griepentrog H.W. Discriminating crop, weeds and soil surface with a terrestrial LIDAR sensor. Sensors. 2013;13:14662–14675. doi: 10.3390/s131114662. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Gregorio E., Rocadenbosch F., Sanz R., Rosell-Polo J.R. Eye-safe lidar system for pesticide spray drift measurement. Sensors. 2015;15:3650–3670. doi: 10.3390/s150203650. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.Gil E., Llorens J., Llop J., Fàbregas X., Gallart M. Use of a terrestrial LIDAR sensor for drift detection in vineyard spraying. Sensors. 2013;13:516–534. doi: 10.3390/s130100516. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 99.Su Y., Guo Q., Jin S., Guan H., Sun X., Ma Q., Hu T., Wang R., Li Y. The development and evaluation of a backpack LiDAR system for accurate and efficient forest inventory. IEEE Geosci. Remote Sens. Lett. 2020;18:1660–1664. doi: 10.1109/LGRS.2020.3005166. [DOI] [Google Scholar]
  • 100.Proudman A., Ramezani M., Digumarti S.T., Chebrolu N., Fallon M. Towards real-time forest inventory using handheld LiDAR. Robot. Auton. Syst. 2022;157:104240. doi: 10.1016/j.robot.2022.104240. [DOI] [Google Scholar]
  • 101.Wang Y., Fang H. Estimation of LAI with the LiDAR technology: A review. Remote Sens. 2020;12:3457. doi: 10.3390/rs12203457. [DOI] [Google Scholar]
  • 102.Zhou H., Zhang J., Ge L., Yu X., Wang Y., Zhang C. Research on volume prediction of single tree canopy based on three-dimensional (3D) LiDAR and clustering segmentation. Int. J. Remote Sens. 2021;42:738–755. doi: 10.1080/01431161.2020.1811917. [DOI] [Google Scholar]
  • 103.Underwood J.P., Hung C., Whelan B., Sukkarieh S. Mapping almond orchard canopy volume, flowers, fruit and yield using lidar and vision sensors. Comput. Electron. Agric. 2016;130:83–96. doi: 10.1016/j.compag.2016.09.014. [DOI] [Google Scholar]
  • 104.Arnó Satorra J., Escolà i Agustí A., Vallès Petit J.M., Llorens Calveras J., Sanz Cortiella R., Masip Vilalta J., Palacín Roca J., Rosell Polo J.R. Leaf area index estimation in vineyards using a ground-based LiDAR scanner. Precis. Agric. 2013;14:290–306. doi: 10.1007/s11119-012-9295-0. [DOI] [Google Scholar]
  • 105.Deery D.M., Rebetzke G.J., Jimenez-Berni J.A., Condon A.G., Smith D.J., Bechaz K.M., Bovill W.D. Ground-based LiDAR improves phenotypic repeatability of above-ground biomass and crop growth rate in wheat. Plant Phenom. 2020;2020:290–306. doi: 10.34133/2020/8329798. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 106.Ruhan A., Du W., Ying H., Wei B., Shan Y., Dai H. Estimation of aboveground biomass of individual trees by backpack LiDAR based on parameter-optimized quantitative structural models (AdQSM) Forests. 2023;14:475. doi: 10.3390/f14030475. [DOI] [Google Scholar]
  • 107.Zhu Y., Sun G., Ding G., Zhou J., Wen M., Jin S., Zhao Q., Colmer J., Ding Y., Ober E.S., et al. Large-scale field phenotyping using backpack LiDAR and GUI-based CropQuant-3D to measure structural responses to different nitrogen treatments in wheat. Plant Physiol. 2021;187:716–738. doi: 10.1093/plphys/kiab324. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.Guo Q., Wu F., Pang S., Zhao X., Chen L., Liu J., Xue B., Xu G., Li L., Jing H., et al. Crop 3D—A LiDAR based platform for 3D high-throughput crop phenotyping. Sci. China Life Sci. 2018;61:328–339. doi: 10.1007/s11427-017-9056-0. [DOI] [PubMed] [Google Scholar]
  • 109.Sultan M., He L. Measuring tree canopy density using a LiDAR-guided system for precision spraying; Proceedings of the Annual International Virtual Meeting, 1. American Society of Agricultural and Biological Engineers; Virtual Meeting. 13–15 July 2020. [Google Scholar]
  • 110.Colaço A.F., Schaefer M., Bramley R.G. Broadacre mapping of wheat biomass using ground-based LiDAR technology. Remote Sens. 2021;13:3218. doi: 10.3390/rs13163218. [DOI] [Google Scholar]
  • 111.Gu C., Zhao C., Zou W., Yang S., Dou H., Zhai C. Innovative leaf area detection models for orchard tree thick canopy based on LiDAR point cloud data. Agriculture. 2022;12:1241. doi: 10.3390/agriculture12081241. [DOI] [Google Scholar]
  • 112.Pagliai A., Ammoniaci M., Sarri D., Lisci R., Perria R., Vieri M., D’Arcangelo M.E.M., Storchi P., Kartsiotis S.P. Comparison of Aerial and Ground 3D Point Clouds for Canopy Size Assessment in Precision Viticulture. Remote Sens. 2022;14:1145. doi: 10.3390/rs14051145. [DOI] [Google Scholar]
  • 113.Malambo L., Popescu S.C., Horne D.W., Pugh N.A., Rooney W.L. Automated detection and measurement of individual sorghum panicles using density-based clustering of terrestrial lidar data. ISPRS J. Photogramm. Remote Sens. 2019;149:1–13. doi: 10.1016/j.isprsjprs.2018.12.015. [DOI] [Google Scholar]
  • 114.Krus A., Van Apeldoorn D., Valero C., Ramirez J.J. Acquiring plant features with optical sensing devices in an organic strip-cropping system. Agronomy. 2020;10:197. doi: 10.3390/agronomy10020197. [DOI] [Google Scholar]
  • 115.George R., Barrett B., Ghamkhar K. Evaluation of LiDAR scanning for measurement of yield in perennial ryegrass. J. New Zealand Grassl. 2020;81:414. doi: 10.33584/jnzg.2019.81.414. [DOI] [Google Scholar]
  • 116.Zhang L., Grift T.E. A LIDAR-based crop height measurement system for Miscanthus giganteus. Comput. Electron. Agric. 2012;85:70–76. doi: 10.1016/j.compag.2012.04.001. [DOI] [Google Scholar]
  • 117.Tsolakis N., Bechtsis D., Bochtis D. Agros: A robot operating system based emulation tool for agricultural robotics. Agronomy. 2019;9:403. doi: 10.3390/agronomy9070403. [DOI] [Google Scholar]
  • 118.Torres-Sánchez J., López-Granados F., Serrano N., Arquero O., Peña J.M. High-throughput 3-D monitoring of agricultural-tree plantations with unmanned aerial vehicle (UAV) technology. PLoS ONE. 2015;10:0130479. doi: 10.1371/journal.pone.0130479. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119.Moreno H., Valero C., Bengochea-Guevara J.M., Ribeiro Á., Garrido-Izard M., Andújar D. On-ground vineyard reconstruction using a LiDAR-based automated system. Sensors. 2020;20:1102. doi: 10.3390/s20041102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 120.Willers J.L., Wu J., O’Hara C., Jenkins J.N. A categorical, improper probability method for combining NDVI and LiDAR elevation information for potential cotton precision agricultural applications. Comput. Electron. Agric. 2012;82:15–22. doi: 10.1016/j.compag.2011.11.010. [DOI] [Google Scholar]
  • 121.Fagua J.C., Jantz P., Rodriguez-Buritica S., Duncanson L., Goetz S.J. Integrating LiDAR, multispectral and SAR data to estimate and map canopy height in tropical forests. Remote Sens. 2019;11:2697. doi: 10.3390/rs11222697. [DOI] [Google Scholar]
  • 122.Wu X., Shen X., Cao L., Wang G., Cao F. Assessment of individual tree detection and canopy cover estimation using unmanned aerial vehicle based light detection and ranging (UAV-LiDAR) data in planted forests. Remote Sens. 2019;11:908. doi: 10.3390/rs11080908. [DOI] [Google Scholar]
  • 123.Jin S., Su Y., Song S., Xu K., Hu T., Yang Q., Wu F., Xu G., Ma Q., Guan H., et al. Non-destructive estimation of field maize biomass using terrestrial lidar: An evaluation from plot level to individual leaf level. Plant Methods. 2020;16:1–19. doi: 10.1186/s13007-020-00613-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 124.Wiering N.P., Ehlke N.J., Sheaffer C.C. Lidar and RGB image analysis to predict hairy vetch biomass in breeding nurseries. Plant Phenome J. 2019;2:1–8. doi: 10.2135/tppj2019.02.0003. [DOI] [Google Scholar]
  • 125.Duncanson L.I., Dubayah R.O., Cook B.D., Rosette J., Parker G. The importance of spatial detail: Assessing the utility of individual crown information and scaling approaches for lidar-based biomass density estimation. Remote Sens. Environ. 2015;168:102–112. doi: 10.1016/j.rse.2015.06.021. [DOI] [Google Scholar]
  • 126.Brede B., Calders K., Lau A., Raumonen P., Bartholomeus H.M., Herold M., Kooistra L. Non-destructive tree volume estimation through quantitative structure modelling: Comparing UAV laser scanning with terrestrial LIDAR. Remote Sens. Environ. 2019;233:111355. doi: 10.1016/j.rse.2019.111355. [DOI] [Google Scholar]
  • 127.Moorthy S.M.K., Calders K., Vicari M.B., Verbeeck H. Improved supervised learning-based approach for leaf and wood classification from LiDAR point clouds of forests. IEEE Trans. Geosci. Remote Sens. 2019;58:3057–3070. doi: 10.1109/TGRS.2019.2947198. [DOI] [Google Scholar]
  • 128.Montzka C., Donat M., Raj R., Welter P., Bates J.S. Sensitivity of LiDAR Parameters to Aboveground Biomass in Winter Spelt. Drones. 2023;7:121. doi: 10.3390/drones7020121. [DOI] [Google Scholar]
  • 129.Coops N.C., Hilker T., Wulder M.A., St-Onge B., Newnham G., Siggins A., Trofymow J.A. Estimating canopy structure of Douglas-fir forest stands from discrete-return LiDAR. Trees. 2007;21:295–310. doi: 10.1007/s00468-006-0119-6. [DOI] [Google Scholar]
  • 130.Lefsky M.A. Lidar remote sensing for ecosystem studies. BioScience. 2002;52:19–30. doi: 10.1641/0006-3568(2002)052[0019:LRSFES]2.0.CO;2. [DOI] [Google Scholar]
  • 131.Liu S., Baret F., Abichou M., Boudon F., Thomas S., Zhao K., Fournier C., Andrieu B., Irfan K., Hemmerlé M., et al. Estimating wheat green area index from ground-based LiDAR measurement using a 3D canopy structure model. Agric. For. Meteorol. 2017;247:12–20. doi: 10.1016/j.agrformet.2017.07.007. [DOI] [Google Scholar]
  • 132.Omasa K., Hosoi F., Konishi A. 3D lidar imaging for detecting and understanding plant responses and canopy structure. J. Exp. Bot. 2007;58:881–898. doi: 10.1093/jxb/erl142. [DOI] [PubMed] [Google Scholar]
  • 133.Eitel J.U., Magney T.S., Vierling L.A., Greaves H.E., Zheng G. An automated method to quantify crop height and calibrate satellite-derived biomass using hypertemporal lidar. Remote Sens. Environ. 2016;187:414–422. doi: 10.1016/j.rse.2016.10.044. [DOI] [Google Scholar]
  • 134.Hosoi F., Omasa K. Estimation of vertical plant area density profiles in a rice canopy at different growth stages by high-resolution portable scanning lidar with a lightweight mirror. ISPRS J. Photogramm. Remote Sens. 2012;74:11–19. doi: 10.1016/j.isprsjprs.2012.08.001. [DOI] [Google Scholar]
  • 135.Simonson W.D., Allen H.D., Coomes D.A. Use of an airborne lidar system to model plant species composition and diversity of Mediterranean oak forests. Conserv. Biol. 2012;26:840–850. doi: 10.1111/j.1523-1739.2012.01869.x. [DOI] [PubMed] [Google Scholar]
  • 136.Cao L., Coops N.C., Sun Y., Ruan H., Wang G., Dai J., She G. Estimating canopy structure and biomass in bamboo forests using airborne LiDAR data. ISPRS J. Photogramm. Remote Sens. 2019;148:114–129. doi: 10.1016/j.isprsjprs.2018.12.006. [DOI] [Google Scholar]
  • 137.Eitel J.U., Magney T.S., Vierling L.A., Brown T.T., Huggins D.R. LiDAR based biomass and crop nitrogen estimates for rapid, non-destructive assessment of wheat nitrogen status. Field Crops Res. 2014;159:21–32. doi: 10.1016/j.fcr.2014.01.008. [DOI] [Google Scholar]
  • 138.Walter J.D., Edwards J., McDonald G., Kuchel H. Estimating biomass and canopy height with LiDAR for field crop breeding. Front. Plant Sci. 2019;10:1145. doi: 10.3389/fpls.2019.01145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 139.Wu J., Xue X., Zhang S., Qin W., Chen C., Sun T. Plant 3D reconstruction based on LiDAR and multi-view sequence images. Int. J. Precis. Agric. Aviat. 2018;1:37–43. doi: 10.33440/j.ijpaa.20180101.0007. [DOI] [Google Scholar]
  • 140.Jin S., Su Y., Wu F., Pang S., Gao S., Hu T., Liu J., Guo Q. Stem–leaf segmentation and phenotypic trait extraction of individual maize using terrestrial LiDAR data. IEEE Trans. Geosci. Remote Sens. 2018;57:1336–1346. doi: 10.1109/TGRS.2018.2866056. [DOI] [Google Scholar]
  • 141.Escolà A., Martínez-Casasnovas J.A., Rufat J., Arnó J., Arbonés A., Sebé F., Pascual M., Gregorio E., Rosell-Polo J.R. Mobile terrestrial laser scanner applications in precision fruticulture/horticulture and tools to extract information from canopy point clouds. Precis. Agric. 2017;18:111–132. doi: 10.1007/s11119-016-9474-5. [DOI] [Google Scholar]
  • 142.Sandonís-Pozo L., Llorens J., Escolà A., Arnó J., Pascual M., Martínez-Casasnovas J.A. Satellite multispectral indices to estimate canopy parameters and within-field management zones in super-intensive almond orchards. Precis. Agric. 2022;23:2040–2062. doi: 10.1007/s11119-022-09956-6. [DOI] [Google Scholar]
  • 143.Anken T., Battiato A., Seatovic D., Meiser V., Selbeck J., Pforte F. Canopy-Area Measurement of Plum Trees Using Laser and Near-Infrared Imaging. Agroscope; Bern, Switzerland: 2013. [Google Scholar]
  • 144.Dhami H., Yu K., Xu T., Zhu Q., Dhakal K., Friel J., Li S., Tokekar P. Crop height and plot estimation for phenotyping from unmanned aerial vehicles using 3D LiDAR; Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); Las Vegas, NV, USA. 24–30 October 2020. [Google Scholar]
  • 145.Yuan W., Li J., Bhatta M., Shi Y., Baenziger P.S., Ge Y. Wheat height estimation using LiDAR in comparison to ultrasonic sensor and UAS. Sensors. 2018;18:3731. doi: 10.3390/s18113731. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 146.Ziliani M.G., Parkes S.D., Hoteit I., McCabe M.F. Intra-season crop height variability at commercial farm scales using a fixed-wing UAV. Remote Sens. 2018;10:2007. doi: 10.3390/rs10122007. [DOI] [Google Scholar]
  • 147.Gao M., Yang F., Wei H., Liu X. Individual maize location and height estimation in field from UAV-borne LiDAR and RGB images. Remote Sens. 2022;14:2292. doi: 10.3390/rs14102292. [DOI] [Google Scholar]
  • 148.Zhou Y., Shi Z., Qiu T., Yu S., Zhang Q., Shen D. Experimental study on morphological characteristics of landslide dams in different shaped valleys. Geomorphology. 2022;400:108081. doi: 10.1016/j.geomorph.2021.108081. [DOI] [Google Scholar]
  • 149.Sofonia J., Shendryk Y., Phinn S., Roelfsema C., Kendoul F., Skocaj D. Monitoring sugarcane growth response to varying nitrogen application rates: A comparison of UAV SLAM LiDAR and photogrammetry. Int. J. Appl. Earth Obs. Geoinf. 2019;82:101878. doi: 10.1016/j.jag.2019.05.011. [DOI] [Google Scholar]
  • 150.Ghamkhar K., Irie K., Hagedorn M., Hsiao J., Fourie J., Gebbie S., Hoyos-Villegas V., George R., Stewart A., Inch C., et al. Real-time, non-destructive and in-field foliage yield and growth rate measurement in perennial ryegrass (Lolium perenne L.) Plant Methods. 2019;15:1–12. doi: 10.1186/s13007-019-0456-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 151.Tamás J., Lehoczky É., Fehér J., Fórián T., Nagy A., Bozsik É., Gálya B., Riczu P. Airborne hyperspectral and LiDAR data integration for weed detection; Proceedings of the EGU General Assembly Conference Abstracts; Vienna, Austria. 27 April–2 May 2014. [Google Scholar]
  • 152.Pretto A., Aravecchia S., Burgard W., Chebrolu N., Dornhege C., Falck T., Fleckenstein F., Fontenla A., Imperoli M., Khanna R., et al. Building an aerial–ground robotics system for precision farming: An adaptable solution. IEEE Robot. Autom. Mag. 2020;28:29–49. doi: 10.1109/MRA.2020.3012492. [DOI] [Google Scholar]
  • 153.Liu K., Dong X., Qiu B. Analysis of cotton height spatial variability based on UAV-LiDAR. Int. J. Precis. Agric. Aviat. 2020;3:72–76. doi: 10.33440/j.ijpaa.20200303.79. [DOI] [Google Scholar]
  • 154.Nguyen P., Badenhorst P.E., Shi F., Spangenberg G.C., Smith K.F., Daetwyler H.D. Design of an unmanned ground vehicle and lidar pipeline for the high-throughput phenotyping of biomass in perennial ryegrass. Remote Sens. 2020;13:20. doi: 10.3390/rs13010020. [DOI] [Google Scholar]
  • 155.Cruz Ulloa C., Krus A., Barrientos A., Del Cerro J., Valero C. Robotic fertilisation using localisation systems based on point clouds in strip-cropping fields. Agronomy. 2020;11:11. doi: 10.3390/agronomy11010011. [DOI] [Google Scholar]
  • 156.Reiser D., Vázquez-Arellano M., Paraforos D.S., Garrido-Izard M., Griepentrog H.W. Iterative individual plant clustering in maize with assembled 2D LiDAR data. Comput. Ind. 2018;99:42–52. doi: 10.1016/j.compind.2018.03.023. [DOI] [Google Scholar]
  • 157.Tsoulias N., Paraforos D.S., Xanthopoulos G., Zude-Sasse M. Apple shape detection based on geometric and radiometric features using a LiDAR laser scanner. Remote Sens. 2020;12:2481. doi: 10.3390/rs12152481. [DOI] [Google Scholar]
  • 158.Itakura K., Hosoi F. Automatic individual tree detection and canopy segmentation from three-dimensional point cloud images obtained from ground-based lidar. J. Agric. Meteorol. 2018;74:109–113. doi: 10.2480/agrmet.D-18-00012. [DOI] [Google Scholar]
  • 159.Tulldahl H.M., Bissmarck F., Larsson H., Grönwall C., Tolt G. Accuracy evaluation of 3D lidar data from small UAV; Proceedings of the Electro-Optical Remote Sensing, Photonic Technologies, and Applications IX; Amsterdam, The Netherlands. 22–23 September 2014. [Google Scholar]
  • 160.Li M., Shamshiri R.R., Schirrmann M., Weltzien C., Shafian S., Laursen M.S. UAV oblique imagery with an adaptive micro-terrain model for estimation of leaf area index and height of maize canopy from 3D point clouds. Remote Sens. 2022;14:585. doi: 10.3390/rs14030585. [DOI] [Google Scholar]
  • 161.Ji X., Ding S., Wei X., Cui B. Path tracking of unmanned agricultural tractors based on a novel adaptive second-order sliding mode control. J. Frankl. Inst. 2023;360:5811–5831. doi: 10.1016/j.jfranklin.2023.03.053. [DOI] [Google Scholar]
  • 162.Cui B., Zhang J., Wei X., Cui X., Sun Z., Zhao Y., Liu Y. Improved Information Fusion for Agricultural Machinery Navigation Based on Context-Constrained Kalman Filter and Dual-Antenna RTK. Actuators. 2024;13:160. doi: 10.3390/act13050160. [DOI] [Google Scholar]
  • 163.Mousazadeh H. A technical review on navigation systems of agricultural autonomous off-road vehicles. J. Terramechan. 2013;50:211–232. doi: 10.1016/j.jterra.2013.03.004. [DOI] [Google Scholar]
  • 164.Mao W., Liu H., Hao W., Yang F., Liu Z. Development of a Combined Orchard Harvesting Robot Navigation System. Remote Sens. 2022;14:675. doi: 10.3390/rs14030675. [DOI] [Google Scholar]
  • 165.Wang S., Song J., Qi P., Yuan C., Wu H., Zhang L., Liu W., Liu Y., He X. Design and development of orchard autonomous navigation spray system. Front. Plant Sci. 2022;13:960686. doi: 10.3389/fpls.2022.960686. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 166.Liu L., Liu Y., He X., Liu W. Precision Variable-Rate Spraying Robot by Using Single 3D LIDAR in Orchards. Agronomy. 2022;12:2509. doi: 10.3390/agronomy12102509. [DOI] [Google Scholar]
  • 167.Bertoglio R., Carini V., Arrigoni S., Matteucci M.A. Map-Free LiDAR-Based System for Autonomous Navigation in Vineyards; Proceedings of the 2023 IEEE European Conference on Mobile Robots (ECMR); Coimbra, Portugal. 27 September 2023. [Google Scholar]
  • 168.Hu X., Wang M., Qian C., Huang C., Xia Y., Song M. Lidar-based SLAM and autonomous navigation for forestry quadrotors; Proceedings of the 2018 IEEE CSAA Guidance, Navigation and Control Conference (CGNCC); Xiamen, China. 10–12 August 2018. [Google Scholar]
  • 169.Malavazi F.B., Guyonneau R., Fasquel J.B., Lagrange S., Mercier F. LiDAR-only based navigation algorithm for an autonomous agricultural robot. Comput. Electron. Agric. 2018;154:71–79. doi: 10.1016/j.compag.2018.08.034. [DOI] [Google Scholar]
  • 170.Velasquez A.E.B., Higuti V.A.H., Guerrero H.B., Gasparino M.V., Magalhaes D.V., Aroca R.V., Becker M. Reactive navigation system based on H∞ control system and LiDAR readings on corn crops. Precis. Agric. 2020;21:349–368. doi: 10.1007/s11119-019-09672-8. [DOI] [Google Scholar]
  • 171.Byun S.-W., Noh D., Lee H.-M. Design of obstacle detection method for autonomous driving in agricultural environments; Proceedings of the 2022 Thirteenth International Conference on Ubiquitous and Future Networks (ICUFN); Barcelona, Spain. 5–8 July 2022. [Google Scholar]
  • 172.Qin J., Sun R., Zhou K., Xu Y., Lin B., Yang L., Chen Z., Wen L., Wu C. Lidar-Based 3D Obstacle Detection Using Focal Voxel R-CNN for Farmland Environment. Agronomy. 2023;13:650. doi: 10.3390/agronomy13030650. [DOI] [Google Scholar]
  • 173.Kong W.Y., Hu G.R., Zhang S., Zhou J.G., Gao Z.N., Chen J. Research on agricultural vehicle safety warning system based on LiDAR. INMATEH Agric. Eng. 2022;68:230–242. doi: 10.35633/inmateh-68-23. [DOI] [Google Scholar]
  • 174.Jiang A., Ahamed T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors. 2023;23:4808. doi: 10.3390/s23104808. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 175.McGlinchy J., Van Aardt J.A., Erasmus B., Asner G.P., Mathieu R., Wessels K., Knapp D., Kennedy-Bowdoin T., Rhody H., Kerekes J.P., et al. Extracting structural vegetation components from small-footprint waveform lidar for biomass estimation in savanna ecosystems. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013;7:480–490. doi: 10.1109/JSTARS.2013.2274761. [DOI] [Google Scholar]
  • 176.Zhu Y., Zhao C., Yang H., Yang G., Han L., Li Z., Feng H., Xu B., Wu J., Lei L. Estimation of maize above-ground biomass based on stem-leaf separation strategy integrated with LiDAR and optical remote sensing data. Plant Biol. 2019;7:e7593. doi: 10.7717/peerj.7593. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 177.Luo S., Wang C., Xi X., Nie S., Fan X., Chen H., Yang X., Peng D., Lin Y., Zhou G. Combining hyperspectral imagery and LiDAR pseudo-waveform for predicting crop LAI, canopy height and above-ground biomass. Ecol. Indic. 2019;102:801–812. doi: 10.1016/j.ecolind.2019.03.011. [DOI] [Google Scholar]
  • 178.De Almeida C.T., Galvao L.S., Ometto J.P.H.B., Jacon A.D., de Souza Pereira F.R., Sato L.Y., Lopes A.P., de Alencastro Graça P.M.L., de Jesus Silva C.V., Longo M., et al. Combining LiDAR and hyperspectral data for aboveground biomass modeling in the Brazilian Amazon using different regression algorithms. Remote Sens. Environ. 2019;232:111323. doi: 10.1016/j.rse.2019.111323. [DOI] [Google Scholar]
  • 179.Zhu W., Sun Z., Peng J., Huang Y., Li J., Zhang J., Yang B., Liao X. Estimating maize above-ground biomass using 3D point clouds of multi-source unmanned aerial vehicle data at multi-spatial scales. Remote Sens. 2019;11:2678. doi: 10.3390/rs11222678. [DOI] [Google Scholar]
  • 180.Anderson S.L., Murray S.C., Malambo L., Ratcliff C., Popescu S., Cope D., Chang A., Jung J., Thomasson J.A. Prediction of maize grain yield before maturity using improved temporal height estimates of unmanned aerial systems. Plant Phenome J. 2019;2:1–15. doi: 10.2135/tppj2019.02.0004. [DOI] [Google Scholar]
  • 181.Nguyen C., Sagan V., Bhadra S., Moose S. UAV Multisensory Data Fusion and Multi-Task Deep Learning for High-Throughput Maize Phenotyping. Sensors. 2023;23:1827. doi: 10.3390/s23041827. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 182.Rodriguez Padilla A.M., Quintana M.A., Prado R.M., Aguilar B.J., Shea T.A., Oskin M.E., Garcia L. Near-Field High-Resolution Maps of the Ridgecrest Earthquakes from Aerial Imagery. Seismol. Soc. Am. 2022;93:494–499. doi: 10.1785/0220210234. [DOI] [Google Scholar]
  • 183.Sun S., Li C. In-field high throughput phenotyping and phenotype data analysis for cotton plant growth using LiDAR; Proceedings of the Annual International Meeting American Society of Agricultural and Biological Engineers; Spokane, DC, USA. 16–19 July 2017. [Google Scholar]
  • 184.Vidoni R., Gallo R., Ristorto G., Carabin G., Mazzetto F., Scalera L., Gasparetto A. ByeLab: An agricultural mobile robot prototype for proximal sensing and precision farming; Proceedings of the ASME International Mechanical Engineering Congress and Exposition; Tampa, FL, USA. 3–9 November 2017. [Google Scholar]
  • 185.Lozic E. Application of Airborne LiDAR Data to the Archaeology of Agrarian Land Use: The Case Study of the Early Medieval Microregion of Bled (Slovenia) Remote Sens. 2021;13:3228. doi: 10.3390/rs13163228. [DOI] [Google Scholar]
  • 186.Alijani Z., Meloche J., McLaren A., Lindsay J., Roy A., Berg A. A comparison of three surface roughness characterization techniques: Photogrammetry, pin profiler, and smartphone-based LiDAR. Int. J. Digit. Earth. 2022;15:2422–2439. doi: 10.1080/17538947.2022.2160842. [DOI] [Google Scholar]
  • 187.Davenport I.J., Holden N., Gurney R.J. Gurney Characterizing errors in airborne laser altimetry data to extract soil roughness. IEEE Trans. Geosci. Remote Sens. 2004;42:2130–2141. doi: 10.1109/TGRS.2004.834648. [DOI] [Google Scholar]
  • 188.Foldager F.F., Pedersen J.M., Haubro Skov E., Evgrafova A., Green O. Lidar-based 3d scans of soil surfaces and furrows in two soil types. Sensors. 2019;19:661. doi: 10.3390/s19030661. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 189.Hollaus M., Aubrecht C., Höfle B., Steinnocher K., Wagner W. Roughness mapping on various vertical scales based on full-waveform airborne laser scanning data. Remote Sens. 2011;3:503–523. doi: 10.3390/rs3030503. [DOI] [Google Scholar]
  • 190.Cassidy R., Thomas I.A., Higgins A., Bailey J.S., Jordan P. A carrying capacity framework for soil phosphorus and hydrological sensitivity from farm to catchment scales. Sci. Total Environ. 2019;687:277–286. doi: 10.1016/j.scitotenv.2019.05.453. [DOI] [PubMed] [Google Scholar]
  • 191.Southee F.M., Treitz P.M., Scott N.A. Application of lidar terrain surfaces for soil moisture modeling. Photogramm. Eng. Remote Sens. 2012;78:1241–1251. doi: 10.14358/PERS.78.11.1241. [DOI] [Google Scholar]
  • 192.Demelezi F., Gálya B., Tamás J., Demelezi I., Nagy A. Evaluation of soil water management properties based on LiDAR data and soil analyses at farm level. Nat. Resour. Sustain. Dev. 2019;2:160–173. doi: 10.31924/nrsd.v9i2.033. [DOI] [Google Scholar]
  • 193.Kemppinen J., Niittynen P., Riihimäki H., Luoto M. Modelling soil moisture in a high-latitude landscape using LiDAR and soil data. Earth Surf. Process. Landf. 2018;43:1019–1031. doi: 10.1002/esp.4301. [DOI] [Google Scholar]
  • 194.Saeys W., Lenaerts B., Craessaerts G., De Baerdemaeker J. Estimation of the crop density of small grains using LiDAR sensors. Biosyst. Eng. 2009;102:22–30. doi: 10.1016/j.biosystemseng.2008.10.003. [DOI] [Google Scholar]
  • 195.Selbeck J., Dworak V., Ehlert D. Testing a vehicle-based scanning lidar sensor for crop detection. Can. J. Remote Sens. 2010;36:24–35. doi: 10.5589/m10-022. [DOI] [Google Scholar]
  • 196.Deremetz M., Lenain R., Laneurit J., Debain C., Peynot T. Autonomous Human Tracking using UWB sensors for mobile robots: An Observer-Based approach to follow the human path; Proceedings of the 2020 IEEE Conference on Control Technology and Applications (CCTA); Montreal, QC, Canada. 24–26 August 2020. [Google Scholar]
  • 197.Prins A.J., Van Niekerk A. Crop type mapping using LiDAR, Sentinel-2 and aerial imagery with machine learning algorithms. Geo-Spat. Inf. Sci. 2021;24:215–227. doi: 10.1080/10095020.2020.1782776. [DOI] [Google Scholar]
  • 198.Canata T.F., Molin J.P., Sousa R.V.d. A measurement system based on lidar technology to characterize the canopy of sugarcane plants. Eng. Agríc. 2019;39:240–247. doi: 10.1590/1809-4430-eng.agric.v39n2p240-247/2019. [DOI] [Google Scholar]
  • 199.Sun Z., Li Q., Jin S., Song Y., Xu S., Wang X., Cai J., Zhou Q., Ge Y., Zhang R., et al. Simultaneous prediction of wheat yield and grain protein content using multitask deep learning from time-series proximal sensing. Plant Phenom. 2022;2022:9757948. doi: 10.34133/2022/9757948. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 200.Gené-Mola J., Gregorio E., Cheein F.A., Guevara J., Llorens J., Sanz-Cortiella R., Escolà A., Rosell-Polo J.R. Fruit detection, yield prediction and canopy geometric characterization using LiDAR with forced air flow. Comput. Electron. Agric. 2020;168:105121. doi: 10.1016/j.compag.2019.105121. [DOI] [Google Scholar]
  • 201.Gebbers R., Ehlert D., Adamek R. Rapid mapping of the leaf area index in agricultural crops. Agron. J. 2011;103:1532–1541. doi: 10.2134/agronj2011.0201. [DOI] [Google Scholar]
  • 202.Tilly N., Aasen H., Bareth G. Fusion of plant height and vegetation indices for the estimation of barley biomass. Remote Sens. 2015;7:11449–11480. doi: 10.3390/rs70911449. [DOI] [Google Scholar]
  • 203.Li W., Niu Z., Huang N., Wang C., Gao S., Wu C. Airborne LiDAR technique for estimating biomass components of maize: A case study in Zhangye City, Northwest China. Ecol. Indic. 2015;57:486–496. doi: 10.1016/j.ecolind.2015.04.016. [DOI] [Google Scholar]
  • 204.Iqbal J., Xu R., Sun S., Li C. Simulation of an autonomous mobile robot for LiDAR-based in-field phenotyping and navigation. Robotics. 2020;9:46. doi: 10.3390/robotics9020046. [DOI] [Google Scholar]
  • 205.Shang Y., Wang H., Qin W., Wang Q., Liu H., Yin Y., Song Z., Meng Z. Design and Test of Obstacle Detection and Harvester Pre-Collision System Based on 2D Lidar. Agronomy. 2023;13:388. doi: 10.3390/agronomy13020388. [DOI] [Google Scholar]
  • 206.Geer L., Gu D., Wang F., Mohan V., Dowling R. Novel Software Architecture for an Autonomous Agricultural Robotic Fruit Harvesting System; Proceedings of the 2022 27th International Conference on Automation and Computing (ICAC); Bristol, UK. 1–3 September 2022. [Google Scholar]
  • 207.Vrochidou E., Tziridis K., Nikolaou A., Kalampokas T., Papakostas G.A., Pachidis T.P., Mamalis S., Koundouras S., Kaburlasos V.G. An autonomous grape-harvester robot: Integrated system architecture. Electronics. 2021;10:1056. doi: 10.3390/electronics10091056. [DOI] [Google Scholar]
  • 208.Xiong Y., Ge Y., Grimstad L., From P.J. An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation. J. Field Robot. 2020;37:202–224. doi: 10.1002/rob.21889. [DOI] [Google Scholar]
  • 209.Long D.S., McCallum J.D. Mapping straw yield using on-combine light detection and ranging (lidar) Int. J. Remote Sens. 2013;34:6121–6134. doi: 10.1080/01431161.2013.793869. [DOI] [Google Scholar]
  • 210.Ehlert D., Adamek R., Horn H.-J. Laser rangefinder-based measuring of crop biomass under field conditions. Precis. Agric. 2009;10:395–408. doi: 10.1007/s11119-009-9114-4. [DOI] [Google Scholar]
  • 211.Long D.S., McCallum J.D. On-combine, multi-sensor data collection for post-harvest assessment of environmental stress in wheat. Precis. Agric. 2015;16:492–504. doi: 10.1007/s11119-015-9391-z. [DOI] [Google Scholar]
  • 212.Mulley M., Kooistra L., Bierens L. High-resolution multisensor remote sensing to support date palm farm management. Agriculture. 2019;9:26. doi: 10.3390/agriculture9020026. [DOI] [Google Scholar]
  • 213.Tilly N., Hoffmeister D., Cao Q., Huang S., Lenz-Wiedemann V., Miao Y., Bareth G. Multitemporal crop surface models: Accurate plant height measurement and biomass estimation with terrestrial laser scanning in paddy rice. J. Appl. Remote Sens. 2014;8:083671. doi: 10.1117/1.JRS.8.083671. [DOI] [Google Scholar]
  • 214.Loudermilk E.L., Hiers J.K., O’Brien J.J., Mitchell R.J., Singhania A., Fernandez J.C., Cropper W.P., Slatton K.C. Ground-based LIDAR: A novel approach to quantify fine-scale fuelbed characteristics. Int. J. Wildland Fire. 2009;18:676–685. doi: 10.1071/WF07138. [DOI] [Google Scholar]
  • 215.Guo T., Fang Y., Cheng T., Tian Y., Zhu Y., Chen Q., Qiu X., Yao X. Detection of wheat height using optimized multi-scan mode of LiDAR during the entire growth stages. Comput. Electron. Agric. 2019;165:104959. doi: 10.1016/j.compag.2019.104959. [DOI] [Google Scholar]
  • 216.Luo S., Liu W., Zhang Y., Wang C., Xi X., Nie S., Ma D., Lin Y., Zhou G. Maize and soybean heights estimation from unmanned aerial vehicle (UAV) LiDAR data. Comput. Electron. Agric. 2021;182:106005. doi: 10.1016/j.compag.2021.106005. [DOI] [Google Scholar]
  • 217.Wang C., Nie S., Xi X., Luo S., Sun X. Estimating the biomass of maize with hyperspectral and LiDAR data. Remote Sens. 2016;9:11. doi: 10.3390/rs9010011. [DOI] [Google Scholar]
  • 218.Zhou C., Ye H., Sun D., Yue J., Yang G., Hu J. An automated, high-performance approach for detecting and characterizing broccoli based on UAV remote-sensing and transformers: A case study from Haining, China. Int. J. Appl. Earth Obs. Geoinf. 2022;114:103055. doi: 10.1016/j.jag.2022.103055. [DOI] [Google Scholar]
  • 219.Canata T.F., Molin J.P., Sousa R.V.D. LiDAR technology for monitoring sugarcane production. Eng. Agric. 2019;39:4040. [Google Scholar]
  • 220.Huang X., Zheng S., Zhu N. High-throughput legume seed phenotyping using a handheld 3D laser scanner. Remote Sens. 2022;14:431. doi: 10.3390/rs14020431. [DOI] [Google Scholar]
  • 221.Harkel J., Bartholomeus H., Kooistra L. Biomass and Crop Height Estimation of Different Crops Using UAV-Based Lidar. Remote Sens. 2020;12:17. doi: 10.3390/rs12010017. [DOI] [Google Scholar]
  • 222.Thapa S., Zhu F., Walia H., Yu H., Ge Y. A Novel LiDAR-Based Instrument for High-Throughput, 3D Measurement of Morphological Traits in Maize and Sorghum. Sensors. 2018;18:1187. doi: 10.3390/s18041187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 223.Lin C., Hu F., Peng J., Wang J., Zhai R. Segmentation and stratification methods of field maize terrestrial LiDAR point cloud. Agriculture. 2022;12:1450. doi: 10.3390/agriculture12091450. [DOI] [Google Scholar]
  • 224.Hoffmeister D., Waldhoff G., Curdt C., Tilly N., Bendig J., Bareth G. Precision Agriculture ’13. Springer; Berlin/Heidelberg, Germany: 2013. Spatial variability detection of crop height in a single field by terrestrial laser scanning. [Google Scholar]
  • 225.Zhou L., Gu X., Cheng S., Yang G., Shu M., Sun Q. Analysis of plant height changes of lodged maize using UAV-LiDAR data. Agriculture. 2020;10:146. doi: 10.3390/agriculture10050146. [DOI] [Google Scholar]
  • 226.Chiappini S., Giorgi V., Neri D., Galli A., Marcheggiani E., Malinverni E.S., Pierdicca R., Balestra M. Innovation in olive-growing by Proximal sensing LiDAR for tree volume estimation; Proceedings of the 2022 IEEE Workshop on Metrology for Agriculture and Forestry (MetroAgriFor); Perugia, Italy. 3–5 November 2022. [Google Scholar]
  • 227.Vescovo L., Gianelle D., Dalponte M., Miglietta F., Carotenuto F., Torresan C. Hail defoliation assessment in corn (Zea mays L.) using airborne LiDAR. Field Crops Res. 2016;196:426–437. doi: 10.1016/j.fcr.2016.07.024. [DOI] [Google Scholar]
  • 228.Weiss U., Biber P. Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot. Auton. Syst. 2011;59:265–273. doi: 10.1016/j.robot.2011.02.011. [DOI] [Google Scholar]
  • 229.Bin Mat Seri A.D.I., bin Mohd Kassim M.S., Sajak A.A.B. Development of Virescens Fresh Fruit Bunch Ripeness Prediction Using LiDAR for Smart Agriculture; Proceedings of the 2021 IEEE Region 10 Symposium (TENSYMP); Jeju, Republic of Korea. 23–25 August 2021; pp. 1–8. [Google Scholar]
  • 230.Su Y., Wu F., Ao Z., Jin S., Qin F., Liu B., Pang S., Liu L., Guo Q. Evaluating maize phenotype dynamics under drought stress using terrestrial lidar. Plant Methods. 2019;15:11. doi: 10.1186/s13007-019-0396-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 231.Goetz K.T., Soitinaho R., Oksanen T. Ploughing furrow recognition for onland ploughing using a 3D-LiDAR sensor. Comput. Electron. Agric. 2023;210:107941. doi: 10.1016/j.compag.2023.107941. [DOI] [Google Scholar]
  • 232.Panjvani K., Dinh A.V., Wahid K.A. LiDARPheno–A low-cost LiDAR-based 3D scanning system for leaf morphological trait extraction. Front. Plant Sci. 2019;10:147. doi: 10.3389/fpls.2019.00147. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 233.Hosoi F., Nakabayashi K., Omasa K. 3-D modeling of tomato canopies using a high-resolution portable scanning lidar for extracting structural information. Sensors. 2011;11:2166–2174. doi: 10.3390/s110202166. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 234.Wang H., Zhang W., Yang G., Lei L., Han S., Xu W., Chen R., Zhang C., Yang H. Maize Ear Height and Ear–Plant Height Ratio Estimation with LiDAR Data and Vertical Leaf Area Profile. Remote Sens. 2023;15:964. doi: 10.3390/rs15040964. [DOI] [Google Scholar]
  • 235.Hadas E., Jozkow G., Walicka A., Borkowski A. Apple orchard inventory with a LiDAR equipped unmanned aerial system. Int. J. Appl. Earth Obs. Geoinf. 2019;82:101911. doi: 10.1016/j.jag.2019.101911. [DOI] [Google Scholar]
  • 236.Estornell J., Ruiz L.A., Velázquez-Martí B., López-Cortés I., Salazar D., Fernández-Sarría A. Estimation of pruning biomass of olive trees using airborne discrete-return LiDAR data. Biomass Bioenergy. 2015;81:315–321. doi: 10.1016/j.biombioe.2015.07.015. [DOI] [Google Scholar]
  • 237.Estornell J., Hadas E., Martí J., López-Cortés I. Tree extraction and estimation of walnut structure parameters using airborne LiDAR data. Int. J. Appl. Earth Obs. Geoinf. 2021;96:102273. doi: 10.1016/j.jag.2020.102273. [DOI] [Google Scholar]
  • 238.Saha K.K., Tsoulias N., Weltzien C., Zude-Sasse M. Estimation of vegetative growth in strawberry plants using mobile LiDAR laser scanner. Horticulturae. 2022;8:90. doi: 10.3390/horticulturae8020090. [DOI] [Google Scholar]
  • 239.Murcia H.F., Tilaguy S., Ouazaa S. Development of a Low-Cost System for 3D Orchard Mapping Integrating UGV and LiDAR. Plants. 2021;10:2804. doi: 10.3390/plants10122804. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 240.Tsoulias N., Xanthopoulos G., Fountas S., Zude-Sasse M. Effects of soil ECa and LiDAR-derived leaf area on yield and fruit quality in apple production. Biosyst. Eng. 2022;223:182–199. doi: 10.1016/j.biosystemseng.2022.03.007. [DOI] [Google Scholar]
  • 241.Berk P., Stajnko D., Belsak A., Hocevar M. Digital evaluation of leaf area of an individual tree canopy in the apple orchard using the LIDAR measurement system. Comput. Electron. Agric. 2020;169:105158. doi: 10.1016/j.compag.2019.105158. [DOI] [Google Scholar]
  • 242.Wang M., Dou H., Sun H., Zhai C., Zhang Y., Yuan F. Calculation Method of Canopy Dynamic Meshing Division Volumes for Precision Pesticide Application in Orchards Based on LiDAR. Agronomy. 2023;13:1077. doi: 10.3390/agronomy13041077. [DOI] [Google Scholar]
  • 243.Sanz R., Llorens J., Escolà A., Arnó J., Planas S., Román C., Rosell-Polo J.R. LIDAR and non-LIDAR-based canopy parameters to estimate the leaf area in fruit trees and vineyard. Agric. For. Meteorol. 2018;260:229–239. doi: 10.1016/j.agrformet.2018.06.017. [DOI] [Google Scholar]
  • 244.Saha K.K., Zude-Sasse M. Estimation of chlorophyll content in banana during shelf life using LiDAR laser scanner. Postharvest Biol. Technol. 2022;192:112011. doi: 10.1016/j.postharvbio.2022.112011. [DOI] [Google Scholar]
  • 245.Yuan W., Choi D., Bolkas D. GNSS-IMU-assisted colored ICP for UAV-LiDAR point cloud registration of peach trees. Comput. Electron. Agric. 2022;197:106966. doi: 10.1016/j.compag.2022.106966. [DOI] [Google Scholar]
  • 246.Colaço A.F., Trevisan R.G., Molin J.P., Rosell-Polo J.R., Escolà A. Orange tree canopy volume estimation by manual and LiDAR-based methods. Adv. Anim. Biosci. 2017;8:477–480. doi: 10.1017/S2040470017001133. [DOI] [Google Scholar]
  • 247.Bietresato M., Carabin G., Vidoni R., Gasparetto A., Mazzetto F. Evaluation of a LiDAR-based 3D-stereoscopic vision system for crop-monitoring applications. Comput. Electron. Agric. 2016;124:1–13. doi: 10.1016/j.compag.2016.03.017. [DOI] [Google Scholar]
  • 248.Keightley K.E., Bawden G.W. 3D volumetric modeling of grapevine biomass using Tripod LiDAR. Comput. Electron. Agric. 2010;74:305–312. doi: 10.1016/j.compag.2010.09.005. [DOI] [Google Scholar]
  • 249.Zhang C., Yang G., Jiang Y., Xu B., Li X., Zhu Y., Lei L., Chen R., Dong Z., Yang H. Apple tree branch information extraction from terrestrial laser scanning and backpack-lidar. Remote Sens. 2020;12:3592. doi: 10.3390/rs12213592. [DOI] [Google Scholar]
  • 250.Liu X., Wang Y., Kang F., Yue Y., Zheng Y. Canopy parameter estimation of citrus grandis var. Longanyou based on Lidar 3d point clouds. Remote Sens. 2021;13:1859. doi: 10.3390/rs13091859. [DOI] [Google Scholar]
  • 251.Knapp-Wilson J., Bohn Reckziegel R., Thapa Magar S., Bucksch A., Chavez D.J. Three-dimensional phenotyping of peach tree-crown architecture utilizing terrestrial laser scanning. Plant Phenome J. 2023;6:e20073. doi: 10.1002/ppj2.20073. [DOI] [Google Scholar]
  • 252.Mahmud M.S., Zahid A., He L., Choi D., Krawczyk G., Zhu H., Heinemann P. Development of a LiDAR-guided section-based tree canopy density measurement system for precision spray applications. Comput. Electron. Agric. 2021;182:106053. doi: 10.1016/j.compag.2021.106053. [DOI] [Google Scholar]
  • 253.You A., Grimm C., Silwal A., Davidson J.R. Semantics-guided skeletonization of sweet cherry trees for robotic pruning. arXiv. 2021 doi: 10.48550/arXiv.2103.02833.2103.02833 [DOI] [Google Scholar]
  • 254.Hu F., Lin C., Peng J., Wang J., Zhai R. Rapeseed leaf estimation methods at field scale by using terrestrial LiDAR point cloud. Agronomy. 2022;12:2409. doi: 10.3390/agronomy12102409. [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

All of the data generated or analyzed during this study are included in this published article.


Articles from Sensors (Basel, Switzerland) are provided here courtesy of Multidisciplinary Digital Publishing Institute (MDPI)

RESOURCES