Skip to main content
Plant Communications logoLink to Plant Communications
. 2022 Jun 2;3(6):100344. doi: 10.1016/j.xplc.2022.100344

Proximal and remote sensing in plant phenomics: 20 years of progress, challenges, and perspectives

Haiyu Tao 1, Shan Xu 1, Yongchao Tian 1, Zhaofeng Li 2, Yan Ge 1, Jiaoping Zhang 4, Yu Wang 1, Guodong Zhou 5, Xiong Deng 6, Ze Zhang 2, Yanfeng Ding 1,3,5, Dong Jiang 1,3,5, Qinghua Guo 7, Shichao Jin 1,3,5,8,
PMCID: PMC9700174  PMID: 35655429

Abstract

Plant phenomics (PP) has been recognized as a bottleneck in studying the interactions of genomics and environment on plants, limiting the progress of smart breeding and precise cultivation. High-throughput plant phenotyping is challenging owing to the spatio-temporal dynamics of traits. Proximal and remote sensing (PRS) techniques are increasingly used for plant phenotyping because of their advantages in multi-dimensional data acquisition and analysis. Substantial progress of PRS applications in PP has been observed over the last two decades and is analyzed here from an interdisciplinary perspective based on 2972 publications. This progress covers most aspects of PRS application in PP, including patterns of global spatial distribution and temporal dynamics, specific PRS technologies, phenotypic research fields, working environments, species, and traits. Subsequently, we demonstrate how to link PRS to multi-omics studies, including how to achieve multi-dimensional PRS data acquisition and processing, how to systematically integrate all kinds of phenotypic information and derive phenotypic knowledge with biological significance, and how to link PP to multi-omics association analysis. Finally, we identify three future perspectives for PRS-based PP: (1) strengthening the spatial and temporal consistency of PRS data, (2) exploring novel phenotypic traits, and (3) facilitating multi-omics communication.

Keywords: plant phenomics, remote sensing, phenotyping, phenotypic traits, multi-omics, breeding, precision cultivation


This review analyzes the progress of proximal and remote sensing (PRS) in plant phenomics (PP) over the last two decades from an interdisciplinary perspective. Means of linking PRS to multi-omics studies are demonstrated, and challenges and prospects for PRS in PP applications are highlighted.

Introduction

Improving crop yield production is a serious global challenge caused by increasing population, limited resources, and deteriorating climate (Rosegrant and Cline, 2003). Breeding ideal varieties and realizing precision cultivation are fundamental ways to meet this challenge (Moreira et al., 2020). High-throughput genomics, transcriptomics, and proteomics have been achieved in the last two decades, enabling the large-scale dissection of the genetic basis of important traits (Varshney et al., 2009; Roitsch et al., 2019). However, high-throughput, high-precision, and multi-dimensional phenotypic data acquisition and analysis are seriously lagging and have become bottlenecks hindering high-throughput genetic-improvement breeding and precision cultivation management (Montes et al., 2007; Grosskinsky et al., 2015).

Plant phenomics (PP), an emerging interdisciplinary subject, is well recognized as an accelerator for breeding and an optimizer for the cultivation of plants (Watt et al., 2020), including agricultural, forest, horticultural, and grassland plants. The history of phenomics can be traced back to 1911 when the concept of phenotype was proposed to represent “all types of organisms, distinguishable by direct inspection or by finer methods of measuring or description” (Johannsen, 1911). Phenomics is a much wider concept that refers to the acquisition of high-dimensional phenotype data on an organism-wide scale (Soule, 1967). The concepts of phenotype and phenomics are both proposed by geneticists to decipher the relationship between genes and target traits (e.g., cancer). When the connotation of these concepts was adopted by plant scientists, plant phenotype and PP were formed to study plant growth, performance, and composition (Furbank, 2011). In addition, the methods and protocols in the process from plant phenotype to PP have been defined as plant phenotyping (Schurr, 2013).

Traditional phenotyping has been implemented mainly by manual measurement or scoring, which is tedious, time-consuming, and labor-intensive (Dhondt et al., 2013). The development of rapid breeding and precision cultivation has placed new demands on the throughput, accuracy, repeatability, and novelty of phenotyping. First, high-throughput, highly accurate, and high-precision phenotyping is the basis. Second, there is an increasing demand for non-destructive, timely, and repeatable phenotyping such as senescence dynamics (Anderegg et al., 2020). Third, novel, high-dimensional, and invisible phenotypes, such as the leaf to panicle ratio (LPR) (Xiao et al., 2021a) and canopy occupation volume (COV) (Liu et al., 2021a), are difficult to measure with traditional methods.

Remote sensing technology is the acquisition of information without contact (Navalgund et al., 2007). It has been widely used in geoscience and engineering and sheds new light on plant phenotyping (Jin et al., 2021b). Since the first aerial photo was taken from a hot-air balloon in 1858, remote sensing technology has experienced two stages from qualitative to quantitative analysis benefiting from the development of sensors (e.g., RGB, hyperspectral, thermal, light, and ranging detection/light detection and ranging [LiDAR]), platforms (e.g., robot, drone, and satellite), and information technologies (e.g., computer vision) (Horning, 2008). With the development of platform techniques and their widespread application to PP, the definition of remote sensing techniques has taken on a more precise distinction in the modern context, i.e., proximal and remote sensing (PRS) (Jin et al., 2021b; Li et al., 2021; Pineda et al., 2021). The use of sensors close to plants is defined as proximal sensing (PS) and includes approaches such as computed tomography (CT) and magnetic resonance imaging (MRI). By contrast, the use of sensors at a distance from plants is defined as remote sensing (RS) and includes airborne and space-borne imaging (Figure 1). The quality of PRS data consists of its temporal, spatial, and spectral resolutions, which determine its advantages for quantitative plant phenotyping (Navalgund et al., 2007; Toth and Jóźków, 2016). The noncontact working mechanism makes it a suitable tool for non-destructive and repeatable measurements. Various temporal, spatial, and spectral resolutions have boosted the acquisition of time-series, multi-scale, and multi-dimensional phenotyping data.

Figure 1.

Figure 1

The spectral, spatial, and temporal dimensions of PRS (sensors and platforms) in PP under the background of genetic diversity and environmental gradients.

The three axes of the cube represent spectral, temporal, and spatial dimensional data from different PRS platforms, including proximal platforms (such as tripods, gantries, and vehicles) and remote platforms (such as drones, airplanes, and satellites). The spectral dimension refers to phenotyping plants with an electromagnetic spectrum that ranges from gamma rays to microwaves. The temporal dimension is the time interval for plant observation, including single, diurnal, seasonal, and inter-annual observations. The spatial dimension includes phenotyping from the cell/tissue level to the global level.

Unprecedented progress of PRS in PP has been witnessed in the last two decades. PRS has provided observations of plants from the cell (Piovesan et al., 2021) to the population (Inostroza et al., 2016), from above ground (Maesano et al., 2020) to underground (Shi et al., 2013), and from indoor (controlled) environments to field conditions on multiple spatial (Pallottino et al., 2019; Jin et al., 2020d; Xie and Yang, 2020) and temporal scales (Din et al., 2017; Chivasa et al., 2019; Weksler et al., 2020). Meanwhile, image analysis methods enable the transformation of multiple spatio-temporal and spectral data into phenotypic knowledge, including plant structural, physiological, and performance-related traits (e.g., biomass and yield) (Feng et al., 2008; Koppe et al., 2010; Dhondt et al., 2013; Din et al., 2017). More importantly, multi-dimensional PRS data mining and interpretation have brought about a renaissance for PP (Yang et al., 2013; Li et al., 2014).

Combining PP and PRS can accelerate breeding and cultivation management (Xiao et al., 2022). Breeders are concerned about the stability of inheritance and the expression of genes in the natural environment, which often takes a long time. Over the past years, PRS technology has helped breeders rapidly characterize the performance of genotypes in multiple environments, enabling early seedling assessment (Yu et al., 2013), key growth period monitoring (Song et al., 2021b), and yield prediction (Zhuo et al., 2022) and thereby improving and accelerating the trait selection process. In addition, the existence of pleiotropism and multigenic effects contributes substantial uncertainty to breeding (Jin et al., 2020d). Multiple spatial and temporal analyses of PRS technology have increased the interpretability of genomic × environment (G × E) interactions and have established an effective feedback mechanism between breeding and cultivation (Jung et al., 2021).

The increasing interdisciplinary applications of PRS in PP have spawned some important review articles during the last decade. These efforts have mainly concerned the applications of specific PRS sensors (e.g., laser scanner, Jin et al., 2021b; optical imager, Li et al., 2014), platforms (e.g., unmanned aerial vehicle [UAV]; Feng et al., 2021), and phenotypic methods (Jiang and Li, 2020) and have emphasized how to link PP to breeding by plant scientists (Araus and Cairns, 2014; Araus et al., 2018; Yang et al., 2020a; Moreira et al., 2020). However, the PP community lacks a systematic review from the PRS perspective, covering phenotypic observation, data interpretation, and phenomics analysis. Therefore, this paper aims to make the following contributions: (1) reviewing the overall applications of PRS to PP during the last two decades rather than focusing on only one class of sensor or platform; (2) summarizing the data acquisition, processing, modeling, and analysis techniques that help link PRS to PP and multi-omics analysis; and (3) highlighting interdisciplinary challenges and prospects from PRS insight.

Progress of PRS in PP

Overview of PRS

PRS is the noncontact acquisition of information on objects or phenomena. The history of PRS can be traced back to the picture of Paris taken from a hot-air balloon in 1858. After that, the balloon platform was replaced by more advanced planes and satellites, enabling PRS to be used in military reconnaissance (Hudson and Hudson, 1975), land surveying (Sheng-qing, 2002), topography mapping (Schuler et al., 1998), and so forth. Meanwhile, sensors have undergone development from simple RGB cameras to various passive and active sensors. Passive PRS and active PRS methods were developed depending on the received electromagnetic radiation of the target illuminated by sunlight or sensor-emitted light (Toth and Jóźków, 2016). Passive PRS sensors commonly include visible light cameras (RGB), multi/hyperspectral imagers (MS/HS), chlorophyll fluorescence imagers, and thermal imagers. Active PRS sensors mainly include LiDAR, laser-induced chlorophyll fluorescence (LIF), synthetic aperture radar (SAR), and CT. Active and passive sensors provide a wealth of data sources for phenotypic observations under different working conditions, and they are inseparable from the development of multiple platforms.

PRS platforms can be either stationary or mobile, indoor or outdoor, proximal or remote (Toth and Jóźków, 2016; Guo et al., 2020b) and include tripods, gantries, vehicles, drones, airplanes, and satellites (Figure 1). These platforms have undergone development from RS (e.g., satellites) to PS (e.g., drones) in PP. Satellite platforms usually have the advantage of global accessibility. Although most satellite platforms have a relatively low spatial-temporal resolution, some low-orbit satellite platforms show promising sub-meter resolution (e.g., WorldView-4, GeoEye-1, and GF-2) and/or near-daily temporal resolution (e.g., AVHRR/MODIS, WorldView-4, and SuperView-1). Airborne platforms (e.g., helicopters) have the advantage of mobilization flexibility, and they can mount various types of sensors and respond quickly under proper conditions. The emergence of drones has further improved the flexibility of low-altitude observations and greatly reduced data costs. However, the endurance and load capacity of drones is far inferior to that of ground platforms such as fixed tripods, mobile gantries, and ground vehicles. As the technology matures and costs decrease, portable and affordable personal PS devices (e.g., smartphones and handheld LiDAR) (Balenović et al., 2021) are becoming powerful tools for exploring plant phenotypes (Lane et al., 2010; Fan et al., 2018).

The development and popularity of PRS sensors and platforms have promoted the formation of highly automated plant phenomics facilities (PPFs) for phenotype acquisition and data transmission in indoor and field environments (Pratap et al., 2015). The indoor platform has the advantage of high stability and can enable the mutual feedback adjustment of phenotypic observation and environmental control. However, most indoor PPFs have limited space and controllable environments, and plant samples are therefore transported to the facilities, which are described as “plant to sensor” facilities. Famous plant to sensor facilities include CropDesign (BASF, Germany) (Sinclair, 2006), PHENOPSIS (INRAE, France) (Granier et al., 2006), the PlantScreen system (NPEC, Holland) (https://plantphenotyping.com/), and HRPF (HZAU, China) (Yang et al., 2014). By contrast, most outdoor PPFs measure phenotypes under natural environmental conditions where plants are fixed but sensors are equipped on moveable facilities, which are described as “sensor to plant” facilities. Representative sensor to plant facilities include Field Scanalyzer (Rothamsted Research, UK) (Virlet et al., 2017), Crop 3D (CAS, China) (Guo et al., 2016), and NU-Spidercam (University of Nebraska, USA) (Bai et al., 2019). These indoor and outdoor PPFs combine an automatic control system and multi-sensors, enabling high-throughput and high-precision observations of multi-scale (from cell to population level), time-series (single time, diurnal, seasonal, and inter-annual), and multi-dimensional plant phenotypes (Figure 1). These PRS-based observations further enable the versatile applications of PRS in PP.

Overview of PRS applications in PP

To analyze the advances, challenges, and perspectives of PRS in PP, we retrieved publications from the Web of Science database (Clarivate Analytics, USA) between 2000 and 2020 using the following keywords and criteria: (“plant”) AND ("remote sensing" or "RGB" or "visible light" or "digital camera" or "∗spectral" or "lidar" or "light detection and ranging" or "laser scanning" or "thermal" or "Chlorophyll Fluorescence" or "SIF" or "CT" or "Computed Tomography" or "PET" or "Positron Emission Tomography" or "NMR" or "Nuclear Magnetic Resonance" or "MRI" or "Magnetic Resonance Imaging"). A total of 52 492 papers in English appeared after conference papers were excluded. Subsequently, a large number of automatically retrieved articles that did not match the review topic were manually eliminated from the analysis based on their article titles, keywords, and abstracts. Finally, 2972 articles were selected and classified into different categories in terms of global spatial distribution and temporal dynamics patterns, specific PRS technologies, phenotypic research fields, working environments, species, traits, and so forth. Specific categorization criteria and final classification results are provided in a supplementary table that is available at https://github.com/ShichaoJin/PRSinPP.

The global spatial distribution and temporal dynamics of numbers of phenotypic applications using PRS show that (1) the total number of global PRS phenotyping applications has increased every year, especially after 2014; (2) the major continents producing publications over the past two decades have been America and Europe. Asia developed slowly in the first decade but has developed rapidly in the last 10 years, even surpassing Europe and on par with America (Figure 2A); (3) most countries in the Americas but only a few countries in Africa are conducting PRS phenotyping research. China, Germany, and Australia have made major contributions in Asia, Europe, and Oceania, respectively (Figure 2B).

Figure 2.

Figure 2

The global spatial distribution and temporal dynamics of the number of phenotypic applications using PRS during 2000–2020.

(A–D) (A) The number of yearly publications by continent, (B) total number of publications by country, (C) yearly publications with different phenotypic sensors, and (D) yearly publications with different phenotypic platforms.

To analyze the specific PRS technologies applied in phenotyping, we analyzed temporal trends in phenotypic sensors and platforms. RGB sensors appeared early and flourished owing to the development of PS in the last 5 years (Figure 2C). Multi-/hyperspectral sensors are the most popular sensors, exceeding 50% of the sensors used in almost all years. Some emerging PS sensors have had fewer applications, but they show a growth trend. For example, LiDAR is mainly used to measure structural phenotypes, and CT is usually used to measure the fine internal structures of plants (Lafond et al., 2015). Furthermore, the simultaneous use of multiple sensors has gradually increased, indicating an increasing demand for multiple phenotypes when studying plant growth.

In terms of PRS platforms (Figure 2D), ground-based platforms are the mainstream application choice, followed by space-borne platforms and airborne platforms. Space-borne platforms have grown steadily, whereas airborne platform applications have increased more rapidly in the last 5 years, mainly owing to the popularity of near-ground aircraft such as drones. Phenotyping applications of multi-platform combinations have also emerged in recent years, but their proportion is small.

The above PRS sensors and platforms have boosted multi-spatial and multi-temporal PP studies of different organ types in different working environments (Figure 3). Outdoor phenotyping research far exceeds indoor phenotyping research, and there are a few studies that span indoor and outdoor environments (Figure 3C). In addition, phenotyping has been conducted at multiple spatial scales, including the cell, organ, individual, canopy, landscape, and global scales (Figure 3A). Canopy, organ, and landscape are the top three scales, accounting for nearly 90% of the surveyed publications. In terms of organ phenotyping, these publications mainly focus on leaves and roots (Figure 3B). Meanwhile, analysis of multi-temporal studies showed that the early application of PRS to PP consisted mainly of one-off (single-time) observations and later changed gradually to focus on growth cycle applications, including diurnal, seasonal, and inter-annual phenotyping (Figure 3D). Among different multi-temporal phenotyping applications, inter-annual phenotyping was the most popular owing to the low requirement for high-throughput phenotyping, whereas diurnal applications were less common because of the challenges of high-throughput, repeatable, and comparable phenotyping. On the whole, there have been relatively few phenotypic studies across multiple environments, organs, and spatiotemporal scales.

Figure 3.

Figure 3

The number of PRS in PP publications for multi-spatial and multi-temporal PP studies of different organ types in different working environments.

(A–D) (A) Multi-spatial scales: from cell to global; (B) specific organ types; (C) working environments, including indoor, outdoor, and both; (D) multi-temporal scales, including single, diurnal, seasonal, and inter-annual phenotyping.

The multiple spatio-temporal applications of PRS in PP can be further analyzed from the perspective of research fields (Figure 4). The growing numbers of published papers show that the main research fields are agriculture, forestry, horticulture, and grasslands (Figure 4A). Because PRS has always been most widely used in agriculture, we next analyzed specific research species. The top 10 commonly studied agricultural species are summarized in Figure 4B, which shows that cereal crops, including wheat, maize, and rice, are the most commonly studied species. Plant phenotypic traits in different fields can be divided into three categories according to Dhondt et al. (2013): physiological, structural, and performance-related; physiological traits account for the largest proportion of studies and performance-related traits the smallest (Figure 4C). In addition, multiple phenotypic traits have only emerged in the last decade. The specific phenotypic traits in each category and the technological readiness level (TRL) of applications using different sensors are evaluated using the methods of Araus et al. (2018) and Jin et al. (2021b) (Table 1). Although phenotyping itself has achieved rapid growth, it has also promoted interdisciplinary research on phenomics (P), genomics (G), and the environment (E). In these interdisciplinary studies, the P × E interaction has been most commonly studied, whereas there have been relatively few studies of P × G and P × E × G interactions (Figure 4D).

Figure 4.

Figure 4

PRS applications in PP.

(A–D) PRS applications in PP in terms of (A) research field, (B) agricultural species, (C) trait class, and (D) interdisciplinary studies.

Table 1.

TRL of different phenotypic traits at different spatial scales using different sensors.

Traits PET CT RGB MS HS LIDAR LIF Thermal SAR MRI References
Global level structural LAI graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif Ganguly et al., 2012; Xiao et al., 2017
canopy height graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif Hudak et al., 2002; Simard et al., 2011; Lucas et al., 2014; Tao et al., 2016
canopy cover graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif Glenn et al., 2016; Tang et al., 2019
physiological fAPAR graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif Inline graphic
Li et al., 2017; Zhang et al., 2018c; Dong et al., 2020
Landscape level structural canopy height graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif Fieuzal and Baup 2016; Fagua et al., 2019; Zhao and Qin 2020
LAI graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif Inoue and Sakaiya 2013; Inoue et al., 2014; Yadav et al., 2019
physiological canopy chlorophyll graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Gevaert et al., 2015
plant nitrogen concentration graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Koppe et al., 2010
absorbed photosynthetically active radiation graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Zhang et al., 2019d
stress-related traits graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif Wang et al., 2019a; Merrick et al., 2019; Stroppiana et al., 2019; Shekhar et al., 2020
Canopy level structural canopy gap/cover graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif Perry et al., 2012; Li et al., 2019a
canopy height/width graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif Fieuzal and Baup 2016; French et al., 2016; Guo et al., 2019; Jimenez-Berni et al., 2018; Jin et al., 2018; Lopez-Granados et al., 2019; Madec et al., 2017; Maesano et al., 2020; Qiu et al., 2019; Su et al., 2019b; Sun et al., 2017; ten Harkel et al., 2020; Xie et al., 2021b; Zhang and Grift 2012
leaf/green/plant area index graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif Jiao et al., 2011; Beriaux et al. 2013, 2015; Fontanelli et al., 2013; Inoue and Sakaiya 2013; Inoue et al., 2014; Baghdadi et al., 2015; Fieuzal and Baup 2016; Liu et al., 2017b; Su et al., 2019b; Wang et al., 2019d; Mandal et al. 2020a, 2020b, 2020c; Zhang et al., 2020a; Hussain et al., 2020; Qi et al., 2020
crop emergence graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Sankaran et al., 2015; Li et al., 2019a
plant count (flower, ear number, individual number)/plant density graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif Wu et al., 2011; Jiang et al., 2012; French et al., 2016; Fernandez-Gallego et al. 2018, 2019a, 2020; Lopez-Granados et al., 2019; Madec et al., 2019; Blanquart et al., 2020; Mandal et al., 2020d; Fang et al., 2020; Lu and Cao 2020; Vergara-Diaz et al., 2020
row spacing graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Qiu et al., 2019
weed infestation graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif López-Granados 2011; Peña et al. 2013, 2015; López-Granados et al., 2016; Pérez-Ortiz et al., 2016; Chen et al., 2018
lodging graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif Yang et al., 2015; Zhao et al., 2017; Zhang et al., 2018a; Hongzhong et al., 2019; Ajadi et al., 2020; Longfei et al., 2020; Shu et al., 2020
canopy architecture graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Yuan et al., 2019a
physiological disease/infestation detection graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif Backoulou et al., 2011; Mishra et al., 2011; Rousseau et al., 2013; Zhou et al., 2015; Ortiz-Bustos et al., 2016; Perez-Bueno et al., 2016; Sugiura et al., 2016; Leucker et al., 2017; Pineda et al., 2017; Su et al., 2018; Thomas et al., 2018; Yu et al., 2018; Yuan et al., 2019b; Jiang and Bai 2019; Polder et al., 2019; Sancho-Adamson et al., 2019; Bendel et al., 2020; Chivasa et al., 2020; Husin et al., 2020; Prabhakaran et al., 2020
abiotic stress detection (e.g., water stress, nitrogen stress, salt stress, ozone stress, cold stress, heat stress, drought stress) graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx4.gif Smith et al., 2004; Casadesus et al., 2007; Delalieux et al., 2007; Fumagalli et al., 2009; Kim et al., 2011; Zia et al., 2013; Virlet et al. 2014a, 2014b; Salvatori et al. 2014, 2016; Singh and Sarkar 2014; Jedmowski and Bruggemann 2015; Subramanian et al., 2015; Zaman-Allah 2015; Gameiro et al., 2016; Vescovo et al., 2016; Zhou et al., 2016; Savi et al., 2017; Sytar et al., 2017; Camino et al., 2018; Moghimi et al., 2018; Asaari et al., 2019; Su et al., 2019b; Buchaillot et al., 2019; Yang et al., 2019b; Chen et al., 2019; Wang et al., 2019c; Zhang et al. 2019c, 2019d, 2020b; Jiao et al., 2019; Poudyal et al., 2019; Zibrat et al., 2019; Cotrozzi et al., 2020; Feng et al., 2020; Helm et al., 2020; Horgan et al., 2020; Song et al., 2020; Xian et al., 2020; Secchi et al., 2021
water content graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif Bruning et al., 2019; Rehman et al., 2020
nitrogen/phosphorus/potassium/protein content graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Stroppiana et al., 2009; Thoren and Schmidhalter 2009; Li et al. 2010, 2018a; Eitel et al., 2011; Cao et al., 2013; Xue et al., 2014; Bruning et al., 2019; Stavrakoudis et al., 2019; Sun et al., 2019; Jasim et al., 2020; Jiang and Li 2020; Lee et al., 2020
chlorophyll content graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Wu et al., 2008; Tubuxin et al., 2015; Zhu et al., 2020b
photosynthesis/light use efficiency/transpiration graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Liu et al. 2013, 2019b; Lu et al., 2018a; Fu et al., 2019a; Du et al., 2019; Keller et al., 2019; Nichol et al., 2019; Shan et al., 2019; Chang et al., 2020; Li et al., 2020
water/fertilizer use efficiency graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Schmidt et al., 2011; Cao et al., 2012; Lu et al., 2018b
canopy temperature graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif Romero-Bravo et al., 2019; Sagan et al., 2019
fAPAR graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Guillén-Climent et al., 2013; Du et al., 2017b; Zhou et al., 2017
Individual level structural 3D reconstruction graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Jin et al., 2018
plant shape/size graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Thapa et al., 2018; Malambo et al., 2019
organ separation graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Paulus et al., 2013; Jin et al. 2019, 2020b
physiological nitrogen content graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Bi et al. 2020a, 2021; Liu et al., 2020a
Organ level structural leaf angle/rolling/orientation graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Li et al., 2018b; Thapa et al., 2018; Su et al., 2020
organ detection and quantification graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif Paquit et al., 2011; Blunk et al., 2017; Tracy et al., 2017; Wu et al., 2021a
organ size graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif Metzner et al., 2014; Mairhofer et al., 2016; Pflugfelder et al., 2017; Zhang et al., 2018b; Maenhout et al., 2019; Soltaninejad et al., 2020
fruit/seed identification graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Liu et al., 2014; Shrestha et al., 2015; Gutierrez et al., 2018; Wendel et al., 2018
physiological water transfer efficiency graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Koebernick et al., 2015
leaf nitrogen content graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Tian et al., 2011; Yendrek et al., 2017; Zheng et al., 2018; Liu et al., 2020a
leaf wilting graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Fumagalli et al., 2009
abiotic stress (temperature stress) graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif Zhou et al., 2018; Shen et al., 2020a; Ma et al., 2020b
pigment graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Gutierrez et al., 2019
respiration rate/photoprotection graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Magney et al., 2014; Coast et al., 2019
root water content graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Bodner et al., 2017
radiation use efficiency graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif McAusland et al., 2019; Lenk et al., 2020
Tissue/cell level structural porosity distribution graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Nugraha et al., 2019
tissue structure/cell structure/position graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif Yamauchi et al., 2012; Herremans et al., 2015; Du et al., 2017a; Zhang et al., 2021; Yamauchi et al., 2012
deformation recognition graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Wang et al., 2018
physiological stomatal conductance graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Rischbeck et al., 2017; Poudyal et al., 2019; Vialet-Chabrand and Lawson 2019
physiological processes graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Converse et al. 2013, 2015; Partelová et al., 2017
Multiple levels (i.e., involving traits from more than one of the levels above) performance-related biomass estimation graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx1.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx1.gif graphic file with name fx4.gif Bankestad and Wik 2016; Castro et al., 2020; Galan et al., 2020; Ge et al., 2016; Gnyp et al., 2014; Jimenez-Berni et al., 2018; Jin et al., 2020c; Li et al., 2010; Ma et al., 2020a; Maesano et al., 2020; Maimaitijiang et al., 2019; Mandal et al., 2020a; Perry et al., 2012; ten Harkel et al., 2020; Thoren and Schmidhalter 2009; Walter et al., 2019; Wang et al., 2019b; Wang et al., 2019d; Xian et al., 2020; Yang et al., 2019a; Yao et al., 2018
yield prediction graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx2.gif graphic file with name fx4.gif Inman et al., 2007; Dente et al., 2008; Motomiya et al., 2012; Rey et al., 2013; Bu et al., 2016; Maki et al., 2017; Zhuo et al. 2018, 2019; Fernandez-Gallego et al., 2019b; Poudyal et al., 2019; Setiyono et al., 2019; Halubok and Yang 2020; Jasim et al., 2020; Lenk et al., 2020; Moghimi et al., 2020; Peng et al., 2020; Smith et al., 2020; Vavlas et al., 2020; Xian et al., 2020
grain quality graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx1.gif graphic file with name fx2.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Ma et al., 2015; Bu et al., 2016; Smith et al., 2020
gross primary production graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx3.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif graphic file with name fx4.gif Wang et al. 2016, 2020c; Liu et al. 2017a, 2020b; Wei et al., 2019; Zhang et al., 2020c

Note: the TRL method was described by Araus et al. (2018) and Jin et al. (2021b).

fAPAR, fraction of absorbed photosynthetically active radiation.

Legend: Inline graphic High TRL Inline graphic Medium TRL Inline graphic Low TRL Inline graphic Not applicable.

This section summarizes progress in the application of PRS to PP in terms of the technological and practical aspects of PRS. Passive sensors are the most frequently used sensor type (Figure 5A), whereas active sensors (e.g., LiDAR) have gradually aroused the interest of researchers because they have less reliance on the environment. PS is becoming the dominant approach for plant phenotyping (Figure 5B) owing to its high spatial, temporal, and spectral resolution. The research targets (organs) are mainly aboveground, but a considerable part of the work has been focused on the underground or on a combination of across the aboveground and underground phenotypes (Figure 5C). In addition, although phenotypic research on abiotic/biotic stress, interdisciplinary approaches, and deep learning are current interests, they still represent only a small proportion of the published literature (Figures 5D–5F). How to leverage PRS to better address these research interests and contribute to multi-omics studies is a question worth pursuing.

Figure 5.

Figure 5

Current PRS technology applications.

(A–F) A summary of current PRS technology applications in PP from different perspectives, including (A) passive versus active sensing, (B) proximal versus remote sensing, (C) aboveground versus underground traits, (D) stress versus non-stress, (E) phenomics versus multi-omics, and (F) deep learning versus non-deep learning.

How to link PRS to G × P × E studies

The development of PP is becoming increasingly important for the promotion of multi-omics studies. Genomic and phenomic association analysis methods have been successfully used in crop improvement breeding, enabling researchers to map chromosome regions that condition complex traits (Carlson et al., 2019), screen drought-resistant germplasm (Wu et al., 2021b), and predict crop yield or quality (Romero-Bravo et al., 2019; Sun et al., 2019). Plant phenotypes are also affected by the environment, showing different types of phenotypic plasticity (Sultan, 2000). Stotz et al. (2021) provided a good interpretation of the G × E interaction of plant phenotyping by studying the differences in plant phenotypic plasticity across biogeographic scales; they emphasized the need to consider environmental factors in order to improve the genetic potential of future plants to adapt to climate change.

However, owing to the interactions between genes and the environment (Dowell et al., 2010), plant phenotypes are comprehensive and spatio-temporal, meaning that the same genotype may correspond to different traits, and different traits may have the same genotype. Similarly, it is important to note that the same physical matter/material may have different spectra and that similar spectra may correspond to different physical matter/materials, as has been studied for several decades in PRS. Methods that have been proposed for determining the essential relationship between PRS signals and intrinsic properties of matter include linear mixing models such as the geometric method (Drumetz et al., 2020), nonnegative matrix factorization (Fu et al., 2019b), Bayesian method (Shuai et al., 2019), and sparse unmixing (Sun et al., 2020); non-linear mixing models such as bilinear mixing models (Luo et al., 2019); and multilinear mixing models (Li et al., 2019b). For example, Zhou et al. (2019) used a spectral unmixing analysis method, vertex component analysis (VCA), to identify and visualize pathogen infection from hyperspectral images. The results showed that abundance maps calculated by VCA could perform high-throughput screening of plant disease infection at the early stages. Yuan et al. (2021) distinguished and amplified the spectral differences between rice and background by integrating the abundance information of the mixed components of rice fields calculated by the bilinear mixing model (BMM) with the spectral index, which also improved the accuracy of the rice yield estimation model. These PRS theories and methods may provide a way to analyze the complexity of phenotypes and multi-omics.

PRS for plant phenotyping

Data acquisition

PRS enables multi-spatial, multi-temporal, and multispectral data acquisition in a non-invasive and high-throughput manner (Song et al., 2021a; Jangra et al., 2021) (Figure 6). Multispectral data can be collected using various sensors. Graph, shape, and spectral information can be ideally captured by high-resolution RGB cameras, three-dimensional (3D) sensors (e.g., LiDAR), and hyperspectral imagers, respectively. RGB cameras provide fast access to two-dimensional (2D) and plant canopy morphology information (Poire et al., 2014; Yang and Han, 2020). In addition, time-of-flight sensors, such as TOF cameras and LiDAR, can produce finer 3D structural phenotypes of plants (Paulus, 2019; Li et al., 2022). In addition to these morphological and structural traits, physiological traits involved in plant biochemical processes can be obtained using multispectral, hyperspectral (Guo et al., 2017), chlorophyll fluorescence (Jiang and Bai, 2019), and thermal imaging technologies (Rajsnerova and Klem, 2012). These physiological traits can quickly and effectively indicate plant growth and developmental status, enabling early assessment of plant vigor (Candiago et al., 2015), early detection of plant pathogens (Rumpf et al., 2010), estimation of total gross primary production (Zhang et al., 2020c), and characterization of changes in stomatal conductance (Vialet-Chabrand and Lawson, 2019). In addition, proximal tomography techniques, such as CT, PET, MRI, and NMR, are recommended for acquiring traits that are not visible to the human eye, such as photoassimilate distribution (Wang et al., 2014), root system structure (Xu et al., 2018), metabolic processes (Phetcharaburanin et al., 2020), plant internal damage (Lyons et al., 2020), and cellular water status (Musse et al., 2017).

Figure 6.

Figure 6

The path of linking PRS to multi-omics by phenotyping and phenomics practices.

In multi-omics analysis, the P2G in black represents the pathway (black arrow) from phenomics to genomics, and the G2P in red represents the pathway (red arrow) from genomics to phenomics.

Multi-spatial data are usually acquired by integrating multiple platforms according to the requirements of a working environment and data quality (Ravi et al., 2019). Ground-based platforms are the most common platforms for data acquisition and can be further divided into indoor and outdoor platforms depending on the working environment. Indoor platforms are usually under controlled conditions and are oriented toward phenotyping at the level of the individual plant (Bi et al., 2021), organ (Sarkar et al., 2021), or cell (Sun et al., 2021), which is particularly advantageous for acquiring comparable phenotypic data. Ground-based outdoor platforms such as gantries, handheld or backpack instruments, and mobile vehicles focus on plant- to canopy-scale phenotyping. In addition, airborne and satellite platforms can be used to obtain phenotypes of plant populations from field to global scales (higher spatial throughput), facilitating the study of environmental and genetic plasticity in the expression of plants in different ecological contexts.

Multi-temporal data can be divided into single-time, diurnal, seasonal, and inter-annual frequencies. In the early years, PRS phenotype monitoring was mostly a single exploratory exercise because of the limitations of phenotyping platforms and algorithm performance (Weiss et al., 2020). The advent of satellite imagery products has enabled inter-annual observations of phenology traits on a global scale (Setiyono et al., 2018). The rapid development of airplane and drone technology in recent years has greatly increased the temporal resolution of phenotyping (Holman et al., 2016), allowing for the acquisition of seasonal and even diurnal phenotype data. Thanks to the development of active sensors (e.g. LiDAR; Guo et al., 2018b) and high-throughput phenotyping facilities (e.g., gantries; Guo et al., 2016), diurnal plant phenotyping (e.g., circadian rhythms; Chaudhury et al., 2019; Jin et al., 2021a) can be fully achieved, enabling higher temporal analysis of plant growth.

Data processing

High-throughput and high-precision trait extraction from PRS data is an essential step from sensors to biological knowledge (Tardieu et al., 2017). Data preprocessing is important for ensuring accuracy in phenotyping, such as radiation calibration and geometric alignment. For example, the raw spectral signal (e.g., one-dimensional [1D] curve or 2D image) records digital number values, which need to be converted to physical quantities like radiance and reflectance (Zhu et al., 2020a). In addition, distortion of information in the spatial domain can occur during PRS image acquisition. Systematic errors are predictable and are usually calibrated at the sensor end (Berra and Peppa, 2020). Random observation errors are usually corrected by geometrically calibrating the PRS image to a known ground coordinate system (such as topographic maps and ground control points) (Han et al., 2020; Liu et al., 2021b). After preprocessing, the data processing pipeline is usually sensor specific. 1D spectral curves, such as the hyperspectral curve, usually require dimension reduction (Luo et al., 2020), wavelet transformation (Paul and Chaki, 2021), and spectral index calculation (Fu et al., 2020). 2D image–based phenotyping (e.g., with RGB images or multi-/hyperspectral images) usually involves image registration (Tondewad and Dale, 2020), classification (Cheng et al., 2020), segmentation (Hossain and Chen, 2019), and trait extraction (Jiang et al., 2020). 3D data, such as LiDAR or image-reconstructed point clouds, typically undergo a processing pipeline of registration (Cheng et al., 2018), denoising (Hu et al., 2021), sampling (Bergman et al., 2020), filtering (Jin et al., 2020a), normalization (Kwan and Yan, 2020), classification/segmentation (Mao and Hou, 2019), and trait extraction (Jin et al., 2021b). These phenotyping methods enable the extraction of structural, physiological, and performance-related traits.

Recently, integrated analysis platforms have been developed to process phenotypic data with improved throughput and efficiency. Image Harvest is a high-throughput image analysis framework (Knecht et al., 2016) that significantly reduces the phenotyping costs required by plant biologists. MISIRoot, an in situ and non-destructive root phenotyping robot, was developed to detect the health of plant roots (Song et al., 2021c). In light of the large volume of phenotypic data, some open-source and cross-platform frameworks have been proposed for flexible and effective phenotyping. PhenoImage is a typical open-source image processing platform that provides simple access to high-throughput/efficient phenotyping for non-computer professionals (Zhu et al., 2021). These high-throughput phenotyping systems with integrated data acquisition and processing have become ideal options for breeders (Wu et al., 2020) because they provide automated and flexible procedures for high-throughput image processing algorithms (Zhang et al., 2019b), paving the way for PP studies with the help of data modeling.

PRS for PP

Data modeling is important for exploring phenotypic knowledge with biological significance from multi-dimensional phenotypes. Basic statistical models mainly suffice for simple preliminary analyses. By contrast, machine-learning methods are superior for high-dimensional and non-linear modeling of phenotypic tasks such as yield prediction (Ashapure et al., 2020). However, machine-learning-based methods usually require handcrafted features, and their performance has not been significantly improved under the accumulation of big data (Guo et al., 2020a). Deep learning, a new branch of data-driven machine learning, can handle more complex phenotypic tasks by performing automatic feature extraction from massive datasets, creating a new paradigm shift in phenomics analysis (Jiang and Li, 2020; Nabwire et al., 2021). For example, wheat ears can be identified and counted from thousands of RGB images of the wheat canopy (Misra et al., 2020). However, deep-learning-based methods usually have higher requirements for data volume and quality.

In addition to empirical and data-driven methods whose generalizability and interpretability may be questioned, mechanical models based on physical and mathematical quantities are of great interest because of their explainable and generalizable ability (Berger et al., 2018a). There have been many studies of phenotypic radiative transfer models (RTMs) as a reliable method for characterizing crop canopy differences, and one of the most popular physical models is PROSAIL (Su et al., 2019a), which has made significant progress in an inversion of leaf area index (LAI) and canopy chlorophyll content (CCC). A recent study demonstrated the potential of the PROSAIL model for studying crop growth traits based on UAV RS (Wan et al., 2021). Crop growth models (CGMs) such as WOFOST (CWFS&WUR, the Netherlands) (Van Diepen et al., 1989), DSSAT (University of Florida, USA) (Jones et al., 2003), Agricultural Production System Simulator (APSIM) (CSIRO, Australia) (Keating et al., 2003), STICS (INRA, France) (Brisson et al., 2003), and CropGrow (NJAU, China) (Zhu et al., 2020c) have also received widespread attention. By integrating the interactions among crop genetic potential, environmental effects, and cultivation techniques, crop growth models can simulate the growth and development of crops under different conditions, effectively predicting plant responses to stress (Tang et al., 2009), simulating the effects of climate extremes on crop yield (Pohanková et al., 2022), predicting the performance of varieties in target environments (Lamsal et al., 2017), and explaining how G × E interactions affect crop productivity (Messina et al., 2018). Therefore, CGMs may provide decision support for precision agriculture, variety selection, and management optimization (Kherif et al., 2022).

PP for multi-omics

Accurate and rapid analysis of phenomics data can not only help expand our understanding of the dynamic developmental processes of plants but also provide a novel approach to interdisciplinary dialogue among multi-omics, including genomic, epigenomic, transcriptomic, proteomic, metabolomic, and microbiomic studies, which shows significant potential value in integrating multi-dimensional regulatory networks from plant genes to phenotypes (Watt et al., 2020).

Multi-omics analysis pipelines can generally be grouped into two types: phenomics to omics (e.g., P2G, from phenomics to genomics) and omics to phenomics (e.g., G2P, genomics to phenomics) (Figure 6). Combining phenomics with other omics and using association analysis for different times and environments, quantitative trait loci (QTLs) can be located and candidate genes or networks discovered (Furbank, 2009). Optimal traits (ideotypes) can then be designed by genome editing.

PRS-based phenomics has been used to accelerate multi-omics studies for identifying new genetic loci, screening high-quality varieties, and accelerating breeding. In terms of rapid genetic loci identification, non-destructive, dynamic, and high-throughput phenotyping provides high-quality phenomics data for genome-wide association studies (GWASs), leading to rapid identification of the genetic architecture of important agronomic traits (Guo et al., 2018c). For example, Wu et al. (2019a) used a high-throughput micro-CT-RGB imaging system to obtain 739 traits from 234 rice accessions at nine time points. A total of 402 significantly associated loci were identified, and two of them were associated with yield and vigor, thus contributing to the selection of high-yielding varieties. Zhang et al. (2017) quantified 106 maize phenotypic traits using an automated high-throughput phenotyping platform and identified 988 QTLs, including three hotspots. They revealed the dynamic genetic structure of maize growth and provided a new strategy for selecting superior maize varieties. In terms of high-quality genetic selection, the high-throughput phenotyping platform has proven its feasibility. Using phenomics information on plant height, leaf shape, color, and flowering time, 18 stable genetic mutants were successfully screened from a library of several thousand tobacco mutants (Wang et al., 2017). Dynamic high-throughput phenotyping data were used for genomic selection to assess optimal wheat varieties under drought and high-temperature environments, demonstrating that the combination of genetic information and phenomics data can help breeders identify and select quality wheat lines more effectively (Crain et al., 2018). In addition, high-throughput phenotyping sensors have provided important data support for accelerating breeding. The target selection cycle of corn oil content can be reduced from 100 generations to 18 generations by combining MRI and near-infrared sensors (Song et al., 1999; Dudley and Lambert, 2004; Song and Chen, 2004). In all, PP has played a crucial role in indoor germplasm screening and field performance evaluation of crop varieties.

In summary, PRS paves the way for multi-omics studies by providing phenotyping methodology and phenomics knowledge, as shown in Figure 6. By acquiring multi-spatial, multi-temporal, and multispectral data, structural, physiological, and performance-related traits can be extracted with data processing and modeling methods. Empirical, data-driven, and mechanical methods can integrate multi-dimensional phenotypes and transform data into phenomics knowledge. As a bridge for multi-omics research, PRS-based PP provides unprecedented opportunities, although there are still many challenges.

Challenges and future perspectives

Strengthening the spatial and temporal consistency of PRS data

The plant phenotype involves comprehensive traits (e.g., biomass) that also show spatio-temporal changes with plant growth and development owing to the co-regulation of genomics and the environment (Dowell et al., 2010). PRS enables high-throughput, high-precision, and multi-dimensional phenotyping, benefiting from various available and affordable sensors and platforms. Some considerations are recommended to improve phenotypic data quality. First, choose an appropriate spectral/spatial/temporal resolution based on the phenotypic targets. Second, standardize data collection processes to ensure comparability and improve processing efficiency by following standards published by international organizations (Liping, 2003; Kresse, 2010). More importantly, because of the increasing need for repeatable phenotyping, data sharing, and interdisciplinary collaboration, a more serious challenge for PRS-based phenotyping is to maintain the spatio-temporal consistency of multi-source data.

Spatio-temporal consistency is the key to ensuring the comparability of phenotypes, which is more important for PS applications. Unlike RS, which observes the earth in similar ways using unified data acquisition, transfer, and processing protocols, PS usually has comparability problems between different sensors due to their different working settings and spatio-temporal resolutions (Aasen et al., 2018). Therefore, it is necessary but challenging to improve the spatio-temporal consistency of PS-based phenotyping. For data acquisition, a space and ground integrated network with wireless sensor networks (WSNs) is operated to coordinate plant traits from satellites and ground-based systems (Huang et al., 2018). Another promising approach is to develop novel sensors that can fuse data at the signal level. Some efforts have been made to develop hyperspectral LiDAR and fluorescence LiDAR sensors using techniques like RGB color-based restoration (Wang et al., 2020a), geometric invariability-based calibration (Zhang et al., 2019a), and multispectral waveform decomposition (Song et al., 2019). These have initially achieved the simultaneous acquisition of geometric and radiation information for plant structural and biochemical traits (Du et al., 2016; Bi et al., 2020b).

At the feature and/or decision levels, fusing multi-source data is also beneficial for strengthening spatiotemporal consistency and fully utilizing the complementary advantages of multiple sensors and platforms. Traditional multi-source feature fusion approaches are usually implemented with simple linear regression by assigning weights to different extracted features (Rischbeck et al., 2016; Sobejano-Paz et al., 2020). With the increasing volume, heterogeneity, and non-linear characteristics of PRS data, especially hyperspectral imagery and LiDAR data, advanced machine-learning algorithms (e.g., domain adaptation and transfer learning) have been developed to provide fusion solutions (Ghamisi et al., 2019). Deep learning is a fast-growing field in PRS and has been used for multi-source data fusion. For example, a classification method based on an interleaving perception convolutional neural network (IP-CNN) was developed to fuse spectral and spatial features (Zhang et al., 2022). Similarly, these statistical, machine-learning, and deep-learning methods have also been used to fuse multi-source data at the decision level (Ouhami et al., 2021). For example, the nitrogen nutrition status of multiple organs in almond trees was successfully assessed by integrating spectral reflectances of leaf and root tissues based on the weighted partial least squares (Paz-Kagan et al., 2020). A micro-phenotypic analysis of micronutrient stress was achieved by combining fluorescence kinetics with cell-related traits (e.g., stomatal conductance) in leaves (Mijovilovich et al., 2020). Although integrating multi-source phenotypic traits can deepen our understanding of plant behavior under multiple conditions, the automatic alignment of heterogeneous PRS data and auxiliary data (e.g., geolocated texts and images) is still challenging.

In addition, data transmission, processing, and storage should include a complete record of metadata, and standardized methods should be adopted to further ensure spatio-temporal data consistency (Jang et al., 2020; Guo et al., 2021). Detailed documentation may include metadata names, acquisition steps (e.g., sensor types and configurations), and experimental conditions (such as experimental design, environmental conditions, time, and geographic location). Metadata also include standard processing methods and unified data formats, which are particularly significant for the integration of multi-source phenomics data and other omics data (Coppens et al., 2017). Metadata will help to build a PP database based on plant ontology. For example, the GnpIS database contains data from indoor and outdoor experiments, from experimental design to data collection, and follows the findable, accessible, interoperable, and reusable (FAIR) principles (Pommier et al., 2019), enabling data to be shared with researchers in different fields over time.

Finally, data sharing is an important step for strengthening interdisciplinary studies and promotes the integration of data sets from multiple sources, creating an unprecedented amount of information that can be reused to generate novel knowledge (Roitsch et al., 2019). For example, an open plant breeding network database, ImageBreed, was designed for image-based phenotyping queries against genotype, phenotype, and experimental design information to train machine-learning models and support breeding decisions (Morales et al., 2020). In particular, Harfouche et al. (2019) considered the sharing of data between individual researchers and breeders to be one of the key challenges for artificial intelligence (AI) breeding in the next decade. They emphasized that data sharing may contribute to deeper insights into data and improve the robustness of breeding programs. Several shared databases have been constructed, such as the GnpIS database (Pommier et al., 2019) described above.

Exploring novel phenotypic traits

Plant phenotypic traits are the “spokespersons” of PP and multi-omics communication, yet many phenotypic traits are still unexplored or unnoticed. PRS may expand the frontier of plant phenotyping and enrich plant science studies.

PRS may help to discover traditionally unobservable phenotypes. The root system is a key organ for keeping plants vigorous and vibrant (Watt et al., 2020), but its phenotyping is easily overlooked owing to limited accessibility and the lack of efficient tools for trait extraction. There are challenges involved in underground phenotyping. As a commonly used non-invasive geophysical technique, ground-penetrating radar has been successfully used to characterize various root system traits. Liu et al. (2022) proposed an automated framework for processing ground-penetrating radar data that provides new opportunities for determining root water content under field conditions and increasing our understanding of plant root system interactions with the environment (soil and water). Research interest in root phenology has increased the application of other 3D visualization techniques such as CT and MRI, and the development of these techniques and new methods has, in turn, increased the potential for understanding complex root systems and their environmental interactions (Topp et al., 2013; Maenhout et al., 2019; Falk et al., 2020). The leaf stoma is another challenging microscopic phenotype that can be observed using advanced PRS technology. Xie et al. (2021a) introduced a high-throughput epidermal cell phenotype analysis pipeline based on confocal microscopy that was combined with QTL techniques to identify the heritability of epidermal traits in field maize, providing a physiological and genetic basis for further studies on stomatal development and conductance.

PRS enables the proposition of novel biologically meaningful traits. Using RGB cameras and deep learning, Yang et al. (2020b) proposedLPR as a novel phenotypic trait indicative of source-sink relationships, revealing unique canopy light interception patterns of ideal-plant-architecture varieties from a solar perspective. Liu et al. (2021a) used a LiDAR camera to develop a new algorithm to segment maize stems and leaves and introduced COV, a new phenotyping trait that characterizes the photosynthetic capacity of plant canopies. The fusion of multi-source PRS data also helps us to understand plant phenotyping traits at a finer scale in multiple dimensions. Shen et al. (2020b) resolved the pattern of biochemical pigmentation with age and species vertical variation in different tree species based on fused hyperspectral and LiDAR data, providing an important potential indicator for quantifying the terrestrial carbon cycle.

PRS boosts time-series phenotyping. Monitoring and tracking plant growth dynamics such as growth duration (Park et al., 2016), flowering rate (Zhang et al., 2019e), filling habit, and senescence dynamics (Han et al., 2018) is a long-term interest of biologists. As early as the eighteenth century, astronomer Jean Jacques discovered that leaves of the mimosa plant exhibit a normal daily rhythm independent of changes in daylight, suggesting a regular adaptation of the plant to its environment, now known as the circadian rhythm (McClung, 2006). Plant rhythms are important for the study of plant responses to changing environments (Webb, 2003). However, time-series observations of plant phenotyping lag far behind the study of growth rhythms in plant physiology (e.g., the circadian clock). LiDAR, as an active sensing technology, is less affected by environmental light conditions and has been successfully used to explore the seasonal and circadian rhythms of plant growth at the individual and organ levels (Puttonen et al. 2016, 2019; Herrero-Huerta et al., 2018; Jin et al., 2021a). However, in situ measurements occur only at a specific time and place, and the resulting conclusions may not be universal. Recently launched and forthcoming earth observation satellites (e.g., OCO-3) offer a possible solution. These satellites have diurnal sampling capabilities that can increase the exploration of diurnal patterns of carbon and water uptake by different ecosystems and plants at different life stages (Xiao et al., 2021b). These time-series phenotypes derived from PS and RS may help to find new traits/genes (Das Choudhury et al., 2018) and large-scale phenotypic plasticity (Stotz et al., 2021) in response to environmental change, respectively.

PRS may also be beneficial for multi-scale phenotyping. Combining RS techniques to capture phenotypic variation that connects molecular biology to earth-system science would also be a major research interest (Pallas et al., 2018; Porcar-Castell et al., 2021). Meanwhile, large-scale phenotypic analysis could help us to understand within-species variation in plants and thus reveal whether local provenances have sufficient genetic variation in functional traits to cope with environmental change (Camarretta et al., 2020). Mizuno et al. (2020) described the tolerance of inbred quinoa lines to salt stress under three different landscape conditions and demonstrated the genotype-phenotype relationships for salt tolerance among quinoa lines, providing a useful basis for molecular elucidation and genetic improvement of quinoa. More studies have explored different genetic gains analyzed at the landscape scale and individual scales (Dungey et al., 2018; Tauro et al., 2022). It is foreseeable that combining PRS techniques will be beneficial for unraveling more universal genetic mechanisms across different scales.

However, phenotyping discrepancies among scales should be considered (Wu et al., 2019b) because PRS signals are influenced by different targets. For example, the leaf-level spectrum is mainly related to leaf thickness, structure, pigment, and water content, whereas the canopy-level spectrum is influenced by canopy structure (e.g., LAI and leaf inclination distribution) (Berger et al., 2018b). In addition, topographic and climatic factors need to be considered at landscape and even higher levels (Zarnetske et al., 2019). Solutions for multi-scale transformation include (1) physical models, such as scaling from leaf to canopy level based on the PROSPECT and SAIL model (Li et al., 2018c); (2) pixel-based methods, such as fractal theory (Wu et al., 2015), for correcting the scaling effect of the LAI estimated from heterogeneous pixels in coarse-resolution images; and (3) object-based methods, such as simulation zone partitions for separating and clustering large regions into smaller zones with similar crop growth traits and environments (Guo et al., 2018a). Despite these existing multi-scale phenotyping and modeling solutions, more efforts are needed to understand the interactions of phenotypes among different scales that may contribute to multi-omics analysis.

Facilitating multi-omics communication

In the coming years, an important challenge in phenomics will be to identify the genetic and environmental determinants of phenotypes. Multi-omics analysis shows promise for resolving the spatio-temporal regulatory networks of important agronomic traits (Yang et al., 2020a).

Discovering genes according to phenotypic variation has enabled tremendous advances, but high-throughput gene discovery still has a long way to go with PRS-derived phenomics (P2G) (Furbank et al., 2019). "Genetic gain" is a fundamental concept in quantitative genetics and breeding and refers to the incremental performance per unit of time achieved through artificial selection (Araus et al., 2018). The integration of different levels of phenotyping and modern breeding techniques, such as marker-assisted selection (MAS), QTLs, and GWAS, can help maximize genetic gain and further shorten breeding cycles (Xiao et al., 2022). At the population level, Sun et al. (2019) investigated the possibility of using hyperspectral traits (i.e., Normalized Difference Spectral Index) for genetic studies. At the organ level, Nehe et al. (2021) explored genetic variation based on a combination of RGB images and KASP (Kompetitive Allele Specific PCR) markers for marker-assisted selection of drought-tolerant wheat varieties. At the cell/tissue level, Zhang et al. (2021) revealed natural genetic variation and dissected the genetic structure of vascular bundles using GWAS of 48 micro-phenotypic traits based on CT scans. Time-series phenotypes for genetic analysis have also increased our understanding of the genetic basis of dynamic plant phenotypes. Campbell et al. (2019) discovered a locus associated with rice shoot growth trajectories using random regression methods based on continuous visible light images, providing a viable solution for revealing persistent and time-specific QTLs. Therefore, a collaboration between high-throughput PP and functional genomics has enhanced our ability to identify new genetic variants (Grzybowski et al., 2021), thereby accelerating precision breeding and cultivation and bridging the research gap between genomics and phenomics (Araus and Kefauver, 2018; Singh et al., 2019).

Predicting phenotypes according to genetic variation is another direction of the genomics and phenomics combination that is important for guiding gene editing to achieve smart breeding (G2P) (Yang et al., 2014; Ma et al., 2018). With the advent of the breeding 4.0 era, in which phenomics, genomics, bioinformatics, and biotechnology are involved in the conventional breeding pipeline, researchers have identified diverse molecular mechanisms of phenotype formation controlled by the expression of deoxyribonucleic acid (DNA) (Wang et al., 2020b). Therefore, correlating molecular phenotypes with phenotypes at the organism-wide scale can further reveal genetic loci associated with plant phenotypes, enabling the establishment of a complete information flow model from DNA to phenotypic traits (Salon et al., 2017). Furthermore, researchers can use the information flow model to explain the causal relationship of genetic variants to phenotypic variation, remove deleterious alleles, and introduce beneficial alleles, significantly accelerating the process of crop improvement (Rodriguez-Leal et al., 2017; Wang et al., 2020b).

Phenotype differences due to various gene expressions in heterogeneous environments have recently been studied (Friedman et al., 2019). However, current studies do not fully consider multiple environmental factors and environmental dynamics. The surrounding environment (e.g., soil, moisture, light) and plant internal environment (e.g., pH) are specific and different for each genotype or variety (Xu, 2016). An integrated understanding of environmental dynamics and accurate environmental factor measurements are also extremely important for breeding resilient varieties because of G × E interactions (Langstroff et al., 2022). CGMs provide a quantitative framework for linking the effects of genes or alleles to traits. The motivation and potential benefits of CGMs as a G2P trait linkage function for applying quantitative genetic mechanisms to predict expected traits were explored in a recent review (Cooper et al., 2020). When applied to practical production, however, the model achieved only small improvements in accuracy, probably caused by the difficulty of estimating parameters for CGMs (Toda et al., 2020). PRS-based high-throughput phenotyping technologies offer opportunities for parameter improvement of CGMs. Combining PRS and CGMs for phenomics-genomics research is an interesting endeavor (Kasampalis et al., 2018). For example, Yang et al. (2021) integrated RGB images to parameterize the APSIM for the development of varieties with performance in the target environment. Furthermore, developing virtual CGMs to predict crop growth states based on G × E data in real time and thus regulate real plant growth is promising (Liu et al., 2019a).

Although multi-source PRS data provide the above-mentioned opportunities for multi-omics analysis, they also introduce new data processing challenges due to massive data accumulation. Therefore, deep-learning methods have gained wide popularity in recent years (Xiao et al., 2022). Image-based deep-learning methods have been well investigated for phenotypic analyses such as wheat head counting (Khaki et al., 2022) and stress detection (Wang et al., 2022). More deep-learning-based phenotypic applications have been reviewed (Singh et al., 2018; Guo et al., 2020a; Arya et al., 2022). Here, we want to highlight the challenges of deep learning for phenomics based on PRS data. One challenge is to construct large-volume, well-labeled, and openly available datasets. There are already ways to promote open data in the current phenotyping community, such as algorithm competitions. Meanwhile, “sharing the right data right” has been proposed because it allows for scientific reproducibility (Tsaftaris and Scharr, 2019). Another challenge is the development of networks with multimodal data. Multi-task learning (MTL) is an important direction that not only may facilitate multiple task implementation but also can integrate multi-source input. MTL has been proven effective and efficient for phenotyping (Dobrescu et al., 2020). For example, Sun et al. (2022) used MTL to simultaneously predict both yield and grain protein content of wheat from LiDAR and multispectral data. In addition, generative adversarial networks (GANs) are also promising for the analysis of very large, multi-source datasets that lack labeled phenotypic data. Yasrab et al. (2021) predicted plant leaf and root growth from multi-temporal data using a GAN. GAN-based methods may also be coupled with growth models to relate digital pairs of plant simulation and plant growth, supporting smart breeding and intelligent management (Drees et al., 2021).

Concluding remarks

PP has become a bottleneck technology for high-throughput breeding and a key valve for increasing yield production. Our era has witnessed tremendous advances in PRS and PP, but there has not previously been a systematic understanding of the history, applications, and trends of PRS in PP. This review focuses on PRS applications in PP from an interdisciplinary perspective over the last two decades and covers the overall history, systematic application progress, and pipeline to link PRS-based phenomics to multi-omics analysis, giving insights into future challenges and perspectives. To the best of our knowledge, the application analysis covers nearly all aspects of PRS application in PP, including the global spatial distribution and pattern of temporal dynamics, specific PRS technologies (sensor and platform types), phenotypic research fields, working environments, species, and traits. As a bridge for multi-omics research, PRS-based PP involves multi-dimensional data acquisition, processing, and modeling, which can be used to accelerate multi-omics studies to identify new genetic loci, screen high-quality varieties, and accelerate breeding. We also highlight some key directions to better promote the in-depth development of PRS in PP, including strengthening the spatial and temporal consistency of PRS data, exploring novel phenotypic traits, and facilitating multi-omics communication.

Funding

This work was supported by the Hainan Yazhou Bay Seed Lab (no. B21HJ1005), the Fundamental Research Funds for the Central Universities (no. KYCYXT2022017), the Open Project of Key Laboratory of Oasis Eco-agriculture, Xinjiang Production and Construction Corps (no. 202101), the Jiangsu Association for Science and Technology Independent Innovation Fund Project (no. CX(21)3107), the High Level Personnel Project of Jiangsu Province (no. JSSCBS20210271), the China Postdoctoral Science Foundation (no. 2021M691490), the Jiangsu Planned Projects for Postdoctoral Research Funds (no. 2021K520C), and the JBGS Project of Seed Industry Revitalization in Jiangsu Province (no. JBGS[2021]007).

Author contributions

S.J. designed the study. H.T., J.Z., and S.J. prepared figures and tables. S.J., H.T., and S.X. wrote the manuscript. Y.W. and H.T. organized the references. S.J., Y.G., Y.T., Z.L., G.Z., X.D., Z.Z., Y.D., D.J., and Q.G. helped to revise the manuscript. S.J. provided funding support. All authors read and approved the final manuscript.

Acknowledgments

We gratefully acknowledge Qing Li and Songyin Zhang at the Nanjing Agricultural University for their help in preparing figures and tables. No conflict of interest is declared.

Published: June 2, 2022

Footnotes

Published by the Plant Communications Shanghai Editorial Office in association with Cell Press, an imprint of Elsevier Inc., on behalf of CSPB and CEMPS, CAS.

References

  1. Aasen H., Honkavaara E., Lucieer A., Zarco-Tejada P.J. Quantitative remote sensing at ultra-high resolution with UAV spectroscopy: a review of sensor technology, measurement procedures, and data correction workflows. Rem. Sens. 2018;10:1091–1131. doi: 10.3390/rs10071091. [DOI] [Google Scholar]
  2. Ajadi O.A., Liao H., Jaacks J., Delos Santos A., Kumpatla S.P., Patel R., Swatantran A. Landscape-scale crop lodging assessment across Iowa and Illinois using synthetic aperture radar (SAR) images. Rem. Sens. 2020;12:3885–3899. doi: 10.3390/rs12233885. [DOI] [Google Scholar]
  3. Anderegg J., Yu K., Aasen H., Walter A., Liebisch F., Hund A. Spectral vegetation indices to track senescence dynamics in diverse wheat germplasm. Front. Plant Sci. 2020;10:1749. doi: 10.3389/fpls.2019.01749. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Araus J.L., Cairns J.E. Field high-throughput phenotyping: the new crop breeding frontier. Trends Plant Sci. 2014;19:52–61. doi: 10.1016/j.tplants.2013.09.008. [DOI] [PubMed] [Google Scholar]
  5. Araus J.L., Kefauver S.C., Zaman-Allah M., Olsen M.S., Cairns J.E. Translating high-throughput phenotyping into genetic gain. Trends Plant Sci. 2018;23:451–466. doi: 10.1016/j.tplants.2018.02.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Araus J.L., Kefauver S.C. Breeding to adapt agriculture to climate change: affordable phenotyping solutions. Curr. Opin. Plant Biol. 2018;45:237–247. doi: 10.1016/j.pbi.2018.05.003. [DOI] [PubMed] [Google Scholar]
  7. Arya S., Sandhu K.S., Singh J., kumar S. Deep learning: as the new frontier in high-throughput plant phenotyping. Euphytica. 2022;218:47. doi: 10.1007/s10681-022-02992-3. [DOI] [Google Scholar]
  8. Asaari M.S.M., Mertens S., Dhondt S., Inzé D., Wuyts N., Scheunders P. Analysis of hyperspectral images for detection of drought stress and recovery in maize plants in a high-throughput phenotyping platform. Comput. Electron. Agric. 2019;162:749–758. doi: 10.1016/j.compag.2019.05.018. [DOI] [Google Scholar]
  9. Ashapure A., Jung J., Chang A., Oh S., Yeom J., Maeda M., Maeda A., Dube N., Landivar J., Hague S., et al. Developing a machine learning based cotton yield estimation framework using multi-temporal UAS data. ISPRS J. Photogrammetry Remote Sens. 2020;169:180–194. doi: 10.1016/j.isprsjprs.2020.09.015. [DOI] [Google Scholar]
  10. Backoulou G.F., Elliott N.C., Giles K., Phoofolo M., Catana V. Development of a method using multispectral imagery and spatial pattern metrics to quantify stress to wheat fields caused by Diuraphis noxia. Comput. Electron. Agric. 2011;75:64–70. doi: 10.1016/j.compag.2010.09.011. [DOI] [Google Scholar]
  11. Baghdadi N.N., El Hajj M., Zribi M., Fayad I. Coupling SAR C-band and optical data for soil moisture and leaf area index retrieval over irrigated grasslands. IEEE J. Sel. Top. Appl. Earth Obs. Rem. Sens. 2016;9:1229–1243. doi: 10.1109/jstars.2015.2464698. [DOI] [Google Scholar]
  12. Bai G., Ge Y.F., Scoby D., Leavitt B., Stoerger V., Kirchgessner N., Irmak S., Graef G., Schnable J., Awada T. NU-Spidercam: a large-scale, cable-driven, integrated sensing and robotic system for advanced phenotyping, remote sensing, and agronomic research. Comput. Electron. Agric. 2019;160:71–81. doi: 10.1016/j.compag.2019.03.009. [DOI] [Google Scholar]
  13. Balenović I., Liang X., Jurjević L., Hyyppä J., Seletković A., Kukko A. Hand-held personal laser scanning: current status and perspectives for forest inventory application. Croat. J. For. Eng. 2021;42:165–183. doi: 10.5552/crojfe.2021.858. [DOI] [Google Scholar]
  14. Bånkestad D., Wik T. Growth tracking of basil by proximal remote sensing of chlorophyll fluorescence in growth chamber and greenhouse environments. Comput. Electron. Agric. 2016;128:77–86. doi: 10.1016/j.compag.2016.08.004. [DOI] [Google Scholar]
  15. Bendel N., Kicherer A., Backhaus A., Köckerling J., Maixner M., Bleser E., Klück H.C., Seiffert U., Voegele R.T., Töpfer R. Detection of grapevine leafroll-associated virus 1 and 3 in white and red grapevine cultivars using hyperspectral imaging. Rem. Sens. 2020;12:1693–1719. doi: 10.3390/rs12101693. [DOI] [Google Scholar]
  16. Berger K., Atzberger C., Danner M., D’Urso G., Mauser W., Vuolo F., Hank T. Evaluation of the PROSAIL model capabilities for future hyperspectral model environments: a review study. Rem. Sens. 2018;10:1–26. doi: 10.3390/rs10010085. [DOI] [Google Scholar]
  17. Berger K., Atzberger C., Danner M., Wocher M., Mauser W., Hank T. Model-based optimization of spectral sampling for the retrieval of crop variables with the PROSAIL model. Rem. Sens. 2018;10:2063. doi: 10.3390/rs10122063. [DOI] [Google Scholar]
  18. Bergman A.W., Lindell D.B., Wetzstein G. 2020 IEEE International Conference on Computational Photography (ICCP) 2020. Deep Adaptive Lidar: End-To-End Optimization of Sampling and Depth Completion at Low Sampling Rates; pp. 1–11. [Google Scholar]
  19. Beriaux E., Lucau-Danila C., Auquiere E., Defourny P. Multiyear independent validation of the water cloud model for retrieving maize leaf area index from SAR time series. Int. J. Rem. Sens. 2013;34:4156–4181. doi: 10.1080/01431161.2013.772676. [DOI] [Google Scholar]
  20. Bériaux E., Waldner F., Collienne F., Bogaert P., Defourny P. Maize leaf area index retrieval from synthetic Quad Pol SAR time series using the water cloud model. Rem. Sens. 2015;7:16204–16225. doi: 10.3390/rs71215818. [DOI] [Google Scholar]
  21. Berra E.F., Peppa M.V. 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS) 2020. Advances and Challenges of UAV SFM MVS Photogrammetry and Remote Sensing: Short Review; pp. 533–538. [Google Scholar]
  22. Bi K., Niu Z., Gao S., Xiao S., Pei J., Zhang C., Huang N. Simultaneous extraction of plant 3-D biochemical and structural parameters using hyperspectral LiDAR. Geosci. Rem. Sens. Lett. IEEE. 2022;19:1–5. doi: 10.1109/lgrs.2020.3025321. [DOI] [Google Scholar]
  23. Bi K., Niu Z., Xiao S., Bai J., Sun G., Wang J., Han Z., Gao S. Non-destructive monitoring of maize nitrogen concentration using a hyperspectral LiDAR: an evaluation from leaf-level to plant-level. Rem. Sens. 2021;13:5025. doi: 10.3390/rs13245025. [DOI] [Google Scholar]
  24. Bi K., Xiao S., Gao S., Zhang C., Huang N., Niu Z. Estimating vertical chlorophyll concentrations in maize in different health states using hyperspectral LiDAR. IEEE Trans. Geosci. Rem. Sens. 2020;58:8125–8133. doi: 10.1109/tgrs.2020.2987436. [DOI] [Google Scholar]
  25. Blanquart J.E., Sirignano E., Lenaerts B., Saeys W. Online crop height and density estimation in grain fields using LiDAR. Biosyst. Eng. 2020;198:1–14. doi: 10.1016/j.biosystemseng.2020.06.014. [DOI] [Google Scholar]
  26. Blunk S., Hafeez MAlik A., de Heer M.I., Ekblad T., Fredlund K., Mooney S.J., Sturrock C.J. Quantification of differences in germination behaviour of pelleted and coated sugar beet seeds using x-ray computed tomography (x-ray CT) Biomed. Phys. Eng. Express. 2017;3:044001. doi: 10.1088/2057-1976/aa7c3f. [DOI] [Google Scholar]
  27. Bodner G., Alsalem M., Nakhforoosh A., Arnold T., Leitner D. RGB and spectral root imaging for plant phenotyping and physiological research: experimental Setup and imaging protocols. J. Vis. Exp. 2017:1–21. doi: 10.3791/56251. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Brisson N., Gary C., Justes E., Roche R., Mary B., Ripoche D., Zimmer D., Sierra J., Bertuzzi P., Burger P., et al. An overview of the crop model STICS. Eur. J. Agron. 2003;18:309–332. doi: 10.1016/s1161-0301(02)00110-7. [DOI] [Google Scholar]
  29. Bruning B., Liu H.J., Brien C., Berger B., Lewis M., Garnett T. The development of hyperspectral distribution maps to predict the content and distribution of nitrogen and water in wheat (Triticum aestivum) Front. Plant Sci. 2019;10:1380–1396. doi: 10.3389/fpls.2019.01380. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Bu H.G., Sharma L.K., Denton A., Franzen D.W. Sugar beet yield and quality prediction at multiple harvest dates using active-optical sensors. Agron. J. 2016;108:273–284. doi: 10.2134/agronj2015.0268. [DOI] [Google Scholar]
  31. Buchaillot M.L., Gracia-Romero A., Vergara-Diaz O., Zaman-Allah M.A., Tarekegne A., Cairns J.E., Prasanna B.M., Araus J.L., Kefauver S.C. Evaluating maize genotype performance under low nitrogen conditions using RGB UAV phenotyping techniques. Sensors. 2019;19:1815–1842. doi: 10.3390/s19081815. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Camarretta N., Harrison A., Lucieer A.P., Lucieer A.M., Potts B., Davidson N., Hunt M. From drones to phenotype: using UAV-LiDAR to detect species and provenance variation in tree productivity and structure. Rem. Sens. 2020;12:3184–3200. doi: 10.3390/rs12193184. [DOI] [Google Scholar]
  33. Camino C., Zarco-Tejada P.J., Gonzalez-Dugo V. Effects of heterogeneity within tree crowns on airborne-quantified SIF and the CWSI as indicators of water stress in the context of precision agriculture. Rem. Sens. 2018;10:604–622. doi: 10.3390/rs10040604. [DOI] [Google Scholar]
  34. Campbell M., Momen M., Walia H., Morota G. Leveraging breeding values obtained from random regression models for genetic inference of longitudinal traits. Plant Genome. 2019;12:180075. doi: 10.3835/plantgenome2018.10.0075. [DOI] [PubMed] [Google Scholar]
  35. Candiago S., Remondino F., De Giglio M., Dubbini M., Gattelli M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Rem. Sens. 2015;7:4026–4047. doi: 10.3390/rs70404026. [DOI] [Google Scholar]
  36. Cao Q., Miao Y.X., Gao X.W., Liu B., Feng G.H., Yue S.C., Ieee . 2012 First International Conference on Agro-Geoinformatics(Agro-Geoinformatics) 2012. Estimating the Nitrogen Nutrition Index of Winter Wheat Using an Active Canopy Sensor in the North China Plain; pp. 178–182. [Google Scholar]
  37. Cao Q., Miao Y.X., Wang H.Y., Huang S.Y., Cheng S.S., Khosla R., Jiang R.F. Non-destructive estimation of rice plant nitrogen status with Crop Circle multispectral active canopy sensor. Field Crop. Res. 2013;154:133–144. doi: 10.1016/j.fcr.2013.08.005. [DOI] [Google Scholar]
  38. Carlson C.H., Gouker F.E., Crowell C.R., Evans L., DiFazio S.P., Smart C.D., Smart L.B. Joint linkage and association mapping of complex traits in shrub willow (Salix purpurea L.) Ann. Bot. 2019;124:701–715. doi: 10.1093/aob/mcz047. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Casadesús J., Kaya1 Y., Bort J., Nachit M.M., Araus J.L., Amor S., Ferrazzano G., Maalouf F., Maccaferri M., Martos V., et al. Using vegetation indices derived from conventional digital cameras as selection criteria for wheat breeding in water-limited environments. Ann. Appl. Biol. 2007;150:227–236. doi: 10.1111/j.1744-7348.2007.00116.x. [DOI] [Google Scholar]
  40. Castro W., Marcato Junior J., Polidoro C., Osco L.P., Gonçalves W., Rodrigues L., Santos M., Jank L., Barrios S., Valle C., et al. Deep learning applied to phenotyping of biomass in forages with UAV-based RGB imagery. Sensors. 2020;20:4802–4820. doi: 10.3390/s20174802. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Chang C.Y., Zhou R.Q., Kira O., Marri S., Skovira J., Gu L.H., Sun Y. An Unmanned Aerial System (UAS) for concurrent measurements of solar-induced chlorophyll fluorescence and hyperspectral reflectance toward improving crop monitoring. Agric. For. Meteorol. 2020;294:108145. doi: 10.1016/j.agrformet.2020.108145. [DOI] [Google Scholar]
  42. Chaudhury A., Ward C., Talasaz A., Ivanov A.G., Brophy M., Grodzinski B., Hüner N.P.A., Patel R.V., Barron J.L. Machine vision system for 3D plant phenotyping. IEEE ACM Trans. Comput. Biol. Bioinf. 2019;16:2009–2022. doi: 10.1109/tcbb.2018.2824814. [DOI] [PubMed] [Google Scholar]
  43. Chen R., Chu T., Landivar J.A., Yang C., Maeda M.M. Monitoring cotton (Gossypium hirsutum L.) germination using ultrahigh-resolution UAS images. Precis. Agric. 2018;19:161–177. doi: 10.1007/s11119-017-9508-7. [DOI] [Google Scholar]
  44. Chen X.J., Mo X.G., Zhang Y.C., Sun Z.G., Liu Y., Hu S., Liu S.X. Drought detection and assessment with solar-induced chlorophyll fluorescence in summer maize growth period over North China Plain. Ecol. Indicat. 2019;104:347–356. doi: 10.1016/j.ecolind.2019.05.017. [DOI] [Google Scholar]
  45. Cheng G., Xie X., Han J., Guo L., Xia G.-S. Remote sensing image scene classification meets deep learning: challenges, methods, benchmarks, and opportunities. IEEE Journal of Selected Topics in Applied Earth Observations. 2020;13:3735–3756. doi: 10.1109/jstars.2020.3005403. [DOI] [Google Scholar]
  46. Cheng L., Chen S., Liu X., Xu H., Wu Y., Li M., Chen Y. Registration of laser scanning point clouds: a review. Sensors (Basel) 2018;18:1641–1666. doi: 10.3390/s18051641. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Chivasa W., Mutanga O., Biradar C. Phenology-based discrimination of maize (Zea mays L.) varieties using multitemporal hyperspectral data. J. Appl. Remote Sens. 2019;13:017504–017525. doi: 10.1117/1.jrs.13.017504. [DOI] [Google Scholar]
  48. Chivasa W., Mutanga O., Biradar C. UAV-based multispectral phenotyping for disease resistance to accelerate crop improvement under changing climate conditions. Rem. Sens. 2020;12:2445–2472. doi: 10.3390/rs12152445. [DOI] [Google Scholar]
  49. Coast O., Shah S., Ivakov A., Gaju O., Wilson P.B., Posch B.C., Bryant C.J., Negrini A.C.A., Evans J.R., Condon A.G., et al. Predicting dark respiration rates of wheat leaves from hyperspectral reflectance. Plant Cell Environ. 2019;42:2133–2150. doi: 10.1111/pce.13544. [DOI] [PubMed] [Google Scholar]
  50. Converse A.K., Ahlers E.O., Bryan T., Williams P.H., Williams P., Barnhart T., Engle J.W., Engle J., Nickles R.J., Nickles R., DeJesus O.T., DeJesus O. Positron emission tomography (PET) of radiotracer uptake and distribution in living plants: methodological aspects. J. Radioanal. Nucl. Chem. 2013;297:241–246. doi: 10.1007/s10967-012-2383-9. [DOI] [Google Scholar]
  51. Converse A.K., Ahlers E.O., Bryan T.W., Hetue J.D., Lake K.A., Ellison P.A., Engle J.W., Barnhart T.E., Nickles R.J., Williams P.H.J.P.M., et al. Mathematical modeling of positron emission tomography (PET) data to assess radiofluoride transport in living plants following petiolar administration. Plant Methods. 2015;11:1–17. doi: 10.1186/s13007-015-0061-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. Cooper M., Powell O., Voss-Fels K.P., Messina C.D., Gho C., Podlich D.W., Technow F., Chapman S.C., Beveridge C.A., Ortiz-Barrientos D., et al. Modelling selection response in plant-breeding programs using crop models as mechanistic gene-to-phenotype (CGM-G2P) multi-trait link functions. in silico Plants. 2020;3:1–21. doi: 10.1093/insilicoplants/diaa016. [DOI] [Google Scholar]
  53. Coppens F., Wuyts N., Inzé D., Dhondt S. Unlocking the potential of plant phenotyping data through integration and data-driven approaches. Current Opinion in Systems Biology. 2017;4:58–63. doi: 10.1016/j.coisb.2017.07.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Cotrozzi L., Lorenzini G., Nali C., Pellegrini E., Saponaro V., Hoshika Y., Arab L., Rennenberg H., Paoletti E. Hyperspectral reflectance of light-adapted leaves can predict both dark- and light-adapted Chl fluorescence parameters, and the effects of chronic ozone exposure on date palm (Phoenix dactylifera) Int. J. Mol. Sci. 2020;21:6441–6459. doi: 10.3390/ijms21176441. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Crain J., Mondal S., Rutkoski J., Singh R.P., Poland J. Combining high-throughput phenotyping and genomic information to increase prediction and selection accuracy in wheat breeding. Plant Genome. 2018;11:170043–170057. doi: 10.3835/plantgenome2017.05.0043. [DOI] [PubMed] [Google Scholar]
  56. Das Choudhury S., Bashyam S., Qiu Y., Samal A., Awada T. Holistic and component plant phenotyping using temporal image sequence. Plant Methods. 2018;14:35. doi: 10.1186/s13007-018-0303-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. Delalieux S., van Aardt J., Keulemans W., Schrevens E., Coppin P. Detection of biotic stress (Venturia inaequalis) in apple trees using hyperspectral data: non-parametric statistical approaches and physiological implications. Eur. J. Agron. 2007;27:130–143. doi: 10.1016/j.eja.2007.02.005. [DOI] [Google Scholar]
  58. Dente L., Satalino G., Mattia F., Rinaldi M. Assimilation of leaf area index derived from ASAR and MERIS data into CERES-Wheat model to map wheat yield. Remote Sensing of Environment. 2008;112:1395–1407. doi: 10.1016/j.rse.2007.05.023. [DOI] [Google Scholar]
  59. Dhondt S., Wuyts N., Inzé D. Cell to whole-plant phenotyping: the best is yet to come. Trends Plant Sci. 2013;18:428–439. doi: 10.1016/j.tplants.2013.04.008. [DOI] [PubMed] [Google Scholar]
  60. Din M., Zheng W., Rashid M., Wang S.Q., Shi Z.H. Evaluating hyperspectral vegetation indices for leaf area index estimation of Oryza sativa L. at diverse phenological stages. Front. Plant Sci. 2017;8:820–837. doi: 10.3389/fpls.2017.00820. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Dobrescu A., Giuffrida M.V., Tsaftaris S.A. Doing more with less: a multitask deep learning approach in plant phenotyping. Front. Plant Sci. 2020;11:1–11. doi: 10.3389/fpls.2020.00141. [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Dong P., Gao L., Zhan W.F., Liu Z.H., Li J.F., Lai J.M., Li H., Huang F., Tamang S.K., Zhao L.M. Global comparison of diverse scaling factors and regression models for downscaling Landsat-8 thermal data. ISPRS J. Photogrammetry Remote Sens. 2020;169:44–56. doi: 10.1016/j.isprsjprs.2020.08.018. [DOI] [Google Scholar]
  63. Dowell R.D., Ryan O., Jansen A., Cheung D., Agarwala S., Danford T., Bernstein D.A., Rolfe P.A., Heisler L.E., Chin B., et al. Genotype to phenotype: a complex problem. Science. 2010;328:469. doi: 10.1126/science.1189015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Drees L., Junker-Frohn L.V., Kierdorf J., Roscher R. Temporal prediction and evaluation of brassica growth in the field using conditional generative adversarial networks. Comput. Electron. Agric. 2021;190:106415. doi: 10.1016/j.compag.2021.106415. [DOI] [Google Scholar]
  65. Drumetz L., Chanussot J., Jutten C., Ma W.K., Iwasaki A. Spectral variability aware Blind hyperspectral image unmixing based on Convex geometry. IEEE Trans. Image Process. 2020;29:4568–4582. doi: 10.1109/tip.2020.2974062. [DOI] [PubMed] [Google Scholar]
  66. Du J.J., Zhang Y., Guo X.Y., Ma L.M., Shao M., Pan X.D., Zhao C.J. Micron-scale phenotyping quantification and three-dimensional microstructure reconstruction of vascular bundles within maize stalks based on micro-CT scanning. Funct. Plant Biol. 2017;44:10–22. doi: 10.1071/fp16117. [DOI] [PubMed] [Google Scholar]
  67. Du L., Gong W., Shi S., Yang J., Sun J., Zhu B., Song S. Estimation of rice leaf nitrogen contents based on hyperspectral LiDAR. Int. J. Appl. Earth Obs. Geoinf. 2016;44:136–143. doi: 10.1016/j.jag.2015.08.008. [DOI] [Google Scholar]
  68. Du S., Liu L., Liu X., Hu J. Response of canopy solar-induced chlorophyll fluorescence to the absorbed photosynthetically active radiation Absorbed by chlorophyll. Rem. Sens. 2017;9:911–930. doi: 10.3390/rs9090911. [DOI] [Google Scholar]
  69. Du S.S., Liu L.Y., Liu X.J., Guo J., Hu J.C., Wang S.Q., Zhang Y.G. SIFSpec: measuring solar-induced chlorophyll fluorescence observations for remote sensing of photosynthesis. Sensors. 2019;19:3009–3030. doi: 10.3390/s19133009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Dudley J., Lambert R. 100 generations of selection for oil and protein in corn. Plant Breed. Rev. 2004;24:79–110. [Google Scholar]
  71. Dungey H.S., Dash J.P., Pont D., Clinton P.W., Watt M.S., Telfer E.J. Phenotyping whole forests will help to track genetic performance. Trends Plant Sci. 2018;23:854–864. doi: 10.1016/j.tplants.2018.08.005. [DOI] [PubMed] [Google Scholar]
  72. Eitel J.U.H., Vierling L.A., Long D.S., Hunt E.R. Early season remote sensing of wheat nitrogen status using a green scanning laser. Agric. For. Meteorol. 2011;151:1338–1345. doi: 10.1016/j.agrformet.2011.05.015. [DOI] [Google Scholar]
  73. Fagua J.C., Jantz P., Rodriguez-Buritica S., Duncanson L., Goetz S.J. Integrating LiDAR, multispectral and SAR data to estimate and map canopy height in tropical forests. Rem. Sens. 2019;11:2697–2717. doi: 10.3390/rs11222697. [DOI] [Google Scholar]
  74. Falk K.G., Jubery T.Z., O'Rourke J.A., Singh A., Sarkar S., Ganapathysubramanian B., Singh A.K. Soybean root system architecture trait study through genotypic, phenotypic, and shape-based clusters. Plant Phenomics. 2020;2020:1925495. doi: 10.34133/2020/1925495. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Fan Y., Feng Z., Mannan A., Khan T.U., Shen C., Saeed S. Estimating tree position, diameter at breast height, and tree height in real-time using a mobile phone with RGB-D SLAM. Remote Sening. 2018;10:1845. doi: 10.3390/rs10111845. [DOI] [Google Scholar]
  76. Fang Y., Qiu X.L., Guo T., Wang Y.Q., Cheng T., Zhu Y., Chen Q., Cao W.X., Yao X., Niu Q.S., et al. An automatic method for counting wheat tiller number in the field with terrestrial LiDAR. Plant Methods. 2020;16:1–14. doi: 10.1186/s13007-020-00672-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. Feng L., Chen S., Zhang C., Zhang Y., He Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021;182:106033. doi: 10.1016/j.compag.2021.106033. [DOI] [Google Scholar]
  78. Feng W., Yao X., Zhu Y., Tian Y.C., Cao W. Monitoring leaf nitrogen status with hyperspectral reflectance in wheat. Eur. J. Agron. 2008;28:394–404. doi: 10.1016/j.eja.2007.11.005. [DOI] [Google Scholar]
  79. Feng X.P., Zhan Y.H., Wang Q., Yang X.F., Yu C.L., Wang H.Y., Tang Z.Y., Jiang D.A., Peng C., He Y. Hyperspectral imaging combined with machine learning as a tool to obtain high-throughput plant salt-stress phenotyping. Plant J. 2020;101:1448–1461. doi: 10.1111/tpj.14597. [DOI] [PubMed] [Google Scholar]
  80. Fernandez-Gallego J.A., Buchaillot M.L., Gracia-Romero A., Vatter T., Diaz O.V., Gutierrez N.A., Nieto-Taladriz M.T., Kerfal S., Serret M.D., Araus J.L., et al. Cereal crop ear counting in field conditions using zenithal RGB images. JoVE. 2019;144:e58695. doi: 10.3791/58695. [DOI] [PubMed] [Google Scholar]
  81. Fernandez-Gallego J.A., Kefauver S.C., Gutierrez N.A., Nieto-Taladriz M.T., Araus J.L. Wheat ear counting in-field conditions: high throughput and low-cost approach using RGB images. Plant Methods. 2018;14:1–12. doi: 10.1186/s13007-018-0289-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  82. Fernandez-Gallego J.A., Kefauver S.C., Vatter T., Aparicio Gutiérrez N., Nieto-Taladriz M.T., Araus J.L. Low-cost assessment of grain yield in durum wheat using RGB images. Eur. J. Agron. 2019;105:146–156. doi: 10.1016/j.eja.2019.02.007. [DOI] [Google Scholar]
  83. Fernandez-Gallego J.A., Lootens P., Borra-Serrano I., Derycke V., Haesaert G., Roldán-Ruiz I., Araus J.L., Kefauver S.C. Automatic wheat ear counting using machine learning based on RGB UAV imagery. Plant J. 2020;103:1603–1613. doi: 10.1111/tpj.14799. [DOI] [PubMed] [Google Scholar]
  84. Fieuzal R., Baup F. Estimation of leaf area index and crop height of sunflowers using multi-temporal optical and SAR satellite data. Int. J. Rem. Sens. 2016;37:2780–2809. doi: 10.1080/01431161.2016.1176276. [DOI] [Google Scholar]
  85. Fontanelli G., Paloscia S., Zribi M., Chahbi A. Sensitivity analysis of X-band SAR to wheat and barley leaf area index in the Merguellil Basin. Remote Sensing Letters. 2013;4:1107–1116. doi: 10.1080/2150704x.2013.842285. [DOI] [Google Scholar]
  86. French A.N., Gore M.A., Thompson A. In: Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping. Valasek J., Thomasson J.A., editors. International Society for Optics and Photonics; 2016. Cotton phenotyping with LIDAR from a track-mounted platform; p. 98660B. [Google Scholar]
  87. Friedman J., Middleton T.E., Rubin M.J. Environmental heterogeneity generates intrapopulation variation in life-history traits in an annual plant. New Phytol. 2019;224:1171–1183. doi: 10.1111/nph.16099. [DOI] [PubMed] [Google Scholar]
  88. Fu P., Meacham-Hensold K., Guan K.Y., Bernacchi C.J. Hyperspectral leaf reflectance as proxy for photosynthetic Capacities: an ensemble approach based on multiple machine learning algorithms. Front. Plant Sci. 2019;10:730–743. doi: 10.3389/fpls.2019.00730. [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Fu X., Huang K., Sidiropoulos N.D., Ma W.-K. Nonnegative matrix factorization for signal and data analytics: identifiability, algorithms, and applications. IEEE Signal Process. Mag. 2019;36:59–80. doi: 10.1109/msp.2018.2877582. [DOI] [Google Scholar]
  90. Fu Y., Yang G., Li Z., Li H., Li Z., Xu X., Song X., Zhang Y., Duan D., Zhao C., et al. Progress of hyperspectral data processing and modelling for cereal crop nitrogen monitoring. Comput. Electron. Agric. 2020;172:105321. doi: 10.1016/j.compag.2020.105321. [DOI] [Google Scholar]
  91. Fumagalli E., Baldoni E., Abbruscato P., Piffanelli P., Genga A., Lamanna R., Consonni R. NMR techniques coupled with multivariate statistical analysis: tools to analyse oryza sativa metabolic content under stress conditions. J. Agron. Crop Sci. 2009;195:77–88. doi: 10.1111/j.1439-037x.2008.00344.x. [DOI] [Google Scholar]
  92. Furbank R.T. Plant phenomics: from gene to form and function. Funct. Plant Biol. 2009;36:1006–1015. doi: 10.1071/FPv36n11_FO. [DOI] [PubMed] [Google Scholar]
  93. Furbank R.T., Jimenez-Berni J.A., George-Jaeggli B., Potgieter A.B., Deery D.M. Field crop phenomics: enabling breeding for radiation use efficiency and biomass in cereal crops. New Phytol. 2019;223:1714–1727. doi: 10.1111/nph.15817. [DOI] [PubMed] [Google Scholar]
  94. Furbank R.T.F.a.M.T., Tester M. Phenomics—technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011;16:635–644. doi: 10.1016/j.tplants.2011.09.005. [DOI] [PubMed] [Google Scholar]
  95. Galán R.J., Bernal-Vasquez A.M., Jebsen C., Piepho H.P., Thorwarth P., Steffan P., Gordillo A., Miedaner T. Integration of genotypic, hyperspectral, and phenotypic data to improve biomass yield prediction in hybrid rye. Theoretical and Applied Genetics. 2020;133:3001–3015. doi: 10.1007/s00122-020-03651-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  96. Gameiro C., Utkin A.B., Cartaxana P., Silva J.M.d., Matos A.R. The use of laser induced chlorophyll fluorescence (LIF) as a fast and non-destructive method to investigate water deficit in Arabidopsis. Agric. Water Manag. 2016;164:127–136. doi: 10.1016/j.agwat.2015.09.008. [DOI] [Google Scholar]
  97. Ganguly S., Nemani R.R., Zhang G., Hashimoto H., Milesi C., Michaelis A., Wang W., Votava P., Samanta A., Melton F., et al. Generating global leaf area index from landsat: algorithm formulation and demonstration. Remote Sensing of Environment. 2012;122:185–202. doi: 10.1016/j.rse.2011.10.032. [DOI] [Google Scholar]
  98. Ge Y.F., Pandey P., Bai G. In: Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping. Valasek J., Thomasson J.A., editors. International Society for Optics and Photonics; 2016. Estimating fresh biomass of maize plants from their RGB images in greenhouse phenotyping; pp. 986605–986611. [Google Scholar]
  99. Gevaert C.M., Suomalainen J., Tang J., Kooistra L. Generation of spectral-temporal response surfaces by combining multispectral satellite and hyperspectral UAV imagery for precision agriculture applications. IEEE J. Sel. Top. Appl. Earth Obs. Rem. Sens. 2015;8:3140–3146. doi: 10.1109/jstars.2015.2406339. [DOI] [Google Scholar]
  100. Ghamisi P., Rasti B., Yokoya N., Wang Q., Hofle B., Bruzzone L., Bovolo F., Chi M., Anders K., Gloaguen R., et al. Multisource and multitemporal data fusion in remote sensing: a comprehensive review of the state of the Art. IEEE Geoscience and Remote Sensing Magazine. 2019;7:6–39. doi: 10.1109/mgrs.2018.2890023. [DOI] [Google Scholar]
  101. Glenn N.F., Neuenschwander A., Vierling L.A., Spaete L., Li A., Shinneman D.J., Pilliod D.S., Arkle R.S., McIlroy S.K. Landsat 8 and ICESat-2: performance and potential synergies for quantifying dryland ecosystem vegetation cover and biomass. Remote Sensing of Environment. 2016;185:233–242. doi: 10.1016/j.rse.2016.02.039. [DOI] [Google Scholar]
  102. Gnyp M.L., Bareth G., Li F., Lenz-Wiedemann V.I.S., Koppe W.G., Miao Y.X., Hennig S.D., Jia L.L., Laudien R., Chen X.P., et al. Development and implementation of a multiscale biomass model using hyperspectral vegetation indices for winter wheat in the North China Plain. Int. J. Appl. Earth Obs. Geoinf. 2014;33:232–242. doi: 10.1016/j.jag.2014.05.006. [DOI] [Google Scholar]
  103. Granier C., Aguirrezabal L., Chenu K., Cookson S.J., Dauzat M., Hamard P., Thioux J.J., Rolland G., Bouchier-Combaud S., Lebaudy A., et al. PHENOPSIS, an automated platform for reproducible phenotyping of plant responses to soil water deficit in Arabidopsis thaliana permitted the identification of an accession with low sensitivity to soil water deficit. New Phytol. 2006;169:623–635. doi: 10.1111/j.1469-8137.2005.01609.x. [DOI] [PubMed] [Google Scholar]
  104. Großkinsky D.K., Svensgaard J., Christensen S., Roitsch T. Plant phenomics and the need for physiological phenotyping across scales to narrow the genotype-to-phenotype knowledge gap. J. Exp. Bot. 2015;66:5429–5440. doi: 10.1093/jxb/erv345. [DOI] [PubMed] [Google Scholar]
  105. Grzybowski M., Wijewardane N.K., Atefi A., Ge Y., Schnable J.C. Hyperspectral reflectance-based phenotyping for quantitative genetics in crops: progress and challenges. Plant Communications. 2021;2:100209. doi: 10.1016/j.xplc.2021.100209. [DOI] [PMC free article] [PubMed] [Google Scholar]
  106. Guillén-Climent M.L., Zarco-Tejada P.J., Villalobos F.J. Estimating radiation interception in heterogeneous orchards using high spatial resolution airborne imagery. Geosci. Rem. Sens. Lett. IEEE. 2013;11:579–583. doi: 10.1109/lgrs.2013.2284660. [DOI] [Google Scholar]
  107. Guo B.B., Qi S.L., Heng Y.R., Duan J.Z., Zhang H.Y., Wu Y.P., Feng W., Xie Y.X., Zhu Y.J. Remotely assessing leaf N uptake in winter wheat based on canopy hyperspectral red-edge absorption. Eur. J. Agron. 2017;82:113–124. doi: 10.1016/j.eja.2016.10.009. [DOI] [Google Scholar]
  108. Guo C., Zhang L., Zhou X., Zhu Y., Cao W., Qiu X., Cheng T., Tian Y. Integrating remote sensing information with crop model to monitor wheat growth and yield based on simulation zone partitioning. Precis. Agric. 2018;19:55–78. doi: 10.1007/s11119-017-9498-5. [DOI] [Google Scholar]
  109. Guo Q., Jin S., Li M., Yang Q., Xu K., Ju Y., Zhang J., Xuan J., Liu J., Su Y., et al. Application of deep learning in ecological resource research: theories, methods, and challenges. Sci. China Earth Sci. 2020;63:1457–1474. doi: 10.1007/s11430-019-9584-9. [DOI] [Google Scholar]
  110. Guo Q., Su Y., Hu T., Guan H., Jin S., Zhang J., Zhao X., Xu K., Wei D., Kelly M., et al. Lidar boosts 3D ecological observations and modelings: a review and perspective. IEEE Geoscience and Remote Sensing Magazine. 2020:1–26. [Google Scholar]
  111. Guo Q., Wu F., Pang S., Zhao X., Chen L., Liu J., Xue B., Xu G., Li L., Jing H., et al. Crop 3D: a platform based on LiDAR for 3D high-throughputcrop phenotyping. Sci Sin Vitae. 2016;46:1210–1221. doi: 10.1360/n052016-00009. [DOI] [Google Scholar]
  112. Guo Q.H., Wu F.F., Pang S.X., Zhao X.Q., Chen L.H., Liu J., Xue B.L., Xu G.C., Li L., Jing H.C., et al. Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping. Sci. China Life Sci. 2018;61:328–339. doi: 10.1007/s11427-017-9056-0. [DOI] [PubMed] [Google Scholar]
  113. Guo T., Fang Y., Cheng T., Tian Y.C., Zhu Y., Chen Q., Qiu X.L., Yao X. Detection of wheat height using optimized multi-scan mode of LiDAR during the entire growth stages. Comput. Electron. Agric. 2019;165:104959. doi: 10.1016/j.compag.2019.104959. [DOI] [Google Scholar]
  114. Guo W., Carroll M.E., Singh A., Swetnam T.L., Merchant N., Sarkar S., Singh A.K., Ganapathysubramanian B. UAS-based plant phenotyping for research and breeding applications. Plant Phenomics. 2021;2021:1–21. doi: 10.34133/2021/9840192. [DOI] [PMC free article] [PubMed] [Google Scholar]
  115. Guo Z., Yang W., Chang Y., Ma X., Tu H., Xiong F., Jiang N., Feng H., Huang C., Yang P., et al. Genome-wide association studies of image traits reveal genetic architecture of drought resistance in rice. Mol. Plant. 2018;11:789–805. doi: 10.1016/j.molp.2018.03.018. [DOI] [PubMed] [Google Scholar]
  116. Gutierrez S., Fernandez-Novales J., Diago M.P., Tardaguila J. On-the-go hyperspectral imaging under field conditions and machine learning for the classification of grapevine varieties. Front. Plant Sci. 2018;9:1–11. doi: 10.3389/fpls.2018.01102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  117. Gutiérrez S., Tardaguila J., Fernández-Novales J., Diago M.P. On-the-go hyperspectral imaging for the in-field estimation of grape berry soluble solids and anthocyanin concentration. Aust. J. Grape Wine Res. 2019;25:127–133. doi: 10.1111/ajgw.12376. [DOI] [Google Scholar]
  118. Halubok M., Yang Z.L. Estimating crop and grass productivity over the United States using satellite solar-induced chlorophyll fluorescence, precipitation and soil moisture data. Rem. Sens. 2020;12:3434–3459. doi: 10.3390/rs12203434. [DOI] [Google Scholar]
  119. Han L., Yang G., Yang H., Xu B., Li Z., Yang X.J.F.i.P.S. Clustering field-based maize phenotyping of plant-height growth and canopy spectral dynamics using a UAV remote-sensing approach. Front. Plant Sci. 2018;9:1638–1653. doi: 10.3389/fpls.2018.01638. [DOI] [PMC free article] [PubMed] [Google Scholar]
  120. Han X., Thomasson J.A., Wang T., Swaminathan V. Autonomous mobile ground control point improves accuracy of agricultural remote sensing through collaboration with UAV. Inventions. 2020;5:12–30. [Google Scholar]
  121. Harfouche A.L., Jacobson D.A., Kainer D., Romero J.C., Harfouche A.H., Scarascia Mugnozza G., Moshelion M., Tuskan G.A., Keurentjes J.J.B., Altman A. Accelerating climate resilient plant breeding by applying next-generation artificial intelligence. Trends Biotechnol. 2019;37:1217–1235. doi: 10.1016/j.tibtech.2019.05.007. [DOI] [PubMed] [Google Scholar]
  122. Helm L.T., Shi H.Y., Lerdau M.T., Yang X. Solar-induced chlorophyll fluorescence and short-term photosynthetic response to drought. Ecol. Appl. 2020;30:e02101. doi: 10.1002/eap.2101. [DOI] [PubMed] [Google Scholar]
  123. Herremans E., Verboven P., Verlinden B.E., Cantre D., Abera M., Wevers M., Nicolaï B.M. Automatic analysis of the 3-D microstructure of fruit parenchyma tissue using X-ray micro-CT explains differences in aeration. BMC Plant Biol. 2015;15:1–14. doi: 10.1186/s12870-015-0650-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  124. Herrero-Huerta M., Lindenbergh R., Gard W. Leaf Movements of indoor plants monitored by terrestrial LiDAR. Front. Plant Sci. 2018;9:189–197. doi: 10.3389/fpls.2018.00189. [DOI] [PMC free article] [PubMed] [Google Scholar]
  125. Holman F., Riche A., Michalski A., Castle M., Wooster M., Hawkesford M. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing. Rem. Sens. 2016;8:1031–1054. doi: 10.3390/rs8121031. [DOI] [Google Scholar]
  126. Li H., Han Y., Chen J. Capability of multidate RADARSAT-2 data to identify sugarcane lodging. J. Appl. Remote Sens. 2019;13:1–17. doi: 10.1117/1.jrs.13.044514. [DOI] [Google Scholar]
  127. Horgan F.G., Jauregui A., Cruz A.P., Martinez E.C., Bernal C.C. Changes in reflectance of rice seedlings during planthopper feeding as detected by digital camera: potential applications for high-throughput phenotyping. PLoS One. 2020;15:e0238173. doi: 10.1371/journal.pone.0238173. [DOI] [PMC free article] [PubMed] [Google Scholar]
  128. Horning N. In: Encyclopedia of Ecology. Jørgensen S.E., Fath B.D., editors. Academic Press; 2008. Remote sensing; pp. 2986–2994. [Google Scholar]
  129. Hossain M.D., Chen D. Segmentation for Object-Based Image Analysis (OBIA): a review of algorithms and challenges from remote sensing perspective. ISPRS J. Photogrammetry Remote Sens. 2019;150:115–134. doi: 10.1016/j.isprsjprs.2019.02.009. [DOI] [Google Scholar]
  130. Hu M., Mao J., Li J., Wang Q., Zhang Y. A novel lidar signal denoising method based on convolutional autoencoding deep learning neural network. Atmosphere. 2021;12:1403–1421. doi: 10.3390/atmos12111403. [DOI] [Google Scholar]
  131. Huang Y., Chen Z.-x., Yu T., Huang X.-z., Gu X.-f. Agricultural remote sensing big data: management and applications. J. Integr. Agric. 2018;17:1915–1931. doi: 10.1016/s2095-3119(17)61859-8. [DOI] [Google Scholar]
  132. Hudak A.T., Lefsky M.A., Cohen W.B., Berterretche M. Integration of lidar and Landsat ETM+ data for estimating and mapping forest canopy height. Rem. Sens. Environ. 2002;82:397–416. doi: 10.1016/s0034-4257(02)00056-1. [DOI] [Google Scholar]
  133. Hudson R., Hudson J.W. The military applications of remote sensing by infrared. Proc. IEEE. 1975;63:104–128. doi: 10.1109/proc.1975.9711. [DOI] [Google Scholar]
  134. Husin N.A., Khairunniza-Bejo S., Abdullah A.F., Kassim M.S.M., Ahmad D., Azmi A.N.N. Application of ground-based LiDAR for analysing oil palm canopy properties on the occurrence of basal stem Rot (BSR) disease. Sci. Rep. 2020;10:6464–6480. doi: 10.1038/s41598-020-62275-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  135. Hussain S., Gao K.X., Din M., Gao Y.K., Shi Z.H., Wang S.Q. Assessment of UAV-Onboard Multispectral Sensor for non-destructive site-specific rapeseed crop phenotype variable at different phenological stages and resolutions. Rem. Sens. 2020;12:397–416. doi: 10.3390/rs12030397. [DOI] [Google Scholar]
  136. Inman D., Khosla R., Reich R.M., Westfall D.G. Active remote sensing and grain yield in irrigated maize. Precis. Agric. 2007;8:241–252. doi: 10.1007/s11119-007-9043-z. [DOI] [Google Scholar]
  137. Inostroza L., Acuña H., Munoz P., Vásquez C., Ibáñez J., Tapia G., Pino M.T., Aguilera H. Using aerial images and canopy spectral reflectance for high-throughput phenotyping of white clover. Crop Sci. 2016;56:2629–2637. doi: 10.2135/cropsci2016.03.0156. [DOI] [Google Scholar]
  138. Inoue Y., Sakaiya E. Relationship between X-band backscattering coefficients from high-resolution satellite SAR and biophysical variables in paddy rice. Rem. Sens. Lett. 2013;4:288–295. doi: 10.1080/2150704x.2012.725482. [DOI] [Google Scholar]
  139. Inoue Y., Sakaiya E., Wang C.Z. Capability of C-band backscattering coefficients from high-resolution satellite SAR sensors to assess biophysical variables in paddy rice. Rem. Sens. Environ. 2014;140:257–266. doi: 10.1016/j.rse.2013.09.001. [DOI] [Google Scholar]
  140. Jang G., Kim J., Yu J.-K., Kim H.-J., Kim Y., Kim D.-W., Kim K.-H., Lee C.W., Chung Y.S. Review: cost-effective unmanned aerial vehicle (UAV) platform for field plant breeding application. Rem. Sens. 2020;12:998–1008. doi: 10.3390/rs12060998. [DOI] [Google Scholar]
  141. Jangra S., Chaudhary V., Yadav R.C., Yadav N.R. High-throughput phenotyping: a platform to accelerate crop improvement. Phenomics. 2021;1:31–53. doi: 10.1007/s43657-020-00007-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  142. Jasim A., Zaeen A., Sharma L.K., Bali S.K., Wang C.Z., Buzza A., Alyokhin A. Predicting phosphorus and potato yield using active and passive sensors. Agriculture. 2020;10:564–588. doi: 10.3390/agriculture10110564. [DOI] [Google Scholar]
  143. Jedmowski C., Bruggemann W. Imaging of fast chlorophyll fluorescence induction curve (OJIP) parameters, applied in a screening study with wild barley (Hordeum spontaneum) genotypes under heat stress. J. Photochem. Photobiol. B Biol. 2015;151:153–160. doi: 10.1016/j.jphotobiol.2015.07.020. [DOI] [PubMed] [Google Scholar]
  144. Jiang N., Yang W.N., Duan L.F., Xu X.C., Huang C.L., Liu Q. Acceleration of CT reconstruction for wheat tiller inspection based on adaptive minimum enclosing rectangle. Comput. Electron. Agric. 2012;85:123–133. doi: 10.1016/j.compag.2012.04.004. [DOI] [Google Scholar]
  145. Jiang X., Bai Z.F. 2019. Remote Sensing Detection of Wheat Stripe Rust by Synergized Solar-Induced Chlorophyll Fluorescence and Differential Spectral Index. [Google Scholar]
  146. Jiang Y., Li C. Convolutional neural networks for image-based high-throughput plant phenotyping: a review. Plant Phenomics. 2020;2020:4152816. doi: 10.34133/2020/4152816. [DOI] [PMC free article] [PubMed] [Google Scholar]
  147. Jiang Y., Wang T., Chang H. An overview of hyperspectral image feature extraction. Electron. Opt. Control. 2020;27:73–77. [Google Scholar]
  148. Jiao W.Z., Chang Q., Wang L.X. The sensitivity of satellite solar-induced chlorophyll fluorescence to meteorological drought. Earth's Future. 2019;7:558–573. doi: 10.1029/2018ef001087. [DOI] [Google Scholar]
  149. Jiao X.F., McNairn H., Shang J.L., Pattey E., Liu J.G., Champagne C. The sensitivity of RADARSAT-2 polarimetric SAR data to corn and soybean leaf area index. Can. J. Rem. Sens. 2011;37:69–81. doi: 10.5589/m11-023. [DOI] [Google Scholar]
  150. Jimenez-Berni J.A., Deery D.M., Rozas-Larraondo P., Condon A.T.G., Rebetzke G.J., James R.A., Bovill W.D., Furbank R.T., Sirault X.R.R. High throughput determination of plant height, ground cover, and above-ground biomass in wheat with LiDAR. Front. Plant Sci. 2018;9:237–245. doi: 10.3389/fpls.2018.00237. [DOI] [PMC free article] [PubMed] [Google Scholar]
  151. Jin S., Su Y., Zhang Y., Song S., Li Q., Liu Z., Ma Q., Ge Y., Liu L., Ding Y., et al. Exploring seasonal and circadian rhythms in structural traits of field maize from LiDAR time series. Plant Phenomics. 2021;2021:1–15. doi: 10.34133/2021/9895241. [DOI] [PMC free article] [PubMed] [Google Scholar]
  152. Jin S., Su Y., Zhao X., Hu T., Guo Q. A point-based fully convolutional neural network for airborne lidar ground point filtering in forested environments. IEEE J. Sel. Top. Appl. Earth Obs. Rem. Sens. 2020;13:3958–3974. doi: 10.1109/jstars.2020.3008477. [DOI] [Google Scholar]
  153. Jin S., Sun X., Wu F., Su Y., Li Y., Song S., Xu K., Ma Q., Baret F., Jiang D., et al. Lidar sheds new light on plant phenomics for plant breeding and management: recent advances and future prospects. ISPRS J. Photogrammetry Remote Sens. 2021;171:202–223. doi: 10.1016/j.isprsjprs.2020.11.006. [DOI] [Google Scholar]
  154. Jin S.C., Su Y.J., Gao S., Wu F.F., Hu T.Y., Liu J., Li W.K., Wang D.C., Chen S.J., Jiang Y.X., et al. Deep learning: individual maize segmentation from terrestrial lidar data using faster R-CNN and regional growth algorithms. Front. Plant Sci. 2018;9:866–876. doi: 10.3389/fpls.2018.00866. [DOI] [PMC free article] [PubMed] [Google Scholar]
  155. Jin S.C., Su Y.J., Gao S., Wu F.F., Ma Q., Xu K.X., Hu T.Y., Liu J., Pang S.X., Guan H.C., et al. Separating the structural components of maize for field phenotyping using terrestrial LiDAR data and deep convolutional neural networks. IEEE Trans. Geosci. Rem. Sens. 2020;58:2644–2658. doi: 10.1109/tgrs.2019.2953092. [DOI] [Google Scholar]
  156. Jin S.C., Su Y.J., Song S.L., Xu K.X., Hu T.Y., Yang Q.L., Wu F.F., Xu G.C., Ma Q., Guan H.C., et al. Non-destructive estimation of field maize biomass using terrestrial lidar: an evaluation from plot level to individual leaf level. Plant Methods. 2020;16:1–19. doi: 10.1186/s13007-020-00613-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  157. Jin S.C., Su Y.J., Wu F.F., Pang S.X., Gao S., Hu T.Y., Liu J., Guo Q.H. Stem-leaf segmentation and phenotypic trait extraction of individual maize using terrestrial LiDAR data. IEEE Trans. Geosci. Rem. Sens. 2019;57:1336–1346. doi: 10.1109/tgrs.2018.2866056. [DOI] [Google Scholar]
  158. Jin X., Zarco-Tejada P.J., Schmidhalter U., Reynolds M.P., Hawkesford M.J., Varshney R.K., Yang T., Nie C., Li Z., Ming B. High-throughput estimation of crop traits: a review of ground and aerial phenotyping platforms. IEEE Geoscience and Remote Sensing Magazine. 2020;9:200–231. doi: 10.1109/mgrs.2020.2998816. [DOI] [Google Scholar]
  159. Johannsen W. The genotype conception of heredity. Am. Nat. 1911;45:129–159. doi: 10.1086/279202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  160. Jones J.W., Hoogenboom G., Porter C.H., Boote K.J., Batchelor W.D., Hunt L., Wilkens P.W., Singh U., Gijsman A.J., Ritchie J.T., et al. The DSSAT cropping system model. Eur. J. Agron. 2003;18:235–265. doi: 10.1016/s1161-0301(02)00107-7. [DOI] [Google Scholar]
  161. Jung J., Maeda M., Chang A., Bhandari M., Ashapure A., Landivar-Bowles J. The potential of remote sensing and artificial intelligence as tools to improve the resilience of agriculture production systems. Curr. Opin. Biotechnol. 2021;70:15–22. doi: 10.1016/j.copbio.2020.09.003. [DOI] [PubMed] [Google Scholar]
  162. Kasampalis D.A., Alexandridis T.K., Deva C., Challinor A., Moshou D., Zalidis G. Contribution of remote sensing on crop models: a review. J. Imaging. 2018;4:52–71. doi: 10.3390/jimaging4040052. [DOI] [Google Scholar]
  163. Keating B.A., Carberry P.S., Hammer G.L., Probert M.E., Robertson M.J., Holzworth D., Huth N.I., Hargreaves J.N., Meinke H., Hochman Z., et al. An overview of APSIM, a model designed for farming systems simulation. Eur. J. Agron. 2003;18:267–288. doi: 10.1016/s1161-0301(02)00108-9. [DOI] [Google Scholar]
  164. Keller B., Vass I., Matsubara S., Paul K., Jedmowski C., Pieruschka R., Nedbal L., Rascher U., Muller O. Maximum fluorescence and electron transport kinetics determined by light-induced fluorescence transients (LIFT) for photosynthesis phenotyping. Photosynth. Res. 2019;140:221–233. doi: 10.1007/s11120-018-0594-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  165. Khaki S., Safaei N., Pham H., Wang L. Wheatnet: a lightweight convolutional neural network for high-throughput image-based wheat head detection and counting. Neurocomputing. 2022;489:78–89. doi: 10.1016/j.neucom.2022.03.017. [DOI] [Google Scholar]
  166. Kherif O., Seghouani M., Justes E., Plaza-Bonilla D., Bouhenache A., Zemmouri B., Dokukin P., Latati M. The first calibration and evaluation of the STICS soil-crop model on chickpea-based intercropping system under Mediterranean conditions. Eur. J. Agron. 2022;133:126449. doi: 10.1016/j.eja.2021.126449. [DOI] [Google Scholar]
  167. Kim Y., Glenn D.M., Park J., Ngugi H.K., Lehman B.L. Hyperspectral image analysis for water stress detection of apple trees. Comput. Electron. Agric. 2011;77:155–160. doi: 10.1016/j.compag.2011.04.008. [DOI] [Google Scholar]
  168. Knecht A.C., Campbell M.T., Caprez A., Swanson D.R., Walia H. Image Harvest: an open-source platform for high-throughput plant image processing and analysis. J. Exp. Bot. 2016;67:3587–3599. doi: 10.1093/jxb/erw176. [DOI] [PMC free article] [PubMed] [Google Scholar]
  169. Koebernick N., Huber K., Kerkhofs E., Vanderborght J., Javaux M., Vereecken H., Vetterlein D. Unraveling the hydrodynamics of split root water uptake experiments using CT scanned root architectures and three dimensional flow simulations. Front. Plant Sci. 2015;6:370–389. doi: 10.3389/fpls.2015.00370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  170. Koppe W., Li F., Gnyp M.L., Miao Y.X., Jia L.L., Chen X.P., Zhang F.S., Bareth G. Evaluating multispectral and hyperspectral satellite remote sensing data for estimating winter wheat growth parameters at regional scale in the North China plain. Photogramm. Fernerkund. GeoInf. 2010;3:167–178. doi: 10.1127/1432-8364/2010/0047. [DOI] [Google Scholar]
  171. Kresse W. International Archives of ISPRS; 2010. Status of ISO Standards for Photogrammetry and Remote Sensing.http://www.isprsorg/proceedings/XXXVIII/Eurocow2010/euroCOW2010_files/papers/24.pdf [Google Scholar]
  172. Kwan M.H., Yan W.Y. Estimation of optimal parameter for range normalization of multispectral airborne lidar intensity data. ISPRS Annals of Photogrammetry, Remote Sensing Spatial Information Sciences. 2020;5:1–6. doi: 10.5194/isprs-annals-v-3-2020-221-2020. [DOI] [Google Scholar]
  173. Lafond J.A., Han L.W., Dutilleul P. Concepts and analyses in the CT scanning of root systems and leaf canopies: a timely summary. Front. Plant Sci. 2015;6:1111–1118. doi: 10.3389/fpls.2015.01111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  174. Lamsal A., Welch S.M., Jones J.W., Boote K.J., Asebedo A., Crain J., Wang X., Boyer W., Giri A., Frink E., et al. Efficient crop model parameter estimation and site characterization using large breeding trial data sets. Agric. Syst. 2017;157:170–184. doi: 10.1016/j.agsy.2017.07.016. [DOI] [Google Scholar]
  175. Lane N.D., Miluzzo E., Lu H., Peebles D., Choudhury T., Campbell A.T. A survey of mobile phone sensing. IEEE Commun. Mag. 2010;48:140–150. doi: 10.1109/mcom.2010.5560598. [DOI] [Google Scholar]
  176. Langstroff A., Heuermann M.C., Stahl A., Junker A. Opportunities and limits of controlled-environment plant phenotyping for climate response traits. Theoretical and Applied Genetics. 2022;135:1–16. doi: 10.1007/s00122-021-03892-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  177. Lee H., Wang J.F., Leblon B. Using linear regression, random forests, and support Vector machine with unmanned aerial vehicle multispectral images to predict canopy nitrogen weight in corn. Rem. Sens. 2020;12:2071–2091. doi: 10.3390/rs12132071. [DOI] [Google Scholar]
  178. Lenk S., Dieleman J.A., Lefebvre V., Heuvelink E., Magán J., Palloix A., Van Eeuwijk F.A., Barócsi A., Barocsi A. Phenotyping with fast fluorescence sensors approximates yield component measurements in pepper (Capsicum annuum L.) Photosynthetica. 2020;58:622–637. doi: 10.32615/ps.2020.016. [DOI] [Google Scholar]
  179. Leucker M., Wahabzada M., Kersting K., Peter M., Beyer W., Steiner U., Mahlein A.K., Oerke E.C. Hyperspectral imaging reveals the effect of sugar beet quantitative trait loci on Cercospora leaf spot resistance. Funct. Plant Biol. 2017;44:1–9. doi: 10.1071/fp16121. [DOI] [PubMed] [Google Scholar]
  180. Li B., Xu X.M., Han J.W., Zhang L., Bian C.S., Jin L.P., Liu J.G. The estimation of crop emergence in potatoes by UAV RGB imagery. Plant Methods. 2019;15:1–13. doi: 10.1186/s13007-019-0399-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  181. Li D., Quan C., Song Z., Li X., Yu G., Li C., Muhammad A. High-throughput plant phenotyping platform (HT3P) as a novel tool for estimating agronomic traits from the lab to the field. Front. Bioeng. Biotechnol. 2021;8:1–24. doi: 10.3389/fbioe.2020.623705. [DOI] [PMC free article] [PubMed] [Google Scholar]
  182. Li D., Shi G., Li J., Chen Y., Zhang S., Xiang S., Jin S. PlantNet: a dual-function point cloud segmentation network for multiple plant species. ISPRS J. Photogrammetry Remote Sens. 2022;184:243–263. doi: 10.1016/j.isprsjprs.2022.01.007. [DOI] [Google Scholar]
  183. Li F., Miao Y.X., Chen X.P., Zhang H.L., Jia L.L., Bareth G. Estimating winter wheat biomass and nitrogen status using an active crop sensor. Intelligent Automation and Soft Computing. 2010;16:1221–1230. [Google Scholar]
  184. Li L., Zhang Q., Huang D.F. A review of imaging techniques for plant phenotyping. Sensors. 2014;14:20078–20111. doi: 10.3390/s141120078. [DOI] [PMC free article] [PubMed] [Google Scholar]
  185. Li M., Zhu F., Guo A.J., Chen J. A graph regularized multilinear mixing model for nonlinear hyperspectral unmixing. Rem. Sens. 2019;11:2188–2212. doi: 10.3390/rs11192188. [DOI] [Google Scholar]
  186. Li S.Y., Liu X.J., Tian Y.C., Zhu Y., Cao Q. 2018 Ieee, 7th International Conference on Agro-Geoinformatics. 2018. Comparison RGB digital camera with active canopy sensor based on UAV for rice nitrogen status monitoring; pp. 186–191. [Google Scholar]
  187. Li Y., Guan K.Y., Gentine P., Konings A.G., Meinzer F.C., Kimball J.S., Xu X.T., Anderegg W.R.L., McDowell N.G., Martinez-Vilalta J., et al. Estimating global ecosystem isohydry/anisohydry using active and passive microwave satellite data. J. Geophys. Res.: Biogeosciences. 2017;122:3306–3321. doi: 10.1002/2017jg003958. [DOI] [Google Scholar]
  188. Li Y., Su Y., Hu T., Xu G., Guo Q. Retrieving 2-D leaf angle distributions for deciduous trees from terrestrial laser scanner data. IEEE Trans. Geosci. Rem. Sens. 2018;56:4945–4955. doi: 10.1109/tgrs.2018.2843382. [DOI] [Google Scholar]
  189. Li Z., Jin X., Yang G., Drummond J., Yang H., Clark B., Li Z., Zhao C. Remote sensing of leaf and canopy nitrogen status in winter wheat (Triticum aestivum L.) based on N-PROSAIL model. Remote Sening. 2018;10:1463–1473. doi: 10.3390/rs10091463. [DOI] [Google Scholar]
  190. Li Z.H., Zhang Q., Li J., Yang X., Wu Y.F., Zhang Z.Y., Wang S.H., Wang H.Z., Zhang Y.G. Solar-induced chlorophyll fluorescence and its link to canopy photosynthesis in maize from continuous ground measurements. Remote Sensing of Environment. 2020;236:111420. [Google Scholar]
  191. Liping D. Vol. 641. 2003. The development of remote-sensing related standards at FGDC, OGC, and ISO TC 211; pp. 643–647. (IGARSS 2003. 2003 IEEE International Geoscience and Remote Sensing Symposium. Proceedings (IEEE Cat. No.03CH37477)). [Google Scholar]
  192. Liu C.H., Liu W., Lu X.Z., Chen W., Yang J.B., Zheng L. Nondestructive determination of transgenic Bacillus thuringiensis rice seeds (Oryza sativa L.) using multispectral imaging and chemometric methods. Food Chemistry. 2014;153:87–93. doi: 10.1016/j.foodchem.2013.11.166. [DOI] [PubMed] [Google Scholar]
  193. Liu F., Song Q., Zhao J., Mao L., Bu H., Hu Y., Zhu X.G. Canopy occupation volume as an indicator of canopy photosynthetic capacity. New Phytologist. 2021;232:941–956. doi: 10.1111/nph.17611. [DOI] [PubMed] [Google Scholar]
  194. Liu H.J., Bruning B., Garnett T., Berger B. The performances of hyperspectral sensors for proximal sensing of nitrogen levels in wheat. Sensors. 2020;20:4550–4571. doi: 10.3390/s20164550. [DOI] [PMC free article] [PubMed] [Google Scholar]
  195. Liu J., Xu W., Guo B., Zhou G., Zhu H. Accurate mapping method for UAV photogrammetry without ground control points in the map projection frame. IEEE Trans. Geosci. Rem. Sens. 2021;59:9673–9681. doi: 10.1109/tgrs.2021.3052466. [DOI] [Google Scholar]
  196. Liu L.Y., Liu X.J., Hu J.C., Guan L.L. Assessing the wavelength-dependent ability of solar-induced chlorophyll fluorescence to estimate the GPP of winter wheat at the canopy level. Int. J. Rem. Sens. 2017;38:4396–4417. doi: 10.1080/01431161.2017.1320449. [DOI] [Google Scholar]
  197. Liu L.Y., Zhang Y.J., Jiao Q.J., Peng D.L. Assessing photosynthetic light-use efficiency using a solar-induced chlorophyll fluorescence and photochemical reflectance index. Int. J. Rem. Sens. 2013;34:4264–4280. doi: 10.1080/01431161.2013.775533. [DOI] [Google Scholar]
  198. Liu S., Martre P., Buis S., Abichou M., Andrieu B., Baret F. Estimation of plant and canopy architectural traits using the digital plant phenotyping platform. Plant Physiology. 2019;181:881–890. doi: 10.1104/pp.19.00554. [DOI] [PMC free article] [PubMed] [Google Scholar]
  199. Liu S.Y., Baret F., Abichou M., Boudon F., Thomas S., Zhao K.G., Fournier C., Andrieu B., Irfan K., Hemmerlé M., et al. Estimating wheat green area index from ground-based LiDAR measurement using a 3D canopy structure model. Agric. For. Meteorol. 2017;247:12–20. doi: 10.1016/j.agrformet.2017.07.007. [DOI] [Google Scholar]
  200. Liu X., Guo L., Cui X., Butnor J.R., Boyer E.W., Yang D., Chen J., Fan B. An automatic processing framework for in situ determination of Ecohydrological root water content by ground-penetrating radar. IEEE Trans. Geosci. Rem. Sens. 2022;60:1–15. doi: 10.1109/tgrs.2021.3065066. [DOI] [Google Scholar]
  201. Liu X.J., Liu L.Y., Hu J.C., Guo J., Du S.S. Improving the potential of red SIF for estimating GPP by downscaling from the canopy level to the photosystem level. Agric. For. Meteorol. 2020;281:107846. doi: 10.1016/j.agrformet.2019.107846. [DOI] [Google Scholar]
  202. Liu Z.Q., Lu X.L., An S.Q., Heskel M., Yang H.L., Tang J.W. Advantage of multi-band solar-induced chlorophyll fluorescence to derive canopy photosynthesis in a temperate forest. Agric. For. Meteorol. 2019;279:107691. doi: 10.1016/j.agrformet.2019.107691. [DOI] [Google Scholar]
  203. Zhou L., Cheng S., Sun Q., Gu X., Yang G., Shu M., Feng H. Remote sensing of regional-scale maize lodging using multitemporal GF-1 images. J. Appl. Remote Sens. 2020;14:1–17. doi: 10.1117/1.jrs.14.014514. [DOI] [Google Scholar]
  204. López-Granados F. Weed detection for site-specific weed management: mapping and real-time approaches. Weed Res. 2011;51:1–11. doi: 10.1111/j.1365-3180.2010.00829.x. [DOI] [Google Scholar]
  205. López-Granados F., Torres-Sánchez J., Jiménez-Brenes F.M., Arquero O., Lovera M., de Castro A.I. An efficient RGB-UAV-based platform for field almond tree phenotyping: 3-D architecture and flowering traits. Plant Methods. 2019;15:1–16. doi: 10.1186/s13007-019-0547-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  206. López-Granados F., Torres-Sánchez J., Serrano-Pérez A., de Castro A.I., Mesas-Carrascosa F.J., Peña J.M. Early season weed mapping in sunflower using UAV technology: variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2016;17:183–199. doi: 10.1007/s11119-015-9415-8. [DOI] [Google Scholar]
  207. Lu H., Cao Z.G. TasselNetV2+: a fast implementation for high-throughput plant counting from high-resolution RGB imagery. Front. Plant Sci. 2020;11:1929–1943. doi: 10.3389/fpls.2020.541960. [DOI] [PMC free article] [PubMed] [Google Scholar]
  208. Lu X.L., Liu Z.Q., An S.Q., Miralles D.G., Maes W.H., Liu Y.L., Tang J.W. Potential of solar-induced chlorophyll fluorescence to estimate transpiration in a temperate forest. Agric. For. Meteorol. 2018;252:75–87. doi: 10.1016/j.agrformet.2018.01.017. [DOI] [Google Scholar]
  209. Lu X.L., Liu Z.Q., Zhou Y.Y., Liu Y.L., Tang J.W. Performance of solar-induced chlorophyll fluorescence in estimating water-use efficiency in a temperate forest. Rem. Sens. 2018;10:796–815. doi: 10.3390/rs10050796. [DOI] [Google Scholar]
  210. Lucas R., Rebelo L.-M., Fatoyinbo L., Rosenqvist A., Itoh T., Shimada M., Simard M., Souza-Filho P.W., Thomas N., Trettin C., et al. Contribution of L-band SAR to systematic global mangrove monitoring. Mar. Freshw. Res. 2014;65:589–603. doi: 10.1071/mf13177. [DOI] [Google Scholar]
  211. Luo F., Zhang L., Du B., Zhang L. Dimensionality reduction with enhanced hybrid-graph discriminant learning for hyperspectral image classification. IEEE Trans. Geosci. Rem. Sens. 2020;58:5336–5353. doi: 10.1109/tgrs.2020.2963848. [DOI] [Google Scholar]
  212. Luo W., Gao L., Zhang R., Marinoni A., Zhang B. Bilinear normal mixing model for spectral unmixing. Image Process. 2019;13:344–354. doi: 10.1049/iet-ipr.2018.5458. [DOI] [Google Scholar]
  213. Lyons C.L., Tshibalanganda M., Plessis A.D. Using CT-scanning technology to quantify damage of the stem-boring beetle, Aphanasium australe, a biocontrol agent of Hakea sericea in South Africa. Biocontrol Sci. Technol. 2020;30:33–41. doi: 10.1080/09583157.2019.1682518. [DOI] [Google Scholar]
  214. Ma D.D., Maki H., Neeno S., Zhang L.B., Wang L.J., Jin J. Application of non-linear partial least squares analysis on prediction of biomass of maize plants using hyperspectral images. Biosyst. Eng. 2020;200:40–54. doi: 10.1016/j.biosystemseng.2020.09.002. [DOI] [Google Scholar]
  215. Ma D.D., Wang L.J., Zhang L.B., Song Z.H., U Rehman T., Rehman T.U., Jin J. Stress distribution analysis on hyperspectral corn leaf images for improved phenotyping quality. Sensors. 2020;20:3659–3672. doi: 10.3390/s20133659. [DOI] [PMC free article] [PubMed] [Google Scholar]
  216. Ma F., Wang J., Liu C.H., Lu X.Z., Chen W., Chen C.G., Yang J.B., Zheng L. Discrimination of kernel quality characteristics for sunflower seeds based on multispectral imaging approach. Food Anal. Methods. 2015;8:1629–1636. doi: 10.1007/s12161-014-0038-x. [DOI] [Google Scholar]
  217. Ma W., Qiu Z., Song J., Li J., Cheng Q., Zhai J., Ma C. A deep convolutional neural network approach for predicting phenotypes from genotypes. Planta. 2018;248:1307–1318. doi: 10.1007/s00425-018-2976-9. [DOI] [PubMed] [Google Scholar]
  218. Madec S., Baret F., de Solan B., Thomas S., Dutartre D., Jezequel S., Hemmerlé M., Colombeau G., Comar A. High-throughput phenotyping of plant height: comparing unmanned aerial vehicles and ground LiDAR Estimates. Front. Plant Sci. 2017;8:2002–2016. doi: 10.3389/fpls.2017.02002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  219. Madec S., Jin X.L., Lu H., De Solan B., Liu S.Y., Duyme F., Heritier E., Baret F. Ear density estimation from high resolution RGB imagery using deep learning technique. Agric. For. Meteorol. 2019;264:225–234. doi: 10.1016/j.agrformet.2018.10.013. [DOI] [Google Scholar]
  220. Maenhout P., Sleutel S., Xu H., Van Hoorebeke L., Cnudde V., De Neve S. Semi-automated segmentation and visualization of complex undisturbed root systems with X-ray mu CT. Soil Tillage Res. 2019;192:59–65. doi: 10.1016/j.still.2019.04.025. [DOI] [Google Scholar]
  221. Maesano M., Khoury S., Nakhle F., Firrincieli A., Gay A., Tauro F., Harfouche A. UAV-based LiDAR for high-throughput determination of plant height and above-ground biomass of the bioenergy grass Arundo donax. Rem. Sens. 2020;12:3464–3484. doi: 10.3390/rs12203464. [DOI] [Google Scholar]
  222. Magney T.S., Eusden S.A., Eitel J.U.H., Logan B.A., Jiang J., Vierling L.A. Assessing leaf photoprotective mechanisms using terrestrial LiDAR: towards mapping canopy photosynthetic performance in three dimensions. New Phytol. 2014;201:344–356. doi: 10.1111/nph.12453. [DOI] [PubMed] [Google Scholar]
  223. Maimaitijiang M., Sagan V., Sidike P., Maimaitiyiming M., Hartling S., Peterson K.T., Maw M.J.W., Shakoor N., Mockler T., Fritschi F.B. Vegetation index weighted canopy volume model (CVMVI) for soybean biomass estimation from unmanned aerial system-based RGB imagery. ISPRS J. Photogrammetry Remote Sens. 2019;151:27–41. doi: 10.1016/j.isprsjprs.2019.03.003. [DOI] [Google Scholar]
  224. Mairhofer S., Johnson J., Sturrock C.J., Bennett M.J., Mooney S.J., Pridmore T.P. Visual tracking for the recovery of multiple interacting plant root systems from X-ray $$\upmu $$ μ CT images. Mach. Vis. Appl. 2016;27:721–734. doi: 10.1007/s00138-015-0733-7. [DOI] [Google Scholar]
  225. Maki M., Sekiguchi K., Homma K., Hirooka Y., Oki K. Estimation of rice yield by SIMRIW-RS, a model that integrates remote sensing data into a crop growth model. J. Agric. Meteorol. 2017;73:2–8. doi: 10.2480/agrmet.d-14-00023. [DOI] [Google Scholar]
  226. Malambo L., Popescu S.C., Horne D.W., Pugh N.A., Rooney W.L. Automated detection and measurement of individual sorghum panicles using density-based clustering of terrestrial lidar data. ISPRS J. Photogrammetry Remote Sens. 2019;149:1–13. doi: 10.1016/j.isprsjprs.2018.12.015. [DOI] [Google Scholar]
  227. Mandal D., Kumar V., Lopez-Sanchez J.M., Bhattacharya A., McNairn H., Rao Y.S. Crop biophysical parameter retrieval from Sentinel-1 SAR data with a multi-target inversion of Water Cloud Model. Int. J. Rem. Sens. 2020;41:5503–5524. doi: 10.1080/01431161.2020.1734261. [DOI] [Google Scholar]
  228. Mandal D., Kumar V., Ratha D., Dey S., Bhattacharya A., Lopez-Sanchez J.M., McNairn H., Rao Y.S. Dual polarimetric radar vegetation index for crop growth monitoring using sentinel-1 SAR data. Remote Sensing of Environment. 2020;247:111954. doi: 10.1016/j.rse.2020.111954. [DOI] [Google Scholar]
  229. Mandal D., Kumar V., Ratha D., Lopez-Sanchez J.M., Bhattacharya A., McNairn H., Rao Y.S., Ramana K.V. Assessment of rice growth conditions in a semi-arid region of India using the Generalized Radar Vegetation Index derived from RADARSAT-2 polarimetric SAR data. Remote Sensing of Environment. 2020;237:111561. doi: 10.1016/j.rse.2019.111561. [DOI] [Google Scholar]
  230. Mandal D., Ratha D., Bhattacharya A., Kumar V., McNairn H., Rao Y.S., Frery A.C. A radar vegetation index for crop monitoring using compact polarimetric SAR data. IEEE Trans. Geosci. Rem. Sens. 2020;58:6321–6335. doi: 10.1109/tgrs.2020.2976661. [DOI] [Google Scholar]
  231. Mao X., Hou J. Object-based forest gaps classification using airborne LiDAR data. J. For. Res. 2019;30:617–627. doi: 10.1007/s11676-018-0652-3. [DOI] [Google Scholar]
  232. McAusland L., Atkinson J.A., Lawson T., Murchie E.H. High throughput procedure utilising chlorophyll fluorescence imaging to phenotype dynamic photosynthesis and photoprotection in leaves under controlled gaseous conditions. Plant Methods. 2019;15:1–15. doi: 10.1186/s13007-019-0485-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  233. McClung C.R. Plant circadian rhythms. Plant Cell. 2006;18:792–803. doi: 10.1105/tpc.106.040980. [DOI] [PMC free article] [PubMed] [Google Scholar]
  234. Merrick T., Pau S., Jorge M., Silva T.S.F., Bennartz R. Spatiotemporal patterns and phenology of tropical vegetation solar-induced chlorophyll fluorescence across Brazilian biomes using satellite observations. Rem. Sens. 2019;11:1746–1772. doi: 10.3390/rs11151746. [DOI] [Google Scholar]
  235. Messina C.D., Technow F., Tang T., Totir R., Gho C., Cooper M. Leveraging biological insight and environmental variation to improve phenotypic prediction: integrating crop growth models (CGM) with whole genome prediction (WGP) Eur. J. Agron. 2018;100:151–162. doi: 10.1016/j.eja.2018.01.007. [DOI] [Google Scholar]
  236. Metzner R., van Dusschoten D., Buhler J., Schurr U., Jahnke S. Belowg round plant development measured with magnetic resonance imaging (MRI): exploiting the potential for non-invasive trait quantification using sugar beet as a proxy. Front. Plant Sci. 2014;5:469–480. doi: 10.3389/fpls.2014.00469. [DOI] [PMC free article] [PubMed] [Google Scholar]
  237. Mijovilovich A., Morina F., Bokhari S.N., Wolff T., Küpper H. Analysis of trace metal distribution in plants with lab-based microscopic X-ray fluorescence imaging. Plant Methods. 2020;16:82. doi: 10.1186/s13007-020-00621-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  238. Mishra A., Karimi D., Ehsani R., Albrigo L.G. Evaluation of an active optical sensor for detection of Huanglongbing (HLB) disease. Biosyst. Eng. 2011;110:302–309. doi: 10.1016/j.biosystemseng.2011.09.003. [DOI] [Google Scholar]
  239. Misra T., Arora A., Marwaha S., Chinnusamy V., Rao A.R., Jain R., Sahoo R.N., Ray M., Kumar S., Raju D., et al. SpikeSegNet-a deep learning approach utilizing encoder-decoder network with hourglass for spike segmentation and counting in wheat plant from visual imaging. Plant Methods. 2020;16:40–60. doi: 10.1186/s13007-020-00582-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  240. Mizuno N., Toyoshima M., Fujita M., Fukuda S., Kobayashi Y., Ueno M., Tanaka K., Tanaka T., Nishihara E., Mizukoshi H., et al. The genotype-dependent phenotypic landscape of quinoa in salt tolerance and key growth traits. DNA Res. 2020;27:dsaa022. doi: 10.1093/dnares/dsaa022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  241. Moghimi A., Yang C., Anderson J.A. Aerial hyperspectral imagery and deep neural networks for high-throughput yield phenotyping in wheat. Comput. Electron. Agric. 2020;172:105299. doi: 10.1016/j.compag.2020.105299. [DOI] [Google Scholar]
  242. Moghimi A., Yang C., Miller M.E., Kianian S.F., Marchetto P.M. A novel approach to assess salt stress tolerance in wheat using hyperspectral imaging. Front. Plant Sci. 2018;9:1182–1199. doi: 10.3389/fpls.2018.01182. [DOI] [PMC free article] [PubMed] [Google Scholar]
  243. Montes J.M., Melchinger A.E., Reif J.C. Novel throughput phenotyping platforms in plant genetic studies. Trends Plant Sci. 2007;12:433–436. doi: 10.1016/j.tplants.2007.08.006. [DOI] [PubMed] [Google Scholar]
  244. Morales N., Kaczmar N.S., Santantonio N., Gore M.A., Mueller L.A., Robbins K.R. ImageBreed: open-access plant breeding web–database for image-based phenotyping. The Plant Phenome Journal. 2020;3:e20004. doi: 10.1002/ppj2.20004. [DOI] [Google Scholar]
  245. Moreira F.F., Oliveira H.R., Volenec J.J., Rainey K.M., Brito L.F. Integrating high-throughput phenotyping and statistical genomic methods to genetically improve Longitudinal traits in crops. Front. Plant Sci. 2020;11:681–699. doi: 10.3389/fpls.2020.00681. [DOI] [PMC free article] [PubMed] [Google Scholar]
  246. Motomiya A.V.d.A., Molin J.P., Motomiya W.R., Biscaro G.A. Nutritional diagnosis with the use of active optical sensor in cotton. Rev. Bras. Eng. Agrícola Ambient. 2012;16:1159–1165. doi: 10.1590/s1415-43662012001100003. [DOI] [Google Scholar]
  247. Musse M., Leport L., Cambert M., Debrandt W., Sorin C., Bouchereau A., Mariette F. A mobile NMR lab for leaf phenotyping in the field. Plant Methods. 2017;13:53–64. doi: 10.1186/s13007-017-0203-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  248. Nabwire S., Suh H.K., Kim M.S., Baek I., Cho B.K. Review: application of artificial intelligence in phenomics. Sensors. 2021;21:4363–4382. doi: 10.3390/s21134363. [DOI] [PMC free article] [PubMed] [Google Scholar]
  249. Navalgund R.R., Jayaraman V., Roy P.S. Remote sensing applications: an overview. Current science. 2007;93:1747–1766. [Google Scholar]
  250. Nehe A., Foulkes M., Ozturk I., Rasheed A., York L., Kefauver S., Ozdemir F., Morgounov A. Root and canopy traits and adaptability genes explain drought tolerance responses in winter wheat. PLoS One. 2021;16:e0242472. doi: 10.1371/journal.pone.0242472. [DOI] [PMC free article] [PubMed] [Google Scholar]
  251. Nichol C.J., Drolet G., Porcar-Castell A., Wade T., Sabater N., Middleton E.M., MacLellan C., Levula J., Mammarella I., Vesala T., et al. Diurnal and seasonal solar induced chlorophyll fluorescence and photosynthesis in a boreal scots pine canopy. Rem. Sens. 2019;11:273–295. doi: 10.3390/rs11030273. [DOI] [Google Scholar]
  252. Nugraha B., Verboven P., Janssen S., Wang Z., Nicolai B.M. Non-destructive porosity mapping of fruit and vegetables using X-ray CT. Postharvest Biol. Technol. 2019;150:80–88. doi: 10.1016/j.postharvbio.2018.12.016. [DOI] [Google Scholar]
  253. Ortiz-Bustos C.M., Perez-Bueno M.L., Baron M., Molinero-Ruiz L. Fluorescence imaging in the red and far-red region during growth of sunflower plantlets. Diagnosis of the early infection by the Parasite Orobanche cumana. Front. Plant Sci. 2016;7:884–894. doi: 10.3389/fpls.2016.00884. [DOI] [PMC free article] [PubMed] [Google Scholar]
  254. Ouhami M., Hafiane A., Es-Saady Y., El Hajji M., Canals R. Computer vision, IoT and data fusion for crop disease detection using machine learning: a survey and ongoing research. Rem. Sens. 2021;13:1–24. doi: 10.3390/rs13132486. [DOI] [Google Scholar]
  255. Pallas B., Delalande M., Coupel-Ledru A., Boudon F., NGAO J., Martinez S., Bluy S., Costes E., Regnard J.-L. 30. International Horticultural Congress IHC2018. 2018. Multi-scale high-throughput phenotyping of an apple tree core collection under water stress condition; pp. 52–67. [Google Scholar]
  256. Pallottino F., Antonucci F., Costa C., Bisaglia C., Figorilli S., Menesatti P. Optoelectronic proximal sensing vehicle-mounted technologies in precision agriculture: a review. Comput. Electron. Agric. 2019;162:859–873. doi: 10.1016/j.compag.2019.05.034. [DOI] [Google Scholar]
  257. Paquit V.C., Gleason S.S., Kalluri U.C. In: Image Processing: Machine Vision Applications Iv. Fofi D., Bingham P.R., editors. International Society for Optics and Photonics; 2011. Monitoring plant growth using high resolution micro-CT images; pp. 78770W–78778W. [Google Scholar]
  258. Park T., Ganguly S., Tømmervik H., Euskirchen E.S., Høgda K.A., Karlsen S.R., Brovkin V., Nemani R.R., Myneni R.B. Changes in growing season duration and productivity of northern vegetation inferred from long-term remote sensing data. Environ. Res. Lett. 2016;11:084001. doi: 10.1088/1748-9326/11/8/084001. [DOI] [Google Scholar]
  259. Partelová D., Kuglerová K., Konotop Y., Horník M., Gubišová M., Gubiš J., Kováč P., et al. Imaging of photoassimilates transport in plant tissues by positron emission tomography. Nova Biotechnologica et Chimica. 2017;16:32–41. doi: 10.1515/nbec-2017-0005. [DOI] [Google Scholar]
  260. Paul A., Chaki N. Dimensionality reduction using band correlation and variance measure from discrete wavelet transformed hyperspectral imagery. Annals of Data Science. 2021;8:261–274. doi: 10.1007/s40745-019-00210-x. [DOI] [Google Scholar]
  261. Paulus S. Measuring crops in 3D: using geometry for plant phenotyping. Plant Methods. 2019;15:103. doi: 10.1186/s13007-019-0490-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  262. Paulus S., Dupuis J., Mahlein A.K., Kuhlmann H. Surface feature based classification of plant organs from 3D laserscanned point clouds for plant phenotyping. BMC Bioinf. 2013;14:238–250. doi: 10.1186/1471-2105-14-238. [DOI] [PMC free article] [PubMed] [Google Scholar]
  263. Paz-Kagan T., Schmilovitch Z.e., Yermiyahu U., Rapaport T., Sperling O. Assessing the nitrogen status of almond trees by visible-to-shortwave infrared reflectance spectroscopy of carbohydrates. Comput. Electron. Agric. 2020;178:105755. doi: 10.1016/j.compag.2020.105755. [DOI] [Google Scholar]
  264. Peña J.M., Torres-Sánchez J., de Castro A.I., Kelly M., López-Granados F. Weed mapping in early-season maize fields using object-based analysis of unmanned aerial vehicle (UAV) images. PLoS One. 2013;8:e77151. doi: 10.1371/journal.pone.0077151. [DOI] [PMC free article] [PubMed] [Google Scholar]
  265. Peña J., Torres-Sánchez J., Serrano-Pérez A., De Castro A.I., López-Granados F., López-Granados F. Quantifying efficacy and limits of unmanned aerial vehicle (UAV) technology for weed seedling detection as affected by sensor resolution. Sensors. 2015;15:5609–5626. doi: 10.3390/s150305609. [DOI] [PMC free article] [PubMed] [Google Scholar]
  266. Peng B., Guan K.Y., Zhou W., Jiang C.Y., Frankenberg C., Sun Y., He L.Y., Köhler P., Kohler P. Assessing the benefit of satellite-based Solar-Induced Chlorophyll Fluorescence in crop yield prediction. Int. J. Appl. Earth Obs. Geoinf. 2020;90:102126. doi: 10.1016/j.jag.2020.102126. [DOI] [Google Scholar]
  267. Perez-Bueno M.L., Pineda M., Cabeza F.M., Baron M. Multicolor fluorescence imaging as a candidate for disease detection in plant phenotyping. Front. Plant Sci. 2016;7:1790–1801. doi: 10.3389/fpls.2016.01790. [DOI] [PMC free article] [PubMed] [Google Scholar]
  268. Pérez-Ortiz M., Peña J.M., Gutiérrez P.A., Torres-Sánchez J., Hervás-Martínez C., López-Granados F. Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery. Expert Syst. Appl. 2016;47:85–94. doi: 10.1016/j.eswa.2015.10.043. [DOI] [Google Scholar]
  269. Perry E.M., Fitzgerald G.J., Poole N., Craig S., Whitlock A. In: ISPRS-international Archives of the Photogrammetry. Shortis M., Shimoda H., Cho K., editors. 2012. NDVI from active optical sensors as a measure of canopy cover and biomass; pp. 317–319. (Remote Sensing and Spatial Information Sciences). [Google Scholar]
  270. Pflugfelder D., Metzner R., van Dusschoten D., Reichel R., Jahnke S., Koller R. Non-invasive imaging of plant roots in different soils using magnetic resonance imaging (MRI) Plant Methods. 2017;13:1–9. doi: 10.1186/s13007-017-0252-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  271. Phetcharaburanin J., Deewai S., Kulthawatsiri T., Moolpia K., Suksawat M., Promraksa B., Klanrit P., Namwat N., Loilome W., Poopasit K., et al. H-1 NMR metabolic phenotyping of Dipterocarpus alatus as a novel tool for age and growth determination. PLoS One. 2020;15:1–12. doi: 10.1371/journal.pone.0243432. [DOI] [PMC free article] [PubMed] [Google Scholar]
  272. Pineda M., Barón M., Pérez-Bueno M.-L. Thermal imaging for plant stress detection and phenotyping. Remote Sening. 2021;13:68–89. doi: 10.3390/rs13010068. [DOI] [Google Scholar]
  273. Pineda M., Pérez-Bueno M.L., Paredes V., Barón M. Use of multicolour fluorescence imaging for diagnosis of bacterial and fungal infection on zucchini by implementing machine learning. Funct. Plant Biol. 2017;44:563. doi: 10.1071/fp16164. [DOI] [PubMed] [Google Scholar]
  274. Piovesan A., Vancauwenberghe V., Van de Looverbosch T., Verboven P., Nicolaï B. X-ray computed tomography for 3D plant imaging. Trends Plant Sci. 2021;26:1171–1185. doi: 10.1016/j.tplants.2021.07.010. [DOI] [PubMed] [Google Scholar]
  275. Pohanková E., Hlavinka P., Kersebaum K.-C., Rodríguez A., Jan B., Bednařík M., Dubrovský M., Gobin A., Hoogenboom G., Moriondo M., et al. Expected effects of climate change on the production and water use of crop rotation management reproduced by crop model ensemble for Czech Republic sites. Eur. J. Agron. 2022;134:126446. doi: 10.1016/j.eja.2021.126446. [DOI] [Google Scholar]
  276. Poiré R., Chochois V., Sirault X.R.R., Vogel J.P., Watt M., Furbank R.T. Digital imaging approaches for phenotyping whole plant nitrogen and phosphorus response in Brachypodium distachyon. J. Integr. Plant Biol. 2014;56:781–796. doi: 10.1111/jipb.12198. [DOI] [PubMed] [Google Scholar]
  277. Polder G., Blok P.M., de Villiers H.A.C., van der Wolf J.M., Kamp J. Potato virus Y detection in seed potatoes using deep learning on hyperspectral images. Front. Plant Sci. 2019;10:209–212. doi: 10.3389/fpls.2019.00209. [DOI] [PMC free article] [PubMed] [Google Scholar]
  278. Pommier C., Michotey C., Cornut G., Roumet P., Duchêne E., Flores R., Lebreton A., Alaux M., Durand S., Kimmel E., et al. Applying FAIR principles to plant phenotypic data management in GnpIS. Plant Phenomics. 2019;2019:1–15. doi: 10.34133/2019/1671403. [DOI] [PMC free article] [PubMed] [Google Scholar]
  279. Porcar-Castell A., Malenovský Z., Magney T., Van Wittenberghe S., Fernández-Marín B., Maignan F., Zhang Y., Maseyk K., Atherton J., Albert L.P., et al. Chlorophyll a fluorescence illuminates a path connecting plant molecular biology to Earth-system science. Nature Plants. 2021;7:998–1009. doi: 10.1038/s41477-021-00980-4. [DOI] [PubMed] [Google Scholar]
  280. Poudyal D., Rosenqvist E., Ottosen C.O. Phenotyping from lab to field - tomato lines screened for heat stress using F-v/F-m maintain high fruit yield during thermal stress in the field. Funct. Plant Biol. 2019;46:44–55. doi: 10.1071/fp17317. [DOI] [PubMed] [Google Scholar]
  281. Prabhakaran N., Kumar A., Sheoran N., Singh V.K., Nallathambi P. Tracking and assessment of Puccinia graminis f. sp. tritici colonization on rice phyllosphere by integrated fluorescence imaging and qPCR for nonhost resistance phenotyping. J. Plant Dis. Prot. 2020;128:457–469. doi: 10.1007/s41348-020-00405-y. [DOI] [Google Scholar]
  282. Pratap A., Tomar R., Kumar J., Pandey V.R., Mehandi S., Katiyar P.K. In: Phenomics in Crop Plants: Trends, Options and Limitations. Kumar J., Pratap A., Kumar S., editors. Springer India; 2015. High-throughput plant phenotyping platforms; pp. 285–296. [Google Scholar]
  283. Puttonen E., Briese C., Mandlburger G., Wieser M., Pfennigbauer M., Zlinszky A., Pfeifer N. Quantification of overnight movement of Birch (Betula pendula) branches and foliage with short interval terrestrial laser scanning. Front. Plant Sci. 2016;7:222–235. doi: 10.3389/fpls.2016.00222. [DOI] [PMC free article] [PubMed] [Google Scholar]
  284. Puttonen E., Lehtomäki M., Litkey P., Näsi R., Feng Z., Liang X., Wittke S., Pandžić M., Hakala T., Karjalainen M., et al. A clustering framework for monitoring circadian rhythm in structural dynamics in plants from terrestrial laser scanning time series. Front. Plant Sci. 2019;10:486–500. doi: 10.3389/fpls.2019.00486. [DOI] [PMC free article] [PubMed] [Google Scholar]
  285. Qi H.X., Zhu B.Y., Wu Z.Y., Liang Y., Li J.W., Wang L.D., Chen T.T., Lan Y.B., Zhang L. Estimation of peanut leaf area index from unmanned aerial vehicle multispectral images. Sensors. 2020;20:6732–6747. doi: 10.3390/s20236732. [DOI] [PMC free article] [PubMed] [Google Scholar]
  286. Qiu Q., Sun N., Bai H., Wang N., Fan Z.Q., Wang Y.J., Meng Z.J., Li B., Cong Y. Field-based high-throughput phenotyping for maize plant using 3D LiDAR point cloud generated with a “Phenomobile”. Front. Plant Sci. 2019;10:554–569. doi: 10.3389/fpls.2019.00554. [DOI] [PMC free article] [PubMed] [Google Scholar]
  287. Rajsnerova P., Klem K. 2012. Development of Phenotyping Methods for Drought Tolerance in Arabidopsis Thaliana (l.) Heynh. Based on Spectral Reflectance and Thermal Imaging. [Google Scholar]
  288. Ravi R., Hasheminasab S.M., Zhou T., Masjedi A., Quijano K., Flatt J.E., Crawford M., Habib A. UAV-based multi-sensor multi-platform integration for high throughput phenotyping. Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping Iv. 2019;11008:1–13. [Google Scholar]
  289. Rehman T.U., Ma D.D., Wang L.J., Zhang L.B., Jin J. Predictive spectral analysis using an end-to-end deep model from hyperspectral images for high-throughput plant phenotyping. Comput. Electron. Agric. 2020;177:105713. doi: 10.1016/j.compag.2020.105713. [DOI] [Google Scholar]
  290. Rey C., Martin M.P., Lobo A., Luna I., Diago M.P., Millan B., Tardaguila J. Wageningen Academic Publishers; 2013. Multispectral Imagery Acquired from a UAV to Assess the Spatial Variability of a Tempranillo Vineyard. [Google Scholar]
  291. Rischbeck P., Cardellach P., Mistele B., Schmidhalter U. Thermal phenotyping of stomatal sensitivity in spring barley. J. Agron. Crop Sci. 2017;203:483–493. doi: 10.1111/jac.12223. [DOI] [Google Scholar]
  292. Rischbeck P., Elsayed S., Mistele B., Barmeier G., Heil K., Schmidhalter U. Data fusion of spectral, thermal and canopy height parameters for improved yield prediction of drought stressed spring barley. Eur. J. Agron. 2016;78:44–59. doi: 10.1016/j.eja.2016.04.013. [DOI] [Google Scholar]
  293. Rodriguez-Leal D., Lemmon Z.H., Man J., Bartlett M.E., Lippman Z.B. Engineering quantitative trait variation for crop improvement by genome editing. Cell. 2017;171:470–480. doi: 10.1016/j.cell.2017.08.030. [DOI] [PubMed] [Google Scholar]
  294. Roitsch T., Cabrera-Bosquet L., Fournier A., Ghamkhar K., Jiménez-Berni J., Pinto F., Ober E.S. Review: new sensors and data-driven approaches—a path to next generation phenomics. Plant Sci. 2019;282:2–10. doi: 10.1016/j.plantsci.2019.01.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  295. Romero-Bravo S., Méndez-Espinoza A.M., Garriga M., Estrada F., Escobar A., González-Martinez L., Poblete-Echeverría C., Sepulveda D., Matus I., Castillo D., et al. Thermal imaging reliability for estimating grain yield and carbon isotope discrimination in wheat genotypes: importance of the environmental conditions. Sensors. 2019;19:2676–2684. doi: 10.3390/s19122676. [DOI] [PMC free article] [PubMed] [Google Scholar]
  296. Rosegrant M.W., Cline S.A.J.S. Global food security: challenges and policies. Science. 2003;302:1917–1919. doi: 10.1126/science.1092958. [DOI] [PubMed] [Google Scholar]
  297. Rousseau C., Belin E., Bove E., Rousseau D., Fabre F., Berruyer R., Guillaumès J., Manceau C., Jacques M.A., Boureau T. High throughput quantitative phenotyping of plant resistance using chlorophyll fluorescence image analysis. Plant Methods. 2013;9:17–30. doi: 10.1186/1746-4811-9-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  298. Rumpf T., Mahlein A.K., Steiner U., Oerke E.C., Dehne H.W., Plumer L. Early detection and classification of plant diseases with Support Vector Machines based on hyperspectral reflectance. Comput. Electron. Agric. 2010;74:91–99. doi: 10.1016/j.compag.2010.06.009. [DOI] [Google Scholar]
  299. Sagan V., Maimaitijiang M., Sidike P., Eblimit K., Peterson K.T., Hartling S., Esposito F., Khanal K., Newcomb M., Pauli D., et al. UAV-based high resolution thermal imaging for vegetation monitoring, and plant phenotyping using ICI 8640 P, FLIR Vue Pro R 640, and thermoMap cameras. Rem. Sens. 2019;11:330–359. doi: 10.3390/rs11030330. [DOI] [Google Scholar]
  300. Salon C., Avice J.-C., Colombié S., Dieuaide-Noubhani M., Gallardo K., Jeudy C., Ourry A., Prudent M., Voisin A.-S., Rolin D. Fluxomics links cellular functional analyses to whole-plant phenotyping. J. Exp. Bot. 2017;68:2083–2098. doi: 10.1093/jxb/erx126. [DOI] [PubMed] [Google Scholar]
  301. Salvatori E., Fusaro L., Gottardini E., Pollastrini M., Goltsev V., Strasser R.J., Bussotti F. Plant stress analysis: application of prompt, delayed chlorophyll fluorescence and 820 nm modulated reflectance. Insights from independent experiments. Plant Physiol. Biochem. 2014;85:105–113. doi: 10.1016/j.plaphy.2014.11.002. [DOI] [PubMed] [Google Scholar]
  302. Salvatori E., Fusaro L., Manes F. Chlorophyll fluorescence for phenotyping drought-stressed trees in a mixed deciduous forest. Ann. Bot. (Rome) 2016;6:39–49. [Google Scholar]
  303. Sancho-Adamson M., Trillas M., Bort J., Fernandez-Gallego J., Romanyà J. Use of RGB vegetation indexes in assessing early effects of verticillium wilt of olive in asymptomatic plants in high and low fertility scenarios. Rem. Sens. 2019;11:607–624. doi: 10.3390/rs11060607. [DOI] [Google Scholar]
  304. Sankaran S., Khot L.R., Carter A.H. Field-based crop phenotyping: multispectral aerial imaging for evaluation of winter wheat emergence and spring stand. Comput. Electron. Agric. 2015;118:372–379. doi: 10.1016/j.compag.2015.09.001. [DOI] [Google Scholar]
  305. Sarkar S., Ramsey A.F., Cazenave A.B., Balota M. Corrigendum: peanut leaf Wilting estimation from RGB color indices and Logistic models. Front. Plant Sci. 2021;12:821325. doi: 10.3389/fpls.2021.821325. [DOI] [PMC free article] [PubMed] [Google Scholar]
  306. Savi T., Miotto A., Petruzzellis F., Losso A., Pacilè S., Tromba G., Mayr S., Nardini A. Drought-induced embolism in stems of sunflower: a comparison of in vivo micro-CT observations and destructive hydraulic measurements. Plant Physiol. Biochem. 2017;120:24–29. doi: 10.1016/j.plaphy.2017.09.017. [DOI] [PubMed] [Google Scholar]
  307. Schmidt J., Beegle D., Zhu Q., Sripada R. Improving in-season nitrogen recommendations for maize using an active sensor. Field Crop. Res. 2011;120:94–101. doi: 10.1016/j.fcr.2010.09.005. [DOI] [Google Scholar]
  308. Schuler D.L., Ainsworth T., Lee J., De Grandi G. Topographic mapping using polarimetric SAR data. Int. J. Rem. Sens. 1998;19:141–160. doi: 10.1080/014311698216477. [DOI] [Google Scholar]
  309. Fiorani F., Schurr U. Future Scenarios for plant phenotyping. Annu. Rev. Plant Biol. 2013;64:267–291. doi: 10.1146/annurev-arplant-050312-120137. [DOI] [PubMed] [Google Scholar]
  310. Secchi F., Pagliarani C., Cavalletto S., Petruzzellis F., Tonel G., Savi T., Tromba G., Obertino M.M., Lovisolo C., Nardini A., et al. Chemical inhibition of xylem cellular activity impedes the removal of drought-induced embolisms in poplar stems - new insights from micro-CT analysis. New Phytol. 2021;229:820–830. doi: 10.1111/nph.16912. [DOI] [PubMed] [Google Scholar]
  311. Setiyono T.D., Quicho E.D., Gatti L., Campos-Taberner M., Busetto L., Collivignarelli F., García-Haro F.J., Boschetti M., Khan N.I., et al. Spatial rice yield estimation based on MODIS and Sentinel-1 SAR data and ORYZA crop growth model. Rem. Sens. 2018;10:293–313. [Google Scholar]
  312. Setiyono T.D., Quicho E.D., Holecz F.H., Khan N.I., Romuga G., Maunahan A., Garcia C., Rala A., Raviz J., Collivignarelli F., et al. Rice yield estimation using synthetic aperture radar (SAR) and the ORYZA crop growth model: development and application of the system in South and South-east Asian countries. Int. J. Rem. Sens. 2019;40:8093–8124. doi: 10.1080/01431161.2018.1547457. [DOI] [Google Scholar]
  313. Shan N., Ju W.M., Migliavacca M., Martini D., Guanter L., Chen J.M., Goulas Y., Zhang Y.G. Modeling canopy conductance and transpiration from solar-induced chlorophyll fluorescence. Agric. For. Meteorol. 2019;268:189–201. doi: 10.1016/j.agrformet.2019.01.031. [DOI] [Google Scholar]
  314. Shekhar A., Chen J., Paetzold J.C., Dietrich F., Zhao X.X., Bhattacharjee S., Ruisinger V., Wofsy S.C. Anthropogenic CO2 emissions assessment of Nile Delta using XCO2 and SIF data from OCO-2 satellite. Environ. Res. Lett. 2020;15:095010. doi: 10.1088/1748-9326/ab9cfe. [DOI] [Google Scholar]
  315. Shen T.T., Zhang C., Liu F., Wang W., Lu Y., Chen R.Q., He Y. High-throughput screening of free proline content in rice leaf under cadmium stress using hyperspectral imaging with chemometrics. Sensors. 2020;20:3229–3243. doi: 10.3390/s20113229. [DOI] [PMC free article] [PubMed] [Google Scholar]
  316. Shen X., Cao L., Coops N.C., Fan H., Wu X., Liu H., Wang G., Cao F. Quantifying vertical profiles of biochemical traits for forest plantation species using advanced remote sensing approaches. Remote Sensing of Environment. 2020;250:112041. doi: 10.1016/j.rse.2020.112041. [DOI] [Google Scholar]
  317. Sheng-qing X. The application status and development trend of remote sensing technology in national land and resources. Remote Sensing for Land & Resources. 2002;1:1–5. [Google Scholar]
  318. Shi L., Shi T., Broadley M.R., White P.J., Long Y., Meng J., Xu F., Hammond J.P. High-throughput root phenotyping screens identify genetic loci associated with root architectural traits in Brassica napus under contrasting phosphate availabilities. Ann. Bot. 2013;112:381–389. doi: 10.1093/aob/mcs245. [DOI] [PMC free article] [PubMed] [Google Scholar]
  319. Shrestha S., Deleuran L.C., Olesen M.H., Gislum R. Use of multispectral imaging in varietal identification of tomato. Sensors. 2015;15:4496–4512. doi: 10.3390/s150204496. [DOI] [PMC free article] [PubMed] [Google Scholar]
  320. Shu M., Zhou L., Gu X., Ma Y., Sun Q., Yang G., Zhou C. Monitoring of maize lodging using multi-temporal Sentinel-1 SAR data. Adv. Space Res. 2020;65:470–480. doi: 10.1016/j.asr.2019.09.034. [DOI] [Google Scholar]
  321. Shuai F., Jinming W., Fengyun C.J.L. Hyperspectral image unmixing based on constrained nonnegative matrix factorization. Laser & Optoelectronics Progress. 2019;56:161001. doi: 10.3788/lop56.161001. [DOI] [Google Scholar]
  322. Simard M., Pinto N., Fisher J.B., Baccini A. Mapping forest canopy height globally with spaceborne LiDAR. J. Geophys. Res.: Biogeosciences. 2011;116:1–12. doi: 10.1029/2011jg001708. [DOI] [Google Scholar]
  323. Sinclair N. BASF buys CropDesign. Chem. Mark. Rep. 2006;269:1–14. [Google Scholar]
  324. Singh A.K., Ganapathysubramanian B., Sarkar S., Singh A. Deep learning for plant stress phenotyping: trends and future perspectives. Trends Plant Sci. 2018;23:883–898. doi: 10.1016/j.tplants.2018.07.004. [DOI] [PubMed] [Google Scholar]
  325. Singh D., Wang X., Kumar U., Gao L., Noor M., Imtiaz M., Singh R.P., Poland J. High-throughput phenotyping enabled genetic dissection of crop lodging in wheat. Front. Plant Sci. 2019;10:394–405. doi: 10.3389/fpls.2019.00394. [DOI] [PMC free article] [PubMed] [Google Scholar]
  326. Singh D.P., Sarkar R.K. Distinction and characterisation of salinity tolerant and sensitive rice cultivars as probed by the chlorophyll fluorescence characteristics and growth parameters. Funct. Plant Biol. 2014;41:727–736. doi: 10.1071/fp13229. [DOI] [PubMed] [Google Scholar]
  327. Smith C., Karunaratne S., Badenhorst P., Cogan N., Spangenberg G., Smith K. Machine learning algorithms to predict forage nutritive value of in situ Perennial Ryegrass plants using hyperspectral canopy reflectance data. Rem. Sens. 2020;12:928–943. doi: 10.3390/rs12060928. [DOI] [Google Scholar]
  328. Smith K.L., Steven M.D., Colls J.J. Use of hyperspectral derivative ratios in the red-edge region to identify plant stress responses to gas leaks. Remote Sensing of Environment. 2004;92:207–217. doi: 10.1016/j.rse.2004.06.002. [DOI] [Google Scholar]
  329. Sobejano-Paz V., Mikkelsen T.N., Baum A., Mo X., Liu S., Köppl C.J., Johnson M.S., Gulyas L., García M. Hyperspectral and thermal sensing of stomatal conductance, transpiration, and photosynthesis for soybean and maize under drought. Remote Sening. 2020;12:3182–3215. doi: 10.3390/rs12193182. [DOI] [Google Scholar]
  330. Soltaninejad M., Sturrock C.J., Griffiths M., Pridmore T.P., Pound M.P. Three dimensional root CT segmentation using multi-resolution encoder-decoder networks. IEEE Trans. Image Process. 2020;29:6667–6679. doi: 10.1109/tip.2020.2992893. [DOI] [PubMed] [Google Scholar]
  331. Song P., Wang J., Guo X., Yang W., Zhao C. High-throughput phenotyping: Breaking through the bottleneck in future crop breeding. The Crop Journal. 2021;9:633–645. doi: 10.1016/j.cj.2021.03.015. [DOI] [Google Scholar]
  332. Song S., Wang B., Gong W., Chen Z., Lin X., Sun J., Shi S. A new waveform decomposition method for multispectral LiDAR. ISPRS J. Photogrammetry Remote Sens. 2019;149:40–49. doi: 10.1016/j.isprsjprs.2019.01.014. [DOI] [Google Scholar]
  333. Song T., Chen S. Long term selection for oil concentration in five maize populations [Zea mays L.; China] Maydica. 2004;49:9–14. [Google Scholar]
  334. Song T., Kong F., Li C., Song G. Eleven cycles of single kernel phenotypic recurrent selection for percent oil in Zhongzong 2 maize synthetic [Zea mays L.-China] Journal of Genetics Breeding. 1999;53:31–35. [Google Scholar]
  335. Song X., Xu D., Huang C., Zhang K., Huang S., Guo D., Zhang S., Yue K., Guo T., Wang S., et al. Monitoring of nitrogen accumulation in wheat plants based on hyperspectral data. Remote Sens. Appl.: Society and Environment. 2021;23:100598. doi: 10.1016/j.rsase.2021.100598. [DOI] [Google Scholar]
  336. Song Y., Wang J., Wang L.X. Satellite solar-induced chlorophyll fluorescence reveals heat stress impacts on wheat yield in India. Rem. Sens. 2020;12:3277–3292. doi: 10.3390/rs12203277. [DOI] [Google Scholar]
  337. Song Z., Qiu W., Jin J. MISIRoot: a robotic, Minimally invasive, in situ imaging system for plant root phenotyping. Transactions of the ASABE. 2021;64:1647–1658. doi: 10.13031/trans.14306. [DOI] [Google Scholar]
  338. Soule M. Phenetics of natural populations I. Phenetic relationships of Insular populations of the Side-Blotched Lizard. Evolution. 1967;21:584. doi: 10.2307/2406618. [DOI] [PubMed] [Google Scholar]
  339. Stavrakoudis D., Katsantonis D., Kadoglidou K., Kalaitzidis A., Gitas I.Z. Estimating rice agronomic traits using drone-collected multispectral imagery. Rem. Sens. 2019;11:545–570. doi: 10.3390/rs11050545. [DOI] [Google Scholar]
  340. Stotz G.C., Salgado-Luarte C., Escobedo V.M., Valladares F., Gianoli E. Global trends in phenotypic plasticity of plants. Ecol. Lett. 2021;24:2267–2281. doi: 10.1111/ele.13827. [DOI] [PubMed] [Google Scholar]
  341. Stroppiana D., Boschetti M., Azar R., Barbieri M., Collivignarelli F., Gatti L., Fontanelli G., Busetto L., Holecz F. In-season early mapping of rice area and flooding dynamics from optical and SAR satellite data. European Journal of Remote Sensing. 2019;52:206–220. doi: 10.1080/22797254.2019.1581583. [DOI] [Google Scholar]
  342. Stroppiana D., Boschetti M., Brivio P.A., Bocchi S. Plant nitrogen concentration in paddy rice from field canopy hyperspectral radiometry. Field Crop. Res. 2009;111:119–129. doi: 10.1016/j.fcr.2008.11.004. [DOI] [Google Scholar]
  343. Su J.Y., Liu C.J., Coombes M., Hu X.P., Wang C.H., Xu X.M., Li Q.D., Guo L., Chen W.H. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018;155:157–166. doi: 10.1016/j.compag.2018.10.017. [DOI] [Google Scholar]
  344. Su W., Zhang M., Bian D., Liu Z., Huang J., Wang W., Wu J., Guo H. Phenotyping of corn plants using unmanned aerial vehicle (UAV) images. Rem. Sens. 2019;11:1–19. doi: 10.3390/rs11172021. [DOI] [Google Scholar]
  345. Su Y., Hu T., Wang Y., Li Y., Dai J., Liu H., Jin S., Ma Q., Wu J., Liu L., et al. Large-scale geographical variations and climatic controls on crown architecture traits. J. Geophys. Res.: Biogeosciences. 2020;125 doi: 10.1029/2019jg005306. e2019JG005306. [DOI] [Google Scholar]
  346. Su Y.J., Wu F.F., Ao Z.R., Jin S.C., Qin F., Liu B.X., Pang S.X., Liu L.L., Guo Q.H. Evaluating maize phenotype dynamics under drought stress using terrestrial LiDAR. Plant Methods. 2019;15:1–16. doi: 10.1186/s13007-019-0396-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  347. Subramanian S., Han L., Dutilleul P., Smith D.L. Computed tomography scanning can monitor the effects of soil medium on root system development: an example of salt stress in corn. Front. Plant Sci. 2015;6:1–13. doi: 10.3389/fpls.2015.00256. [DOI] [PMC free article] [PubMed] [Google Scholar]
  348. Sugiura R., Tsuda S., Tamiya S., Itoh A., Nishiwaki K., Murakami N., Shibuya Y., Hirafuji M., Nuske S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016;148:1–10. doi: 10.1016/j.biosystemseng.2016.04.010. [DOI] [Google Scholar]
  349. Sultan S.E. Phenotypic plasticity for plant development, function and life history. Trends Plant Sci. 2000;5:537–542. doi: 10.1016/s1360-1385(00)01797-0. [DOI] [PubMed] [Google Scholar]
  350. Sun D.W., Cen H.Y., Weng H.Y., Wan L., Abdalla A., El-Manawy A.I., Zhu Y.M., Zhao N., Fu H.W., Tang J., et al. Using hyperspectral analysis as a potential high throughput phenotyping tool in GWAS for protein content of rice quality. Plant Methods. 2019;15:1–16. doi: 10.1186/s13007-019-0432-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  351. Sun L., Wu F., Zhan T., Liu W., Wang J., Jeon B. Weighted nonlocal low-rank tensor decomposition method for sparse unmixing of hyperspectral images. IEEE J. Sel. Top. Appl. Earth Obs. Rem. Sens. 2020;13:1174–1188. doi: 10.1109/jstars.2020.2980576. [DOI] [Google Scholar]
  352. Sun S.P., Li C.Y., Paterson A.H. In-field high-throughput phenotyping of cotton plant height using LiDAR. Rem. Sens. 2017;9:377–398. doi: 10.3390/rs9040377. [DOI] [PMC free article] [PubMed] [Google Scholar]
  353. Sun Z., Li Q., Jin S., Song Y., Xu S., Wang X., Cai J., Zhou Q., Ge Y., Zhang R., et al. Simultaneous prediction of wheat yield and grain protein content using Multitask deep learning from time-series proximal sensing. Plant Phenomics. 2022;2022:1–13. doi: 10.34133/2022/9757948. [DOI] [PMC free article] [PubMed] [Google Scholar]
  354. Sun Z., Song Y., Li Q., Cai J., Wang X., Zhou Q., Huang M., Jiang D. An integrated method for tracking and monitoring stomata dynamics from microscope videos. Plant Phenomics. 2021;2021:9835961. doi: 10.34133/2021/9835961. 11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  355. Sytar O., Brestic M., Zivcak M., Olsovska K., Kovar M., Shao H.B., He X.L. Applying hyperspectral imaging to explore natural plant diversity towards improving salt stress tolerance. Sci. Total Environ. 2017;578:90–99. doi: 10.1016/j.scitotenv.2016.08.014. [DOI] [PubMed] [Google Scholar]
  356. Tang H., Armston J., Hancock S., Marselis S., Goetz S., Dubayah R. Characterizing global forest canopy cover distribution using spaceborne lidar. Remote Sensing of Environment. 2019;231:111262. doi: 10.1016/j.rse.2019.111262. [DOI] [Google Scholar]
  357. Tang L., Zhu Y., Hannaway D., Meng Y., Liu L., Chen L., Cao W. RiceGrow: a rice growth and productivity model. NJAS - Wageningen J. Life Sci. 2009;57:83–92. doi: 10.1016/j.njas.2009.12.003. [DOI] [Google Scholar]
  358. Tao S., Guo Q., Li C., Wang Z., Fang J. Global patterns and determinants of forest canopy height. Ecology. 2016;97:3265–3270. doi: 10.1002/ecy.1580. [DOI] [PubMed] [Google Scholar]
  359. Tardieu F., Cabrera-Bosquet L., Pridmore T., Bennett M. Plant phenomics, from sensors to knowledge. Curr. Biol. 2017;27:R770–R783. doi: 10.1016/j.cub.2017.05.055. [DOI] [PubMed] [Google Scholar]
  360. Tauro F., Maltese A., Giannini R., Harfouche A. Latent heat flux variability and response to drought stress of black poplar: a multi-platform multi-sensor remote and proximal sensing approach to relieve the data scarcity bottleneck. Remote Sensing of Environment. 2022;268:112771. doi: 10.1016/j.rse.2021.112771. [DOI] [Google Scholar]
  361. ten Harkel J., Bartholomeus H., Kooistra L. Biomass and crop height estimation of different crops using UAV-based LiDAR. Rem. Sens. 2020;12:1–17. doi: 10.3390/rs12010017. [DOI] [Google Scholar]
  362. Thapa S., Zhu F.Y., Walia H., Yu H.F., Ge Y.F. A novel LiDAR-based instrument for high-throughput, 3D measurement of morphological traits in maize and sorghum. Sensors. 2018;18:1187–1201. doi: 10.3390/s18041187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  363. Thomas S., Behmann J., Steier A., Kraska T., Muller O., Rascher U., Mahlein A.K. Quantitative assessment of disease severity and rating of barley cultivars based on hyperspectral imaging in a non-invasive, automated phenotyping platform. Plant Methods. 2018;14:45–57. doi: 10.1186/s13007-018-0313-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  364. Thoren D., Schmidhalter U. Nitrogen status and biomass determination of oilseed rape by laser-induced chlorophyll fluorescence. Eur. J. Agron. 2009;30:238–242. doi: 10.1016/j.eja.2008.12.001. [DOI] [Google Scholar]
  365. Tian Y.C., Yao X., Yang J., Cao W.X., Hannaway D.B., Zhu Y. Assessing newly developed and published vegetation indices for estimating rice leaf nitrogen concentration with ground- and space-based hyperspectral reflectance. Field Crop. Res. 2011;120:299–310. doi: 10.1016/j.fcr.2010.11.002. [DOI] [Google Scholar]
  366. Toda Y., Wakatsuki H., Aoike T., Kajiya-Kanegae H., Yamasaki M., Yoshioka T., Ebana K., Hayashi T., Nakagawa H., Hasegawa T., et al. Predicting biomass of rice with intermediate traits: modeling method combining crop growth models and genomic prediction models. PLoS One. 2020;15:e0233951. doi: 10.1371/journal.pone.0233951. [DOI] [PMC free article] [PubMed] [Google Scholar]
  367. Tondewad M.P.S., Dale M.M.P. Remote sensing image registration methodology: review and discussion. Procedia Comput. Sci. 2020;171:2390–2399. doi: 10.1016/j.procs.2020.04.259. [DOI] [Google Scholar]
  368. Topp C.N., Iyer-Pascuzzi A.S., Anderson J.T., Lee C.R., Zurek P.R., Symonova O., Zheng Y., Bucksch A., Mileyko Y., Galkovskyi T., et al. 3D phenotyping and quantitative trait locus mapping identify core regions of the rice genome controlling root architecture. Proc. Natl. Acad. Sci. USA. 2013;110:E1695–E1704. doi: 10.1073/pnas.1304354110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  369. Toth C., Jóźków G. Remote sensing platforms and sensors: a survey. ISPRS J. Photogrammetry Remote Sens. 2016;115:22–36. doi: 10.1016/j.isprsjprs.2015.10.004. [DOI] [Google Scholar]
  370. Tracy S.R., Gómez J.F., Sturrock C.J., Wilson Z.A., Ferguson A.C. Non-destructive determination of floral staging in cereals using X-ray micro computed tomography (mu CT) Plant Methods. 2017;13:1–12. doi: 10.1186/s13007-017-0162-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  371. Tsaftaris S.A., Scharr H. Sharing the right data right: a Symbiosis with machine learning. Trends Plant Sci. 2019;24:99–102. doi: 10.1016/j.tplants.2018.10.016. [DOI] [PubMed] [Google Scholar]
  372. Tubuxin B., Rahimzadeh-Bajgiran P., Ginnan Y., Hosoi F., Omasa K. Estimating chlorophyll content and photochemical yield of photosystem II (I broken vertical bar(PSII)) using solar-induced chlorophyll fluorescence measurements at different growing stages of attached leaves. J. Exp. Bot. 2015;66:5595–5603. doi: 10.1093/jxb/erv272. [DOI] [PMC free article] [PubMed] [Google Scholar]
  373. Van Diepen C.v., Wolf J.v., Van Keulen H., Rappoldt C. WOFOST: a simulation model of crop production. Soil Use Manag. 1989;5:16–24. doi: 10.1111/j.1475-2743.1989.tb00755.x. [DOI] [Google Scholar]
  374. Varshney R.K., Nayak S.N., May G.D., Jackson S.A. Next-generation sequencing technologies and their implications for crop genetics and breeding. Trends Biotechnol. 2009;27:522–530. doi: 10.1016/j.tibtech.2009.05.006. [DOI] [PubMed] [Google Scholar]
  375. Vavlas N.-C., Waine T.W., Meersmans J., Burgess P.J., Fontanelli G., Richter G.M. Deriving wheat crop productivity indicators using sentinel-1 time series. Rem. Sens. 2020;12:2385–2405. doi: 10.3390/rs12152385. [DOI] [Google Scholar]
  376. Vergara-Diaz O., Vatter T., Kefauver S.C., Obata T., Fernie A.R., Araus J.L. Assessing durum wheat ear and leaf metabolomes in the field through hyperspectral data. Plant J. 2020;102:615–630. doi: 10.1111/tpj.14636. [DOI] [PubMed] [Google Scholar]
  377. Vescovo L., Gianelle D., Dalponte M., Miglietta F., Carotenuto F., Torresan C. Hail defoliation assessment in corn ( Zea mays L.) using airborne LiDAR. Field Crop. Res. 2016;196:426–437. doi: 10.1016/j.fcr.2016.07.024. [DOI] [Google Scholar]
  378. Vialet-Chabrand S., Lawson T. Dynamic leaf energy balance: deriving stomatal conductance from thermal imaging in a dynamic environment. J. Exp. Bot. 2019;70:2839–2855. doi: 10.1093/jxb/erz068. [DOI] [PMC free article] [PubMed] [Google Scholar]
  379. Virlet N., Gomez-Candon D., Lebourgeois V., Martinez S., Jolivot A., Lauri P.E., Costes E., Labbe S., Regnard J.L. In: XXIX International Horticultural Congress on Horticulture: Sustaining Lives, Livelihoods and Landscapes. Onus N., Currie A., editors. 2014. Contribution of high-resolution remotely sensed thermal-infrared imagery to high-throughput field phenotyping of an apple progeny submitted to water constraints; pp. 243–250. [Google Scholar]
  380. Virlet N., Lebourgeois V., Martinez S., Costes E., Labbe S., Regnard J.L. Stress indicators based on airborne thermal imagery for field phenotyping a heterogeneous tree population for response to water constraints. J. Exp. Bot. 2014;65:5429–5442. doi: 10.1093/jxb/eru309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  381. Virlet N., Sabermanesh K., Sadeghi-Tehran P., Hawkesford M.J. Field Scanalyzer: an automated robotic field phenotyping platform for detailed crop monitoring. Funct. Plant Biol. 2017;44:143–153. doi: 10.1071/fp16163. [DOI] [PubMed] [Google Scholar]
  382. Walter J.D.C., Edwards J., McDonald G., Kuchel H. Estimating biomass and canopy height with LiDAR for field crop breeding. Front. Plant Sci. 2019;10:1145–1161. doi: 10.3389/fpls.2019.01145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  383. Wan L., Zhang J., Dong X., Du X., Zhu J., Sun D., Liu Y., He Y., Cen H. Unmanned aerial vehicle-based field phenotyping of crop biomass using growth traits retrieved from PROSAIL model. Comput. Electron. Agric. 2021;187:106304. doi: 10.1016/j.compag.2021.106304. [DOI] [Google Scholar]
  384. Wang B., Song S., Gong W., Cao X., He D., Chen Z., Lin X., Li F., Sun J. Color restoration for full-waveform multispectral LiDAR data. Remote Sening. 2020;12:593–616. doi: 10.3390/rs12040593. [DOI] [Google Scholar]
  385. Wang C., Beringer J., Hutley L.B., Cleverly J., Li J., Liu Q.H., Sun Y. Phenology dynamics of dryland ecosystems along the North Australian tropical transect revealed by satellite solar-induced chlorophyll fluorescence. Geophys. Res. Lett. 2019;46:5294–5302. doi: 10.1029/2019gl082716. [DOI] [Google Scholar]
  386. Wang C., Caragea D., Kodadinne Narayana N., Hein N.T., Bheemanahalli R., Somayanda I.M., Jagadish S.V.K. Deep learning based high-throughput phenotyping of chalkiness in rice exposed to high night temperature. Plant Methods. 2022;18:9. doi: 10.1186/s13007-022-00839-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  387. Wang D., Wang S., Chao J., Wu X., Sun Y., Li F., Lv J., Gao X., Liu G., Wang Y. Morphological phenotyping and genetic analyses of a new chemical-mutagenized population of tobacco (Nicotiana tabacum L.) Planta. 2017;246:149–163. doi: 10.1007/s00425-017-2690-z. [DOI] [PubMed] [Google Scholar]
  388. Wang H., Cimen E., Singh N., Buckler E. Deep learning for plant genomics and crop improvement. Curr. Opin. Plant Biol. 2020;54:34–41. doi: 10.1016/j.pbi.2019.12.010. [DOI] [PubMed] [Google Scholar]
  389. Wang J.J., Dai Q.X., Shang J.L., Jin X.L., Sun Q., Zhou G.S., Dai Q.G. Field-scale rice yield estimation using sentinel-1A synthetic aperture radar (SAR) data in coastal saline region of Jiangsu Province, China. Rem. Sens. 2019;11:2274–2283. doi: 10.3390/rs11192274. [DOI] [Google Scholar]
  390. Wang Q., Mathews A.J., Li K., Wen J., Komarov S., O'Sullivan J.A., Tai Y.C. A dedicated high-resolution PET imager for plant sciences. Phys. Med. Biol. 2014;59:5613–5629. doi: 10.1088/0031-9155/59/19/5613. [DOI] [PubMed] [Google Scholar]
  391. Wang S.H., Huang C.P., Zhang L.F., Lin Y., Cen Y., Wu T.X. Monitoring and assessing the 2012 drought in the great plains: analyzing satellite-retrieved solar-induced chlorophyll fluorescence, drought indices, and gross primary production. Rem. Sens. 2016;8:61–78. doi: 10.3390/rs8020061. [DOI] [Google Scholar]
  392. Wang W.J., Jiang J.B., Qiao X.J., Yuan D.S. Identification of plants responding to natural gas microleakage stress using solar-induced chlorophyll fluorescence. J. Appl. Remote Sens. 2019;13:034531. doi: 10.1117/1.jrs.13.034531. [DOI] [Google Scholar]
  393. Wang X.P., Chen J.M., Ju W.M. Photochemical reflectance index (PRI) can be used to improve the relationship between gross primary productivity (GPP) and sun-induced chlorophyll fluorescence (SIF) Remote Sensing of Environment. 2020;246:111888. doi: 10.1016/j.rse.2020.111888. [DOI] [Google Scholar]
  394. Wang Y.Y., Zhang K., Tang C.L., Cao Q., Tian Y.C., Zhu Y., Cao W.X., Liu X.J. Estimation of rice growth parameters based on linear mixed-effect model using multispectral images from fixed-wing unmanned aerial vehicles. Rem. Sens. 2019;11:1371–1393. doi: 10.3390/rs11111371. [DOI] [Google Scholar]
  395. Wang Z., Rogge S., Abera M., van Dael M., van Nieuwenhove V., Verboven P., Sijbers J., Nicolai B. In: International Symposium on Sensing Plant Water Status - Methods and Applications in Horticultural Science. Herppich W.B., editor. 2018. Understanding microstructural deformation of apple tissue from 4D micro-CT imaging; pp. 7–13. [Google Scholar]
  396. Watt M., Fiorani F., Usadel B., Rascher U., Muller O., Schurr U. Phenotyping: new windows into the plant for breeders. Annu. Rev. Plant Biol. 2020;71:689–712. doi: 10.1146/annurev-arplant-042916-041124. [DOI] [PubMed] [Google Scholar]
  397. Webb A.A.R. The physiology of circadian rhythms in plants. New Phytol. 2003;160:281–303. doi: 10.1046/j.1469-8137.2003.00895.x. [DOI] [PubMed] [Google Scholar]
  398. Wei J., Tang X.G., Gu Q., Wang M., Ma M.G., Han X.J. Using solar-induced chlorophyll fluorescence observed by OCO-2 to predict autumn crop production in China. Rem. Sens. 2019;11:1715–1729. doi: 10.3390/rs11141715. [DOI] [Google Scholar]
  399. Weiss M., Jacob F., Duveiller G. Remote sensing for agricultural applications: a meta-review. Remote Sensing of Environment. 2020;236:111402. doi: 10.1016/j.rse.2019.111402. [DOI] [Google Scholar]
  400. Weksler S., Rozenstein O., Haish N., Moshelion M., Walach R., Ben-Dor E. A hyperspectral-physiological phenomics system: measuring diurnal transpiration rates and diurnal reflectance. Rem. Sens. 2020;12:1493–1509. doi: 10.3390/rs12091493. [DOI] [Google Scholar]
  401. Wendel A., Underwood J., Walsh K. Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform. Comput. Electron. Agric. 2018;155:298–313. doi: 10.1016/j.compag.2018.10.021. [DOI] [Google Scholar]
  402. Wu C.Y., Niu Z., Tang Q., Huang W.J. Estimating chlorophyll content from hyperspectral vegetation indices: modeling and validation. Agric. For. Meteorol. 2008;148:1230–1241. doi: 10.1016/j.agrformet.2008.03.005. [DOI] [Google Scholar]
  403. Wu D., Guo Z.L., Ye J.L., Feng H., Liu J.X., Chen G.X., Zheng J.S., Yan D.M., Yang X.Q., Xiong X., et al. Combining high-throughput micro-CT-RGB phenotyping and genome-wide association study to dissect the genetic architecture of tiller growth in rice. J. Exp. Bot. 2019;70:545–561. doi: 10.1093/jxb/ery373. [DOI] [PMC free article] [PubMed] [Google Scholar]
  404. Wu D., Wu D., Feng H., Duan L., Dai G., Liu X., Wang K., Yang P., Chen G., Gay A.P., et al. A deep learning-integrated micro-CT image analysis pipeline for quantifying rice lodging resistance-related traits. Plant Communications. 2021;2:100165. doi: 10.1016/j.xplc.2021.100165. [DOI] [PMC free article] [PubMed] [Google Scholar]
  405. Wu J.H., Yue S.C., Hou P., Meng Q.F., Cui Z.L., Li F., Chen X.P. Monitoring winter wheat population dynamics using an active crop sensor. Spectrosc. Spectr. Anal. 2011;31:535–538. [PubMed] [Google Scholar]
  406. Wu L., Liu X.-n., Zheng X.p., Qin Q.-m., Ren H.-z., Sun Y.-j. Spatial scaling transformation modeling based on fractal theory for the leaf area index retrieved from remote sensing imagery. J. Appl. Remote Sens. 2015;9:096015. doi: 10.1117/1.jrs.9.096015. [DOI] [Google Scholar]
  407. Wu S., Wen W., Wang Y., Fan J., Wang C., Gou W., Guo X. MVS-pheno: a portable and low-cost phenotyping platform for maize shoots using multiview stereo 3D reconstruction. Plant Phenomics. 2020;2020:1848437. doi: 10.34133/2020/1848437. [DOI] [PMC free article] [PubMed] [Google Scholar]
  408. Wu X., Feng H., Wu D., Yan S., Zhang P., Wang W., Zhang J., Ye J., Dai G., Fan Y., et al. Using high-throughput multiple optical phenotyping to decipher the genetic architecture of maize drought tolerance. Genome Biol. 2021;22:185–211. doi: 10.1186/s13059-021-02377-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  409. Wu X., Xiao Q., Wen J., You D., Hueni A. Advances in quantitative remote sensing product validation: overview and current status. Earth Sci. Rev. 2019;196:102875. doi: 10.1016/j.earscirev.2019.102875. [DOI] [Google Scholar]
  410. Xian C.J., He L., He Z.W., Xue D.J., Li Z. Assessing the response of satellite solar-induced chlorophyll fluorescence and NDVI to impacts of heat waves on winter wheat in the North China plain. Adv. Meteorol. 2020;2020:1–14. doi: 10.1155/2020/8873534. [DOI] [Google Scholar]
  411. Xiao F., Li W., Xiao M., Yang Z., Cheng W., Gao S., Li G., Ding Y., Paul M.J., Liu Z. A novel light interception trait of a hybrid rice ideotype indicative of leaf to panicle ratio. Field Crop. Res. 2021;274:108338. doi: 10.1016/j.fcr.2021.108338. [DOI] [Google Scholar]
  412. Xiao J., Fisher J.B., Hashimoto H., Ichii K., Parazoo N.C. Emerging satellite observations for diurnal cycling of ecosystem processes. Nature Plants. 2021;7:877–887. doi: 10.1038/s41477-021-00952-8. [DOI] [PubMed] [Google Scholar]
  413. Xiao Q., Bai X., Zhang C., He Y. Advanced high-throughput plant phenotyping techniques for genome-wide association studies: a review. J. Adv. Res. 2022;35:215–230. doi: 10.1016/j.jare.2021.05.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  414. Xiao Z., Liang S., Jiang B. Evaluation of four long time-series global leaf area index products. Agric. For. Meteorol. 2017;246:218–230. doi: 10.1016/j.agrformet.2017.06.016. [DOI] [Google Scholar]
  415. Xie C., Yang C. A review on plant high-throughput phenotyping traits using UAV-based sensors. Comput. Electron. Agric. 2020;178:105731. doi: 10.1016/j.compag.2020.105731. [DOI] [Google Scholar]
  416. Xie J., Fernandes S.B., Mayfield-Jones D., Erice G., Choi M., E Lipka A., Leakey A.D.B. Optical topometry and machine learning to rapidly phenotype stomatal patterning traits for maize QTL mapping. Plant Physiology. 2021;187:1462–1480. doi: 10.1093/plphys/kiab299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  417. Xie Q., Wang J., Lopez-Sanchez J.M., Peng X., Liao C., Shang J., Zhu J., Fu H., Ballester-Berman J.D. Crop height estimation of corn from multi-year RADARSAT-2 polarimetric observables using machine learning. Rem. Sens. 2021;13:392–411. doi: 10.3390/rs13030392. [DOI] [Google Scholar]
  418. Xu Y. Envirotyping for deciphering environmental impacts on crop plants. Theoretical and Applied Genetics. 2016;129:653–673. doi: 10.1007/s00122-016-2691-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  419. Xu Z., Valdes C., Clarke J. Existing and potential statistical and computational approaches for the analysis of 3D CT images of plant roots. Agronomy. 2018;8:71–81. doi: 10.3390/agronomy8050071. [DOI] [Google Scholar]
  420. Xue L.H., Li G.H., Qin X., Yang L.Z., Zhang H.L. Topdressing nitrogen recommendation for early rice with an active sensor in south China. Precis. Agric. 2014;15:95–110. doi: 10.1007/s11119-013-9326-5. [DOI] [Google Scholar]
  421. Yadav V.P., Prasad R., Bala R. Leaf area index estimation of wheat crop using modified water cloud model from the time-series SAR and optical satellite data. Geocarto Int. 2019;36:791–802. doi: 10.1080/10106049.2019.1624984. [DOI] [Google Scholar]
  422. Yamauchi D., Tamaoki D., Hayami M., Uesugi K., Takeuchi A., Suzuki Y., Karahara I., Mineyuki Y. In: International Workshop on X-Ray and Neutron Phase Imaging with Gratings. Momose A., Yashiro W., editors. American Institute of Physics Conference Series; 2012. Extracting tissue and cell outlines of arabidopsis seeds using refraction contrast X-ray CT at the SPring-8 facility; pp. 237–242. [Google Scholar]
  423. Yang H., Chen E., Li Z., Zhao C., Yang G., Pignatti S., Casa R., Zhao L. Wheat lodging monitoring using polarimetric index from RADARSAT-2 data. Int. J. Appl. Earth Obs. Geoinf. 2015;34:157–166. doi: 10.1016/j.jag.2014.08.010. [DOI] [Google Scholar]
  424. Yang H., Yang G.J., Gaulton R., Zhao C.J., Li Z.H., Taylor J., Wicks D., Minchella A., Chen E.X., Yang X.T. In-season biomass estimation of oilseed rape (Brassica napus L.) using fully polarimetric SAR imagery. Precis. Agric. 2019;20:630–648. doi: 10.1007/s11119-018-9587-0. [DOI] [Google Scholar]
  425. Yang K.-W., Chapman S., Carpenter N., Hammer G., McLean G., Zheng B., Chen Y., Delp E., Masjedi A., Crawford M., et al. Integrating crop growth models with remote sensing for predicting biomass yield of sorghum. silico Plants. 2021;3:1–19. doi: 10.1093/insilicoplants/diab001. [DOI] [Google Scholar]
  426. Yang W., Duan L., Chen G., Xiong L., Liu Q. Plant phenomics and high-throughput phenotyping: accelerating rice functional genomics using multidisciplinary technologies. Curr. Opin. Plant Biol. 2013;16:180–187. doi: 10.1016/j.pbi.2013.03.005. [DOI] [PubMed] [Google Scholar]
  427. Yang W., Feng H., Zhang X., Zhang J., Doonan J.H., Batchelor W.D., Xiong L., Yan J. Crop phenomics and high-throughput phenotyping: past decades, current challenges, and future perspectives. Mol. Plant. 2020;13:187–214. doi: 10.1016/j.molp.2020.01.008. [DOI] [PubMed] [Google Scholar]
  428. Yang W., Guo Z., Huang C., Duan L., Chen G., Jiang N., Fang W., Feng H., Xie W., Lian X., et al. Combining high-throughput phenotyping and genome-wide association studies to reveal natural genetic variation in rice. Nat. Commun. 2014;5:5087–5096. doi: 10.1038/ncomms6087. [DOI] [PMC free article] [PubMed] [Google Scholar]
  429. Yang W., Yang C., Hao Z.Y., Xie C.Q., Li M.Z. Diagnosis of plant cold damage based on hyperspectral imaging and convolutional neural network. IEEE Access. 2019;7:118239. doi: 10.1109/access.2019.2936892. [DOI] [Google Scholar]
  430. Yang Z., Gao S., Xiao F., Li G., Ding Y., Guo Q., Paul M.J., Liu Z. Leaf to panicle ratio (LPR): a new physiological trait indicative of source and sink relation in japonica rice based on deep learning. Plant Methods. 2020;16:117–132. doi: 10.1186/s13007-020-00660-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  431. Yang Z.S., Han Y.X. A low-cost 3D phenotype measurement method of leafy vegetables using video recordings from smartphones. Sensors. 2020;20:6068. doi: 10.3390/s20216068. [DOI] [PMC free article] [PubMed] [Google Scholar]
  432. Yao X., Si H.Y., Cheng T., Jia M., Chen Q., Tian Y.C., Zhu Y., Cao W.X., Chen C.Y., Cai J.Y., et al. Hyperspectral estimation of canopy leaf biomass phenotype per ground area using a continuous wavelet analysis in wheat. Front. Plant Sci. 2018;9:1360. doi: 10.3389/fpls.2018.01360. [DOI] [PMC free article] [PubMed] [Google Scholar]
  433. Yasrab R., Zhang J., Smyth P., Pound M.P. Predicting plant growth from time-series data using deep learning. Rem. Sens. 2021;13:1–17. doi: 10.3390/rs13030331. [DOI] [Google Scholar]
  434. Yendrek C.R., Tomaz T., Montes C.M., Cao Y.Y., Morse A.M., Brown P.J., McIntyre L.M., Leakey A.D.B., Ainsworth E.A. High-throughput phenotyping of maize leaf physiological and biochemical traits using hyperspectral reflectance. Plant Physiology. 2017;173:614–626. doi: 10.1104/pp.16.01447. [DOI] [PMC free article] [PubMed] [Google Scholar]
  435. Yu K., Anderegg J., Mikaberidze A., Karisto P., Mascher F., McDonald B.A., Walter A., Hund A. Hyperspectral canopy sensing of wheat Septoria tritici Blotch disease. Front. Plant Sci. 2018;9:1195. doi: 10.3389/fpls.2018.01195. [DOI] [PMC free article] [PubMed] [Google Scholar]
  436. Yu Z., Cao Z., Wu X., Bai X., Qin Y., Zhuo W., Xiao Y., Zhang X., Xue H. Automatic image-based detection technology for two critical growth stages of maize: emergence and three-leaf stage. Agric. For. Meteorol. 2013;174-175:65–84. doi: 10.1016/j.agrformet.2013.02.011. [DOI] [Google Scholar]
  437. Yuan H.B., Bennett R.S., Wang N., Chamberlin K.D. Development of a peanut canopy measurement system using a ground-based lidar sensor. Front. Plant Sci. 2019;10:203–216. doi: 10.3389/fpls.2019.00203. [DOI] [PMC free article] [PubMed] [Google Scholar]
  438. Yuan L., Yan P., Han W.Y., Huang Y.B., Wang B., Zhang J.C., Zhang H.B., Bao Z.Y. Detection of anthracnose in tea plants based on hyperspectral imaging. Comput. Electron. Agric. 2019;167:105039. doi: 10.1016/j.compag.2019.105039. [DOI] [Google Scholar]
  439. Yuan N., Gong Y., Fang S., Liu Y., Duan B., Yang K., Wu X., Zhu R. UAV remote sensing estimation of rice yield based on adaptive spectral Endmembers and bilinear mixing model. Rem. Sens. 2021;13:1–25. doi: 10.3390/rs13112190. [DOI] [Google Scholar]
  440. Zaman-Allah M. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods. 2015;11:1–10. doi: 10.1186/s13007-015-0078-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  441. Zarnetske P.L., Read Q.D., Record S., Gaddis K.D., Pau S., Hobi M.L., Malone S.L., Costanza J.M., Dahlin K., Latimer A.M., et al. Towards connecting biodiversity and geodiversity across scales with satellite remote sensing. Global Ecology and Biogeography. 2019;28:548–556. doi: 10.1111/geb.12887. [DOI] [PMC free article] [PubMed] [Google Scholar]
  442. Zhang C., Gao S., Niu Z., Pei J., Bi K., Sun G. Calibration of the pulse signal decay effect of full-waveform hyperspectral LiDAR. Sensors. 2019;19:5263. doi: 10.3390/s19235263. [DOI] [PMC free article] [PubMed] [Google Scholar]
  443. Zhang C., Zhang C., Pumphrey M.O., Pumphrey M.O., Zhou J., Zhou J., Zhang Q., Zhang Q., Sankaran S., Sankaran S. Development of an automated high- throughput phenotyping system for wheat evaluation in a controlled environment. Transactions of the ASABE. 2019;62:61–74. doi: 10.13031/trans.12856. [DOI] [Google Scholar]
  444. Zhang J., Wang C.F., Yang C.H., Xie T.J., Jiang Z., Hu T., Luo Z.B., Zhou G.S., Xie J. Assessing the effect of real spatial resolution of in situ UAV multispectral images on seedling rapeseed growth monitoring. Rem. Sens. 2020;12:1207. doi: 10.3390/rs12071207. [DOI] [Google Scholar]
  445. Zhang J., Xie T.J., Yang C.H., Song H.B., Jiang Z., Zhou G.S., Zhang D.Y., Feng H., Xie J. Segmenting purple rapeseed leaves in the field from UAV RGB imagery using deep learning as an auxiliary means for nitrogen stress detection. Rem. Sens. 2020;12:1403. doi: 10.3390/rs12091403. [DOI] [Google Scholar]
  446. Zhang L., Grift T.E. A LiDAR-based crop height measurement system for Miscanthus giganteus. Comput. Electron. Agric. 2012;85:70–76. doi: 10.1016/j.compag.2012.04.001. [DOI] [Google Scholar]
  447. Zhang L.B., Maki H., Ma D.D., Sánchez-Gallego J.A., Sanchez-Gallego J.A., Mickelbart M.V., Wang L.J., Rehman T.U., Jin J. Optimized angles of the swing hyperspectral imaging system for single corn plant. Comput. Electron. Agric. 2019;156:349–359. doi: 10.1016/j.compag.2018.11.030. [DOI] [Google Scholar]
  448. Zhang L.F., Qiao N., Huang C.P., Wang S.H. Monitoring drought effects on vegetation productivity using satellite solar-induced chlorophyll fluorescence. Rem. Sens. 2019;11:378–396. doi: 10.3390/rs11040378. [DOI] [Google Scholar]
  449. Zhang M., Li W., Tao R., Li H., Du Q. Information fusion for classification of hyperspectral and LiDAR data using IP-CNN. IEEE Trans. Geosci. Rem. Sens. 2022;60:1–12. doi: 10.1109/tgrs.2021.3093334. [DOI] [Google Scholar]
  450. Zhang X., Huang C., Wu D., Qiao F., Li W., Duan L., Wang K., Xiao Y., Chen G., Liu Q., et al. High-throughput phenotyping and QTL mapping reveals the genetic architecture of maize plant growth. Plant Phys. 2017;173:1554–1564. doi: 10.1104/pp.16.01516. [DOI] [PMC free article] [PubMed] [Google Scholar]
  451. Zhang Y., Du J., Wang J., Ma L., Lu X., Pan X., Guo X., Zhao C. High-throughput micro-phenotyping measurements applied to assess stalk lodging in maize (Zea mays L.) Biol. Res. 2018;51:40–54. doi: 10.1186/s40659-018-0190-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  452. Zhang Y., Du J.J., Ma L.M., Pan X.D., Wang J.L., Guo X.Y. Three-dimensional segmentation, reconstruction and phenotyping analysis of maize kernel based on micro-CT images. Fresenius Environ. Bull. 2018;27:3965–3969. [Google Scholar]
  453. Zhang Y., Wang J., Du J., Zhao Y., Lu X., Wen W., Gu S., Fan J., Wang C., Wu S., et al. Dissecting the phenotypic components and genetic architecture of maize stem vascular bundles using high-throughput phenotypic analysis. Plant Biotechnology Journal. 2021;19:35–50. doi: 10.1111/pbi.13437. [DOI] [PMC free article] [PubMed] [Google Scholar]
  454. Zhang Y., Xiao X.M., Zhang Y.G., Wolf S., Zhou S., Joiner J., Guanter L., Verma M., Sun Y., Yang X., et al. On the relationship between sub-daily instantaneous and daily total gross primary production: implications for interpreting satellite-based SIF retrievals. Remote Sensing of Environment. 2018;205:276–289. doi: 10.1016/j.rse.2017.12.009. [DOI] [Google Scholar]
  455. Zhang Z., Lou Y., Moses A O., Li R., Ma L., Li J.J.S.L. Hyperspectral remote sensing to quantify the flowering phenology of winter wheat. Spectrosc. Lett. 2019;52:389–397. doi: 10.1080/00387010.2019.1649701. [DOI] [Google Scholar]
  456. Zhang Z.Y., Zhang Y.G., Porcar-Castell A., Joiner J., Guanter L., Yang X., Migliavacca M., Ju W.M., Sun Z.G., Chen S.P., et al. Reduction of structural impacts and distinction of photosynthetic pathways in a global estimation of GPP from space-borne solar-induced chlorophyll fluorescence. Remote Sensing of Environment. 2020;240:111722. doi: 10.1016/j.rse.2020.111722. [DOI] [Google Scholar]
  457. Zhao C.P., Qin C.Z. 10-m-resolution mangrove maps of China derived from multi-source and multi-temporal satellite observations. ISPRS J. Photogrammetry Remote Sens. 2020;169:389–405. doi: 10.1016/j.isprsjprs.2020.10.001. [DOI] [Google Scholar]
  458. Zhao L., Yang J., Li P., Shi L., Zhang L. Characterizing lodging damage in wheat and canola using Radarsat-2 polarimetric SAR data. Remote Sensing Letters. 2017;8:667–675. doi: 10.1080/2150704x.2017.1312028. [DOI] [Google Scholar]
  459. Zheng H.B., Li W., Jiang J.L., Liu Y., Cheng T., Tian Y.C., Zhu Y., Cao W.X., Zhang Y., Yao X. A comparative assessment of different modeling algorithms for estimating leaf nitrogen content in winter wheat using multispectral images from an unmanned aerial vehicle. Rem. Sens. 2018;10:2026–2042. doi: 10.3390/rs10122026. [DOI] [Google Scholar]
  460. Zhou B., Elazab A., Bort J., Vergara O., Serret M.D., Araus J.L. Low-cost assessment of wheat resistance to yellow rust through conventional RGB images. Comput. Electron. Agric. 2015;116:20–29. doi: 10.1016/j.compag.2015.05.017. [DOI] [Google Scholar]
  461. Zhou G., Liu X., Zhao S., Liu M., Wu L. Estimating FAPAR of rice growth period using radiation transfer model coupled with the WOFOST model for analyzing heavy metal stress. Rem. Sens. 2017;9:424–439. doi: 10.3390/rs9050424. [DOI] [Google Scholar]
  462. Zhou J.F., Pavek M.J., Shelton S.C., Holden Z.J., Sankaran S. Aerial multispectral imaging for crop hail damage assessment in potato. Comput. Electron. Agric. 2016;127:406–412. doi: 10.1016/j.compag.2016.06.019. [DOI] [Google Scholar]
  463. Zhou R.-Q., Jin J.-J., Li Q.-M., Su Z.-Z., Yu X.-J., Tang Y., Luo S.-M., He Y., Li X.-L. Early detection of Magnaporthe oryzae-Infected barley leaves and lesion visualization based on hyperspectral imaging. Front. Plant Sci. 2019;9:1–13. doi: 10.3389/fpls.2018.01962. [DOI] [PMC free article] [PubMed] [Google Scholar]
  464. Zhou R., Hyldgaard B., Yu X.Q., Rosenqvist E., Ugarte R.M., Yu S.X., Wu Z., Ottosen C.O., Zhao T.M. Phenotyping of faba beans (Vicia faba L.) under cold and heat stresses using chlorophyll fluorescence. Euphytica. 2018;214:68–81. doi: 10.1007/s10681-018-2154-y. [DOI] [Google Scholar]
  465. Zhu F., Saluja M., Dharni J.S., Paul P., Sattler S.E., Staswick P., Walia H., Yu H. PhenoImage: an open-source graphical user interface for plant image analysis. The Plant Phenome Journal. 2021;4:e20015. doi: 10.1002/ppj2.20015. [DOI] [Google Scholar]
  466. Zhu M., Huang D., Hu X.-J., Tong W.-H., Han B.-L., Tian J.-P., Luo H.-B. Application of hyperspectral technology in detection of agricultural products and food: a Review. Food Science & Nutrition. 2020;8:5206–5214. doi: 10.1002/fsn3.1852. [DOI] [PMC free article] [PubMed] [Google Scholar]
  467. Zhu W.X., Sun Z.G., Yang T., Li J., Peng J.B., Zhu K.Y., Li S.J., Gong H.R., Lyu Y., Li B.B., et al. Estimating leaf chlorophyll content of crops via optimal unmanned aerial vehicle hyperspectral data at multi-scales. Comput. Electron. Agric. 2020;178:105786. doi: 10.1016/j.compag.2020.105786. [DOI] [Google Scholar]
  468. Zhu Y., Tang L., Liu L., Liu B., Zhang X., Qiu X., Tian Y., Cao W. Research progress on the crop growth model crop grow. Sci. Agric. Sin. 2020;53:3235–3256. [Google Scholar]
  469. Zhuo W., Fang S., Gao X., Wang L., Wu D., Fu S., Wu Q., Huang J. Crop yield prediction using MODIS LAI, TIGGE weather forecasts and WOFOST model: a case study for winter wheat in Hebei, China during 2009–2013. Int. J. Appl. Earth Obs. Geoinf. 2022;106:102668. doi: 10.1016/j.jag.2021.102668. [DOI] [Google Scholar]
  470. Zhuo W., Huang J.X., Li L., Huang R., Gao X.R., Zhang X.D., Zhu D.H., Ieee . 2018 7th International Conference on Agro-Geoinformatics. 2018. Assimilating SAR and optical remote sensing data into WOFOST model for improving winter wheat yield estimation; pp. 547–551. [Google Scholar]
  471. Zhuo W., Huang J.X., Li L., Zhang X.D., Ma H.Y., Gao X.R., Huang H., Xu B.D., Xiao X.M. Assimilating soil moisture retrieved from Sentinel-1 and Sentinel-2 data into WOFOST model to improve winter wheat yield estimation. Rem. Sens. 2019;11:1618–1635. doi: 10.3390/rs11131618. [DOI] [Google Scholar]
  472. Zia S., Romano G., Spreer W., Sanchez C., Cairns J., Araus J.L., Müller J. Infrared thermal imaging as a rapid tool for identifying water-stress tolerant maize genotypes of different phenology. J. Agron. Crop Sci. 2013;199:75–84. doi: 10.1111/j.1439-037x.2012.00537.x. [DOI] [Google Scholar]
  473. Zibrat U., Susic N., Knapic M., Sirca S., Strajnar P., Razinger J., Voncina A., et al. Pipeline for imaging, extraction, pre-processing, and processing of time-series hyperspectral data for discriminating drought stress origin in tomatoes. MethodsX. 2019;6:399–408. doi: 10.1016/j.mex.2019.02.022. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Plant Communications are provided here courtesy of Elsevier

RESOURCES