Short abstract
This article is a Commentary on Sun et al. (2022), 236: 1584–1604.
Keywords: aerial phenotyping, drone, dynamic trait, GUI software, static trait
The use of drones has great potential for breeding and plant research as drones can collect sufficient visual information relatively easily and economically from our plants in the field. Nevertheless, the present bottleneck for aerial phenotyping is analytical solutions, which need to be readily used to extract meaningful and reliable trait information from the obtained images. An open‐source platform called AirMeasurer developed by Sun et al. (2022; pp. 1584–1604), published in this issue of New Phytologist, has made efforts to address this problem. Not only has this work conducted multiseason aerial phenotyping using low‐cost drones and customised protocols to generate high quality visual representations of plants under different environments, Sun et al. also incorporated computer vision algorithms and machine learning techniques into the automatic estimation of key agronomic traits such as seedling number, plant height, canopy coverage, vegetative indices and model‐predicted key growth stages.
‘This study is developing a graphical user interface for AirMeasurer to help nonspecialists exploit the aerial phenotyping, and showed its application in examining rice trials and will help researchers obtain static and dynamic phenotypic data to develop novel molecular markers for crop improvement.’
Why drones?
Massive data collection capability in genome sequencing of plants has demanded faster and greater capability for scientists and growers to collect phenotypic trait data of various plants across genetic lines (Shi et al., 2016). As a critical component of crop improvement and breeding, field‐based phenotyping has opened the doors for the development of high‐throughput phenotyping techniques that are necessary to rapidly access thousands of breeders' plots in field trials (Yang et al., 2017; Furbank et al., 2019). Drones are also known as unmanned air vehicles (UAVs) and unmanned aircraft systems (UASs). The first marked deployment and utilisation of drones was accomplished during World War II, and the first evidence of using drones for agriculture was seen with crop spraying in the 1980s (Radoglou‐Grammatikis et al., 2020). The use of drones has proved an efficient and effective technique for field and plot plant phenotyping because of its capabilities in: (1) the required throughput, that is UAVs can collect huge quantities of data ‘on demand’ (Masjedi et al., 2020); (2) it provides an accurate description of trait expression under natural cropping conditions, and it is the real expression of the species’ genotype and the environment in which it grows (Shi et al., 2016); (3) it can be configured at a low cost (Han et al., 2018). As a consequence, the application of drones used for field plant phenotyping has been fully investigated in the past decade (Fig. 1).
Fig. 1.

A scene of drones (a) used in field plant phenotyping for (b) rice, (c) rapeseed and (d) wheat. Image is courtesy of Wanneng Yang and Jian Zhang, Huazhong Agricultural University, China.
Data collection modalities
Platform configurations have been demonstrated as a critical factor that needs comprehensive consideration before the in‐site flying tasks. The load capacity is the main factor that limits its wide application. Multisensor has been demonstrated as a new and unique characteristic for field phenotyping using drones to meet the requirements of high precision, light weight and small size. Various sensors including red–green–blue (RGB) cameras, light detection and ranging (LiDAR), multispectral or high spectral cameras, and thermal and infrared sensors are mounted and integrated together on the aerial drones to collect large amounts of data. A couple of issues must be addressed before its practical application in field phenotyping: (1) how to integrate diverse sensors together to produce the necessary phenotypic information for breeders and the subsequent data processing, which is quite challenging because the data come from different platforms and a multimodal data processing technique is urgently needed; and (2) complex hardware configuration demands larger and higher cost drones, which would lead to wind disruption during the low‐altitude imaging. For the sake of dealing with these challenges, Sun et al. applied a low‐cost drone configured with an RGB camera as the data collection platform. A low‐cost drone has the advantages of lower wind disruption, on which the sensor can be easily mounted (Xie & Yang, 2020) and it can serve for a series of phenotypic trait extractions, including colour, texture, plant height, canopy coverage and heading time. In addition, data collection procedures were performed strictly according to the experimental design concerning biological questions.
Data processing
After the data collection in field aerial phenotyping, the analysis of large numbers of images is very laborious and time consuming (Shi et al., 2016; Yang et al., 2020), needing a high‐throughput image analysis pipeline. For example, three‐dimensional (3D) point cloud generation would produce 3D traits, image mosaic would cover a broader area, and plot segmentation can be used for phenotypic trait estimation and analysis at plot level. Significant data processing and prototype development coding work is required to reduce the tedious labour needed to achieve high efficiency.
Three‐dimensional point cloud can be generated by the combination of Structure from Motion (SfM) and Multiview Stereo View (MVS) algorithms, which has been demonstrated as a very effective option and can be implemented by various tools including Visual SfM, Pix4Dmapper and Colmap. In Sun et al.’s work, commercial software, Pix4Dmapper, was adopted because it outcompeted other tools. Based on the generated 3D point cloud, a canopy height model (CHM) was produced by subtracting the digital surface model (DSM) from the digital elevation model (DEM). Image mosaic was also conducted using Pix4Dmapper. Then, an optimised plot segmentation approach was developed, combining iterative self‐organising data thresholding and Hough Transform algorithms. Finally, refined masks were used to generate plot‐level sampling regions. The work protocol is easily implemented for researchers who are without the relevant experience or background.
Static and dynamic trait estimation
Varied phenotypic traits, including morphological traits, spectral traits and textured traits, can be extracted on the generated image and 3D point cloud data at plot scale. Static traits were named for these phenotypic traits at a specific growth point in Sun et al.’s work and, consequently, 14 static growth‐related traits over time were estimated by utilising both spatial and spectral signals, which can be finished in an automated fashion. Furthermore, the dynamic development of some phenotypic traits can reflect the relationship between plant growth and the surrounding environment (Han et al., 2018). Therefore, dynamic change of the phenotypic traits is essential for later genome‐wide association study (GWAS) analysis and quantitative trait loci (QTL) mapping. Some dynamic traits, such as rapid growth phase, faster growth rate and average growth rate, were calculated based on the static parameters.
In particular, Sun et al. established an original approach to characterise traits dynamically so that time‐series measurements could be obtained from the dynamic trait analysis to enable the examination of dynamic or longitudinal phenotypes during plant–environment interactions. This solution for static and dynamic trait extraction can be repeated on other phenotypic traits in rice, and even for other plant species, to produce more abundant and promising phenotypic information, which differs from the conventional way of using only static traits.
Field planting experiments are often carried out at multiple sites, in multiple years and even with multiple batch sizes. Accordingly, multitemporal and multisite static and dynamic traits should be integrated together for analysis to better study the response of plants to environmental change. Using the platform, Sun et al. studied hundreds of rice landraces and recombinant inbred lines from 2019 to 2021, at two sites over 2000 km apart in China. Then they selected a range of static and dynamic traits to perform GWAS analysis and QTL mapping, resulting in reliable loci that were either matched with published work or had previously unknown strong signals, valuable for identifying the effects of individual allelic differences that regulate trait expressions.
The expandable GUI software
Although a tremendous amount of research work has been done on aerial phenotyping using drones, the most challenging issue remains the lack of easy‐to‐use tools. Sun et al. developed a graphical user interface (GUI) for AirMeasurer to help nonspecialists exploit the platform, and showed its application in examining rice trials. It is worth mentioning that the function of this tool can be expanded according to specific research use. In future, more open‐source and user‐friendly image analysis pipelines should be encouraged.
Perspective
Overall, advantages of aerial phenotyping using drones can be summarised as follows: multisensor, multiscale, multiresolution, multitemporal and multisite, and considerable research results have been reported in the past decade. In particular, some aspects on the development and applications of aerial phenotyping are recommended:
For data collection, multisensors should be considered to integrate together to produce various data with the development of sensors and drones, hopefully more lightweight flying aircraft with more load would be developed to resolve the overload issues (Zhu et al., 2021).
For data processing, cutting‐edge technology, such as deep learning and computer vision algorithms, can be introduced into the developed tool allowing us to analyse huge amounts of phenotypic data in a more efficient way, as Sun et al. have indicated. Especially, multimodal data processing algorithms need to be involved in the multisource data processing and multiscale phenotyping traits, such as at individual plant, plot and field scale, should be combined and further exploited.
For data analysis, more commonly used tools, especially combining new machine learning algorithms and predicted models, need further research and development. Machine learning or deep learning algorithms can be instigated to establish appropriate models on certain growth traits and more plant species.
In conclusion, aerial phenotyping with the merits of fast data collection and new traits extraction, which help researchers develop novel molecular markers, will promote crop improvement and breeding in the future.
Acknowledgements
This work was supported by grants from the National Natural Science Foundation of China (U21A20205), Key projects of the Natural Science Foundation of Hubei Province (2021CFA059), Fundamental Research Funds for the Central Universities (2021ZKPY006), and HZAU‐AGIS Cooperation Fund (SZYJY2022014).
This article is a Commentary on Sun et al. (2022), 236: 1584–1604.
References
- Furbank RT, Jimenez‐Berni JA, George‐Jaeggli B, Potgieter AB, Deery DM. 2019. Field crop phenomics: enabling breeding for radiation use efficiency and biomass in cereal crops. New Phytologist 223: 1714–1727. [DOI] [PubMed] [Google Scholar]
- Han L, Yang GJ, Yang H, Xu B, Li ZH, Yang XD. 2018. Clustering field‐based maize phenotyping of plant‐height growth and canopy spectral dynamics using a UAV remote‐sensing approach. Frontiers in Plant Science 9: 18. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Masjedi A, Crawford MM, Carpenter NR, Tuinstra MR. 2020. Multi‐temporal predictive modelling of sorghum biomass using UAV‐based hyperspectral and LiDAR data. Remote Sensing 12: 35. [Google Scholar]
- Radoglou‐Grammatikis P, Sarigiiannidis P, Lagkas T, Moschoios I. 2020. A Compilation of UAV applications for precision agriculture. Computer Networks 172: 107148. [Google Scholar]
- Shi YY, Thomasson JA, Murray SC, Pugh NA, Rooney WL, Shafian S, Rajan N, Rouze G, Morgan CLS, Neely HL et al. 2016. Unmanned aerial vehicles for high‐throughput phenotyping and agronomic research. PLoS ONE 11: 26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sun G, Lu H, Zhao Y, Zhou J, Jackson R, Wang Y, Xu L‐x, Wang A, Colmer J, Ober E et al. 2022. AirMeasurer: open‐source software to quantify static and dynamic traits derived from multiseason aerial phenotyping to empower genetic mapping studies in rice. New Phytologist 236: 1584–1604. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Xie CQ, Yang C. 2020. A review on plant high‐throughput phenotyping traits using UAV‐based sensors. Computers and Electronics in Agriculture 178: 14. [Google Scholar]
- Yang GJ, Liu JG, Zhao CJ, Li ZH, Huang YB, Yu HY, Xu B, Yang XD, Zhu DM, Zhang XY et al. 2017. Unmanned aerial vehicle remote sensing for field‐based crop phenotyping: current status and perspectives. Frontiers in Plant Science 8: 26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yang WN, Feng H, Zhang XH, Zhang J, Doonan JH, Batchelor WD, Xiong LZ, Yan JB. 2020. Crop phenomics and high‐throughput phenotyping: past decades, current challenges, and future perspectives. Molecular Plant 13: 187–214. [DOI] [PubMed] [Google Scholar]
- Zhu WX, Sun ZG, Huang YH, Yang T, Li J, Zhu KY, Zhang JQ, Yang B, Shao CX, Peng JB et al. 2021. Optimization of multi‐source UAV RS agro‐monitoring schemes designed for field‐scale crop phenotyping. Precision Agriculture 22: 1768–1802. [Google Scholar]
