Table 1.
Category | Step | Details |
---|---|---|
UAS image collection | Set up UAS | DJI Inspire 1 with Sentera Double 4 K sensor |
Prepare flight path | AgVault mobile app or Pix4D capture app | |
Fly UAS for data collection | 70% image overlap, 61 m altitude | |
Image orthomosaic using Pix4D | Initial processing | Key points extraction and matching, camera model optimization, geolocation |
Point cloud and mesh | Point densification and 3D textured mesh creation, insert ground control points | |
Digital surface model, orthomosaic, and index | Creation of digital surface model, Orthomosaic, reflectance map, and index map | |
Image processing | Plant and soil classification | k-means clustering and recode to plant and soil |
Green, yellow, brown pixel classification | k-means clustering on masked canopy and recode to green, yellow, brown | |
Neural network/random forest with ground data | Subset into training and validation sets, ground based data is response variable and green, yellow, brown pixel counts are used as features |
The flight path is set up in Pix4D capture with 70% overlap of images. Individual photos are orthomosaiced in Pix4D and k-means clustering is used to mask the plants from the soil background. An additional classification of green, yellow, and brown pixels is performed on the plant objects. In QGIS, plots are defined, and the proportions of green, yellow, and brown pixels are extracted from each plot. Finally, predictions are made to correlate these three features with ground based visual score estimates rated on a one through 5 scoring system