Table 5.
Crop | Perception | Autonomy Level | Results | Cited Work |
---|---|---|---|---|
Maize & wheat | Cameras, spectral imaging systems, laser sensors, 3D time-of-flight cameras | Autonomous | No performance metrics provided | [75] |
Cotton | Stereo RGB & IR thermal camera, temperature, humidity, light intensity sensors, pyranometer, quantum, LiDAR | Semi-autonomous | RMS error: Plant height: < 0.5 cm (Vinobot) RGB to IR calibration: 2.5 px (Vinoculer) Temp: < 1 °C (Vinoculer) |
[78] |
Sorghum | Stereo camera, RGB camera with fish eye lenses, penetrometer | Autonomous | Stalk detection: 96% | [73] |
Rice, maize & wheat | RGB camera, chlorophyll fluorescence camera, NDVI camera, thermal infrared camera, hyperspectral camera, 3D laser scanner | Fixed site fully automated | Plant height RMS error: 1.88 cm | [76] |
Sugar Beet | Mobile robot: Webcam camera, gigaethernet camera Bettybot: Color camera, hyperspectral camera |
Autonomous | No performance metrics provided | [74] |
Sorghum | Stereo imaging system consisting of color cameras | Autonomous based on commercial tractor | The image-derived measurements were highly repeatable & showed high correlations with manual measurements. |
[77] |
Energy Sorghum | Stereo camera, time of flight depth sensor, IR camera | Semi-autonomous | Average absolute error for stem width and plant height: 13% and 15%. | [79] |