Skip to main content

Some NLM-NCBI services and products are experiencing heavy traffic, which may affect performance and availability. We apologize for the inconvenience and appreciate your patience. For assistance, please contact our Help Desk at info@ncbi.nlm.nih.gov.

Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2023 Jan 27;107(2):21. doi: 10.1007/s10846-022-01793-z

A Survey of Robotic Harvesting Systems and Enabling Technologies

Leonidas Droukas 1,, Zoe Doulgeri 1, Nikolaos L Tsakiridis 1, Dimitra Triantafyllou 2, Ioannis Kleitsiotis 2, Ioannis Mariolis 2, Dimitrios Giakoumis 2, Dimitrios Tzovaras 2, Dimitrios Kateris 3, Dionysis Bochtis 3
PMCID: PMC9881528  PMID: 36721646

Abstract

This paper presents a comprehensive review of ground agricultural robotic systems and applications with special focus on harvesting that span research and commercial products and results, as well as their enabling technologies. The majority of literature concerns the development of crop detection, field navigation via vision and their related challenges. Health monitoring, yield estimation, water status inspection, seed planting and weed removal are frequently encountered tasks. Regarding robotic harvesting, apples, strawberries, tomatoes and sweet peppers are mainly the crops considered in publications, research projects and commercial products. The reported harvesting agricultural robotic solutions, typically consist of a mobile platform, a single robotic arm/manipulator and various navigation/vision systems. This paper reviews reported development of specific functionalities and hardware, typically required by an operating agricultural robot harvester; they include (a) vision systems, (b) motion planning/navigation methodologies (for the robotic platform and/or arm), (c) Human-Robot-Interaction (HRI) strategies with 3D visualization, (d) system operation planning & grasping strategies and (e) robotic end-effector/gripper design. Clearly, automated agriculture and specifically autonomous harvesting via robotic systems is a research area that remains wide open, offering several challenges where new contributions can be made.

Keywords: Robotic harvesting, Automated agriculture, Agricultural functionalities, State-of-art review

Biographies

Leonidas Droukas

was born in 1984. He received the Diploma in electrical and computer engineering from the Department of Electrical and Computer Engineering of the Aristotle University of Thessaloniki, Greece in 2008 and the Ph.D. degree in electrical and computer engineering from the same department in 2016. He has worked as a researcher in various projects both European and national, with his work published in scientific conferences and journals. He is currently working as a Postdoctoral Researcher in the H2020 European research project BACCHUS on mobile robotic platforms for active inspection and harvesting in agricultural areas, in the Automation and Robotics Laboratory, Aristotle University of Thessaloniki (ARL-AUTH). His research interests involve various topics with respect to non-linear control and robotics, including prescribed performance control, force/position robot control, rolling contact motion planning, object stable grasping and manipulation by robotic fingers, as well as safe human-robot interaction and cooperation. He is an associate member of the IEEE.

Zoe Doulgeri

is a Professor of Robotics and Control of Manufacturing Systems and the director of the Automation and Robotics Lab of the Department of Electrical and Computer Engineering of the Aristotle University of Thessaloniki (AUTH). She received the diploma of Electrical Engineering from AUTH and an M.Sc. (DIC) in Control Systems, an M.Sc.(DIC) in Social and Economic Aspects of Science and Technology in Industry and a Ph.D in Mechanical Engineering, from the Imperial College, London, UK. She teaches Control Systems and Robotics. She authored more than 150 publications in peer-reviewed international journals and conferences. She is an evaluator of EU research and innovation proposals in Horizon-Europe and a reviewer in the most important robotics journals and conferences. She served as associate editor in the Journal of Intelligent and Robotic systems and the IEEE Robotics and Automation Letters. She has participated in many robotics projects financed by the EU and the Greek government. She has been the coordinator of the recently completed H2020 “Collaborate” project. She is currently coordinating the H2020 European research project BACCHUS on mobile robotic platforms for active inspection and harvesting in agricultural areas and two national projects. Her current research interests include the topics of physical human robot interaction, robot teaching and learning by demonstration, bimanual mobile robots, object grasping and manipulation with analytical and data based learning methods and the control of uncertain robotic systems. She has been the representative of Greece in the European Control Association from 2016-2021. She is a senior member of the IEEE, the IEEE Robotics and Automation Society and the IEEE Control Systems Society.

Nikolaos L. Tsakiridis

received his diploma and PhD in Electrical Engineering and Computer Science from the Aristotle University of Thessaloniki, Greece. His research focuses on the development and application of explainable artificial intelligence techniques in Earth Observation data, covering all tiers from in situ and laboratory to remote sensing data. He has been involved in multiple EU-funded projects focusing on both improving the accuracy of predictions as well as understanding the underlying associations identified by the models. Most predominantly, these models are applied to improve food security with a particular emphasis on soil data, in support of the European Green Deal Strategy, the EU mission “A Soil Deal for Europe”, and EU Soil Strategy for 2030. His other research interests include big data analysis and systems engineering.

Dimitra Triantafyllou

is a research assistant in CERTH/ITI since 2012. She received her diploma in Electrical and Computer Engineering from the Aristotle University of Thessaloniki (AUTH), her M.Sc. in Production Systems from the Technical University of Crete (TUC) and her PhD from the Department of Mechanical Engineering and Aeronautics, UPAT. She has also served as teaching fellow at the University of Applied Science of Thessaly and the University of Patras (UPAT) during the period 2009–2012. Her research interests include AI, computer vision and robotics.

Ioannis Kleitsiotis

received his Diploma in Electrical and Computer Engineering from the Aristotle University of Thessaloniki, Greece, in 2017. He is currently pursuing his Ph.D at the Department of Computer Engineering and Informatics, University of Patras, Greece. His research interests include object detection in multi-modal data and synthetic data generation with applications in robotic grasping.

Ioannis Mariolis

received the Diploma degree and the Ph.D. degree in Electrical and Computer Engineering, in 2002 and 2009, respectively, both from the University of Patras (UoP)- Greece. During his PhD studies, he worked as a teaching and research assistant at the Wire Communications Laboratory, Dept. of Electrical and Computer Engineering. During 2010 and 2011, he worked as a research assistant at the Medical Physics Laboratory - School of Medicine-(UoP), while from March 2010 to March 2011 he also served as an adjunct lecturer at the Department of Informatics and Means of Mass Communication of Technological Educational Institute of Patras, Greece. Since February 2012, Dr Mariolis has been working in the Informatics and Telematics Institute (I.T.I), as a postdoctoral research fellow. His main research interests include statistical signal processing, machine vision, medical image analysis and pattern recognition.

Dimitrios Giakoumis

is a Senior Researcher (Grade C’) at the Information Technologies Institute of the Centre for Research and Technology Hellas (CERTH). He received his diploma in Electrical and Computer Engineering (2006), his MSc in Advanced Computing and Networking Systems (2008) and his PhD (2012) from the Aristotle University of Thessaloniki (AUTH). His main research interests include human-robot interaction, robot vision, service robot perception and cognition, affective computing, human motion, activity and behaviour analysis and modelling, safe and social-aware robot navigation, biosignals processing and sensor management, multimodal interfaces, machine learning and pattern recognition.

Dimitrios Tzovaras

is a Senior Researcher (Researcher A’) and the Centre for Research and Technology Hellas (CERTH) President of the Board. He received his diploma in Electrical and Computer Engineering (1992) and his Ph.D (1997) from the Aristotle University of Thessaloniki (AUTH) while he has been working as a Researcher since September 1999. His main research interests include visual analytics, 3D object recognition, search and retrieval, behavioral biometrics, assistive technologies, information and knowledge management, multimodal interfaces, computer graphics and virtual reality.

Dr. Dimitrios Kateris

is an Assistant Researcher (Grade C’) at the Institute for Bio-Economy and Agri-Technology (iBO) of the Centre for Research and Technology Hellas (CERTH). In 2015 he received his PhD in Agricultural Engineering from Aristotle University of Thessaloniki for his dissertation on “Intelligent system for fault diagnosis in agricultural tractor mechanical subsystems”. Also, he holds a MSc in Agricultural Engineering and Water Resources from the same Institute (2009) and two BSc in Hydraulics, Soil Science and Agricultural Engineering from Aristotle University of Thessaloniki (2006) and from Technological Educational Institute of Thessaly (1999) respectively. His research interests lie mainly in the areas of Agricultural Robotics, Digital Agriculture, Agricultural Engineering, Decision Support Systems (DSS), Operations Management, Information Systems, Automation Systems in Agriculture, Machine Learning. His involvement with the above research areas has led to the co-authoring of over 47 papers in refereed journals, 2 books, 6 book chapters, 1 patent and more than 100 papers in international and national conferences. He has served as a regular reviewer in many international journals and conferences. Also, he has participated (as researcher, WP leader, partner, and coordinator) in more than 20 European and National Projects.

Prof. Dionysis D. Bochtis

(M) is the Director of CERTH/iBO. Previous positions include: Chair Professor (Agri-Robotics) at the Dept. of Computer Science, University of Lincoln, UK, and Senior Scientist (Operations Management) at the Dept. of Engineering at Aarhus University, Denmark. He holds a PhD in Fleet management in bio-production systems, a MSc in Automation Control, and a B.Sc. in Exact Sciences (Physics). His primary research is Systems Engineering focused on bio-production and related provision systems including conventional systems with enhanced ICT and automation technologies and fully robotized systems. He has participated (as researcher, WP leader, partner, and coordinator) in 50 (32 EU) funded research projects. He has given 27 key-note speeches around on Systems Engineering topics. He is vice-chairman of CIGR (International Commission of Agricultural and Biosystems Engineers) Section V (Systems management) and he was the president of CIOSTA, 2011-2013 (Commission Internationale de l’Organisation Scientifique du Travail en Agriculture, founded in Paris, 1950).

Author Contributions

Conceptualization - Leonidas Droukas, Zoe Doulgeri; Literature search - Leonidas Droukas, Nikolaos L. Tsakiridis, Dimitra Triantafyllou, Ioannis Kleitsiotis and Dimitrios Kateris; Writing/original draft preparation - Leonidas Droukas, Nikolaos L. Tsakiridis, Dimitra Triantafyllou, Ioannis Kleitsiotis and Dimitrios Kateris; Writing/review and editing/critical revision - Zoe Doulgeri, Ioannis Mariolis, Dimitrios Giakoumis, Dimitrios Tzovaras and Dionysis Bochtis.

Funding

Open access funding provided by HEAL-Link Greece. This research received funding from the European Community’s Framework Programme Horizon 2020 under grant agreement No 871704, project BACCHUS.

Declarations

Conflict of Interests

The authors declare that they have no conflict of interest.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Leonidas Droukas, Email: ldroukas@ece.auth.gr.

Zoe Doulgeri, Email: doulgeri@ece.auth.gr.

Nikolaos L. Tsakiridis, Email: tsakirin@ece.auth.gr

Dimitra Triantafyllou, Email: dtriant@iti.gr.

Ioannis Kleitsiotis, Email: ioklei@iti.gr.

Ioannis Mariolis, Email: ymariolis@iti.gr.

Dimitrios Giakoumis, Email: dgiakoum@iti.gr.

Dimitrios Tzovaras, Email: Dimitrios.Tzovaras@iti.gr.

Dimitrios Kateris, Email: d.kateris@certh.gr.

Dionysis Bochtis, Email: d.bochtis@certh.gr.

References

  • 1.Eberhardt M, Vollrath D. The effect of agricultural technology on the speed of development. World Dev. 2018;109:483–496. doi: 10.1016/j.worlddev.2016.03.017. [DOI] [Google Scholar]
  • 2.Fathallah FA. Musculoskeletal disorders in labor-intensive agriculture. Appl. Ergon. 2010;41(6):738–743. doi: 10.1016/j.apergo.2010.03.003. [DOI] [PubMed] [Google Scholar]
  • 3.Proto AR, Zimbalatti G. Risk assessment of repetitive movements in the citrus fruit industry. J. Agric. Saf. Health. 2010;16(4):219–228. doi: 10.13031/2013.34834. [DOI] [PubMed] [Google Scholar]
  • 4.Zhang Z, Wang Y, Zhang Z, Li D, Wu Z, Bai R, Meng G. Ergonomic and efficiency analysis of conventional apple harvest process. Int. J. Agric. Biol. Eng. 2019;12(2):210–217. [Google Scholar]
  • 5.Zhang Z, Zhang Z, Wang X, Liu H, Wang Y, Wang W. Models for economic evaluation of multi-purpose apple harvest platform and software development. Int. J. Agric. Biol. Eng. 2019;12(1):74–83. [Google Scholar]
  • 6.Marinoudi V, Sørensen CG, Pearson S, Bochtis D. Robotics and labour in agriculture. a context consideration. Biosyst. Eng. 2019;184:111–121. doi: 10.1016/j.biosystemseng.2019.06.013. [DOI] [Google Scholar]
  • 7.Fountas S, Mylonas N, Malounas I, Rodias E, Santos CH, Pekkeriet E. Agricultural robotics for field operations. Sensors. 2020;20(9):2672. doi: 10.3390/s20092672. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Shamshiri RR, Weltzien C, Hameed IA, Yule IJ, Grift TE, Balasundram SK, Pitonakova L, Ahmad D, Chowdhary G. Research and development in agricultural robotics: a perspective of digital farming. Int. J. Agric. Biol. Eng. 2018;11(4):1–11. [Google Scholar]
  • 9.Bac CW, van Henten EJ, Hemming J, Edan Y. Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. J. Field Robot. 2014;31(6):888–911. doi: 10.1002/rob.21525. [DOI] [Google Scholar]
  • 10.Oliveira, L.F.P., Moreira, A.P., Silva, M.F.: Advances in agriculture robotics: A state-of-the-art review and challenges ahead. Robotics 10(2). 10.3390/robotics10020052 (2021)
  • 11.Slaughter DC, Giles DK, Downey D. Autonomous robotic weed control systems: a review. Comput. Electron. Agric. 2008;61(1):63–78. doi: 10.1016/j.compag.2007.05.008. [DOI] [Google Scholar]
  • 12.Zhang, Q., Karkee, M., Tabb, A.: The use of agricultural robots in orchard management. Burleigh Dodds Series in Agricultural Science, 187–214. 10.19103/as.2019.0056.14 (2019)
  • 13.Bechar, A., Vigneault, C.: Agricultural robots for field operations. Part 2: Operat. Syst. Biosyst. Eng., 110–128. 10.1016/j.biosystemseng.2016.11.004 (2017)
  • 14.Saiz-Rubio, V., Rovira-Más, F.: From smart farming towards agriculture 5.0: a review on crop data management. Agronomy (2)207. 10.3390/agronomy10020207 (2020)
  • 15.Baillie, C.P., Thomasson, J.A., Lobsey, C.R., McCarthy, C.L., Antille, D.L.: A review of the state of the art in agricultural automation part i: Sensing technologies for optimization of machine operation and farm inputs. 10.13031/aim.201801589 (2018)
  • 16.Fue, K.G., Porter, W.M., Barnes, E.M., Rains, G.C.: An extensive review of mobile agricultural robotics for field operations: Focus on cotton harvesting. AgriEngineering (1)150–174. 10.3390/agriengineering2010010 (2020)
  • 17.Defterli, S.G., Shi, Y., Xu, Y., Ehsani, R.: Review of robotic technology for strawberry production. Appl. Eng. Agric. (3)301–318. 10.13031/aea.32.11318 (2016)
  • 18.Aravind, K.R., Raja, P., Pérez-Ruiz, M.: Task-based agricultural mobile robots in arable farming: a review. Spanish Journal of Agricultural Research (1)16. 10.5424/sjar/2017151-9573 (2017)
  • 19.Tsolakis, N., Bechtsis, D., Bochtis, D.: Agros: a robot operating system based emulation tool for agricultural robotics. Agronomy (7)403. 10.3390/agronomy9070403 (2019)
  • 20.del Cerro, J., Cruz Ulloa, C., Barrientos, A., de León Rivas, J.: Unmanned aerial vehicles in agriculture: A survey. Agronomy 11(2). 10.3390/agronomy11020203 (2021)
  • 21.Mogili UR, Deepak BBVL. Review on application of drone systems in precision agriculture. Procedia Comput. Sci. 2018;133:502–509. doi: 10.1016/j.procs.2018.07.063. [DOI] [Google Scholar]
  • 22.Kim J, Kim S, Ju C, Son HI. Unmanned aerial vehicles in agriculture: A review of perspective of platform, control, and applications. IEEE Access. 2019;7:105100–105115. doi: 10.1109/ACCESS.2019.2932119. [DOI] [Google Scholar]
  • 23.Feng L, Chen S, Zhang C, Zhang Y, He Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021;182:106033. doi: 10.1016/j.compag.2021.106033. [DOI] [Google Scholar]
  • 24.Ju C, Kim J, Seol J, Son HI. A review on multirobot systems in agriculture. Comput. Electron. Agric. 2022;202:107336. doi: 10.1016/j.compag.2022.107336. [DOI] [Google Scholar]
  • 25.Ribeiro A, Conesa-Muñoz J. Multi-robot Systems for Precision Agriculture. Berlin: Springer; 2021. pp. 151–175. [Google Scholar]
  • 26.Bhandari, S., Raheja, A., Green, R.L., Do, D.: Towards collaboration between unmanned aerial and ground vehicles for precision agriculture. In: Proc. SPIE, Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II, vol. 10218, p. 1021806. 10.1117/12.2262049 (2017)
  • 27.Pretto A, Aravecchia S, Burgard W, Chebrolu N, Dornhege C, Falck T, Fleckenstein F, Fontenla A, Imperoli M, Khanna R, Liebisch F, Lottes P, Milioto A, Nardi D, Nardi S, Pfeifer J, Popović M, Potena C, Pradalier C, Rothacker-Feder E, Sa I, Schaefer A, Siegwart R, Stachniss C, Walter A, Winterhalter W, Wu X, Nieto J. Building an aerial–ground robotics system for precision farming: an adaptable solution. IEEE Robot Autom Mag. 2021;28(3):29–49. doi: 10.1109/MRA.2020.3012492. [DOI] [Google Scholar]
  • 28.Deusdado, P., Pinto, E., Guedes, M., Marques, F., Rodrigues, P., Lourenço, A., Mendonça, R., Silva, A., Santana, P., Corisco, J., Almeida, M., Portugal, L., Caldeira, R., Barata, J., Flores, L.: An aerial-ground robotic team for systematic soil and biota sampling in estuarine mudflats. In: Robot 2015: Second Iberian robotics conference, pp. 15–26. Springer. 10.1007/978-3-319-27149-1_2 (2016)
  • 29.Edmonds, M., Yigit, T., Yi, J.: Resolution-optimal, energy-constrained mission planning for unmanned aerial/ground crop inspections. In: 2021 IEEE 17Th International conference on automation science and engineering (CASE), pp. 2235–2240. 10.1109/CASE49439.2021.9551394 (2021)
  • 30.Li, P., Lee, S., Hsu, H.-Y.: Review on fruit harvesting method for potential use of automatic fruit harvesting systems. Procedia Eng, 351–366. 10.1016/j.proeng.2011.11.2514 (2011)
  • 31.Coppock, G.E., Hedden, S.L.: Design and development of a tree-shaker harvest system for citrus fruit. Transactions of the ASAE (3)339–342. 10.13031/2013.39404 (1968)
  • 32.Sumner, H.R., Hedden, S.L., Alfred, L.: Harvesting oranges with a full-powered positiong limb shaker. Floriad State Horticultural Society, 117–120 (1975)
  • 33.Torregrosa A, Ortí E, Martín B, Gil J, Ortiz C. Mechanical harvesting of oranges and mandarins in spain. Biosyst. Eng. 2009;104(1):18–24. doi: 10.1016/j.biosystemseng.2009.06.005. [DOI] [Google Scholar]
  • 34.Silwal A, Davidson JR, Karkee M, Mo C, Zhang Q, Lewis K. Design, integration, and field evaluation of a robotic apple harvester. J Field Robot. 2017;34(6):1140–1159. doi: 10.1002/rob.21715. [DOI] [Google Scholar]
  • 35.Baeten J, Donné K, Boedrij S, Beckers W, Claesen E. Autonomous fruit picking machine: a robotic apple harvester. Field and Service Robotics. 2008;42:531–539. doi: 10.1007/978-3-540-75404-6_51. [DOI] [Google Scholar]
  • 36.Bulanon DM, Kataoka T. A fruit detection system and an end effector for robotic harvesting of fuji apples. Agric. Eng. Int.: CIGR J. 2010;12:203–210. [Google Scholar]
  • 37.Onishi Y, Yoshida T, Kurita H, Fukao T, Arihara H, Iwai A. An automated fruit harvesting robot by using deep learning. ROBOMECH J. 2019;6(13):2–9. [Google Scholar]
  • 38.De-An Z, Jidong L, Wei J, Ying Z, Yu C. Design and control of an apple harvesting robot. Biosyst. Eng. 2011;110(2):112–122. doi: 10.1016/j.biosystemseng.2011.07.005. [DOI] [Google Scholar]
  • 39.Davidson, J.R., Hohimer, C.J., Mo, C., Karkee, M.: Dual robot coordination for apple harvesting. 110(2)112–122. 10.13031/aim.201700567 (2017)
  • 40.Muscato G, Prestifilippo M, Abbate N, Rizzuto I. A prototype of an orange picking robot: past history, the new robot and experimental results. Ind. Robot. 2005;32(2):128–138. doi: 10.1108/01439910510582255. [DOI] [Google Scholar]
  • 41.Klaoudatos, D.S., Moulianitis, V.C., Aspragathos, N.A.: Development of an experimental strawberry harvesting robotic system. 2,437–445. 10.5220/0007934004370445 (2019)
  • 42.Xiong Y, Peng C, Grimstad L, From PJ, Isler V. Development and field evaluation of a strawberry harvesting robot with a cable-driven gripper. Comput. Electron. Agric. 2019;157:392–402. doi: 10.1016/j.compag.2019.01.009. [DOI] [Google Scholar]
  • 43.Xiong Y, Ge Y, Grimstad L, From PJ. An autonomous strawberry-harvesting robot: design, development, integration, and field evaluation. J. Field Robot. 2020;37(2):202–224. doi: 10.1002/rob.21889. [DOI] [Google Scholar]
  • 44.Hayashi S, Yamamoto S, Tsubota S, Ochiai Y, Kobayashi K, Kamata J, Kurita M, Kurita M, Inazumi H, Rajendra P. Automation technologies for strawberry harvesting and packing operations in japan. J. Berry Res. 2014;4(1):19–27. doi: 10.3233/JBR-140065. [DOI] [Google Scholar]
  • 45.Hayashi S, Shigematsu K, Yamamoto S, Kobayashi K, Kohno Y, Kamata J, Kurita M. Evaluation of a strawberry-harvesting robot in a field test. Biosyst. Eng. 2010;105(2):160–171. doi: 10.1016/j.biosystemseng.2009.09.011. [DOI] [Google Scholar]
  • 46.Tanigaki K, Fujiura T, Akase A, Imagawa J. Cherry-harvesting robot. Comput. Electron. Agric. 2008;63(1):65–72. doi: 10.1016/j.compag.2008.01.018. [DOI] [Google Scholar]
  • 47.Edan Y, Rogozin D, Flash T, Miles GE. Robotic melon harvesting. IEEE Trans. Robot. Autom. 2000;16(6):831–835. doi: 10.1109/70.897793. [DOI] [Google Scholar]
  • 48.Bechar A, Edan Y. Human-robot collaboration for improved target recognition of agricultural robots. Ind. Robot. 2003;30(5):432–436. doi: 10.1108/01439910310492194. [DOI] [Google Scholar]
  • 49.Umeda M, Kubota S, Iida M. Development of ’stork’, a watermelon-harvesting robot. Artif. Life Robot. 1999;3:143–147. doi: 10.1007/BF02481130. [DOI] [Google Scholar]
  • 50.Zion B, Mann M, Levin D, Shilo A, Rubinstein D, Shmulevich I. Harvest-order planning for a multiarm robotic harvester. Comput. Electron. Agric. 2014;103:75–81. doi: 10.1016/j.compag.2014.02.008. [DOI] [Google Scholar]
  • 51.Ceres R, Pons JL, Jimenez AR, Martin JM, Calderon L. Design and implementation of an aided fruit-harvesting robot(agribot) Ind. Robot. 1998;25(5):337–346. doi: 10.1108/01439919810232440. [DOI] [Google Scholar]
  • 52.Yaguchi, H., Nagahama, K., Hasegawa, T., Inaba, M.: Development of an autonomous tomato harvesting robot with rotational plucking gripper. 3, 652–657. 10.1109/IROS.2016.7759122 (2016)
  • 53.Wang LL, Bo Z, Jinwei F, Xiaoan H, Shu W, Yashuo L, Zhou Q, Chongfeng W. Development of a tomato harvesting robot used in greenhouse. Int. J. Agric. Biol. Eng. 2017;10(4):140–149. [Google Scholar]
  • 54.Feng Q, Zou W, Fan P, Zhang C, Wang X. Design and test of robotic harvesting system for cherry tomato. Int. J. Agric. Biol. Eng. 2018;11(1):96–100. [Google Scholar]
  • 55.Zhao Y, Gong L, Liu C, Huang Y. Dual-arm robot design and testing for harvesting tomato in greenhouse. IFAC-PapersOnLine. 2016;49(16):161–165. doi: 10.1016/j.ifacol.2016.10.030. [DOI] [Google Scholar]
  • 56.Ling X, Zhao Y, Gong L, Liu C, Wang T. Dual-arm cooperation and implementing for robotic harvesting tomato using binocular vision. Robot. Auton. Syst. 2019;114:134–143. doi: 10.1016/j.robot.2019.01.019. [DOI] [Google Scholar]
  • 57.Henten EJV, Hemming J, van Tuijl BAJ, Kornet JG, Meuleman J, Bontsema J, van Os EA. An autonomous robot for harvesting cucumbers in greenhouses. Auton. Robot. 2002;13:241–258. doi: 10.1023/A:1020568125418. [DOI] [Google Scholar]
  • 58.Herck, L.V., Kurtser, P., Wittemans, L., Edan, Y.: Crop design for improved robotic harvesting: a case study of sweet pepper harvesting. Biosyst. Eng. 192, 294–308. 10.1016/j.biosystemseng.2020.01.021 (2020)
  • 59.Bloch V, Degani A, Bechar A. A methodology of orchard architecture design for an optimal harvesting robot. Biosyst. Eng. 2018;166:126–137. doi: 10.1016/j.biosystemseng.2017.11.006. [DOI] [Google Scholar]
  • 60.Hayashi S, Ganno K, Ishii Y, Tanaka I. Robotic harvesting system for eggplants. Japan Agricultural Research Quarterly: JARQ. 2002;36(3):163–168. doi: 10.6090/jarq.36.163. [DOI] [Google Scholar]
  • 61.Foglia MM, Reina G. Agricultural robot for radicchio harvesting. J. Field Robot. 2006;23(6-7):363–377. doi: 10.1002/rob.20131. [DOI] [Google Scholar]
  • 62.Irie, N., Taguchi, N., Horie, T., Ishimatsu, T.: Asparagus harvesting robot coordinated with 3-d vision sensor, 1–6. 10.1109/ICIT.2009.4939556 (2009)
  • 63.Zhang T, Huang Z, You W, Lin J, Tang X, Huang H. An autonomous fruit and vegetable harvester with a low-cost gripper using a 3d sensor. Sensors. 2020;20(1):93. doi: 10.3390/s20010093. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Kitamura, S., Oka, K.: Recognition and cutting system of sweet pepper for picking robot in greenhouse horticulture 1807–1812. 10.1109/icma.2005.1626834 (2005)
  • 65.Lehnert C, English A, McCool C, Tow AW, Perez T. Autonomous sweet pepper harvesting for protected cropping systems. IEEE Robotics and Automation Letters. 2017;2(2):872–879. doi: 10.1109/LRA.2017.2655622. [DOI] [Google Scholar]
  • 66.Arad B, Balendonck J, Barth R, Ben-Shahar O, Edan Y, Hellström T, Hemming J, Kurtser P, Ringdahl O, Tielen T, van Tuijl B. Development of a sweet pepper harvesting robot. Journal of Field Robotics. 2020;37(6):1027–1039. doi: 10.1002/rob.21937. [DOI] [Google Scholar]
  • 67.Quaglia G, Visconte C, Scimmi LS, Melchiorre M, Cavallone P, Pastorelli S. Design of a ugv powered by solar energy for precision agriculture. Robotics. 2020;9(1):13. doi: 10.3390/robotics9010013. [DOI] [Google Scholar]
  • 68.Quaglia G, Cavallone P, Visconte C. Agri-q: Agriculture ugv for monitoring and drone landing. IFToMM Symposium on Mechanism Design for Robotics. 2019;66:413–423. [Google Scholar]
  • 69.Quaglia G, Visconte C, Scimmi LS, Melchiorre M, Cavallone P, Pastorelli S. Design of the positioning mechanism of an unmanned ground vehicle for precision agriculture. IFToMM World Congress on Mechanism and Machine Science. 2019;73:3531–3540. doi: 10.1007/978-3-030-20131-9_348. [DOI] [Google Scholar]
  • 70.Monta, M., Kondo, N., Shibano, Y.: Agricultural robot in grape production system. 3, 2504–2509. 10.1109/ROBOT.1995.525635 (1995)
  • 71.Ogawa Y, Kondo NA, Monta MI, Sakaeshibusawa N. Spraying robot for grape production. Field and Service Robotics. 2006;24:539–548. doi: 10.1007/10991459_52. [DOI] [Google Scholar]
  • 72.Oberti R, Marchi M, Tirelli P, Calcante A, Iriti M, Hočevar M, Baur J, Pfaff J, Schütz C, Ulbrich H. Selective spraying of grapevine’s diseases by a modular agricultural robot. J. Agric. Eng. 2013;44(s2):149–153. [Google Scholar]
  • 73.Bontsema, J., Hemming, J., Pekkeriet, E., Saeys, W., Edan, Y., Shapiro, A., Hocevar, M., Oberti, R., Armada, M., Ulbrich, H., Baur, J., Debilde, B., Best, S., Evain, S., Gauchel, W., Hellström, T., Ringdahl, O.: Crops: Clever robots for crops. Engineering & Technology Reference, 1–11. 10.1049/etr.2015.0015 (2015)
  • 74.Roure, F., Moreno, G., Soler, M., Faconti, D., Serrano, D., Astolfi, P., Bardaro, G., Gabrielli, A., Bascetta, L., Matteucci, M.: Grape: Ground robot for vineyard monitoring and protection. 249–260, 10.1007/978-3-319-70833-1_21 (2017)
  • 75.Reiser D, Sehsah ES, Bumann O, Morhard J, Griepentrog HW. Development of an autonomous electric robot implement for intra-row weeding in vineyards. Agriculture. 2019;9(1):1–12. doi: 10.3390/agriculture9010018. [DOI] [Google Scholar]
  • 76.Santos, F.N.D., Sobreira, H.M., Campos, D.F., Morais, R., Moreira, A.P., Contente, O.: Towards a reliable monitoring robot for mountain vineyards. 37–43, 10.11009/2015.21 (2015)
  • 77.Botterill T, Paulin S, Green R, Williams S, Lin J, Saxton V, Mills S, Chen X, Corbett-Davies S. A robot system for pruning grape vines. J. Field Robot. 2017;34(6):1100–1122. doi: 10.1002/rob.21680. [DOI] [Google Scholar]
  • 78.Tang, Y., Chen, M., Wang, C., Luo, L., Li, J., Lian, G., Zou, X.: Recognition and localization methods for vision-based fruit picking robots: A review. Frontiers in Plant Science 11. 10.3389/fpls.2020.00510(2020) [DOI] [PMC free article] [PubMed]
  • 79.Saleem MH, Potgieter J, Arif KM. Automation in agriculture by machine and deep learning techniques: a review of recent developments. Precision Agric. 2021;22:2053–2091. doi: 10.1007/s11119-021-09806-x. [DOI] [Google Scholar]
  • 80.Vitzrabin E, Edan Y. Adaptive thresholding with fusion using a rgbd sensor for red sweet-pepper detection. Biosys. Eng. 2016;146:45–56. doi: 10.1016/j.biosystemseng.2015.12.002. [DOI] [Google Scholar]
  • 81.Blok PM, Barth R, Van den Berg W. Machine vision for a selective broccoli harvesting robot. IFAC-PapersOnLine. 2016;49(16):66–71. doi: 10.1016/j.ifacol.2016.10.013. [DOI] [Google Scholar]
  • 82.Kurtser P, Ringdahl O, Rotstein N, Berenstein R, Edan Y. In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level rgb-d camera. IEEE Robot. Automation Lett. 2020;5(2):2031–2038. doi: 10.1109/LRA.2020.2970654. [DOI] [Google Scholar]
  • 83.Nuske S, Wilshusen K, Achar S, Yoder L, Narasimhan S, Singh S. Automated visual yield estimation in vineyards. J. Field Robot. 2014;31(5):837–860. doi: 10.1002/rob.21541. [DOI] [Google Scholar]
  • 84.Luo L, Tang Y, Lu Q, Chen X, Zhang P, Zou X. A vision methodology for harvesting robot to detect cutting points on peduncles of double overlapping grape clusters in a vineyard. Comput. Industry. 2018;99:130–139. doi: 10.1016/j.compind.2018.03.017. [DOI] [Google Scholar]
  • 85.Ostovar A, Ringdahl O, Hellstrom T. Adaptive image thresholding of yellow peppers for a harvesting robot. Robotics. 2018;7(1):1–16. doi: 10.3390/robotics7010011. [DOI] [Google Scholar]
  • 86.Altaheri H, Alsulaiman M, Muhammad G. Date fruit classification for robotic harvesting in a natural environment using deep learning. IEEE Access. 2019;7:117115–117133. doi: 10.1109/ACCESS.2019.2936536. [DOI] [Google Scholar]
  • 87.Ge Y, Xiong Y, Tenorio GL, From PJ. Fruit localization and environment perception for strawberry harvesting robots. IEEE Access. 2019;7:147642–147652. doi: 10.1109/ACCESS.2019.2946369. [DOI] [Google Scholar]
  • 88.Lehnert, C., Sa, I., McCool, C., Upcroft, B., Perez, T.: Sweet pepper pose detection and grasping for automated crop harvesting, 2428–2434. 10.1109/ICRA.2016.7487394 (2016)
  • 89.Kang, H., Zhou, H., Chen, C.: Visual perception and modeling for autonomous apple harvesting. IEEE Access, 62151–62163. 10.1109/ACCESS.2020.2984556 (2020)
  • 90.Zapotezny-Anderson PM, Lehnert C. Towards active robotic vision in agriculture: a deep learning approach to visual servoing in occluded and unstructured protected cropping environments. IFAC-PapersOnLine. 2019;52(30):120–125. doi: 10.1016/j.ifacol.2019.12.508. [DOI] [Google Scholar]
  • 91.Arad B, Kurtser P, Barnea E, Harel B, Edan Y, Ben-Shahar O. Controlled lighting and illumination-independent target detection for real-time cost-efficient applications. the case study of sweet pepper robotic harvesting. Sensors. 2019;19(6):1390. doi: 10.3390/s19061390. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Tian Y, Duan H, Luo R, Zhang Y, Jia W, Lian J, Zheng Y, Ruan C, Li C. Fast recognition and location of target fruit based on depth information. IEEE Access. 2019;7:170553–170563. doi: 10.1109/ACCESS.2019.2955566. [DOI] [Google Scholar]
  • 93.Kurtser P, Ringdahl O, Rotstein N, Andreasson H. Pointnet and geometric reasoning for detection of grape vines from single frame rgb-d data in outdoor conditions. Proceedings of the Northern Lights Deep Learning Workshop 2020. 2020;1:6. doi: 10.7557/18.5155. [DOI] [Google Scholar]
  • 94.McCool, C., Sa, I., Dayoub, F., Lehnert, C., Perez, T., Upcroft, B.: Visual detection of occluded crop: For automated harvesting. 2506–2512. 10.1109/ICRA.2016.7487405 (2016)
  • 95.Liu, Y.P., Yang, C.H., Ling, H., Mabu, S., Kuremoto, T.: A visual system of citrus picking robot using convolutional neural networks. 344–349. 10.1109/ICSAI.2018.8599325 (2018)
  • 96.Lehnert, C., Tsai, D., Eriksson, A., McCool, C.: 3d move to see: Multiperspective visual servoing towards the next best view within unstructured and occluded environments. 3890–3897. 10.1109/IROS40897.2019.8967918 (2019)
  • 97.Zemmour E, Kurtser P, Edan Y. Automatic parameter tuning for adaptive thresholding in fruit detection. Sensors. 2019;19(9):2130. doi: 10.3390/s19092130. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 98.Vitzrabin E, Edan Y. Changing task objectives for improved sweet pepper detection for robotic harvesting. IEEE Robot. Automat. Lett. 2016;1(1):578–584. doi: 10.1109/LRA.2016.2523553. [DOI] [Google Scholar]
  • 99.Pothen, Z.S., Nuske, S.: Texture-based fruit detection via images using the smooth patterns on the fruit. 5171–5176. 10.1109/ICRA.2016.7487722 (2016)
  • 100.Bargoti, S., Underwood, J.P.: Image segmentation for fruit detection and yield estimation in apple orchards. Journal of Field Robotics 34(6) (2016). 10.1002/rob.21699
  • 101.Lin G, Tang Y-C, Zou X, Xiong J, Li J. Guava detection and pose estimation using a low-cost rgb-d sensor in the field. Sensors. 2019;19(2):428. doi: 10.3390/s19020428. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 102.Fawakherji, M., Youssef, A., Bloisi, D., Pretto, A., Nardi, D.: Crop and weeds classification for precision agriculture using context-independent pixel-wise segmentation, 146–152. 10.1109/IRC.2019.00029(2019)
  • 103.Häni N, Roy P, Isler V. A comparative study of fruit detection and counting methods for yield mapping in apple orchards. J. Field Robot. 2020;37(2):263–282. doi: 10.1002/rob.21902. [DOI] [Google Scholar]
  • 104.Liu, X., Chen, S., Aditya, S., Sivakumar, N., Dcunha, S., Qu, C., Taylor, C.J., Das, J., Kumar, V.: Robust fruit counting: Combining deep learning, tracking, and structure from motion, 1045–1052. 10.1109/IROS.2018.8594239 (2018)
  • 105.Lin G, Tang Y, Zou X, Xiong J, Fang Y. Color-, depth-, and shape-based 3d fruit detection. Precis. Agric. 2019;21:1–17. doi: 10.1007/s11119-019-09654-w. [DOI] [Google Scholar]
  • 106.McCool C, Perez T, Upcroft B. Mixtures of lightweight deep convolutional neural networks: Applied to agricultural robotics. IEEE Robot. Autom. Lett. 2017;2(3):1344–1351. doi: 10.1109/LRA.2017.2667039. [DOI] [Google Scholar]
  • 107.Santos, T.T., Souza, L., dos Santos, A.A., Avila, S.: Grape detection, segmentation and tracking using deep neural networks and three dimensional association. Comput. Electron. Agric., 170. 10.1016/j.compag.2020.105247 (2020)
  • 108.Koirala A, Walsh K, Wang Z, McCarthy C. Deep learning for realtime fruit detection and orchard fruit load estimation: benchmarking of mangoyolo. Precis. Agric. 2019;20:1107–1135. doi: 10.1007/s11119-019-09642-0. [DOI] [Google Scholar]
  • 109.Tian Y, Yang G, Wang Z, Li E, Liang Z. Detection of apple lesions in orchards based on deep learning methods of cyclegan and yolov3-dense. Journal of Sensors. 2019;20:7630926–1763092613. [Google Scholar]
  • 110.Gonzalez S, Arellano C, Tapia JE. Deepblueberry: Quantification of blueberries in the wild using instance segmentation. IEEE Access. 2019;7:105776–105788. doi: 10.1109/ACCESS.2019.2933062. [DOI] [Google Scholar]
  • 111.Ganesh P, Volle K, Burks T, Mehta S. Deep orange: Mask r-cnn based orange detection and segmentation. IFAC-PapersOnLine. 2019;52(30):70–75. doi: 10.1016/j.ifacol.2019.12.499. [DOI] [Google Scholar]
  • 112.Sa, I., Ge, Z., Dayoub, F., Upcroft, B., Perez, T., McCool, C.: Deepfruits: A fruit detection system using deep neural networks. Sensors (Basel Switzerland) 16(8). 10.3390/s16081222 (2016) [DOI] [PMC free article] [PubMed]
  • 113.Gene-Mola J, Vilaplana V, Rosell-Polo JR, Morros JR, Ruiz-Hidalgo J, Gregorio E. Multi-modal deep learning for fuji apple detection using rgb-d cameras and their radiometric capabilities. Comput. Electron. Agric. 2019;162:689–698. doi: 10.1016/j.compag.2019.05.016. [DOI] [Google Scholar]
  • 114.Hasan, M.M., Chopin, J.P., Laga, H., Miklavcic, S.J.: Detection and analysis of wheat spikes using convolutional neural networks. Plant Methods 14(100). 10.1186/s13007-018-0366-8 (2018) [DOI] [PMC free article] [PubMed]
  • 115.Kang, H., Chen, C.: Fast implementation of real-time fruit detection in apple orchards using deep learning. Comput. Electron. Agric., 168. 10.1016/j.compag.2019.105108 (2020)
  • 116.Kang, H., Chen, C.: Fruit detection and segmentation for apple harvesting using visual sensor in orchards. Sensors (Basel Switzerland) 19(20). 10.3390/s19204599 (2019) [DOI] [PMC free article] [PubMed]
  • 117.Kirk R, Cielniak G, Mangan M. L*a*b*fruits: a rapid and robust outdoor fruit detection system combining bio-inspired features with one-stage deep learning networks. Sensors. 2020;20(1):275. doi: 10.3390/s20010275. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 118.Dias PA, Tabb A, Medeiros H. Apple flower detection using deep convolutional networks. Comput. Ind. 2018;99:17–28. doi: 10.1016/j.compind.2018.03.010. [DOI] [Google Scholar]
  • 119.Bayati, M., Fotouhi, R.: A mobile robotic platform for crop monitoring. Advances in Robotics and Automation 7(1). 10.4172/2168-9695.1000186 (2018)
  • 120.Virlet N, Sabermanesh K, Sadeghi-Tehran P, Hawkesford MJ. Field scanalyzer: an automated robotic field phenotyping platform for detailed crop monitoring. Funct. Plant Biol. 2017;44(1):143–153. doi: 10.1071/FP16163. [DOI] [PubMed] [Google Scholar]
  • 121.Shafiekhani A, Kadam S, Fritschi FB, DeSouza GN. Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping. Sensors. 2017;17(1):214. doi: 10.3390/s17010214. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122.Gutiérrez S, Fernández-Novales J, Diago MP, Tardaguila J. On-the-go hyperspectral imaging under field conditions and machine learning for the classification of grapevine varieties. Frontiers in Plant Sci. 2018;9:1102. doi: 10.3389/fpls.2018.01102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 123.Hu K, Coleman GRY, Zeng S, Wang Z, Walsh M. Graph weeds net: a graph-based deep learning method for weed recognition. Comput. Electron. Agric. 2020;174:105520. doi: 10.1016/j.compag.2020.105520. [DOI] [Google Scholar]
  • 124.Christiansen P, Nielsen L, Steen K, Jorgensen R, Karstoft H. Deepanomaly: Combining background subtraction and deep learning for detecting obstacles and anomalies in an agricultural field. Sensors. 2016;16(11):1904. doi: 10.3390/s16111904. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 125.Isokane, T., Okura, F., Ide, A., Matsushita, Y., Yagi, Y.: Probabilistic plant modeling via multi-view image-to-image translation. 2906–2915, 10.1109/CVPR.2018.00307 (2018)
  • 126.Bietresato M, Carabin G, Vidoni R, Gasparetto A, Mazzetto F. Evaluation of a lidar-based 3d-stereoscopic vision system for crop-monitoring applications. Comput. Electron. Agric. 2016;124:1–13. doi: 10.1016/j.compag.2016.03.017. [DOI] [Google Scholar]
  • 127.Williams D, Britten A, McCallum S, Jones H, Aitkenhead M, Karley A, Loades K, Prashar A, Graham J. A method for automatic segmentation and splitting of hyperspectral images of raspberry plants collected in field conditions. Plant methods. 2017;13(74):1–12. doi: 10.1186/s13007-017-0226-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 128.Daniels AJ, Poblete-Echeverría C, Opara UL, Nieuwoudt HH. Measuring internal maturity parameters contactless on intact table grape bunches using nir spectroscopy. Front. Plant Sci. 2019;10:1517. doi: 10.3389/fpls.2019.01517. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 129.Costa DDS, Mesa NFO, Freire MS, Ramos RP, Mederos BJT. Development of predictive models for quality and maturation stage attributes of wine grapes using vis-nir reflectance spectroscopy. Postharvest Biology and Technol. 2019;150:166–178. doi: 10.1016/j.postharvbio.2018.12.010. [DOI] [Google Scholar]
  • 130.Porep, J.U., Mattes, A., Nikfardjam, M.S., Kammerer, D.R., Carle, R.: Implementation of an on-line near infrared/visible (nir/vis) spectrometer for rapid quality assessment of grapes upon receival at wineries. Australian Journal of Grape and Wine Research 21(1). 10.1111/ajgw.12120 (2015)
  • 131.Giovenzana V, Beghi R, Malegori C, Civelli R, Guidetti R. Wavelength selection with a view to a simplified handheld optical system to estimate grape ripeness. Am. J. Enol. Vitic. 2013;65:117–123. doi: 10.5344/ajev.2013.13024. [DOI] [Google Scholar]
  • 132.Wendel A, Underwood J, Walsh K. Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform. Comput. Electron. Agric. 2018;155:298–313. doi: 10.1016/j.compag.2018.10.021. [DOI] [Google Scholar]
  • 133.Gutiérrez S, Wendel A, Underwood J. Ground based hyperspectral imaging for extensive mango yield estimation. Comput. Electron. Agric. 2019;157:126–135. doi: 10.1016/j.compag.2018.12.041. [DOI] [Google Scholar]
  • 134.Gutiérrez S, Wendel A, Underwood J. Spectral filter design based on in-field hyperspectral imaging and machine learning for mango ripeness estimation. Comput. Electron. Agric. 2019;164(1):104890. doi: 10.1016/j.compag.2019.104890. [DOI] [Google Scholar]
  • 135.Nasirahmadi, A., Wilczek, U., Hensel, O.: Sugar beet detection during harvesting using different convolutional neural networok models. Agriculture 11(11). 10.3390/agriculture11111111 (2021)
  • 136.Tian, S., Wang, S., Xu, H.: Early detection of freezing damage in oranges by online vis/nir transmission coupled with diameter method and deep 1d-cnn. Comput. Electron. Agric., 193. 10.1016/j.compag.2021.106638 (2022)
  • 137.Jin X, Jie L, Wang S, Qi HJ, Li SW. Classifying wheat hyperspectral pixels of healthy heads and fusarium head blight disease using a deep neural network in the wild field. Remote Sens. 2018;10(3):395. doi: 10.3390/rs10030395. [DOI] [Google Scholar]
  • 138.Mack J, Lenz C, Teutrine J, Steinhage V. High-precision 3d detection and reconstruction of grapes from laser range data for efficient phenotyping based on supervised learning. Comput. Electron. Agric. 2017;135:300–311. doi: 10.1016/j.compag.2017.02.017. [DOI] [Google Scholar]
  • 139.Hafeez, A., Aslam Husain, M., Singh, S.P., Chauhan, A., Khan, M.T., Kumar, N., Chauhan, A., Soni, S.K.: Implementation of drone technology for farm monitoring and pesticide spraying: A review. Information Processing in Agriculture. 10.1016/j.inpa.2022.02.002 (2022)
  • 140.Esposito, M., Crimaldi, V.M., et al.: Cirillo Drone and sensor technology for sustainable weed management: a review. Chem. Biol. Technol Agric 8(18). 10.1186/s40538-021-00217-8 (2022)
  • 141.Su J, et al. Aerial visual perception in smart farming: Field study of wheat yellow rust monitoring. IEEE Trans Industr Inform. 2021;17(3):2242–2249. doi: 10.1109/TII.2020.2979237. [DOI] [Google Scholar]
  • 142.Crimaldi, M., Cristiano, V., De Vivo, A., Isernia, M., Ivanov, P., Sarghini, F.: Neural network algorithms for real time plant diseases detection using uavs. Innovative Biosystems Engineering for Sustainable Agriculture, 67. 10.1007/978-3-030-39299-4_89 (2020)
  • 143.Pflanz, M., Schirrmann, M., Nordmeyer, H.: Drone based weed monitoring with an image feature classifier. Julius-Kühn-Archiv, 84 (2018)
  • 144.Liao, J., Babiker, I., Xie, W.-F., Li, W., Cao, L.: Dandelion segmentation with background transfer learning and rgb-attention module. Comput. Electron. Agric., 202. 10.1016/j.compag.2022.107355(2022)
  • 145.Sa, I., Popović, M., Khanna, R., Chen, Z., Lottes, P., Liebisch, F., Nieto, J., Stachniss, C., Walter, A., Siegwart, R.: Weedmap: a large-scale semantic weed mapping framework using aerial multispectral imaging and deep neural network for precision farming. Remote Sensing 10(9). 10.3390/rs10091423(2018)
  • 146.Mazzia, V., Comba, L., Khaliq, A., Chiaberge, M., Gay, P.: Uav and machine learning based refinement of a satellite-driven vegetation index for precision agriculture. Sensors 20(9). 10.3390/s20092530 (2020) [DOI] [PMC free article] [PubMed]
  • 147.Candiago S, Remondino F, De Giglio M, Dubbini M, Gattelli M. Evaluating multispectral images and vegetation indices for precision farming applications from uav images. Remote Sens. 2015;7(4):4026–4047. doi: 10.3390/rs70404026. [DOI] [Google Scholar]
  • 148.Che’Ya, N.N., Dunwoody, E., Gupta, M.: Assessment of weed classification using hyperspectral reflectance and optimal multispectral uav imagery. Agronomy 11(7). 10.3390/agronomy11071435 (2021)
  • 149.Mink R, Linn AI, Santel H-J, Gerhards R. Sensor-based evaluation of maize (zea mays) and weed response to post-emergence herbicide applications of isoxaflutole and cyprosulfamide applied as crop seed treatment or herbicide mixing partner. Pest Manag. Sci. 2019;76(5):1856–1865. doi: 10.1002/ps.5715. [DOI] [PubMed] [Google Scholar]
  • 150.Zaidner G, Shapiro A. A novel data fusion algorithm for low-cost localisation and navigation of autonomous vineyard sprayer robots. Biosyst. Eng. 2016;146:133–148. doi: 10.1016/j.biosystemseng.2016.05.002. [DOI] [Google Scholar]
  • 151.Gan H, Lee WS. Development of a navigation system for a smart farm. IFAC-PapersOnLine. 2018;51(17):1–4. doi: 10.1016/j.ifacol.2018.08.051. [DOI] [Google Scholar]
  • 152.Biber, P., Weiss, U., Dorna, M., Albert, A: Navigation system of the autonomous agricultural robot ‘bonirob,’. Workshop on Agricultural Robotics: Enabling Safe, Efficient, and Affordable Robots for Food Production (collocated with IROS 2012) Portugal (2012)
  • 153.Gu Y, Li Z, Zhang Z, Li J, Chen L. Path tracking control of field information- collecting robot based on improved convolutional neural network algorithm. Sensors. 2020;20(3):797. doi: 10.3390/s20030797. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 154.Ouellette, R., Hirasawa, K.: Mayfly: a small mapping robot for japanese office environments. In: IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 880–885. 10.1109/AIM.2008.4601777 (2008)
  • 155.Zhang, J., Maeta, S., Bergerman, M., Singh, S.: Mapping orchards for autonomous navigation. ASABE and CSBE/SCGAB Annual International Meeting, St. Joseph, Michigan, American Society of Agricultural and Biological Engineers. 10.13031/aim.20141838567 (2014)
  • 156.Libby, J., Kantor, G.: Deployment of a point and line feature localization system for an outdoor agriculture vehicle, pp 1565–1570. 10.1109/ICRA.2011.5980430 (2011)
  • 157.Jin J, Tang L. Corn plant sensing using real-time stereo vision. J. Field Robot. 2009;26(6):591–608. doi: 10.1002/rob.20293. [DOI] [Google Scholar]
  • 158.Cadena C, Carlone L, Carrillo H, Latif Y, Scaramuzza D, Neira J, Reid I, Leonard JJ. Past, present, and future of simultaneous localization and mapping: Toward the robust-perception age. IEEE Trans. Robot. 2016;32(6):1309–1332. doi: 10.1109/TRO.2016.2624754. [DOI] [Google Scholar]
  • 159.Cheein FA, Steiner G, Paina GP, Carelli R. Optimized eif-slam algorithm for precision agriculture mapping based on stems detection. Comput. Electron. Agric. 2011;78(2):195–207. doi: 10.1016/j.compag.2011.07.007. [DOI] [Google Scholar]
  • 160.Pierzchała M, Giguère P, Astrup R. Mapping forests using an unmanned ground vehicle with 3d lidar and graph-slam. Comput. Electron. Agric. 2018;145:217–225. doi: 10.1016/j.compag.2017.12.034. [DOI] [Google Scholar]
  • 161.Nguyen TT, Kayacan E, Baedemaeker JD, Saeys W. Task and motion planning for apple harvesting robot*. IFAC Proc. 2013;46(18):247–252. doi: 10.3182/20130828-2-SF-3019.00063. [DOI] [Google Scholar]
  • 162.Hornung A, Wurm KM, Bennewitz M, Stachniss C, Burgard W. Octomap: an efficient probabilistic 3d mapping framework based on octrees. Auton. Robot. 2013;34(3):189–206. doi: 10.1007/s10514-012-9321-0. [DOI] [Google Scholar]
  • 163.Mehta SS, Burks TF. Vision-based control of robotic manipulator for citrus harvesting. Comput. Electron. Agric. 2014;102:146–158. doi: 10.1016/j.compag.2014.01.003. [DOI] [Google Scholar]
  • 164.Barth R, Hemming J, van Henten EJ. Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation. Biosyst. Eng. 2016;146:71–84. doi: 10.1016/j.biosystemseng.2015.12.001. [DOI] [Google Scholar]
  • 165.Ringdahl O, Kurtser P, Edan Y. Evaluation of approach strategies for harvesting robots: Case study of sweet pepper harvesting: Category: (5) J. Intell. Robot. Syst. 2019;95(1):149–164. doi: 10.1007/s10846-018-0892-7. [DOI] [Google Scholar]
  • 166.Ringdahl, O., Kurtser, P., Edan, Y.: Strategies for selecting best approach direction for a sweet-pepper harvesting robot. 10454, 516–525. 10.1007/978-3-319-64107-2_41 (2017)
  • 167.Camacho, J.D.G., From, P.J., Leite, A.C.: A Visual Servoing Approach for Robotic Fruit Harvesting in the Presence of Parametric Uncertainties XXII Congresso Brasileiro De Automatica. 1. Campinas/SP Brasil: SBA. 10.20906/cps/cba2018-0541 (2018)
  • 168.Bu L, Hu G, Chen C, Sugirbay A, Chen J. Experimental and simulation analysis of optimum picking patterns for robotic apple harvesting. Sci. Hortic. 2020;261:108937. doi: 10.1016/j.scienta.2019.108937. [DOI] [Google Scholar]
  • 169.Wei, J., Yi, D., Bo, X., Guangyu, C., Dean, Z.: Adaptive variable parameter impedance control for apple harvesting robot compliant picking. Complexity, 1–15. 10.1155/2020/4812657 (2020)
  • 170.Roshanianfard A, Noguchi N. Characterization of pumpkin for a harvesting robot. IFAC-PapersOnLine. 2018;51(17):23–30. doi: 10.1016/j.ifacol.2018.08.056. [DOI] [Google Scholar]
  • 171.Mehta SS, MacKunis W, Burks TF. Nonlinear robust visual servo control for robotic citrus harvesting. IFAC Proceedings. 2014;47(3):8110–8115. [Google Scholar]
  • 172.Vasconez JP, Kantor GA, Cheein FAA. Human–robot interaction in agriculture: a survey and current challenges. Biosyst. Eng. 2019;179:35–48. doi: 10.1016/j.biosystemseng.2018.12.005. [DOI] [Google Scholar]
  • 173.Bechar A, Vigneault C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016;149:94–111. doi: 10.1016/j.biosystemseng.2016.06.014. [DOI] [Google Scholar]
  • 174.Adamides, G.: Doctoral dissertation: user interfaces for human-robot interaction: Application on a semi-autonomous agricultural robot sprayer. Ph.D dissertation (2016)
  • 175.Bergerman M, Maeta SM, Zhang J, Freitas GM, Hamner B, Singh S, Kantor G. Robot farmers: Autonomous orchard vehicles help tree fruit production. IEEE Robotics Automation Magazine. 2015;22(1):54–63. doi: 10.1109/MRA.2014.2369292. [DOI] [Google Scholar]
  • 176.Cullen RH, Smarr C-A, Serrano-Baquero D, McBride SE, Beer JM, Rogers WA. The smooth (tractor) operator: Insights of knowledge engineering. Appl. Ergon. 2012;43(6):1122–30. doi: 10.1016/j.apergo.2012.04.002. [DOI] [PubMed] [Google Scholar]
  • 177.Jin X, Zheng B, Pei Y, Li H. A method to estimate operator’s mental workload in multiple information presentation environment of agricultural vehicles. Engineering Psychology and Cognitive Ergonomics: Performance Emotion and Situation Awareness. 2017;10275:3–20. [Google Scholar]
  • 178.Gomez-Gil J, San-Jose-Gonzalez I, Nicolas-Alonso LF, Alonso-Garcia S. Steering a tractor by means of an emg-based human-machine interface. Sensors. 2011;11(7):7110–26. doi: 10.3390/s110707110. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 179.Szczepaniak J, Tanas W, Pawlowski T, Kromulski J. Modelling of agricultural combination driver behaviour from the aspect of safety of movement. Annals of Agricultural and Environmental Medicine: AAEM. 2014;21(2):403–6. doi: 10.5604/1232-1966.1108613. [DOI] [PubMed] [Google Scholar]
  • 180.Mohan J, Lanka K, Rao NA. A review of dynamic job shop scheduling techniques. Procedia Manuf. 2019;30:34–39. doi: 10.1016/j.promfg.2019.02.006. [DOI] [Google Scholar]
  • 181.Petrovic, M., Miljkovic, Z., Jokic, A.: A novel methodology for optimal single mobile robot scheduling using whale optimization algorithm. Applied Soft Computing Journal, 81. 10.1016/j.asoc.2019.105520(2019)
  • 182.Turkyilmaz, A., Senvar, O., Unal, I., Bulkan, S.: A research survey: heuristic approaches for solving multi objective flexible job shop problems. Journal of Intelligent Manufacturing 31(4). 10.1007/s10845-020-01547-4 (2020)
  • 183.Xu, L., Jiawei, D., Ming, H.: Research on hybrid cloud particle swarm optimization for multi-objective flexible job shop scheduling problem (2017)
  • 184.Li JQ, Pan QK, Tasgetiren MF. A discrete artificial bee colony algorithm for the multi-objective flexible job-shop scheduling problem with maintenance activities. Appl. Math. Model. 2014;38(3):1111–1132. doi: 10.1016/j.apm.2013.07.038. [DOI] [Google Scholar]
  • 185.Zheng Y, Li YX, Lei DM. Multi-objective swarm based neighborhood search of fuzzy flexible job shop scheduling. The international Journal of Advanced Manufacturing Technologies. 2012;60(9-12):1063–1069. doi: 10.1007/s00170-011-3646-2. [DOI] [Google Scholar]
  • 186.Li JQ, Duan P, Cao J, Li XP, Pan YY. A hybrid pareto based tabu search for the distributed flexible job shop scheduling problem with e/t criteria. IEEE Access. 2018;6:5883–5897. [Google Scholar]
  • 187.Huang RH, Yang CL, Cheng WC. Flexible job shop scheduling with due window- a two-pheromone ant colony approach. Int. J. Prod. Econ. 2013;141(2):685–697. doi: 10.1016/j.ijpe.2012.10.011. [DOI] [Google Scholar]
  • 188.Reddy MB, Ratnam C, Rajyalakshmi G, Manupati VK. An effective hybrid multi objective evolutionary algorithm for solving real time event in flexible job shop scheduling problem. Measurement. 2019;114:78–90. doi: 10.1016/j.measurement.2017.09.022. [DOI] [Google Scholar]
  • 189.Huang X, Yang L. A hybrid genetic algorithm for multi objective flexible job shop scheduling problem considering transportation time. International Journal of Intelligent Computing and Cybernetics. 2019;12(2):154–174. doi: 10.1108/IJICC-10-2018-0136. [DOI] [Google Scholar]
  • 190.Ojstersek R, Zhang H, Liu S, Buchmeister B. Improved heuristic kalman algorithm for solving multi-objective flexible job-shop scheduling problem. Procedia Manufacturing. 2018;17:895–902. doi: 10.1016/j.promfg.2018.10.142. [DOI] [Google Scholar]
  • 191.Zhou Y, Yang J, Zheng L. Multi-agent based hyper-heuristics for multi-objective flexible job-shopscheduling: a case study in an aero-engine blade manufacturing plant. IEEE Access. 2019;7:21147–21176. doi: 10.1109/ACCESS.2019.2897603. [DOI] [Google Scholar]
  • 192.Seyyedhasani, H., Peng, C., Jang, W. -J., Vougioukas, S.G.: Collaboration of human pickers and crop-transporting robots during harvesting – part i: Model and simulator development Computers and Electronics in Agriculture 172. 10.1016/j.compag.2020.105324 (2020)
  • 193.Conesa-Muñoz J, Bengochea-Guevara JM, Andujar D, Ribeiro A. Route planning for agricultural tasks: a general approach for fleets of autonomous vehicles in site-specific herbicide applications. Comput Electron Agric. 2016;127:204–220. doi: 10.1016/j.compag.2016.06.012. [DOI] [Google Scholar]
  • 194.Edwards G, Sorensen CG, Bochtis D, Munkholm LJ. Optimized schedules for sequential agricultural operations using a tabu search method. Comput. Electron. Agric. 2015;117:102–113. doi: 10.1016/j.compag.2015.07.007. [DOI] [Google Scholar]
  • 195.Ahsan Z, Dankowicz H. Optimal scheduling and sequencing for large-scale seeding operations. Comput. Electron. Agric. 2019;104728:x163. [Google Scholar]
  • 196.Jensen MF, Bochtis D, Sorensen CG. Coverage planning for capacitated field operations, part ii: Optimisation. Biosyst. Eng. 2015;139:149–164. doi: 10.1016/j.biosystemseng.2015.07.002. [DOI] [Google Scholar]
  • 197.Santoro E, Soler EM, Cherri AC. Route optimization in mechanized sugarcane harvesting. Comput. Electron. Agric. 2017;141:140–146. doi: 10.1016/j.compag.2017.07.013. [DOI] [Google Scholar]
  • 198.Cheein FA, Torres-Torriti M, Hopfenblatt NB, Prado AJ, Calabi D. Agricultural service unit motion planning under harvesting scheduling and terrain constraints. Journal of Field Robotics. 2017;34:1531–1542. doi: 10.1002/rob.21738. [DOI] [Google Scholar]
  • 199.Richards, D., Patten, T., Fitch, R.C., Ball, D., Sukkarieh, S.: User interface and coverage planner for agricultural robotics. Australasian Conference on Robotics and Automation (2015)
  • 200.Mann MP, Zion B, Shmulevich I, Rubinstein D, Linker R. Combinatorial optimization and performance analysis of a multi-arm cartesian robotic fruit harvester–extensions of graph coloring. J. Intell. Robot. Syst. 2016;82(2-4):399–411. doi: 10.1007/s10846-015-0211-5. [DOI] [Google Scholar]
  • 201.Barnett J, Duke M, Au CK, Lim SH. Work distribution of multiple cartesian robot arms for kiwifruit harvesting. Comput. Electron. Agric. 2020;105202:169. [Google Scholar]
  • 202.Kurtser, P., Edan, Y.: Planning the sequence of tasks for harvesting robots. Robot. Auton. Syst., 131. 10.1016/j.robot.2020.103591 (2020)
  • 203.Miller, A.T., Allen, P.K.: Examples of 3d grasp quality computations. 2, 1240–1246. 10.1109/ROBOT.1999.772531 (1999)
  • 204.Rodríguez F, Moreno JC, Sánchez JA, Berenguel M. Grasping in agriculture: State-of-the-art and main characteristics. Mechanisms and Machine Science. 2012;10:385–409. doi: 10.1007/978-1-4471-4664-3_15. [DOI] [Google Scholar]
  • 205.Mu L, Cui G, Liu Y, Cui Y, Fu L, Gejima Y. Design and simulation of an integrated end-effector for picking kiwifruit by robot. Information Processing in Agriculture. 2020;7(1):58–71. doi: 10.1016/j.inpa.2019.05.004. [DOI] [Google Scholar]
  • 206.Liu, J., Li, P., Li, Z.: A multi-sensory end-effector for spherical fruit harvesting robot, 258–262. 10.1109/ICAL.2007.4338567 (2007)
  • 207.Jia, B., Zhu, A., Yang, S.X., Mittal, G.S.: Integrated gripper and cutter in a mobile robotic system for harvesting greenhouse products, 1778–1783. 10.1109/ROBIO.2009.5420430 (2009)
  • 208.Dimeas F, Sako DV, Moulianitis V, Aspragathos N. Design and fuzzy control of a robotic gripper for efficient strawberry harvesting. Robotica. 2014;33(5):1085–1098. doi: 10.1017/S0263574714001155. [DOI] [Google Scholar]
  • 209.Zhong H, Nof SY, Berman S. Asynchronous cooperation requirement planning with reconfigurable end-effectors. Robotics and Computer Integrated Manufacturing. 2015;34:95–104. doi: 10.1016/j.rcim.2014.11.004. [DOI] [Google Scholar]
  • 210.Pedrazzoli, P., Rinaldi, R., Boer, C.R.: A rule based approach to the gripper selection issue for the assembly process. 202–207. 10.1109/ISATP.2001.928990 (2001)
  • 211.Pham DT, Gourashi NS, Eldukhri EE. Automated configuration of gripper systems for assembly tasks. Proc. Ins. Mech. Eng. Part B: J. Eng. Manuf. 2007;221:1643–1649. doi: 10.1243/09544054JEM878SC. [DOI] [Google Scholar]
  • 212.Sanfilippo, F., Salvietti, G., Zhang, H.X., Hildre, H.P., Prattichizzo, D.: Efficient modular grasping: An iterative approach, 1281–1286. 10.1109/BioRob.2012.6290693 (2012)
  • 213.Brown, R.G., Brost, R.C.: A 3d modular gripper design tool. 3, 2332–2339. 10.1109/ROBOT.1997.619310 (1997)
  • 214.Balan L, Bone GM. Automated gripper jaw design and grasp planning for sets of 3d objects. Journal of Field Robotics. 2003;20:147–162. [Google Scholar]
  • 215.Velasco, V.B., Newman, W.S.: An approach to automated gripper customization using rapid prototyping technology (1996)
  • 216.Velasco, V.B., Newman, W.S.: Computer-assisted gripper and fixture customization using rapid-prototyping technology, 4, 3658–3664. 10.1109/ROBOT.1998.681393 (1998)
  • 217.Honarpardaz, M., Tarkian, M., Feng, X., Sirkett, D., Ölvander, J.: Generic automated finger design. 5,1–9. 10.1115/DETC2016-60514 (2016)
  • 218.Sahbani A, El-Khoury S, Bidaud P. An overview of 3d object grasp synthesis algorithms. Robot. Auton. Syst. 2012;60(3):326–336. doi: 10.1016/j.robot.2011.07.016. [DOI] [Google Scholar]
  • 219.Bicchi, A., Kumar, V.: Robotic grasping and contact: a review. 1, 348–353. 10.1109/ROBOT.2000.844081 (2000)
  • 220.Li J-W, Liu H, Cai H-G. On computing three-finger force-closure grasps of 2-d and 3-d objects. IEEE Trans. Robot. Autom. 2003;19(1):155–161. doi: 10.1109/TRA.2002.806774. [DOI] [Google Scholar]
  • 221.Han, L., Trinkle, J.C., Li, Z.: Grasp analysis as linear matrix inequality problems. 1261–1268. 10.1109/ROBOT.1999.772534 (1999)
  • 222.Mishra B, Schwartz JT, Sharir M. On the existence and synthesis of multifinger positive grips. Algorithmica. 1987;2(1):541–558. doi: 10.1007/BF01840373. [DOI] [Google Scholar]
  • 223.Liu Y-H. Qualitative test and force optimization of 3-d frictional form-closure grasps using linear programming. IEEE Trans. Robot. Autom. 1999;15(1):163–173. doi: 10.1109/70.744611. [DOI] [Google Scholar]
  • 224.Borst, C., Fischer, M., Hirzinger, G.: Grasping the dice by dicing the grasp. 3, 3692–3697. 10.1109/IROS.2003.1249729 (2003)
  • 225.Miller, A.T., Knoop, S., Christensen, H.I., Allen, P.K.: Automatic grasp planning using shape primitives. 2:1824–1829. 10.1109/ROBOT.2003.1241860 (2003)
  • 226.Ding, D., Liu, Y.-H., Wang, S.: Computing 3-d optimal form-closure grasps. 4, 3573–3578. 10.1109/ROBOT.2000.845288 (2000)
  • 227.Ding, D., Liu, Y.-H., Wang, M.Y.: On computing immobilizing grasps of 3-d curved objects. In: Proceedings 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 11–16. 10.1109/CIRA.2001.1013165 (2001)
  • 228.Liu Y-H, Lam M-L, Ding D. A complete and efficient algorithm for searching 3-d form-closure grasps in the discrete domain. IEEE Trans. Robot. 2004;20(5):805–816. doi: 10.1109/TRO.2004.829500. [DOI] [Google Scholar]
  • 229.Eizicovits D, Berman S. Efficient sensory-grounded grasp pose quality mapping for gripper design and online grasp planning. Robot. Auton. Syst. 2014;62(8):1208–1219. doi: 10.1016/j.robot.2014.03.011. [DOI] [Google Scholar]
  • 230.Liu, S., Carpin, S.: Global grasp planning using triangular meshes, 4904–4910. 10.1109/ICRA.2015.7139880 (2015)
  • 231.Hemming, J., Bac, C.W., Tuijl, B., Barth, R., Bontsema, J., Pekkeriet, E., Henten, E.V.: A robot for harvesting sweet-pepper in greenhouses (2014)
  • 232.Bohg J, Morales A, Asfour T, Kragic D. Data-driven grasp synthesis - a survey. IEEE Trans. Robot. 2014;30(2):289–309. doi: 10.1109/TRO.2013.2289018. [DOI] [Google Scholar]
  • 233.Kim J, Iwamoto K, Kuffner JJ, Ota Y, Pollard NS. Physically based grasp quality evaluation under pose uncertainty. IEEE Trans. Rob. 2013;29(6):1424–1439. doi: 10.1109/TRO.2013.2273846. [DOI] [Google Scholar]
  • 234.Wolniakowski, A., Miatliuk, K., Kruger, N., Rytz, J.A.: Automatic evaluation of task-focused parallel jaw gripper design. International Conference on Simulation, Modeling, and Programming for Autonomous Robots, 450–461. 10.1007/978-3-319-11900-7_38 (2014)
  • 235.Fernández R, Salinas C, Montes H, Sarria J. Multisensory system for fruit harvesting robots. experimental testing in natural scenarios and with different kinds of crops. Sensors. 2014;14(12):23885–23904. doi: 10.3390/s141223885. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 236.Bac CW, Hemming J, van Tuijl BAJ, Barth R, Wais E, van Henten EJ. Performance evaluation of a harvesting robot for sweet pepper. Journal of Field Robotics. 2017;34(6):1123–1139. doi: 10.1002/rob.21709. [DOI] [Google Scholar]
  • 237.Tardáguila, J., Blasco, J., Diago, M.P.: Vinerobot: A new robot for vineyard monitoring using non-invasive sensing technologies. In: 9th International Cool Climate Wine Symposium. Retrieved from https://digital.csic.es/handle/10261/148399 - Last accessed: 19-05-2022 (2016)
  • 238.Fernández-Novales J, Garde-Cerdán T, Tardáguila J, Gutiérrez-Gamboa G, Pérez-Álvarez EP, Diago MP. Assessment of amino acids and total soluble solids in intact grape berries using contactless vis and nir spectroscopy during ripening. Talanta. 2019;199:244–253. doi: 10.1016/j.talanta.2019.02.037. [DOI] [PubMed] [Google Scholar]
  • 239.Aquino A, Millan B, Diago MP, Tardaguila J. Automated early yield prediction in vineyards from on-the-go image acquisition. Comput. Electron. Agric. 2018;144:26–36. doi: 10.1016/j.compag.2017.11.026. [DOI] [Google Scholar]
  • 240.Fernández-Novales J, Tardaguila J, Gutiérrez S, Marañón M, Diago MP. In field quantification and discrimination of different vineyard water regimes by on-the-go nir spectroscopy. Biosyst. Eng. 2018;165:47–58. doi: 10.1016/j.biosystemseng.2017.08.018. [DOI] [Google Scholar]
  • 241.Lopes, C.M., Graça, J.D., Sastre, J., Reyes, M., Guzman, R., Braga, R., Monteiro, A., Pinto, P.A.: Vineyard yield estimation by vinbot robot - preliminary results with the white variety viosinho. 10.13140/RG.2.1.3912.0886 (2016)
  • 242.Lopes, C.M., Torres, A., Guzman, R., Graça, J. D., Monteiro, A., Braga, R.P., Barriguinha, A., Victorino, G., Reys, M.: Using an unmanned ground vehicle to scout vineyards for non-intrusive estimation of canopy features and grape yield. 20th giESCO International Meeting (2017)
  • 243.Astolfi P, Gabrielli A, Bascetta L, Matteucci M. Vineyard autonomous navigation in the echord++ grape experiment. IFAC-PapersOnLine. 2018;51(11):704–709. doi: 10.1016/j.ifacol.2018.08.401. [DOI] [Google Scholar]
  • 244.Leu A, Razavi M, Langstädtler L, Ristić-Durrant D, Raffel H, Schenck C, Gräser A, Kuhfuss B. Robotic green asparagus selective harvesting. IEEE/ASME Trans. Mechatron. 2017;22(6):2401–2410. doi: 10.1109/TMECH.2017.2735861. [DOI] [Google Scholar]
  • 245.Fernández-Novales J, Tardáguila J, Gutiérrez S, Diago MP. On-the-go vis + sw − nir spectroscopy as a reliable monitoring tool for grape composition within the vineyard. Molecules. 2019;24(15):2795. doi: 10.3390/molecules24152795. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 246.Gutiérrez S, Tardaguila J, Fernández-Novales J, Diago MP. On-the-go hyperspectral imaging for the in-field estimation of grape berry soluble solids and anthocyanin concentration. Aust. J. Grape Wine Res. 2019;25(1):127–133. doi: 10.1111/ajgw.12376. [DOI] [Google Scholar]
  • 247.Gutiérrez S, Diago MP, Fernández-Novales J, Tardaguila J. On-the-go thermal imaging for water status assessment in commercial vineyards. Adv. Anim. Biosci. 2017;8(2):520–524. doi: 10.1017/S204047001700108X. [DOI] [Google Scholar]
  • 248.Grimstad, L., Pham, C.D., Phan, H.T., From, P.J.: On the design of a low-cost, light-weight, and highly versatile agricultural robot. IEEE Workshop on Advanced Robotics and its Social Impacts ARSO. 10.1109/ARSO.2015.7428210 (2016)
  • 249.Harvest Croo Robotics. https://harvestcroo.com/ - Last accessed: 22-11-2022
  • 250.Dogtooth. https://dogtooth.tech/ - Last accessed: 22-11-2022
  • 251.Agrobot E-Series. http://agrobot.com - Last Accessed: 22-11-2022
  • 252.OCTINION. http://octinion.com/products/agricultural-robotics/rubion - Last Accessed: 22-11-2022
  • 253.SAGA Robotics. https://sagarobotics.com - Last Accessed: 22-11-2022
  • 254.Grimstad, L., From, P.J.: The thorvald ii agricultural robotic system. Robotics 6(4). 10.3390/robotics6040024 (2017)
  • 255.MetoMotion. https://metomotion.com- Last accessed: 22-11-2022
  • 256.Root-AI. https://www.appharvest.com/press_release/appharvest-acquires-agricultural-robotics-and-artificial-intelligence-company-root-ai-to-increase-efficiency/ - Last accessed: 22-11-2022
  • 257.AppHarvest. https://www.appharvest.com/ - Last accessed: 22-11-2022
  • 258.ENERGID. https://www.energid.com/industries/agricultural-robotics - Last accessed: 22-11-2022
  • 259.VISION ROBOTICS. https://www.visionrobotics.com/vr-grapevine-pruner - Last accessed: 22-11-2022
  • 260.naio Technologies. https://www.naio-technologies.com/en/ted/ - Last accessed: 22-11-2022
  • 261.ViTiBOT. https://vitibot.fr/en - Last Accessed: 22-11-2022

Articles from Journal of Intelligent & Robotic Systems are provided here courtesy of Nature Publishing Group

RESOURCES