Skip to main content
PLOS One logoLink to PLOS One
. 2023 Apr 6;18(4):e0284184. doi: 10.1371/journal.pone.0284184

Open Science Drone Toolkit: Open source hardware and software for aerial data capture

Gustavo Pereyra Irujo 1,*, Paz Bernaldo 2, Luciano Velázquez 3, Antoni Pérez 4, Celeste Molina Favero 5, Alejandrina Egozcue 6
Editor: Vona Méléder7
PMCID: PMC10079087  PMID: 37023121

Abstract

Despite the increased access to scientific publications and data as a result of open science initiatives, access to scientific tools remains limited. Uncrewed aerial vehicles (UAVs, or drones) can be a powerful tool for research in disciplines such as agriculture and environmental sciences, but their use in research is currently dominated by proprietary, closed source tools. The objective of this work was to collect, curate, organize and test a set of open source tools for aerial data capture for research purposes. The Open Science Drone Toolkit was built through a collaborative and iterative process by more than 100 people in five countries, and comprises an open-hardware autonomous drone and off-the-shelf hardware, open-source software, and guides and protocols that enable the user to perform all the necessary tasks to obtain aerial data. Data obtained with this toolkit over a wheat field was compared to data from satellite imagery and a commercial hand-held sensor, finding a high correlation for both instruments. Our results demonstrate the possibility of capturing research-grade aerial data using affordable, accessible, and customizable open source software and hardware, and using open workflows.

Introduction

Openness has been central to modern science since its inception [1], through the publication of theories and data on which they are based, and the encouragement of replication, scrutiny and challenge [2]. In recent decades, information technologies have enabled the rise of an “open science” global movement that seeks to not only increase transparency and dissemination of scientific processes and products, but also enable more widespread collaboration, participation and inclusion in science [35]. Although access to scientific publications and raw data has increased significantly in recent years [69], access to the scientific tools needed to obtain or analyze data (i.e., scientific instruments, materials and software) remains as one of the main barriers to increasing participation in science production, replication or reproduction of published results [10,11].

Open source software is computer code that is licensed so that the user has the freedom to copy and redistribute it, have access to the source code, and make improvements to it, among other rights [12]. Similarly, open source hardware is any physical object or artifact whose design is available so that anyone can study, modify, distribute, make, and sell the design or hardware based on that design [13]. Open source research software and open scientific instruments and materials are considered to provide a series of advantages over proprietary alternatives: i) being either cost-free (in the case of software) or usually more affordable (in the case of hardware), they allow more people to participate in science endeavors, especially for non-professional or budget-limited researchers; ii) reproducibility of published results or replication attempts are less constrained by a lack of access to the same tools that were originally used; iii) having access to the software code or hardware design allows for a better understanding of the functioning of the tool and the methods or algorithms that it implements; and iv) it is possible to customize the tools to adapt them to new uses or local contexts [11,1416].

Uncrewed aerial vehicles (UAVs, usually called “drones”) can be a powerful tool for research in disciplines such as agriculture and environmental sciences, allowing the capture of high-resolution aerial imaging with great speed and flexibility [17]. Drone use in research is rapidly growing but it is dominated by closed source tools: in a recent literature review of applications in agro-environmental monitoring, more than 80% of studies used fully closed-source drones, and more than 90% of studies used proprietary closed-source software for image processing [18]. Proprietary drone solutions usually require a significant initial investment, monthly software subscriptions, or an internet connection for cloud processing, which can constitute barriers for many low-resource users [19,20], and usually function as a ‘black-box’ which offer users little insight on its internal workings, and limited customization [21]. Also, when these solutions are implemented in developing countries, concerns have been raised regarding limited repairability, and risks of technological dependence and extractive practices [22].

For a typical use of drones in environmental or agricultural research, the drone needs to be able to be reliably and precisely positioned over the studied terrain and capture images that can be later processed to get a high quality image of the surveyed area and extract useful data [2325]. Although there are already many open source hardware and software tools that can be used in each of the individual steps of this process, our question was whether it was possible to perform all these steps using open tools. The objective of this work was to address this question by collecting, curating, organizing and testing a comprehensive set of open source tools for aerial data capture for research purposes. The result of these actions is the Open Science Drone Toolkit (OSDT), which is presented here in detail, and is also available online at https://vuela.cc/en/toolkit.

Toolkit design process

The OSDT was developed as part of “Vuela” [26,27], a research-action project which aimed to fight the lack of access to creating scientific and technological knowledge, by exploring an alternative way of developing scientific tools. The toolkit was built through a collaborative and iterative process, involving the work of more than 100 people between 2017 and 2019, in more than 30 local, in-person workshops, in five countries (Argentina, Brazil, Chile, Paraguay and Uruguay), as well as permanent online collaboration. Workshop participants were school students, traditional scientists, technicians, hobbyists, journalists, local community members, self-taught software developers, and included both academics and people with no formal academic or technology background, and people with and without experience making or using drones. Work was carried out also in different languages: Spanish, Haitian Créole, French, Portuguese and English.

The Open Science Drone Toolkit is a set of hardware and software tools, guides, and protocols that enable the user to perform all the necessary tasks to obtain aerial data, as detailed in Table 1. These steps represent a ‘typical’ use case, but can be modified according to the specific research question.

Table 1. Tasks that can be performed using the Open Science Drone Toolkit in order to obtain data from aerial images.

Task Details
0. Assemble the toolkit Build or acquire the drone, camera and other hardware components, install the software components, and perform the necessary configurations, following the toolkit assembly guide.
1. Identify the study area Specify the boundaries of the area that will be surveyed, either in situ or using georeferenced aerial or satellite imagery
2. Design the flight plan Design the flight path the drone will follow over the study area, as well as flight altitude and speed, taking into account the desired ground sampling resolution (cm per pixel), the overlap required for image stitching, and technical constraints of the drone and camera
3. Select the camera settings Select the optimal camera parameters (shutter exposure, ISO sensitivity, automatic shutter interval) according to current conditions at the time of the flight
4. Perform a satellite-guided flight Perform a satellite-guided autonomous flight over the study area, following the designed flight plan, capturing the necessary images with a fixed set of camera parameters
5. Geotag the captured images Assign coordinates to each image according to the precise location recorded by the drone at capture time
6. Process the images to obtain a mosaic Process the captured images in order to obtain an orthorectified and georeferenced image of the complete surveyed area
7. Analyze the mosaic to obtain data Process the mosaic image in order to obtain data relevant to the research question, for the complete image or for specific areas
8. Manage and share data Organize and visualize all the generated data (raw images, mosaic, area boundaries, flight plan and telemetry records) and metadata (flight name and description, date and time, location, etc.), and bundle all files together for efficient archiving and sharing

One of the first objectives of the project was to put into practice one of the commonly less-exercised freedoms of open source hardware: the freedom to modify an existing design. Instead of developing a drone from scratch, the project started by replicating, testing, and identifying potential improvements for an already available open-source drone called “Flone” [28]. The original design had limited capabilities for performing the required tasks listed in Table 1, so a series of changes were needed for using the drone for research purposes: increasing the range that the drone could safely cover, adding satellite navigation capability, increasing the payload capacity, and improving the stability of the camera. This iterative process of hardware development was carried out in conjunction with a careful selection of open source software tools (and development of new ones) to perform each of the tasks listed in Table 1, and the development of protocols and detailed user guides. The resulting set of tools is described in the following section.

Toolkit components

The components of the OSDT are listed in Table 2. The main component is the open-hardware drone, developed especially for this toolkit. Other hardware components of the toolkit are not open source, but are mostly off-the-shelf equipment that can be readily replaced. The software components of the toolkit are all open-source software projects, which have been selected for being suitable for each task, or developed especially for the toolkit. Finally, the documentation includes assembly and usage guides for the toolkit.

Table 2. Summary of the hardware, software, and documentation components of the Open Science Drone Toolkit.

Component type Component Task in which is used
Hardware Computer (generic PC with Windows operating system) 0 to 8
Smartphone (generic smartphone with Android operating system) 1
Camera (Canon brand camera compatible with CHDK software) 3, 4
Drone (OVLI drone) 4
Radio transmitter (generic 6-channel radio transmitter) 4
Batteries and charger (generic 3-cell lithium-polymer battery and balance charger) 4
Software Location recorder app (GPS Logger) 1
Drone ground station (Mission Planner) 2, 4, 5
Camera control (CHDK) 3, 4
Drone autopilot (ArduCopter) 4
Image processing (OpenDroneMap) 6
Orthomosaic processing and analysis (QGIS) 7
Data management (Bitácora) 8
Documentation Toolkit assembly guide 0
Toolkit usage guide 1 to 8

The OVLI drone (an acronym for “Objeto Volador Libre”, which means “Free Flying Object” in Spanish) is a quadcopter (i.e., a helicopter with four propellers), equipped with an autopilot board with accelerometer, gyroscope, barometer, and GNSS (Global Navigation Satellite System) sensors that allow fully autonomous flight. The autopilot is an open source Pixhawk board [29,30], running the open source ArduPilot/ArduCopter firmware [31,32]. The OVLI has a frame diameter of 395mm and weighs 0.773kg without batteries. Its frame is assembled from MDF (Medium Density Fibreboard) cut with a laser cutter according to a design file, which can be easily edited to modify the drone structure. This material was chosen because it is widely available, low cost, and easy to assemble, repair and modify. The final design of the OVLI is shown in Fig 1, alongside the original “Flone” drone on which it was based.

Fig 1. The Flone and OVLI drones.

Fig 1

(A) original “Flone” design [28] upon which further versions of the open hardware drone were developed. (B) the “OVLI” drone that was developed as part of the Open Science Drone Toolkit.

The OVLI drone has a payload capacity of around 500 g, enough for a high-resolution RGB (Red-Green-Blue, i.e., visible spectrum) camera, multi-spectral camera, or other sensors. Maximum flight time is 11 minutes, using a 5000 mAh battery and carrying a 141 g camera as payload, measured from take-off until the low-battery alarm was activated, with approximately 30% of battery capacity remaining.

Operation of the OVLI drone can be done both through a manually operated radio controller, but for research purposes it is usually convenient to fly the drone autonomously using a pre-programmed flight plan. Planning usually begins by identifying the area to be surveyed (task #1 in Table 1), which can be done by physically surrounding the area carrying a smartphone and using the open-source app GPSLogger (Table 2). The resulting file with the coordinates is uploaded to the open-source software Mission Planner (Table 2) to design the flight plan (task #2 in Table 1), according to the desired image resolution, and considering the maximum flight time of the drone and other constraints. This flight plan is uploaded to the OVLI drone, which can later fly autonomously while capturing the images (task #4 in Table 1). During flight, the Mission Planner software is also used to view live telemetry data, such as the position of the drone, battery voltage, altitude, speed, etc.

The camera selected for the toolkit is an RGB, 12-megapixel pocket camera (Canon PowerShot ELPH100HS, or equivalent). This kind of camera was selected for two main reasons: 1) “pocket” or “point and shoot” cameras usually have a mechanical shutter, which means that when an image is captured, all the pixels are captured at the same time, whereas many “action cameras” usually used in drones have an electronic “rolling shutter”, in which the sensor captures the images line-by-line, potentially introducing image distortions [33]; 2) most Canon cameras have the possibility of being “hacked” by means of the open-source CHDK (Canon Hack Development Kit) software [34], which allows setting the camera to capture images automatically, and to manually set camera parameters (e.g., shutter speed and ISO values; task #3 in Table 1) to capture sharp, well-exposed and suitable images for further processing and data extraction [35]. This type of camera does not provide location data of the images (which is necessary for later obtaining a georeferenced mosaic), so this information has to be retrieved from the flight log of the drone. This process is called geotagging (task #5 in Table 1), and can be performed using the Mission Planner ground station software. This kind of off-the-shelf RGB cameras have been shown to be useful for the measurement of vegetation indices [36]. Moreover, the camera filters can be modified by replacing the standard near-infrared (NIR) filter with an appropriate filter, resulting in a low-cost camera capable of detecting two visible bands and one NIR band [37,38].

The captured images then need to be merged to obtain a rectified and georeferenced image of the complete surveyed area, which is known as an orthomosaic (task #6 in Table 1). Currently, a suitable open-source software to perform this task is OpenDroneMap [39], which has been shown to provide high quality results comparable to widely-used commercial packages [21]. The next step is to extract information from the orthomosaic image. This step depends largely on the research question that is being addressed. The open source software QGIS [40] can be used to open the georeferenced orthomosaic and calculate vegetation indices, measure areas, and multiple other data extraction tasks (task #7 in Table 1).

The open-source application “Bitácora” (“logbook” in Spanish) (https://vuela.cc/en/bitacora), developed especially for the OSDT, helps in visualizing and organizing all the files, images and metadata generated during the whole process, for archiving, sharing or further processing. The user only needs to save all the files generated in a flight (survey area polygon, flight plan, captured images, mosaic, elevation model, etc.) in a folder, and the program will automatically generate a map visualizing the files, and a table with flight information (flight date and time, location, altitude, speed, names of relevant files; Fig 2). This information is also saved in open formats compatible with other software (flight information table in csv format, flight map in png and kml formats).

Fig 2. Screenshot from the “Bitácora” software.

Fig 2

(A) The main window, showing the list of logs already registered in the logbook. (B-C) Individual flight windows showing details for selected flights.

The toolkit documentation includes: 1) an “assembly guide” that includes a step-by-step guide for building the OVLI drone, setting up and configuring the hardware components, and installing the software, and 2) a “usage guide”, with instructions for flying the drone, programming an autonomous mission, programming the camera, and processing the images. Both are available as openly licensed documents (using a Creative Commons license allowing users to redistribute and build upon the material, with attribution), ready for download in PDF and HTML format, and also as live documents (in Google Docs) open for suggestions. The guides have a simple layout that allow for automatic machine translations in many languages, which are readily available from the project website.

Example use case

An example use case is presented here in which the OSDT was used to obtain data on the spatial variability in the maturity of a wheat crop, assessed through a vegetation index that quantifies the “greenness” of the crop canopy. A 6000 m2 area was surveyed in a wheat field sown in late July 2020 in Balcarce, Buenos Aires Province, Argentina. This particular field was selected for having uneven maturity of the crop due to variability in soil depth. Following the steps in the toolkit guide, the area of interest was first delimited (Fig 3A), a flight plan was designed (Fig 3B), and the OVLI drone was flown on the 5th of December 2020, when the crop was in the grain filling stage (Zadoks 7.7 stage). A total of 151 images were captured at a flight altitude of 50m (Fig 3C), and 104 of them (discarding those captured during takeoff and landing) were then used to obtain an orthomosaic with a resolution of 2 cm/pixel (Fig 3D).

Fig 3. Aerial data capture and analysis steps followed in the presented example use case.

Fig 3

A) identification of the study area, which was done first in situ with a location recorder app (bright red line) and then manually refined by drawing a rectangular polygon (dark red area); B) design of the flight plan to cover the study area using a “grid” or “lawn-mower” pattern; C) flight and image capture, represented through the actual flight path of the drone (blue line) and the position of each captured image (blue circles); D) orthomosaic obtained by joining the captured images; and E) calculated vegetation index (VARI) calculated from the orthomosaic data. The background image is a 10x10m resolution satellite image captured 1 day after the drone flight (retrieved from Sentinel Hub EO Browser under a CC-BY 4.0 license [45]).

Vegetation indices are transformations of data obtained from optical sensors, usually based on the plant’s increased reflectance in the green and/or infrared wavelengths, that can be used to quantify spatial and temporal variations in vegetation characteristics [41]. A vegetation index (Visible Atmospherically Resistant Index, VARI) that has been previously used to estimate wheat growth and phenology [42,43] was calculated from the red, green and blue channels of the orthomosaic, in order to quantify the “greenness” of the crop as an indicator of the degree of maturity (Fig 3E). This data obtained from the drone images was compared to the Normalized Difference Vegetation Index (NDVI), which is the most widely used vegetation index [44], obtained from two sources: a handheld sensor and satellite data.

Satellite data from the Sentinel 2 constellation (European Space Agency) was retrieved from the Sentinel Hub EO Browser under a CC-BY 4.0 license [45]. Data from bands B04 (red, central wavelength = 665nm) and B08 (near infrared, central wavelength = 842nm) for the 6th December 2020 (1 day after the drone image capture) at 10m/pixel resolution was used to calculate NDVI values for the surveyed area (Fig 4A). For comparison between satellite (NDVI) and drone (VARI) data, 140 10 x 10 m areas equivalent to the satellite image pixels were delimited in the processed orthomosaic (Fig 4C), and the mean VARI value measured in each of them. One caveat to this comparison was that the georeferencing of the orthomosaic was based on the drone GNSS sensor which usually has an accuracy of about 2 m [46], so the correspondence between these 10 x 10 m areas and the satellite image pixels might not have been complete.

Fig 4. Comparison between data obtained with the OSDT and data obtained from satellite imagery and a handheld sensor.

Fig 4

A) NDVI satellite image from Sentinel 2; B) VARI image from drone orthomosaic; C) VARI drone image showing the 10x10m areas equivalent to the satellite image pixels delimited in the processed orthomosaic; D) transects used for measurement with hand-held sensor; E) relationship between NDVI from the satellite image and VARI from the drone; and F) relationship between NDVI from the hand-held sensor and VARI from the drone. Satellite image data was retrieved from Sentinel Hub EO Browser under a CC-BY 4.0 license [45].

On the 28th of November 2020 (seven days before the drone image capture) a hand-held sensor (Greenseeker, N-tech Industries, USA) was used to measure NDVI in six transects parallel to the crop rows, each of them 70 meters long and spaced 18 meters apart (Fig 4D). To aid in the correspondence between sensor and drone data, a visible mark that could be easily identified in the aerial images was placed at the start of the first transect, and each subsequent transect was started based on ground measurements relative to that reference. The sensor was placed one meter above the canopy, resulting in a field of view of about 60 cm. Around 700 data points were recorded in each transect, equivalent to around one point every 10 cm. For comparison between handheld NDVI sensor and drone (VARI) data, NDVI data was averaged every two meters (yielding 35 data points per transect, and 210 in total). Similarly, 35 2 x 2 m areas along each transect were delimited in the processed orthomosaic, and the mean VARI value measured in each of them.

VARI data obtained from the drone orthomosaic were plotted against NDVI data from the satellite image and the handheld sensor, and an exponential equation of the form y = axb-c was fitted for each set of data. The NDVI index usually has a curvilinear relationship with the green leaf area index, while the VARI index has been shown to have a rather linear relationship [42], therefore a curvilinear relationship between NDVI and VARI can be expected. A high correlation was found in both cases, with R2 values of 0.84 and 0.88 when drone data was compared to satellite and hand-held sensor data, respectively (Fig 4E and 4F). A single curve could also be fit to both datasets, with an R2 of 0.86 (not shown).

Discussion

This paper reports the results of developing a complete toolkit in order to demonstrate the possibility of capturing research-grade aerial data using open source software and hardware. The use case example presented shows its suitability for tasks that can be useful for many research questions (and also commercial applications, e.g., farming). The wide range of possible applications of aerial imaging in research, however, cannot be fully covered by any single toolkit. Nevertheless, the open nature of the OSDT allows for its components to be used separately or be replaced by alternative tools, either open-source or proprietary, as necessary. For example, if the area of interest is significantly larger than the one shown in the example, the flight time of the OVLI drone would not be sufficient. In that case, a “fixed wing” type of drone, such as the open-source “Asa-Branca-I” [47], could provide the capability of covering more than 100 ha in a single flight (albeit with lower image resolution). And if the cost of a drone is a limitation, or if drones cannot be used due to regulations, aerial images can also be captured using kites [48]. It is also possible to use the image processing, analysis and management software tools of the toolkit with aerial images obtained with proprietary drones.

Concerns about the quality and reliability of open source tools (especially hardware) can sometimes limit their use [49]. In this work we attempted to overcome this limitation by comparing the results obtained with the OSDT to data from satellite imagery and a commercial hand-held sensor, finding a high correlation for both instruments. The two sensors used for comparison have been among the most popular tools used by farmers for crop monitoring for around 20 years [50], both of them make use of the most widely used vegetation index (NDVI) [44], and can therefore be considered as a reliable benchmark. Studies with off-the-shelf RGB cameras such as that used in the OSDT have shown that they can yield robust measurements of vegetation indices [36], which can also be further improved through radiometric calibrations [51], with results comparable to multispectral cameras which can cost several times more [5255], and with higher spatial resolution [56]. Although not shown in this paper, the OSDT can also be used to generate digital elevation maps and 3D point clouds of the studied terrain. Studies using similar tools showed that it is possible to obtain results similar to those of expensive LiDAR (Light Detection And Ranging) systems, when appropriate ground control points are used [57,58].

Another concern about open source research tools is their sustainability [59], especially when they lack an associated business model to provide funding, considering that long-term availability is important for reproducibility and replicability [60]. Nevertheless, open source is usually considered as a requisite for sustainable research software, as it ensures the possibility of continuous validation, reuse and improvement [61,62] while, on the other hand, proprietary software has been found to be an obstacle for reproducibility [63]. The issue of sustainability of open hardware has not been studied extensively [64], but it has been argued that open source hardware can provide more long-term security to research projects due to the possibility of in-house repairing in case the original provider went out of business [14]. We therefore argue that using the OSDT for aerial data capture for research purposes could be considered a more sustainable option than proprietary commercial systems.

While there have been significant technical improvements in drones and sensors in recent years, little attention has been paid to the management and storage of the increasingly large and complex datasets that are the result of drone operations [65]. There are few standards for drone data management, sharing or publication, which makes collaboration and reproducibility difficult [66]. Commercial drone packages usually offer complete solutions, but generally at the cost of expensive licenses and less interoperability with other tools or possibility of customization. Open source tools offer a more flexible but fragmented landscape, and users can be set back by the need to deal with many individual components. In the OSDT, the software “Bitácora” was developed with the aim of helping to overcome this issue, by providing a way of centralizing all the files and data that is generated in the different steps of the process of aerial data capture, which would otherwise have to be managed and visualized using many different tools. As an open source software, it can also be extended to incorporate other data formats and file types (e.g. flight plans for proprietary drones, images from multispectral cameras, etc.). It also aids the user in collecting all the generated data in a single folder, adding the corresponding metadata, which helps in openly sharing research data in reusable and interoperable formats. “Bitácora” can automatically extract part of the metadata required in the Minimum Information Framework for drone users (proposed by [66]), thus helping users make their data “FAIR” (findable, accessible, interoperable and reusable). Another open source software with a similar goal is “DroneDB” [67], which aids especially in sharing datasets of images, orthomosaics and other drone products through a cloud interface, but without providing flight metadata. “Bitácora” is therefore a small but key component of the OSDT, since it helps “bundle” the toolkit together and use it to further the goals of open science.

Conclusions

The Open Science Drone Toolkit was presented in this paper, which enables the user to perform all the necessary tasks to obtain aerial data. Data obtained with this toolkit over a wheat field was compared to data from satellite imagery and a commercial hand-held sensor, finding a high correlation for both instruments. These results demonstrate the possibility of capturing research-grade aerial data using affordable, accessible and customizable open source software and hardware, and using open workflows.

Supporting information

S1 File. Alternative language article.

Complete manuscript in Spanish: “Herramientas de código abierto para la captura de datos aéreos mediante drones”.

(PDF)

Acknowledgments

We would like to acknowledge the contributions of the participants of the Vuela project workshops and online collaborators that made building this toolkit possible, especially Loulou Jude, Daniela Muñoz, Lot Amorós, Guillermo Pereyra Irujo, Nicolás Narváez, Carla Alvial, Fernando Yamada, John Arancibia, Constanza Alberio, Vicente Dimuro, and Stevens Azima. We want to thank Junta de Vecinos Teniente Merino Alto, Junta de Vecinos Francisco Werchez, Universidad Católica, INIA Rayentué (Chile), INTA Balcarce, Club Social de Innovación Balcarce, R’lyeh Hacklab, INTA Marcos Juárez, and Universidad Nacional de Cuyo (Argentina), IPTA Capitán Miranda (Paraguay), Universidade Federal do Rio Grande do Sul (Brazil), and INIA La Estanzuela (Uruguay) for providing the venues for the workshops. We also want to thank Abril Pereyra Molina and Julián Pereyra Molina for their assistance with field measurements.

Data Availability

Open Science Drone Toolkit details are publicly available from Vuela (https://vuela.cc/), and data for the example use case is publicly available from the OSF repository (https://osf.io/t7y6x/).

Funding Statement

Funding for this project was provided by Mozilla Foundation (Mozilla Science Mini Grant, foundation.mozilla.org) to GPI and PB, Knowledge/Culture/Ecologies (KCE2017, Western Sydney University, westernsydney.edu.au) to PB, Shuttleworth Foundation (Flash Grant, shuttleworthfoundation.org) to PB, Programa Cooperativo para el Desarrollo Tecnológico Agroalimentario y Agroindustrial del Cono Sur (PROCISUR, www.procisur.org.uy) to GPI, Instituto Nacional de Tecnología Agropecuaria (PNCYO-1124072 and 2019-PD-E3-I060, inta.gob.ar) to GPI. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

References

  • 1.Hull D. Openness and Secrecy in Science: Their Origins and Limitations. Sci Technol Hum Values. 1985;10(2):4–12. [Google Scholar]
  • 2.Boulton G, Campbell P, Collins B, Elias P, Hall W, Laurie G, et al. Science as an open enterprise. London: The Royal Society; 2012. [Google Scholar]
  • 3.Bahlai C, Bartlett L, Burgio K, Fournier A, Keiser C, Poisot T, et al. Open Science Isn’t Always Open to All Scientists. Am Sci. 2019;107(2):78. [Google Scholar]
  • 4.UNESCO. UNESCO Recommendation on Open Science. UNESCO; 2021. Available from: https://unesdoc.unesco.org/ark:/48223/pf0000379949.locale=en. [Google Scholar]
  • 5.Wolff B, Schlagwein D. From Open Science to Open Source (and beyond): A Historical Perspective on Open Practices without and with IT. In: 17th International Symposium on Open Collaboration. New York, NY, USA: Association for Computing Machinery; 2021. P. 1–11. (OpenSym 2021). doi: 10.1145/3479986.3479990 [DOI] [Google Scholar]
  • 6.Boulton G, Rawlins M, Vallance P, Walport M. Science as a public enterprise: the case for open data. The Lancet. 2011;377(9778):1633–5. doi: 10.1016/S0140-6736(11)60647-8 [DOI] [PubMed] [Google Scholar]
  • 7.Forero DA, Curioso WH, Patrinos GP. The importance of adherence to international standards for depositing open data in public repositories. BMC Res Notes. 2021;14(1):405. doi: 10.1186/s13104-021-05817-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Himmelstein DS, Romero AR, Levernier JG, Munro TA, McLaughlin SR, Greshake Tzovaras B, et al. Sci-Hub provides access to nearly all scholarly literature. eLife. 2018;7:e32822. doi: 10.7554/eLife.32822 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Piwowar H, Priem J, Larivière V, Alperin JP, Matthias L, Norlander B, et al. The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles. PeerJ. 2018;6:e4375. doi: 10.7717/peerj.4375 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Arancio J, Dosemagen S. Bringing Open Source to the Global Lab Bench. Issues in Science and Technology. 2022. Available from: https://issues.org/open-source-science-hardware-gosh-arancio-dosemagen/. [Google Scholar]
  • 11.Walters WP. Code Sharing in the Open Science Era. J Chem Inf Model. 2020;60(10):4417–20. doi: 10.1021/acs.jcim.0c01000 [DOI] [PubMed] [Google Scholar]
  • 12.Open Source Initiative. The Open Source Definition. 2007. Available from: https://opensource.org/osd. [Google Scholar]
  • 13.Open Source Hardware Association. Open Source Hardware (OSHW) Statement of Principles 1.0. Open Source Hardware Association. 2016. Available from: https://www.oshwa.org/definition/. [Google Scholar]
  • 14.Chagas AM. Haves and have nots must find a better way: The case for open scientific hardware. PLOS Biol. 2018;16(9):e3000014. doi: 10.1371/journal.pbio.3000014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Diederich B, Müllenbroich C, Vladimirov N, Bowman R, Stirling J, Reynaud EG, et al. CAD we share? Publishing reproducible microscope hardware. Nat Methods. 2022;1–5. [DOI] [PubMed] [Google Scholar]
  • 16.Ravindran S. How DIY technologies are democratizing science. Nature. 2020;587(7834):509–11. doi: 10.1038/d41586-020-03193-5 [DOI] [PubMed] [Google Scholar]
  • 17.Chabot D. Trends in drone research and applications as the Journal of Unmanned Vehicle Systems turns five. J Unmanned Veh Syst. 2018;6(1):vi–xv. [Google Scholar]
  • 18.Eskandari R, Mahdianpari M, Mohammadimanesh F, Salehi B, Brisco B, Homayouni S. Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models. Remote Sens. 2020;12(21):3511. [Google Scholar]
  • 19.Paneque-Gálvez J, Vargas-Ramírez N, Napoletano BM, Cummings A. Grassroots Innovation Using Drones for Indigenous Mapping and Monitoring. Land. 2017;6(4):86. [Google Scholar]
  • 20.Vargas-Ramírez N, Paneque-Gálvez J. The Global Emergence of Community Drones (2012–2017). Drones. 2019;3(4):76. [Google Scholar]
  • 21.Pell T, Li JYQ, Joyce KE. Demystifying the Differences between Structure-from-Motion Software Packages for Pre-Processing Drone Data. Drones. 2022;6(1):24. [Google Scholar]
  • 22.Hanrahan BV, Maitland C, Brown T, Chen A, Kagame F, Birir B. Agency and Extraction in Emerging Industrial Drone Applications: Imaginaries of Rwandan Farm Workers and Community Members. Proc ACM Hum-Comput Interact. 2021;4(CSCW3):233:1–233:21. [Google Scholar]
  • 23.Assmann JJ, Kerby JT, Cunliffe AM, Myers-Smith IH. Vegetation monitoring using multispectral sensors—best practices and lessons learned from high latitudes. J Unmanned Veh Syst. 2019;7(1):54–75. [Google Scholar]
  • 24.Rusnák M, Sládek J, Kidová A, Lehotský M. Template for high-resolution river landscape mapping using UAV technology. Measurement. 2018;115:139–51. [Google Scholar]
  • 25.Tmušić G, Manfreda S, Aasen H, James MR, Gonçalves G, Ben-Dor E, et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020;12(6):1001. [Google Scholar]
  • 26.Bernaldo P, Pereyra Irujo G. Proyecto “Vuela.” Liinc Em Rev. 2018;14(1). [Google Scholar]
  • 27.Vuela Proyecto. Proyecto Vuela: ciencia libre con drones. Proyecto Vuela. 2018. Available from: http://vuela.cc/. [Google Scholar]
  • 28.Amorós L, Varona N, Boronat R, Rangholia C. Flone: una plataforma para que los smartphones puedan volar. 2015. Available from: https://flone.cc/. [Google Scholar]
  • 29.Meier L, Tanskanen P, Heng L, Lee GH, Fraundorfer F, Pollefeys M. PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Auton Robots. 2012;33(1):21–39. [Google Scholar]
  • 30.Project Pixhawk. Pixhawk: the hardware standard for open-source autopilots. 2014. Available from: https://pixhawk.org/. [Google Scholar]
  • 31.ArduPilot Project. ArduPilot. ArduPilot.org. 2016. Available from: https://ardupilot.org.
  • 32.Short J, Mackay R, Robustini M. ArduCopter. 2015. Available from: https://github.com/ArduPilot/ardupilot/blob/master/ArduCopter/Copter.cpp. [Google Scholar]
  • 33.Vautherin J, Rutishauser S, Schneider-Zapp K, Choi HF, Chovancova V, Glass A, et al. Photogrammetric accuracy and modeling of rolling shutter cameras. In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Copernicus GmbH; 2016. P. 139–46. Available from: https://www.isprs-ann-photogramm-remote-sens-spatial-inf-sci.net/III-3/139/2016/. [Google Scholar]
  • 34.CHDK Development Team. Canon Hack Development Kit. 2007. Available from: https://chdk.fandom.com/wiki/CHDK. [Google Scholar]
  • 35.O’Connor J, Smith MJ, James MR. Cameras and settings for aerial surveys in the geosciences: Optimising image data. Prog Phys Geogr Earth Environ. 2017;41(3):325–44. [Google Scholar]
  • 36.Svensgaard J, Jensen SM, Westergaard JC, Nielsen J, Christensen S, Rasmussen J. Can reproducible comparisons of cereal genotypes be generated in field experiments based on UAV imagery using RGB cameras? Eur J Agron. 2019;106:49–57. [Google Scholar]
  • 37.Sankaran S, Zhou J, Khot LR, Trapp JJ, Mndolwa E, Miklas PN. High-throughput field phenotyping in dry bean using small unmanned aerial vehicle based multispectral imagery. Computers and Electronics in Agriculture. 2018;151:84–92. [Google Scholar]
  • 38.Jewan SYY, Pagay V, Billa L, Tyerman SD, Gautam D, Sparkes D, et al. The feasibility of using a low-cost near-infrared, sensitive, consumer-grade digital camera mounted on a commercial UAV to assess Bambara groundnut yield. International Journal of Remote Sensing. 2022;43(2):393–423. [Google Scholar]
  • 39.Authors OpenDroneMap. OpenDroneMap: A command line toolkit to generate maps, point clouds, 3D models and DEMs from drone, balloon or kite images. 2020. Available from: https://github.com/OpenDroneMap/ODM. [Google Scholar]
  • 40.QGIS Development Team. QGIS: a Free and Open Source Geographic Information System. 2002. Available from: https://www.qgis.org/. [Google Scholar]
  • 41.Bannari A, Morin D, Bonn F, Huete AR. A review of vegetation indices. Remote Sens Rev. 1995;13(1–2):95–120. [Google Scholar]
  • 42.Gitelson AA, Kaufman YJ, Stark R, Rundquist D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens Environ. 2002;80(1):76–87. [Google Scholar]
  • 43.Zhou M, Ma X, Wang K, Cheng T, Tian Y, Wang J, et al. Detection of phenology using an improved shape model on time-series vegetation index in wheat. Comput Electron Agric. 2020;173:105398. [Google Scholar]
  • 44.Jiang Z, Huete AR, Chen J, Chen Y, Li J, Yan G, et al. Analysis of NDVI and scaled difference vegetation index retrievals of vegetation fraction. Remote Sens Environ. 2006;101(3):366–78. [Google Scholar]
  • 45.Laboratory Sinergise. EO Browser. 2022. Available from: https://apps.sentinel-hub.com/eo-browser/?zoom=18&lat=-37.76428&lng=-58.51554&datasetId=S2L2A&fromTime=2020-12-06&toTime=2020-12-07. [Google Scholar]
  • 46.Guo W, Carroll ME, Singh A, Swetnam TL, Merchant N, Sarkar S, et al. UAS-Based Plant Phenotyping for Research and Breeding Applications. Plant Phenomics. 2021;2021:9840192. doi: 10.34133/2021/9840192 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Mesquita GP, Rodríguez-Teijeiro JD, Oliveira RR de, Mulero-Pázmány M. Steps to build a DIY low-cost fixed-wing drone for biodiversity conservation. PLOS ONE. 2021;16(8):e0255559. doi: 10.1371/journal.pone.0255559 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Anderson K, Griffiths D, DeBell L, Hancock S, Duffy JP, Shutler JD, et al. A Grassroots Remote Sensing Toolkit Using Live Coding, Smartphones, Kites and Lightweight Drones. PLOS ONE. 2016;11(5):e0151564. doi: 10.1371/journal.pone.0151564 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Parker A, Dosemagen S, Molloy J, Bowser A, Novak A. Open Hardware: An Opportunity to Build Better Science. The Wilson Center; 2021. Available from: https://diplomacy21-adelphi.wilsoncenter.org/sites/default/files/media/uploads/documents/STIP%20Open%20Hardware%20An%20Opportunity%20to%20Build%20Better%20Science_0.pdf. [Google Scholar]
  • 50.Mulla D, Khosla R. Historical Evolution and Recent Advances in Precision Farming. In: Lal R, Stewart BA, Eds. Soil Specific Farming: Precision Agriculture. CRS Press: Boca Raton, FL, USA, 2016. [Google Scholar]
  • 51.Svensgaard J, Jensen SM, Christensen S, Rasmussen J. The importance of spectral correction of UAV-based phenotyping with RGB cameras. Field Crops Res. 2021;269:108177. [Google Scholar]
  • 52.Holman FH, Riche AB, Castle M, Wooster MJ, Hawkesford MJ. Radiometric Calibration of ‘Commercial off the Shelf’ Cameras for UAV-Based High-Resolution Temporal Crop Phenotyping of Reflectance and NDVI. Remote Sens. 2019;11(14):1657. [Google Scholar]
  • 53.Ashapure A, Jung J, Chang A, Oh S, Maeda M, Landivar J. A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data. Remote Sensing. 2019;11(23):2757. [Google Scholar]
  • 54.Costa L, Nunes L, Ampatzidis Y. A new visible band index (vNDVI) for estimating NDVI values on RGB images utilizing genetic algorithms. Computers and Electronics in Agriculture. 2020;172:105334. [Google Scholar]
  • 55.Davidson C, Jaganathan V, Sivakumar AN, Czarnecki JMP, Chowdhary G. NDVI/NDRE prediction from standard RGB aerial imagery using deep learning. Computers and Electronics in Agriculture. 2022;203:107396. [Google Scholar]
  • 56.Herzig P, Borrmann P, Knauer U, Klück HC, Kilias D, Seiffert U, et al. Evaluation of RGB and Multispectral Unmanned Aerial Vehicle (UAV) Imagery for High-Throughput Phenotyping and Yield Prediction in Barley Breeding. Remote Sensing. 2021;13(14):2670. [Google Scholar]
  • 57.Zahawi RA, Dandois JP, Holl KD, Nadwodny D, Reid JL, Ellis EC. Using lightweight unmanned aerial vehicles to monitor tropical forest recovery. Biol Conserv. 2015;186:287–95. [Google Scholar]
  • 58.Zhang F, Hassanzadeh A, Kikkert J, Pethybridge SJ, van Aardt J. Comparison of UAS-Based Structure-from-Motion and LiDAR for Structural Characterization of Short Broadacre Crops. Remote Sens. 2021;13(19):3975. [Google Scholar]
  • 59.Carver JC, Weber N, Ram K, Gesing S, Katz DS. A survey of the state of the practice for research software in the United States. PeerJ Comput Sci. 2022;8:e963. doi: 10.7717/peerj-cs.963 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Hocquet A, Wieber F. Epistemic issues in computational reproducibility: software as the elephant in the room. Euro Jnl Phil Sci. 2021;11(2):38. [Google Scholar]
  • 61.Jiménez RC, Kuzak M, Alhamdoosh M, Barker M, Batut B, Borg M, et al. Four simple recommendations to encourage best practices in research software. F1000Research 2017;6. doi: 10.12688/f1000research.11407.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Anzt H, Bach F, Druskat S, Löffler F, Loewe A, Renard BY, et al. An environment for sustainable research software in Germany and beyond: current state, open challenges, and call for action. F1000Research. 2021;9:295. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Konkol M, Kray C, Pfeiffer M. Computational reproducibility in geoscientific papers: Insights from a series of studies with geoscientists and a reproduction study. International Journal of Geographical Information Science. 2019;33(2):408–29. [Google Scholar]
  • 64.Li Z, Seering W. Does Open Source Hardware Have a Sustainable Business Model? An Analysis of Value Creation and Capture Mechanisms in Open Source Hardware Companies. Proceedings of the Design Society: International Conference on Engineering Design. 2019;1(1):2239–48.
  • 65.Wyngaard J, Barbieri L, Thomer A, Adams J, Sullivan D, Crosby C, et al. Emergent Challenges for Science sUAS Data Management: Fairness through Community Engagement and Best Practices Development. Remote Sens. 2019;11(15):1797. [Google Scholar]
  • 66.Thomer A, Barbieri L, Wyngaard J, Swanz S. A Minimum Information Framework for capturing FAIR data with small Uncrewed Aircraft Systems. 2021; Available from: https://eartharxiv.org/repository/view/2593/. [Google Scholar]
  • 67.Authors DroneDB. DroneDB—Effortless Aerial Data Management and Sharing. 2020. Available from: https://dronedb.app/. [Google Scholar]

Decision Letter 0

Vona Méléder

18 Jan 2023

PONE-D-22-24873Open Science Drone Toolkit: open source hardware and software for aerial data capturePLOS ONE

Dear Dr. Pereyra Irujo,

Thank you for submitting your manuscript to PLOS ONE. First, I would like to apologize for the delay for the reviewing. It is not due to the qualtity of your work, but the lack of available expert to review it.After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

Please submit your revised manuscript by Mar 04 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Vona Méléder, Ph.D.

Academic Editor

PLOS ONE

Journal requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf

and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf.

2. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability.

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

3. We note that [Figures 2-4] in your submission contain [map/satellite] images which may be copyrighted. All PLOS content is published under the Creative Commons Attribution License (CC BY 4.0), which means that the manuscript, images, and Supporting Information files will be freely available online, and any third party is permitted to access, download, copy, distribute, and use these materials in any way, even commercially, with proper attribution. For these reasons, we cannot publish previously copyrighted maps or satellite images created using proprietary data, such as Google software (Google Maps, Street View, and Earth). For more information, see our copyright guidelines: http://journals.plos.org/plosone/s/licenses-and-copyright.

We require you to either (1) present written permission from the copyright holder to publish these figures specifically under the CC BY 4.0 license, or (2) remove the figures from your submission:

a. You may seek permission from the original copyright holder of Figures 2-4 to publish the content specifically under the CC BY 4.0 license. 

We recommend that you contact the original copyright holder with the Content Permission Form (http://journals.plos.org/plosone/s/file?id=7c09/content-permission-form.pdf) and the following text:

“I request permission for the open-access journal PLOS ONE to publish XXX under the Creative Commons Attribution License (CCAL) CC BY 4.0 (http://creativecommons.org/licenses/by/4.0/). Please be aware that this license allows unrestricted use and distribution, even commercially, by third parties. Please reply and provide explicit written permission to publish XXX under a CC BY license and complete the attached form.”

Please upload the completed Content Permission Form or other proof of granted permissions as an ""Other"" file with your submission.

In the figure caption of the copyrighted figure, please include the following text: “Reprinted from [ref] under a CC BY license, with permission from [name of publisher], original copyright [original copyright year].”

b. If you are unable to obtain permission from the original copyright holder to publish these figures under the CC BY 4.0 license or if the copyright holder’s requirements are incompatible with the CC BY 4.0 license, please either i) remove the figure or ii) supply a replacement figure that complies with the CC BY 4.0 license. Please check copyright information on all replacement figures and update the figure caption with source information. If applicable, please specify in the figure caption text when a figure is similar but not identical to the original image and is therefore for illustrative purposes only.

The following resources for replacing copyrighted map figures may be helpful:

USGS National Map Viewer (public domain): http://viewer.nationalmap.gov/viewer/

The Gateway to Astronaut Photography of Earth (public domain): http://eol.jsc.nasa.gov/sseop/clickmap/

Maps at the CIA (public domain): https://www.cia.gov/library/publications/the-world-factbook/index.html and https://www.cia.gov/library/publications/cia-maps-publications/index.html

NASA Earth Observatory (public domain): http://earthobservatory.nasa.gov/

Landsat: http://landsat.visibleearth.nasa.gov/

USGS EROS (Earth Resources Observatory and Science (EROS) Center) (public domain): http://eros.usgs.gov/#

Natural Earth (public domain): http://www.naturalearthdata.com/.

4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

Reviewer #2: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: Yes

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: 1. This is a valuable project that will be of interest, and I believe a revised version should be published; my two main recommendations are: a) carefully address reproducibility and replicability (R&R) literature, and b) do not try to “validate” data collected from the platforms but provide comparisons

2. Abstract; given the relatively large altitude of an orbital platform, it is not clear why satellite imagery would be the most obvious modality for reference

3. Abstract; reproducibility is mentioned as a benefit of the Open Science Drone Toolkit, though this is presented as a conlcusion rather than an area of inquiry; how does one ascertain reproducibility and what about replicability and other stakeholder interests?

4. Line 36; it is not clear what the authors mean when using the word “reproduction”, which has limited reference in The Royal Society (2012). The National Academies (2019) provides a much needed clarification on terminology regarding reproducibility and replicability, with the latter being mentioned far more often by The Royal Society (2012): National Academies of Sciences, Engineering, and Medicine. (2019). *Reproducibility and replicability in science*. National Academies Press. [https://www.nap.edu/catalog/25303](https://www.nap.edu/catalog/25303); these definitions and their implications would need to be addressed throughout the manuscript in order to avoid confusion with recent progress on reproducibility and replicability (R&R)

5. Line 44; if the user makes improvements to open source software, as described here, does that contribute to (or detract from) its benefits for others?

6. Lines 100-102; the statement that the hardware which is not open-hardware can be “readily replaced” doesn’t make sense in the context of the study; if the various commercial components such as Windows computers are convenient, why not commercial drones?

7. P. 6; when referencing “GPS”, this would typically refer to Global Positioning System which is a United States platform; wouldn’t this benefit from a more generic reference, given international scope?

8. Line 185; I doubt Sentinel 2 imagery can be used to “validate” drone imagery that has a spatial resolution of 2 x 2 cm; these are very different sources and it is not clear what is meant by “validate”; I would strongly suggest including these efforts as a comparison but not as a validation

9. The use of the handheld sensor is helpful, though still a comparison rather than a validation; I believe the latter would require a far more sophisticated process which draws deeply upon literature regarding error expression and propagation as well as uncertainty

10. You can reference literature that has already done extensive work on low altitude drone imaging using commercial sensors, and its comparisons with reference data; the fact that you are using an open hardware drone should not change the fundamental process of low altitude aerial image streams

Reviewer #2: Thanks for the opportunity to review this paper, I really enjoyed reading it an learning about your work! The paper is well written and provides documentation of interesting science that is well aligned with my area of interest. I have attached my comments in the document and hopefully they prove useful for you in a minor revision.

I also suggest that the figure quality needs to be increased - they are quite blurry on my screen.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Karen Joyce

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

Attachment

Submitted filename: PONE-D-22-24873review.pdf

PLoS One. 2023 Apr 6;18(4):e0284184. doi: 10.1371/journal.pone.0284184.r002

Author response to Decision Letter 0


16 Feb 2023

Review Comments to the Author - Reviewer #1

Reviewer's comment: 1. This is a valuable project that will be of interest, and I believe a revised version should be published; my two main recommendations are: a) carefully address reproducibility and replicability (R&R) literature, and b) do not try to “validate” data collected from the platforms but provide comparisons

Authors' response: We appreciate the reviewer’s comments and valuable suggestions. We have addressed both of the main recommendations, as detailed in the responses to the following comments.

Reviewer's comment: 2. Abstract; given the relatively large altitude of an orbital platform, it is not clear why satellite imagery would be the most obvious modality for reference

Authors' response: Both satellite imagery and hand-held sensors (such as the one used in our study) are among the tools most frequently used by farmers to monitor their crops, have been in use for around 20 years, and therefore can safely be considered as well-established, trusted, reference sensors (we included a comment and reference in the Discussion section to clarify this, lines 250-252 *). An “obvious” choice could have been another UAV, but none was available to us at the moment, and also the myriad of currently available platforms and sensors would have made it difficult to select one of them as reference (the citations added in line 255 [52-55] include many of those sensors, detailed in our response to Comment #10).

* Line numbers mentioned in our responses correspond to the revised version with tracked changes

Reviewer's comment: 3. Abstract; reproducibility is mentioned as a benefit of the Open Science Drone Toolkit, though this is presented as a conclusion rather than an area of inquiry; how does one ascertain reproducibility and what about replicability and other stakeholder interests?

Authors' response: As indicated previously, R&R terminology has been clarified, and therefore the reference to reproducibility in its generic sense was removed from the Abstract and the Conclusions. Our work does not attempt to evaluate the reproducibility of our computational methods nor the replicability of the results obtained with the OSDT, as it would be out of the scope of the paper. We have therefore revised the text and removed any references to the reproducibility or replicability of the methods and results of our paper. In the Introduction section, however, several references are cited which argue that open source hardware and software can be beneficial for R&R. These references were kept, but we clarified the terminology as per the NASEM (2019) report (details in the response to the following comment).

Reviewer's comment: 4. Line 36; it is not clear what the authors mean when using the word “reproduction”, which has limited reference in The Royal Society (2012). The National Academies (2019) provides a much needed clarification on terminology regarding reproducibility and replicability, with the latter being mentioned far more often by The Royal Society (2012): National Academies of Sciences, Engineering, and Medicine. (2019). *Reproducibility and replicability in science*. National Academies Press. [https://www.nap.edu/catalog/25303](https://www.nap.edu/catalog/25303); these definitions and their implications would need to be addressed throughout the manuscript in order to avoid confusion with recent progress on reproducibility and replicability (R&R)

Authors' response: The use of R&R terminology was revised throughout the manuscript, following the NASEM (2019) report, as detailed here:

Lines 31-32: “reproducible” was removed (open source software and open data provide increased transparency which could arguably contribute to reproducibility, but the reproducibility of our computational workflows was not assessed)

Line 36: “reproduction” was replaced by “replication” (we meant reproduction in the generic sense, but this idea is more clearly conveyed by the term replication as defined in the NASEM report)

Line 42: “reproduction” was replaced by “replication or reproduction” (access to scientific instruments, materials and software is necessary both for computational reproducibility or for attempting a complete replication of a previous study)

Line 50: “reproducibility of published results” was replaced by “reproducibility of published results or replication attempts” (both computational reproducibility using the same tools and data, and independent replication attempts can benefit from having access to the same tools used in the original studies)

Line 253: “reproducible and repeatable measurements” was replaced by “robust measurements” (the terms used in the paper cited to assess the quality of the measurements were not in line with the NASEM terminology, so they were replaced with a more general term)

Line 271: “reproducibility” was left unchanged (it refers to computational reproducibility being hindered by lack of data sharing standards)

Line 292: “reproducible” was removed (same reason as in the Abstract)

Reviewer's comment: 5. Line 44; if the user makes improvements to open source software, as described here, does that contribute to (or detract from) its benefits for others?

Authors' response: This sentence refers only to the freedoms conferred to the user when software is openly licensed. Any changes to a software can only impact other users as long as the modified version is further distributed. This is the same for proprietary or closed-source software, the difference being that in this case only the original developer can (legally) make those changes. In recent years, with the advent of subscription cloud platforms and software-as-a-service, the final user is usually not aware of software changes or updates, since they function as a “black box”. In the case of open-source software, the “risk” of many people being able to make changes is accompanied by the possibility of inspecting the code and effectively checking those changes against previous versions. We have not included any additional comments in the manuscript text, since these concepts apply to all openly licensed works and we therefore believe it is out of the scope of our paper to discuss it more deeply. The reader can refer to the literature already cited [10-16] for further details on this topic.

Reviewer's comment: 6. Lines 100-102; the statement that the hardware which is not open-hardware can be “readily replaced” doesn’t make sense in the context of the study; if the various commercial components such as Windows computers are convenient, why not commercial drones?

Authors' response: In contrast with open source software, open source hardware is virtually impossible to be 100% open source, and therefore uses the concept of “available components” (https://ohwr.org/project/cernohl/wikis/faq#q-what-are-available-component)s or “readily-available components and materials” (https://www.oshwa.org/definition/). The OSDT components that are referred to in lines 102-103 are those that can effectively be replaced without significantly altering the technical specifications of the system. It is not expected that the capabilities of the system will be significantly altered by using a different computer, smartphone or radio transmitter (since the drone flies autonomously), as long as they are compatible. This might not be the case for the camera, for which different brands or models could have widely varied capabilities, and therefore we suggest the use of a specific brand.

A commercial drone could indeed be used as a complete replacement, but this generally also requires the use of a dedicated closed-source software, has limited customization options, etc. The aim of our work was to reduce the use of proprietary technologies as much as possible, therefore maximizing the user freedoms and the advantages of open tools.

Reviewer's comment: 7. P. 6; when referencing “GPS”, this would typically refer to Global Positioning System which is a United States platform; wouldn’t this benefit from a more generic reference, given international scope?

Authors' response: This was corrected as suggested:

Table 1: “GPS-guided flight” was replaced by “satellite-guided flight”

Table 1: “GPS coordinates” was replaced by “coordinates”

Line 94: “GPS capability” was replaced by “satellite navigation capability”

Table 2: “GPS recorder” was replaced by “Location recorder app”

Line 111: “GPS” was replaced by “GNSS (Global Navigation Satellite System)”

Line 142: “GPS location data” was replaced by “location data”

Line 193: “GPS logging app” was replaced by “location recorder app”

Reviewer's comment: 8. Line 185; I doubt Sentinel 2 imagery can be used to “validate” drone imagery that has a spatial resolution of 2 x 2 cm; these are very different sources and it is not clear what is meant by “validate”; I would strongly suggest including these efforts as a comparison but not as a validation

9. The use of the handheld sensor is helpful, though still a comparison rather than a validation; I believe the latter would require a far more sophisticated process which draws deeply upon literature regarding error expression and propagation as well as uncertainty

Authors' response: The objective in both cases was indeed to provide a comparison and not a validation, and it is referred to as such many times throughout the manuscript (lines 28, 203, 215, 223, 227, 248, 289). The term validation was wrongly included in this sentence, and has been corrected (line 188).

Reviewer's comment: 10. You can reference literature that has already done extensive work on low altitude drone imaging using commercial sensors, and its comparisons with reference data; the fact that you are using an open hardware drone should not change the fundamental process of low altitude aerial image streams

Authors' response: In the paragraph in lines 247-258, our results are discussed in relation to some of the (indeed extensive) literature on UAV aerial imaging, and particularly the comparison of similar sensors (RGB cameras) to other sensors (multispectral cameras and LiDAR systems). In the revised version we have included additional references (line 255) to studies which have compared RGB cameras to multiple sensors, i.e., Micasense RedEdge-M camera [54], SlantRange 3p sensor [53], MicaSense RedEdge 3, Parrot Sequoia and Sentera cameras [55], Parrot Sequoia camera, Tec5 HandySpec Field spectrometer [52]. As pointed out by the Reviewer, our work should only aspire to provide a comparison to reference instruments, since a true validation would indeed require a sophisticated procedure which is beyond the scope of our paper.

---

Review Comments to the Author - Reviewer #2

Reviewer's comment: Thanks for the opportunity to review this paper, I really enjoyed reading it an learning about your work! The paper is well written and provides documentation of interesting science that is well aligned with my area of interest. I have attached my comments in the document and hopefully they prove useful for you in a minor revision.

Authors' response: We appreciate the reviewer’s comments and valuable suggestions. All the in-line comments in the manuscript have been addressed, as detailed below.

Reviewer's comment: I also suggest that the figure quality needs to be increased - they are quite blurry on my screen.

Authors' response: The figures are shown in low resolution in the PDF, but are uploaded in the required resolution in the editorial platform.

Reviewer's comments in the document:

Line 48: Comment: Might be worth noting that sometimes a disadvantage is that often open hardware and software is developed through a grant or similar and has no business model to maintain, sustain, or further develop [...]

Authors' response: We agree with the concerns about the sustainability of open source projects. We have added a paragraph on this issue in the Discussion section (lines 259-267).

Line 69: “The objective of this work was to address this question by collecting, curating, organizing and testing…” - Comment: processing? analysing?

Authors' response: We kept this phrase unchanged, since we consider that the processing and analysis of the data is included in the “testing” step

Line 108: Comment: I like the name

Authors' response: Thanks!

Line 122: “...with approximately 30% of battery capacity remaining to return to the launch site and land safely.” - Comment: Caveat that this is based on how far away (horizontally and vertically) the drone is from the home point, noting that there should be 20% battery remaining on landing

Authors' response: We agree that the time necessary to return to the launch site and land should be considered in the total flight time, and that there should be at least 20% battery remaining after that. We tried to make a conservative estimate considering 30% battery remaining, but it is true that our phrasing implied that that “extra” 10% would be sufficient for returning to the home point and landing in all cases, which is not correct. We therefore modified the sentence so as to avoid this (line 124).

Line 144: “This process is called geotagging … and can be performed using the Mission Planner ground station software.” - Comment: as long as you remember to set the time properly on the camera

Authors' response: This is indeed an important thing to consider, and as such it is included in the “usage guide” of the OSDT. Also, since it is something that is easy to forget, an alternative is to measure the time offset afterwards, and use that offset in the geotagging tool.

Line 202: Comment: I'm missing in this paragraph the height of the sensor from the feature, it's FOV, and therefore the 'footprint' of the measurement. Also maybe something about how the measurements were geo-located?

Authors' response: The missing information was included (lines 211-214)

Lines 202-208: [various corrections]

Authors' response: The text was corrected as suggested

Line 214: Comment: Would be good to note the spatial accuracy of the GPS and how you are effectively tying 2cm (GSD from drone image) to a satellite image and field data measurements when presumably there could be meters discrepancy? I understand that you can't resolve for this within the scope of the study, but would like to see recognition of this issue

Authors' response: We included a sentence which warns about this possible discrepancy in the case of the drone-satellite comparison (lines 205-208). In the case of the handheld sensor-drone comparison, a visible physical mark combined with ground measurements and the aid of the crop rows was used to aid in the correspondence (lines 211-213).

Lines 224-225: Comment: why not potentially commercial grade? I'm guessing here that you're imagining 'research grade' is lower level - but no reason why your results can't demonstrate value to the commercial sector? If the results are comparable to proprietary systems, why not?!

Authors' response: We agree about the possibility of commercial applications, but the scope of our work and the toolkit is scientific applications. We did not imply either lower or higher level, but requirements that are specific to scientific work (e.g., not only being able to provide a valid measurement but also being transparent about the algorithms that lead to that result). We nevertheless included a small comment to suggest the possibility of commercial applications (line 237).

Line 234: I think that it's often the commercial sector that perhaps hints at the 'quality' or lack thereof of open source (perhaps a reasonable tactic to keep themselves in business, which of course is necessary for them too), but the reliability side of things can be a real issue (see my point in the intro.

Authors' response: As indicated in the response to the first comment, a paragraph on this topic was added to the Discussion section (lines 259-267).

Line 239: Comment: I would also argue that in many cases, the high spatial resolution is just as, if not more important the gaining an extra couple of bands. Multispec cameras are notoriously low in spatial resolution, and people who don't understand this trade off relationship are often left disappointed with their purchases. The Phantom multispec for example is 2MP vs 20MP for their standard RGB (I know you are going with open source, but just for comparison here).

Another thought actually could be to hack a canon or similar to remove the NIR filter and then put filters back on in the wavelengths of interest - that would be more in your flavour and keep the spatial resolution of the sensor.

Note that I suggest the above as comments to consider adding in the discussion text, not for re-doing or adding to the scope of your current paper

Authors' response: We agree on the advantages of the higher spatial resolution of RGB over multipectral cameras, as clearly explained by the Reviewer. We added a comment and a reference to a paper where this is discussed in more detail (line 255). We also added references about the possibility of using modified RGB cameras for NIR imaging (lines 145-147).

Line 244: “...little attention has been paid to the management…” - Comment: and storage

Authors' response: We included the text as suggested.

Attachment

Submitted filename: Response to reviewers.pdf

Decision Letter 1

Vona Méléder

20 Mar 2023

PONE-D-22-24873R1Open Science Drone Toolkit: open source hardware and software for aerial data capturePLOS ONE

Dear Dr. Pereyra Irujo,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. Reviewers higlighted the fact that authors have very carefully considered the reviewers comments. This is an important study involving a significant effort. There is still a minor recommandation to be addressed before publication.

Please submit your revised manuscript by May 04 2023 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols.

We look forward to receiving your revised manuscript.

Kind regards,

Vona Méléder, Ph.D.

Academic Editor

PLOS ONE

Journal Requirements:

Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #1: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: N/A

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: Thank you for the opportunity to review this new version of the manuscript. You have carefully addressed the reviewer comments. I am pleased to recommend publication with the following minor recommendation:

I still get a sense that the comparative data are being presented as "reference" and as an "evaluation" rather than simply a comparison; the above words probably require a different approach, which I don't believe is necessary; I suggest changing these words to "comparison" or to a close synonym.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

**********

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2023 Apr 6;18(4):e0284184. doi: 10.1371/journal.pone.0284184.r004

Author response to Decision Letter 1


22 Mar 2023

Review Comments to the Author - Reviewer #1:

"Thank you for the opportunity to review this new version of the manuscript. You have carefully addressed the reviewer comments. I am pleased to recommend publication with the following minor recommendation:

I still get a sense that the comparative data are being presented as "reference" and as an "evaluation" rather than simply a comparison; the above words probably require a different approach, which I don't believe is necessary; I suggest changing these words to "comparison" or to a close synonym."

Response: We appreciate the Reviewer’s recommendation for publication. As suggested, the words “reference” and “evaluation” were deleted or replaced with “comparison” or similar words, as follows:

-Lines 28-30 (Abstract): “Performance was evaluated by…” was replaced by “Data obtained with this toolkit … was compared to…”, and “both reference instruments” was changed to simply “both instruments”.

-Line 188: “two reference sources” was changed to “two sources”.

-Line 248: “both reference instruments” was replaced by “both instruments”.

-Lines 288-290: Same changes as in the Abstract

Attachment

Submitted filename: response-to-reviewers-rev2.pdf

Decision Letter 2

Vona Méléder

27 Mar 2023

Open Science Drone Toolkit: open source hardware and software for aerial data capture

PONE-D-22-24873R2

Dear Dr. Pereyra Irujo,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Vona Méléder, Ph.D.

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Vona Méléder

29 Mar 2023

PONE-D-22-24873R2

Open Science Drone Toolkit: open source hardware and software for aerial data capture

Dear Dr. Pereyra Irujo:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Vona Méléder

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 File. Alternative language article.

    Complete manuscript in Spanish: “Herramientas de código abierto para la captura de datos aéreos mediante drones”.

    (PDF)

    Attachment

    Submitted filename: PONE-D-22-24873review.pdf

    Attachment

    Submitted filename: Response to reviewers.pdf

    Attachment

    Submitted filename: response-to-reviewers-rev2.pdf

    Data Availability Statement

    Open Science Drone Toolkit details are publicly available from Vuela (https://vuela.cc/), and data for the example use case is publicly available from the OSF repository (https://osf.io/t7y6x/).


    Articles from PLOS ONE are provided here courtesy of PLOS

    RESOURCES