End-To-End Precision Agriculture UAV-Based Functionalities Tailored To Field Characteristics

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

Journal of Intelligent & Robotic Systems (2023) 107:23

https://doi.org/10.1007/s10846-022-01761-7

REGULAR PAPER

End-to-end Precision Agriculture UAV-Based Functionalities


Tailored to Field Characteristics
Emmanuel K. Raptis1,2 · Marios Krestenitis2 · Konstantinos Egglezos3 · Orfeas Kypris3 ·
Konstantinos Ioannidis2 · Lefteris Doitsidis4 · Athanasios Ch. Kapoutsis2 · Stefanos Vrochidis2 ·
Ioannis Kompatsiaris2 · Elias B. Kosmatopoulos1,2

Received: 24 June 2022 / Accepted: 24 October 2022 / Published online: 27 January 2023
© The Author(s) 2023

Abstract
This paper presents a novel, low-cost, user-friendly Precision Agriculture platform that attempts to alleviate the drawbacks
of limited battery life by carefully designing missions tailored to each field’s specific, time-changing characteristics. The
proposed system is capable of designing coverage missions for any type of UAV, integrating field characteristics into the
resulting trajectory, such as irregular field shape and obstacles. The collected images are automatically processed to create
detailed orthomosaics of the field and extract the corresponding vegetation indices. A novel mechanism is then introduced
that automatically extracts possible problematic areas of the field and subsequently designs a follow-up UAV mission to
acquire extra information on these regions. The toolchain is finished by using a deep learning module that was made just for
finding weeds in the close-examination flight. For the development of such a deep-learning module, a new weed dataset from
the UAV’s perspective, which is publicly available for download, was collected and annotated. All the above functionalities
are enclosed in an open-source, end-to-end platform, named Cognitional Operations of micro Flying vehicles (CoFly). The
effectiveness of the proposed system was tested and validated with extensive experimentation in agricultural fields with
cotton in Larissa, Greece during two different crop sessions.

Keywords Precision agriculture · UAVs · Coverage · Remote sensing · Site-specific inspection ·


Convolutional neural networks · Weed detection

1 Introduction resources. Nowadays, increasing the global productivity and


quality of food resources remains a challenging field for
The course of human history can be characterized as humanity.
a continuous struggle for food, wealth distribution, and From the scope of economics, the primary sector
prosperity. The industrial revolution reformed this process embraces a wide part of it, especially in the case of less
by setting the basis for mass and automated production. heavily industrialized countries, where agriculture remains
The technological advancements in the sector of agriculture a prime activity that leads to income and export revenue.
reflect this constant human labor to guarantee adequate food Increasing food production is primarily a prerequisite
for human well-being but also economic enhancement.
Considering that apart from food resources, agriculture is
 Emmanuel K. Raptis responsible also for the production of fabric, wood, paper,
[email protected]
etc., increasing the overall productivity rate remains a
crucial issue from an economic perspective also. However,
1 Department of Electrical and Computer Engineering, while the production of all these viable goods is becoming
Democritus University of Thrace, Xanthi, 67100, Greece more and more overwhelming due to the continuous
2 Information Technologies Institute, The Centre for Research population growth [41], the already finite arable land is
& Technology, Hellas, Thessaloniki, 57001, Greece further being limited by extended farming that deteriorates
3 iKnowHow SA, Athens, 11526, Greece soil quality and increases the regions unsuitable for
4 School of Production Engineering and Management, cultivation [51]. According to the Food and Agriculture
Technical University of Crete, Chania, 73100, Greece Organization of United Nations (FAO), it is estimated that
23 Page 2 of 26 J Intell Robot Syst (2023) 107:23

annually around 20%-40% of any agricultural region is lost precision agriculture to take hold, any technological tool
due to pests and diseases [21]. This cultivation regional that is expected to assist farmers must be scalable to the
loss along with the constant need for increased productivity size of their needs, expectation, and farming operations.
leads to excessive use of pesticides that clearly threatens Conceptually, any agricultural software which is integrating
human health and pollutes the environment. robotics and computer vision techniques must have a
The worldwide prevalence of the aforementioned issues common holistic view of agricultural processes enhancing
emerged a call for immediate action and improvement crop yield management decisions by assisting even non-
in the agricultural sector [26], aiming to develop more expert users in proper manipulations.
effective vegetative treatment strategies for enhancing the In this direction, utilizing intelligence devices such
quality and quantity of crop yields. Towards this direction, as Unmanned Aerial Vehicles (UAVs) have expanded
utilizing remote sensing technology holds the promise the capabilities of PA techniques by enabling flexibility,
of a feasible solution to the aforementioned challenges. wieldy and adjustable to the case requirements solutions.
Processing of sensory data based on artificial intelligence Based on that, integrated end-to-end platforms have been
and visual analysis can provide propitious, so-called, developed to automate the procedures of UAV navigation,
Precision Agriculture (PA) techniques for monitoring data collection, and processing for knowledge extraction
food, fabric, wood, etc. production. These PA techniques and decision support. Thanks to such platforms, common-
focus on building systems for automation of important commercial UAVs have started to bring the agriculture
farming operations such as monitoring the crop’s growth production systems into economic and environmental
and performance, assessing its fertilizer rate and water benefits, making productivity in the primary sector more
level, etc. To achieve that, data acquisition from the sustainable. In the following subsection, an overview of the
agricultural fields can provide information to the farmers important and recent related works regarding end-to-end
regarding crop diseases, pest infestations, water stress, platforms in the field of agriculture is presented.
and other relevant issues that affect crop productivity. By
further processing the acquired sensory data, PA systems
1.1 Related Work
aim to correlate production techniques with early crop Recent technological developments, a scale-up in sensor
management decisions to reduce agrochemical use and and UAV production with a simultaneous decrease in cost,
manage natural resources more efficiently while boosting have given rise to a booming startup industry. While in the
overall productivity. beginning applications were limited to military use, soon
However, particular challenges need to be met to allow enough a large number of industrial and civilian use cases
the uptake of innovative technologies in smart farming to emerged, one of them being precision agriculture.
reach its promise [43]. The main challenge when combining According to a recent report [11], the agriculture drone
innovative technologies in precision agriculture is the market will be worth $5.19 Billion by 2025, growing at
standardizing technology and the data synchronization of an impressive Compound Annual Growth Rate (CAGR) of
the involved assets. Operating standards across different 31.1% from 2019. An overview and comparison among the
technologies may differ as many commercial or non- most significant commercial agricultural products in terms
commercial services adopted in the agricultural sector, have of their farming technological features can be seen in the
been developed based on a completely different conceptual upper section of Table 1. Pix4D [35], DroneDeploy [16],
approach and are intended to operate individually and only and Sentera FieldAgent [5] are commercial drone mapping
for a specific task (e.g., mission planning, image processing, software which incorporate digital farming technologies,
etc.). This lack of interoperability causes monotonous and emerging at precision agriculture and farming development.
time-consuming work for the operators since they have They can produce UAV flight paths, capture aerial
to manually feed the output data from one subsystem to photographs, visualize health indices, as well as provide
another. Another challenging task is the data management a timeline view of previous field scans for continuous
plan. Even a small farm has thousands of data from a crop monitoring purposes. Agisoft Metashape [3] is advertised
session where information must be gathered to estimate as a product for creating 3D maps using photogrammetry,
and assess its performance. Attempting to process and including crop yield and plant health maps. Botlink [12] is
understand these data, without a field service management a UAV-based agricultural software with an integrated flight
system to store them, whether that is daily, monthly, or planning framework that can provide high-definition 2-D
seasonally, can be overwhelming for both agriculturalists and 3-D outputs as well as vegetation indices. While Blue
and farmers. Last but not least, the lack of scalability of River technology [47] is not a UAV technology company,
many robot-related services, which are designed only for but rather a provider of smart farm machines to manage
scanning operations, are sure to end up with limited energy crops at a plant level, their product is mention-worthy, as
autonomy while operating across hundreds of acres [9]. For it uses computer vision and deep learning techniques to
J Intell Robot Syst (2023) 107:23 Page 3 of 26 23

Table 1 Competitiveness matrix listing commercially available products and research works, including this work (CoFly), comparing them with
respect to different features

monitor each plant in the field individually, a feature which [35] or any other software compatible with the drone.
integrated UAV products are currently lacking. While the aforementioned UAV flight planning software is
Research-wise, a large number of UAV-based applica- broadly used and open-source, they do not offer capabilities
tions have been developed, for a variety of agricultural that fit each field’s characteristics such as no-fly zones
practices, such as crop health and yield monitoring (sec- and automatic UAV-based weed detection. A deeper look
ond half of Table 1), making up the majority of agricultural into available open-source UAV flight controllers and flight
applications due to the ease of use, portability, and eco- simulators [17], reveals that there are ample options for
nomic viability compared to available commercial products, flight control and mission planning [4, 6, 30] however, none
while others incorporate crop spraying techniques [19], of these extend their capability beyond that to offer data
[20], which are rather limited owing to the lower pay- post-processing capabilities.
loads that most autonomous UAVs can carry. According Overall, several research works include remote sensing
to a recent review paper [39], most of these developed capabilities for a variety of UAV-based applications in preci-
systems use commercial software for 3D photogrammetry, sion agriculture. Despite their effectiveness and usefulness,
visualization, and health index calculation (Pix4dMapper the main drawback of these systems is the fact that are
[35], Agisoft Metashape [3]). In terms of contribution, intended for specific tasks and not for a holistic view of
these works mostly focus on improving specific workflows; agricultural processes. For example, some of them focus
for example, estimating the overall plant volume and the more on flight planning and health index calculation with-
soil surface for specific crop parcels [15]; monitoring and out having an integrated system including a friendly user
recording the reflectance of the vegetation canopy [37]; interface that eases and automate the utilization of the UAVs
estimation of chlorophyll levels in rice paddies by evaluat- [29, 37, 48, 49]. Some others utilize separate commercial
ing vegetation indices derived by hyperspectral data [48]; software for UAV planning in terms of data acquisition and
inspecting periodically the status of crops while obtaining crop health index computation, making their customization
multiple images of them and calculating various indices, as in agriculture applications difficult as they require some
well as pH level and acidity [49]; evaluating the capabilities expertise in order to be consolidated and manageable as
of a hyperspectral camera in identifying the nitrogen sta- an overall system [15, 53]. Besides, utilizing commercial
tus in rice crops [53]; monitoring and mapping vineyards as products might be non-affordable as some of them establish
well as identification of discontinuous crop rows based on a price according to daily or monthly use. Finally, almost
image stitching and health index computation [29]. none of the aforementioned commercial or research works
From a motion planning view, a great majority of incorporate no-fly zones inside the area of interest but sup-
bespoke systems use ROS framework [38], while others port only generic coverage missions which are universally
use Ardupilot Mission Planner software [8], Pix4DCapture applied to all crop fields (no site-specific missions).
23 Page 4 of 26 J Intell Robot Syst (2023) 107:23

1.2 Contributions 3. Site-specific mission: CoFly software classifies the


available VI maps and estimates the center of the
In this work, we propose a holistic approach focusing in problematic areas which are considered essential for
a high-level, user-friendly, and centralized platform devel- further processing and marked as Points of Interest.
oped to perform precision agriculture practices based on On top of that, an automated site-specific mission
UAV functionalities, optimized for real-life use and specif- with low altitude focused on site-specific treatment and
ically tailored to each field’s characteristics. The proposed weed detection is designed to provide a more thorough
platform aims at enhancing the software-dependent system inspection of these problematic locations.
abilities in various robot-related research areas while the
Although several UAV platforms are available and
hardware requirements are kept at the most significant so
capable of providing high-end, informative representations
that the cost could be kept at low levels.
of the field, all of them are severely restricted by the
In a nutshell, the overall workflow of the proposed
hardware (mainly battery capabilities) of the deployed UAV,
platform is illustrated in Fig. 1 and consists of the following
leading to limited energy autonomy and operational time.
steps.
CoFly system aims to alleviate this drawback by utilizing
an extra automatic way of creating also precision UAV
1. UAV flight planning: Initially, a Region of Interest
flights considering the extracted field-specific knowledge
(ROI) with a possible no-fly zone along with the UAV’s
(site-specific mission). In other words, the idea of this
flight planning parameters is defined. Given this user-
added feature is to avoid an over-detailed scanning of
defined information, the CoFly framework extracts an
the whole field, but rather to combine a quick overview
automated mission for the drone mapping project where
with cognition capabilities to accommodate solutions and
it can be monitored live through the smartphone.
alleviate the drawbacks of the UAV’s limited battery life
2. Computation and display of health indices: Each frame
while operating across hundreds of acres improving, thus,
per second an aerial image taken by the UAV is
the operating time and energy efficiency of the monitoring
uploaded to CoFly software for processing. When the
process. More specifically, the novelty of this paper lies in
mission ends, all the collected visual data are merged
a holistic framework for precision agriculture integrating
and provide an orthomosaic as an accurate map of the
the following unique characteristics and custom software
ROI. Given the orthomosaic, several Vegetation Indices
solutions:
(VIs) maps for assessing crop health are available to
display. – Low-cost deployment by using off-the-shelf drones,

Fig. 1 CoFly’s high-level architecture


J Intell Robot Syst (2023) 107:23 Page 5 of 26 23

– Easy-to-use and compatible with any commercial UAV, performance of the developed algorithm. Section 3 presents
– Mobility and data synchronization across devices, the proposed methodologies for extracting vegetative
– UAV built-in flight planning for automated crop knowledge regarding the crop’s health status along with
monitoring, a pixel-wise pipeline for detecting possible problematic
– Integrated image processing functionalities for ortho- locations. Section 4 describes the proposed motion planning
mosaic extraction and vegetation health estimation, algorithm for dealing with the site-specific inspection
– UAV built-in site-specific mission based on the and presents the weed detection module by using deep
extracted knowledge and combined with incorporated learning methods. Section 5 presents an overview of the
deep learning model for weed detection, graphical user interface and displays the overall workflow
– No-fly zones/Obstacle avoidance, for the system’s integration. Section 6 includes the real-
– Timeline view for storing information according to the life experiments along with the visual results and finally, in
year’s harvest. Section 7, concluding remarks are given.

The proposed platform aims at providing the opera-


tor/farmer with a cost-effective and end-to-end integrated 2 UAV Coverage Path Planning
system to accomplish autonomous tasks that were used to be
completed manually (e.g. crop growth monitoring). Besides, 2.1 Problem Formulation
it will play a supportive role in decision-making objectives
(e.g. possible weed infestations) enhancing crop yield man- For the purpose of covering a continuous area, it is essential
agement decisions by assisting non-expert users in proper to deploy a method that provides safe and efficient paths
manipulations. A comparison of the main features, of all while ensuring complete coverage of the agricultural field.
available platforms, as long as the advantages of the CoFly To efficiently cope with challenges faced in real-world
approach are presented in Table 1. scenarios, one should analyze all the aspects that govern
To the best of our knowledge, CoFly is the first to Coverage Path Planning (CPP) problems [24]. With the term
integrate (i) mission planning supporting no-fly-zones and CPP, we refer to the task of determining a path that passes
obstacles, (ii) plant health index calculation-visualization, through all the points of an area while avoiding any No-
(iii) weed detection using deep learning, with (iv) a Fly Zones (NFZs)/Obstacles possible defined within the
user-friendly and intuitive graphical user interface. All of operational area.
the aforementioned services are build upon modules of In this paper, the proposed platform deals with the
enhanced and properly modified well-established method- problem of designing a path given:
ologies, providing unique characteristics toward a fully
functional end-to-end field service management system. A. Field-based info:
Our work is an open source alternative, directly
– a user-defined polygon (A) which contains a series
comparable to commercially available platforms, aiming to
of points that define the agricultural field to be
maximize the usage of PA by non experienced in UAV
scanned,
flight end-users. At the same time, we offer the research
– a user-defined sub-polygon (O), if needed, repre-
community a tool to develop and test custom solutions
senting the NFZs/Obstacles within A.
for specific agricultural applications. The overall CoFly
ecosystem is open-source and publicly available1 to the B. UAV-based info:
community.
– the scanning density, in meters, representing the
distance between two adjacent flight path lines,
1.3 Paper Structure
– the flight direction, in degrees, defining the
direction of the path within A,
The rest of the paper is organized as follows. Section 2
– the flight altitude of the UAV for the mission,
presents the translation of a general coverage motion
– the speed of the UAV for the mission.
planning framework to an optimization problem and
describes the developed algorithm for tackling such The objective is to compile all this user-defined
a problem. In this section, a simulated evaluation is information and extract a set of coordinates, representing
performed in different scenarios to adequately analyze the a safe and efficient path, so as to provide a complete
coverage of the A. Note that the defined problem does
not assume any type of UAV, but rather adapts to each
UAV’s capabilities by tuning the UAV-based info (as defined
1 https://github.com/CoFly-Project previously) accordingly.
23 Page 6 of 26 J Intell Robot Syst (2023) 107:23

2.1.1 Field Approximation on Grid Definition 2 A valid path of length m is considered any
sequence of nodes
To begin with, let us assume that the agricultural field
that needs to be covered has an arbitrary shape and is X = {l1 , l2 , . . . lm } (5)
constrained within a two-dimensional polygon A defined
by a set of coordinates (x, y). As a first step, A must where ∃ E(li , li+1 ) ∀ i ∈ [1, lm−1 ].
be represented as a grid. For the grid approximation, A
is decomposed into a finite set of equal cells which do In other words, the set of (V , E) ∈ G within L provides
not overlap while filling the entire area of A. The number us with information corresponding to a sequence of adjacent
of these cells represents the required spatial resolution of nodes allocated in the center of the unoccupied cells.
the UAV’s collected images for monitoring crop growth. It Consequently, if a series of nodes follows Def. 2 could
should be noted that in the proposed platform, the spatial constitute a valid path that passes through all the points of U
level resolution can be specified through the values of the while avoiding obstacles. Hence, G to provide a valid path
flight altitude and scanning density. specifically tailored to each field’s characteristics is also
The grid representation of A is described as follows: directly related to the characteristic hyperparameter flight
direction. Modifying the flight direction feature intends to
U := {x, y : x ∈ [1, rows], y ∈ [1, cols]} (1) optimize the node’s placement inside U to create a more
favorable -in terms of coverage- representation of G.
where rows, cols enumerate the rows and columns after the The key idea is the rotation of each node within the grid
grid discretization of the terrain to be covered. Apparently, U by an angle ϑ in a counter-clockwise direction around a
the size of the grid U is given by n = rows × cols. center-point. Given the desired angle ϑ determined by the
During cellular decomposition, we also assume that there flight direction value, the rotation of each node is defined by
are no obstacles at the size of a cell placed in several the following equations:
known positions. The set of these obstacles are described as
xnew = cx + (x − cx ) · cosh(ϑ) − (y − cy ) · sinh(ϑ) (6)
follows:

O := {(x, y) ∈ U : (x, y) is occupied} (2) ynew = cy + (x − cx ) · sinh(ϑ) + (y − cy ) · cosh(ϑ) (7)


where each cell that corresponds to any obstructed area on the where the following constraints are hold:
actual field is marked as occupied. Evidently, the number of
– A is defined in a two-dimensional space,
cells that need to be covered are reduced to l = n - no
– the center-point cx , cy is defined as the center of A,
Therefore, the set of unoccupied cells that should be
– (x, y) ∀ V ∈ G are in the form of Cartesian plane,
covered is described as follows:
– ϑ, which controls the rotation of G over U , is in range
L=U \O (3) [0, 360◦ ].

2.1.2 Nodes Placement Modifying the flight’s direction, there might appear more
favorable configurations of G inside U , where the number
Having defined the operational area U , we proceed by of nodes within L is greater. Such configurations are more
defining the allowed UAV movements, from each grid likely to provide paths that achieve a higher percentage of
cell within L, to fully cover an agricultural field A. To coverage for the given A. It must be emphasized that the
achieve that, an undirected weighted graph G = (V , E) xnew , ynew variables are rotating with respect to the user’s
is introduced. Each vertex V ∈ G is placed in the center desired scanning density.
of each grid cell of set L and represented as a node. The
edges among these nodes denote the allowed movements of 2.1.3 Optimization Problem
the UAV when navigating inside L. Generally, each node
is connected with four adjacent nodes following the Von Given the mathematical description presented above, the
Neumann neighborhood [36]. Nodes that are placed on the problem of finding a minimum size of a collection of edges
boundaries of the operational area or are adjacent to an (UAV movements) that cover all the vertices of the graph
obstacle have less edges. can be translated into the minimization of the covering path
X , so as,
Definition 1 Two nodes u(xi ,yi ) , ν(xj ,yj ) ∈ G are consider
to be adjacent if: minimize m
subject to E ⊆ {li , li+1 }, i = 1, . . . , lm−1 (8)
(xi , yi ) − (xj , yj ) = sd (4)
X ⊇L
where sd defines the value of scanning density in meters. where
J Intell Robot Syst (2023) 107:23 Page 7 of 26 23

– m is a positive integer which denotes the cardinality of


set X ,
– l ∈ L is the number of unoccupied cells,
– E is a set of edges-movements,
– X is a valid path.

2.2 Navigation Algorithm


Having illustrated all the prime features that constitute a
completely safe and efficient path, we proceed to present
a navigation algorithm in steps, as sketched in Fig. 2, for
solving the optimization problem of (8). The navigation
system that we propose in this paper, utilizes the Spanning
Tree Algorithm [23] which deploys an off-line coverage
path planning algorithm as the UAV covers any arbitrary
shape, while knowing in advance any related information
about the location of the NFZs/Obstacles. Conceptually, this
algorithm along with the equations of Section 2.1 tackles the
aforementioned challenges and achieves the overall mission
objectives within five distinct steps.
Algorithm 12 outlines in pseudo-code the key steps of the
proposed navigation algorithm.
Step 1. As a first step, in lines 1 and 2, the user’s input
data A and O are transformed from WGS84-
EPSG:4326 into WGS84-EPSG:3857 [44]. For
ease of understanding, let us analyze the usage of
this transformation.
To start with, EPSG standard codes are used
as Spatial Reference System Identifiers for map
datasets. EPSG:4326 projects the map dataset in a
coordinate system on the surface of a sphere while
EPSG:3857 projects the map dataset in a coordi-
nate system on the surface of a square. The reason
why this transformation is needed is because A and
O are initially defined in a set of latitude/longitude
pairs while scanning density is defined in meters.
Fig. 2 Spanning-Tree Coverage algorithm main steps and terminology
In latitude/longitude (EPSG:4326), distances are
not measured in meters, but rather in decimal
degrees. To efficiently deal with these challenges, as the only algorithm’s requirement, that the areas
a Web-Pseudo Mercator projection (EPSG:3857) which include stationary obstacles within the grid
was used to project these latitude/longitude pairs in cannot be smaller than a large square-shaped cell.
a square shape which supports a metric system in Step 2. In lines 11-21, after the discretization of the
meters. This suitable projection was used for deal- field, every unoccupied large square-shaped cell
ing with the above aforementioned mathematical is translated into a node and for each pair of
descriptions of Section 2.1, extract the path points, nodes, a number (the weight) is assigned to each
and project them back to latitude/longitude pairs. edge. These nodes-edges, regulated by the flight
Having transformed the input coordinates A, O direction feature, form the graph G = (V , E)
to a Cartesian Plane shape, in lines 3-10, the entire which defines the allowed movements of the UAV
area of A is grouped into large square-shaped cells, (Fig. 2(b)).
each of which is either entirely blocked or entirely Step 3. As a next step (line 23), a graph minimization
unblocked, as shown in Fig. 2(a). It must be noted, methodology by using Kruskal’s algorithm [32]
is applied in the previously formed graph G as
2 https://github.com/CoFly-Project/Waypoint-Trajectory-Planning
illustrated in Fig. 2(c). The resulting minimum
23 Page 8 of 26 J Intell Robot Syst (2023) 107:23

Algorithm 1 Trajectory planner.

spanning tree (MST) contains the minimum of aerial mapping. An important analysis of those hyperpa-
number of edges among all nodes (allowed rameters is analyzed below:
movements) so that G remains fully connected.
Figure 2(d) graphically depicts the resulting MST Flight altitude. Low flight altitudes yield the highest
above the original discrete area of Fig. 2(b).
resolution and best precision in mapping with more
Step 4. The UAV, then, to cover the area of interest
matched features per image. In contrast, flying at a
circumnavigates the MST [28] in a clockwise
higher altitude covers more space and allows the capture
direction along the subdivided subcells of line 22,
as illustrated in Fig. 2(e). The circumnavigation of common unique features in multiple images which
of the spanning tree (line 25) generates a simple can be useful when mapping areas with homogeneous
closed path X = {(Lat1 , Lon1 ), (Lat2 , Lon2 ), . . . , imagery. Consequently, flight altitude determines the
(Latm , Lonm )}, where m denotes the number of
waypoints, (Fig. 2(f)).
Step 5. Once Algorithm 1 determines a path X , in line
27, a smoothing hyperparameter called corner
radius is introduced and applied in turns. Figure 3
illustrates the path smoothing process with the
black line indicating the path X generated in Step
4 while the blue line indicates an optimal X -in
terms of covering time- solution after the applied
smoothing process.
Having defined the navigation algorithm, let us provide
some necessary preliminaries about the flight hyperparam-
eters (flight altitude, scanning density, flight direction, cor- Fig. 3 Path smoothing process regulated by the hyperparameter corner
ner radius, speed) and how they affect the overall quality radius
J Intell Robot Syst (2023) 107:23 Page 9 of 26 23

Fig. 4 Overlap settings


regulated by the
hyperparameters flight altitude
and scanning density. Subfigure
(a) illustrates the overlap
between two sequential images
on the same flight path line.
Subfigure (b) illustrates the
overlap between two sequential
images in adjacent parallel flight
path lines

overlap between two corresponding images in the same of a circle with a radius equal to the value of the
flight path line as sketched in Fig. 4(a) (frontlap). corner radius. Note that the value of the corner radius
Scanning density. In the proposed CPP method, the dis- should fluctuate at low levels e.g. 0.5-1m to minimize
tance between two sequential trajectories is determined the aircraft’s deceleration at the turning points and at the
by the value of scanning density. As a result, this distance same time avoid deviating from the resulted coverage
can control the overlap between images from two adja- path X .
cent flight path lines as sketched in Fig. 4(b) (sidelap). Speed. Last but not least, speed controls the speed at
Flight direction. Using the flight direction feature allows which the UAV flies during a mission. The UAV’s flight
you to change the direction that your drone flies and speed varies according to the flight planning parameters
ensure that the extracted path pattern is tailored to the and image overlap settings.
field. Figure 5 illustrates an example where the flight
direction feature is applied to a specific complex-shaped Adjusting the aforementioned hyperparameters to the crop
polygon. yield can lead to an optimized setup by generating a path
Corner radius. Speed deviation and 90◦ degree turn on pattern tailored to each field’s characteristics and optimize
the spot maximize the flight time of the mission and can the chance of generating high-resolution maps.
lead in many cases to inefficient missions. Controlling
indirectly the corner radius hyperparameter minimizes 2.3 Simulated Evaluation
the flight time by allowing smooth turns where the UAV,
during a turning point, does not make a 90◦ degree To assess the performance of the proposed navigation
turn on the spot but rather moves along the perimeter algorithm to the quantity of coverage and the overall quality

Fig. 5 Flight path direction


regulated by the flight direction
hyperparameter based on node
setup
23 Page 10 of 26 J Intell Robot Syst (2023) 107:23

of paths, two different metrics are considered. The first For the evaluation of the proposed method, a set of
metric Percentage of Coverage (P oC) is for the accuracy in 6 generated polygons is used. This set includes complex-
terms of coverage and the second metric Estimated Flight shaped and non complex-shaped polygons, with or without
Time (EF T ) is for the efficiency in terms of coverage time. obstacles. The simulations were carried out with the default
The metric P oC intends to represent the percentage of UAV model of RotorS equipped with GPS and IMU sensors.
coverage directly matched to a path X within A. Given the Moreover, it is assumed that the UAV is initially located at
problem definition, P oC is defined as follows: (Lat1 , Lon1 ) of each simulated trajectory X . For all sets,
nA the proposed method was executed with fixed flight altitude
P oC = × 100% (9) and speed of 100 m and 3 m/s, respectively. Additional
nAmax
information associated to each scenario is presented in
where nA denotes the number of nodes placed inside a Table 2. It should be noted that each scenario was designed
polygon A which will be used for the MST construction and in such a way to demonstrate the performance of Algorithm
nAmax denotes the theoretically maximum number of nodes 1 and test the effect of the flight direction and corner radius
that could fit inside a given polygon A. Thus given A and hyperparameters on the resulting path in terms of coverage
scanning density, nAmax is formulated as follows: percentage and time. The rest of the hyperparameters were
not evaluated in RotorS as they do not directly affect the
A (m2 ) path pattern but rather the image overlap settings.
nAmax = (10)
scanning density 2 (m2 ) For each scenario, Algorithm 1 was employed to calcu-
late the respective UAV path to solve optimization problem
where A is defined in m2 and scanning density is defined in (8) aiming at maximizing the P oC while minimizing the
meters in the x-axis and the y-axis, respectively (Algorithm EF T . The visualization of the produced trajectories X for
1, lines 7 & 9). each scenario along with their corresponding performance
The second metric EF T intends to represent the is presented in Fig. 6 and the last two columns of Table 2.
estimation time, in minutes, for the UAV to follow a path. As The results of the evaluation procedure are reported and
to covering time, the EF T is determined by the simulation analyzed in the following paragraphs.
time. In Scenario #1, 31 nodes with flight direction = 0, are
RotorS UAV [22] simulator framework was utilized to selected for the MST construction as important regions of
test and evaluate the proposed navigation algorithm. RotorS interest for the UAV. In Scenario #2, by only modifying
is an open-source, simulator for unmanned aerial vehicles the flight’s direction, the navigation algorithm finds a more
based on the Robot Operating System (ROS) and the profitable -in terms of coverage- configuration of U inside
Gazebo 3D robotics simulator. All the experiments were A with 32 nodes selected as important regions of interest.
carried out in the waypoint-navigator package [34] of the As presented in the last column of Table 2, the resulting
aforementioned simulator, which is a high-level waypoint- path X of Scenario #2 in comparison to Scenario #1 is
following package that simulates a trajectory through a more optimal in terms of coverage. However, due to the
list of waypoints formatted in WGS84-EPSG:4326. For the increasing number of waypoints, the flight time of Scenario
representation and visualization of the trajectories we used #2 is greater than Scenario #1. Hence, Scenario #3 is
the RViz (Robot Visualization) program [45], an auxiliary considered to demonstrate the effectiveness of smoothing
tool for visualizing our waypoint data in a real domain. the trajectory in turns. As presented in the EF T column
RotorS was hosted in a common laptop (Intel Core i7- of Table 2, by tuning the corner radius, the overall flight
8750H CPU) with Ubuntu 16.04 LTS and a combination time of the resulting path X of Scenario #3 is decreasing
of v7.7/Kinetic as a version of Gazebo and ROS package, in contrast to Scenario #2, providing thus the exact
respectively. percentage of coverage. While Scenarios #1,#2 and #3 are

Table 2 Evaluation metrics based on scenarios description

Scenario # Area (acres) NFZs/Obstacles (acres) Scanning density Flight direction Corner radius (m) EFT [22] PoC

1 7.39 0 30 0 0 8:31.394 93.31%


2 7.39 0 30 20 0 8:94.510 96.32%
3 7.39 0 30 20 0.5 8:44.197 96.32%
4 10.65 0 35 0 0 10:88.281 99.48%
5 10.65 0.65 35 0 0 10:11.192 96.96%
6 10.65 0.65 35 0 0.5 9:53.855 96.96%
J Intell Robot Syst (2023) 107:23 Page 11 of 26 23

Fig. 6 Path visualizations in


RViz. Effect of the
hyperparameters flight direction
& corner radius on the resulting
path according to each scenario
description

demonstrated in complex-shaped polygon, in Scenario #4 3.1 Image Stitching Process


when in non-complex polygon even without optimizing the
nodes placement within U (flight direction = 0), Algorithm The aim of the current module is to merge the collected
1 achieves a high percentage of coverage. In addition, visual data and provide an orthomosaic as an accurate map
the effectiveness of Algorithm 1 is also demonstrated in of the examined area. Towards this direction an appropriate
Scenarios #5 and #6. In both Scenarios, 27 nodes were tool for image stitching has been deployed which is based
selected for the MST construction while 5 nodes were on the following process:
considered to contain NFZs/Obstacles. For both scenarios
1. Firstly, the telemetry from the metadata of each image is
the resulting path either with smoothing or not achieves an
extracted in order to speed up and improve the accuracy
average P oC above 96%.
of the stitching process.
Overall, the developed algorithm designs the UAV paths,
2. Then, a sparse point cloud is generated with the Open
in such a way as to achieve high P oC both in complex
Structure from Motion (OpenSfM) [2] library.
and non-complex fields while keeping the flight time as
3. The generated point cloud is transformed to triangle
minimum as possible, enabling the prolonged utilization of
meshes via the Poisson Surface Reconstruction [10]
the UAV in the agricultural application.
method.
4. The triangle meshes are textured over by selecting
appropriate patches from the acquired images generat-
3 Location Estimates for Crop Health
ing the georeferenced orthoimage.
Monitoring
The above process can be implemented with the
To maximize the system’s usability and incorporate novel OpenDroneMap (ODM) [1] toolkit. In Fig. 7 the generated
and cognitive functionalities, the framework involves orthophoto of an examined area is demonstrated. The
sequentially executed processes that eventually improve the outcome of the aforementioned stitching process constitutes
crop monitoring process. After the execution of the generic a comprehensive map-quality representation of the field and
coverage mission presented in Section 2, a set of RGB can be further processed as a whole, to extract qualitative
georeferenced images covering the whole field has been features, such as an estimation of the vegetation health, and
collected. This set comprises the base of the subsequent provide the end-user an overview of the crop status.
processing procedures aiming at i) extracting knowledge
regarding the crop’s health status and ii) providing identified 3.2 Vegetation Indices (VIs)
locations over the field that correspond to the centers
of detected problematic areas. The following sub-sections To enhance the capacities of the proposed framework toward
analyze the integrated procedures. a cognitive precision agriculture system, the previously
23 Page 12 of 26 J Intell Robot Syst (2023) 107:23

end, VIs focused on the visual spectrum were incorporated


as the usage of a visual camera is more flexible and
affordable. Therefore, multiple VIs were adopted and
developed to cover the majority of aspects that are related to
crop health. The employed RGB-based VIs are presented in
Table 3.
Each one of the four selected VIs represents the
actual reflectance of the field’s vegetation in different
color bands and thus, it can reflect different measures
of crop health. More specifically, the Visual Atmospheric
Resistance Index (VARI) [25], originally designed for
satellite imagery, considers the blue light scattering as
the blue spectrum travels in smaller wavelengths. On
the contrary, Green Leaf Index (GLI) [33] was proposed
to measure wheat cover and all main color bands are
considered in a weighted approach. Similarly, Normalized
Green-Red Difference Index (NGRDI) [27] is obtained by
calculating the reflectance of the green and the red zone
Fig. 7 Extracted orthomosaic map of an examined area of the electromagnetic spectrum. A similar approach is
adopted for the Normalized Green-Blue Difference Index
(NGBDI) [52] nonetheless, the estimation relies on the
defined stitched map of the under-study field is further reflectance of the green and the blue zone. These VIs were
processed to estimate the vegetation’s health. Considered as selected as an established choice capable to reflect and
the main objective and motivation of the proposed system, represent the under-study agriculture-related issues.
crop health estimation prerequisites the utilization of such The estimated VI maps are defined within a limited range
metrics. These estimates, widely known as Vegetation and displayed to the operator by utilizing a red-green color-
Indices (VIs), are strictly dependent on the available sensor map by using the appropriate scale, as demonstrated in
that is equipped on the deployed UAVs. Relevant sensors Figs. 8(a)-(d). Since each calculated VI reflects different
involve operations under either the visual or a wider light aspects of the visual light that is incorporated, the
spectrum which eventually is used to identify defects and corresponding visualization for every VI is unique as
problematic crop areas. different aspects of plant health are assumed.
The use of a multi-spectral camera would provide
accurate and more meaningful data for estimating plant 3.3 Location Extraction of Problematic Areas
health via the calculation of some pertinent VIs. Despite
their widest exploitation and their increased performance The extracted VIs image representations can provide the
in precision agriculture applications, deploying a multi- end-user, with a concrete overview of the crop’s health.
spectral sensor will significantly increase the total cost Yet, to automate the procedure, individual field regions that
while technical restrictions might be inserted such as a present poor conditions, in terms of vegetation health, have
considerably lower acquisition rate. On the other hand, to be defined. This module aims to detect these problematic
employing off-the-shelf UAVs utilizing RGB cameras areas and feed their location to the site-specific mission
consists of a low-cost and easy-to-deploy solution. To this module for a targeted, on-the-spot, inspection.

Table 3 Vegetation indices (VIs)

Vegetation index Name Formula References

G−R
VARI Visible atmospheric G+R−B [25]
Resistant index
2×G−R−B
GLI Green leaf index 2×G+R+B [33]
G−R
NGRDI Normalized green red G+R [27]
Difference index
G−B
NGBDI Normalized green blue G+B [52]
Difference index
J Intell Robot Syst (2023) 107:23 Page 13 of 26 23

Fig. 8 Image representations of


the four calculated VIs. Lower
index values are colored with
red tones while higher index
values correspond to green tones

The significance of VIs is not only their ability to every detected area, the center of mass is calculated leading
visualize the health status of the crop but also the to a set of points, as presented in Fig. 9(b).
quantification of this status by assigning a single value to These center points are considered the destination points
each pixel of the processed image, usually, in the range where the site-specific mission has to take place, to
[−1, 1]. Based on that, a pixel-wise processing pipeline provide a more thorough inspection regarding the low
has been developed to detect these regions where the health conditions of the specific area. Yet, a large number
corresponding index value is low and thus can be considered of waypoints is usually prohibitive, since the flight time
problematic. The deployed module consists of the following of a conventional drone is not adequate to operate in
processing steps: all these individual areas. Furthermore, there are cases
where neighboring problematic areas can be considered as
1. Outlier detection
parts of a wider area that can be examined with a single
2. Binary thresholding
pass of the UAV asset. To overcome this issue, the k-
3. Connected components detection
Means clustering algorithm is employed, aiming to group
4. Centers of mass calculation
the center points of smaller individual problematic regions
5. Centers aggregation
and lead to a more manageable number of destination
Although the typical range of the presented VIs is points. Focusing on an autonomous operation, the number
[−1, 1], in real-world cases the actual range can differ of clusters k is selected automatically in the range [2, 10]
significantly and values close to the typical extrema can be based on the maximization of the Calinski-Harabasz
considered outliers. Towards this direction, initial filtering score [13]. Furthermore, to acquire more meaningful results,
of the index array outliers takes place. At first, the histogram a weighted clustering approach is selected, where the cluster
of the index array is calculated. Assuming that k1 is the first centroids are calculated with respect to the size of each
histogram’s bin corresponding to the interval [−1, −0.9), area. Figure 9(c) illustrates the outcome of the k-Means
in case the frequency of this bin is smaller than 5% of N, clustering. The acquired cluster centers are forwarded to
where N is the number of the pixels corresponding to the the site-specific missions module, as the points of interest
entire field region, then the values belonging on the specific where a further on-spot examination will be conducted.
bin range are considered outliers and −0.9 is the actual Moreover, the ability to fine-tune or exclude destination
minimum (notated as min) of the processed index array. The points is provided to the user, to create a more tailored
process is repeated for the next bin until the outlier criterion site-specific mission that meets his requirements.
is not satisfied. A similar approach is applied to estimate the As mentioned in Section 3.2, the employment of RGB-
maximum value (max). based indices aims to compose a low-cost and flexible
Next, binary thresholding is applied to separate the framework. However, note that the presented approach for
problematic areas from the healthy ones. As problematic the identification of the problematic areas can be applied in
areas are considered those with pixel values in the range the case of multispectral indices without any modifications
[min, t], where t is the binary threshold equal to t = and thus, it can be considered a “plug-and-play” solution.
min + 0.25(max − min). Although it is established that
t is equal to the quarter-half of the VI’s actual range, one
could fine-tune this value manually to achieve better, crop- 4 Site-Specific Mission
oriented performance. In Fig. 9(a) the corresponding binary
image acquired by thresholding the VARI index of Fig. 8(a) Towards assessing the aforementioned problematic loca-
is depicted. Afterward, a connected components analysis tions, a site-specific mission focused on the detected points
is employed to detect connected components on the binary of interest is designed. The main objective of this mission
image that correspond to a single problematic area. For is to provide a more thorough inspection of these individual
23 Page 14 of 26 J Intell Robot Syst (2023) 107:23

Fig. 9 Demonstration of the processing pipeline for the problematic center of mass of every problematic regions, finally (c) K-Means
areas identification. Initially the index array is (a) binary thresholded, clustering is employed to estimate the centers of wider problematic
then (b) connected component analysis is conducted to detect the areas

spots of the examined area that seem to be problematic that visits every vertex is defined. In lines 9-10, every point
in terms of plant health and provide valuable informa- - vertex of the resulted optimal path is labeled as H ot-
tion to the end-user. To efficiently monitor these individual P oint. As a next step (lines 11-15), we generate a set of
spots, visual footage from lower altitudes and different m additional points with an angle ϑ ∈ [0◦ , 360◦ ] and
camera angles is captured aiming to visually detect crop a constant radius r around a specified H ot-P oint. Note
health deterioration sources, such as weed plants, to enable that, the number of points m that enclose a H ot-P oint
site-specific treatment. depends on an angle ϕ (line 13) which determines the
angular distance between any two consecutive points on
4.1 Hot-Point Mission the perimeter of the circle. The smaller the value of ϕ, the
From the motion planning perspective, visiting each of the greater the number of points on the perimeter of the circle.
individual points of interest can be deduced to the problem Figure 10(a) illustrates an indicative example of a circular
of finding the shortest path that visits all nodes (Fig. 9(c)). flight that encircles a H ot-P oint.
For ease of understanding, let us assume that the crop health Based on Algorithm 2, the UAV will perform a fixed
indicators (VIs) of section III extract k points of interest radius cycle around the central points of interest (H ot-
{1, 2, . . . , k} each of one is expressed as χi = {Lat, Lon}, P oints) as it follows the resulted path of the TSP algorithm
where i ∈ {1, 2, . . . , k}. Overall, the set of points of interest as sketched in Fig. 10(b). It should be highlighted that the
is defined by the following equation: resulting path starts and ends at a specific vertex (line 1)
which denotes the takeoff/landing waypoint after having
χ := [χ1 , χ2 , . . . , χk ] (11) encountered all the other vertices of the graph, exactly
once. Moreover and if needed, during the execution of the
At this point, the path planning problem is formulated
mission, the user can also use the physical remote control to
into the optimal round trip that will guarantee the shortest modify the speed, heading, and altitude of the aircraft.
possible route while passing through all points, only
once. For individual coverage, we applied the Travelling 4.1.1 A Hot-Point Approximation Study on Flight Time
Salesman algorithm (TSP) [7], which is renowned for for UAVs
finding the shortest path within an undirected graph. The
pseudo-code for monitoring each χi is given in Algorithm 23 . This study aims to estimate the UAV’s mission time and
Given the takeoff/landing point and the overall set analyze its association with the number of points of interest.
χ, in lines 1-6, we treat the defined points as an For the evaluation study, 10 χ sets are used. Every set
undirected stationary graph representing each point within includes a different number of distinct points of interest. For
the examined area as a vertex. The distance between each set, 100 simulation experiments, with a random-shaped
each pair of vertices is defined by the weight of their distribution of points of interest, were performed within an
area of 10.77 acres (Fig. 10(b)). All sets were executed
corresponding edge.
with a fixed radius r of 5 m and an angle ϕ = 45◦ . The
In line 7, by solving the TSP algorithm an optimal path
simulations were carried out with a fixed altitude and speed
of 10 m and 3 m/s, respectively. The overall performance
3 https://github.com/CoFly-Project/Waypoint-Trajectory-Planning
of Algorithm 2 in terms of covering time regarding each set
J Intell Robot Syst (2023) 107:23 Page 15 of 26 23

Algorithm 2 Hot-point mission.

of χ is presented in Fig. 11. users use the remote controller to adjust the flight status
According to Fig. 11, each of the sets χ indicates of the UAV, covering time is increasing or decreasing
that the estimation coverage time varies and depends on proportionally to the duration of each action. As a result, the
the geographical location (latitude, longitude) of each of Hot-Point mission as a feasible approach for site-specific
the points of interest within the field. Although, the key inspection can be applied to precision agriculture practices
outcome of this study is that the developed algorithm, for for automated visual footage focusing on low-cost smart
each of the 100 simulations, achieves a persistent coverage farming by utilizing common commercial UAVs.
of 3-30 points of interest within an estimation coverage time
close to the average flight time of most common commercial 4.2 High Level Semantic Information Extraction
UAVs, i.e. 25-30 min. Such a feature is of paramount
importance, as it supports trajectories for covering a large During the Hot-Point mission, a closer inspection by the
number of individual spots while concurrently respecting end-user of the problematic field areas is enabled through
the average energy efficiency of a conventional UAV. Note the captured visual footage. Yet, artificial intelligence can
that, the Estimated Flight Time on the y-axis could be be employed to automate this process and compose the
achievable only if the user does not intervene during basis of a decision support system. Towards this direction,
the execution of the Hot-Point mission. However, when a weed detection module is deployed to analyze the visual

Fig. 10 An optimal circular path


around points of interest
23 Page 16 of 26 J Intell Robot Syst (2023) 107:23

Fig. 11 Hot-Point Experimental


Data. A box plot visualization of
the data distribution for a given
set χ concerning the estimated
flight time to serve a Hot-point
mission

data collected during the Hot-Point mission and provide amount of man-hours and the corresponding workforce, as
information regarding the detected weeds to the end-user. well as the collection of data over a wide period of time. Yet,
Accurate weed detection is a crucial, yet challenging, to train and evaluate the deployed model, a custom dataset
task of precision agriculture in order to maintain the was designed for the task of weed semantic segmentation,
health of the overall crop and minimize the damage as an initial approach to create the required challenging
through selective treatment. Convolutional Neural Networks dataset. Towards this direction, a set of 1280 × 720 RGB
(CNNs) have reported remarkable results in a wide variety images was collected during real-life experiments over a
of computer vision tasks and especially in the fields of cotton field. The acquired images depict crop rows of cotton
object detection [46] and semantic segmentation [42]. The plants where different types of weeds interfere among the
proposed framework employs these techniques for the planting lines. The annotation of the collected data was
task of weed detection. In specific, the particular module conducted by field experts utilizing the LabelMe annotation
is based on DeepLabv3+ [14], a robust deep-learning tool [50]. The annotation process led to a set of 201 labeled
model for image semantic segmentation possessing state- images where depicted weed clusters are labeled as “weed”
of-the-art results on PASCAL VOC 2012 dataset [18]. while the rest of the image is annotated as “background”.
Contrary to more naive approaches of image classification The developed dataset [31] is publicly available4 to the
where the input image is classified as a weed or not, research community. To further demonstrate the challenging
the semantic segmentation approach enables the detection nature of the specific task, Table 4 presents the number of
of multiple instances and types of weeds by applying pixels for each class, among the number of depicted weed
a pixel-wise annotation on the input and thus, provides instances contained in the built dataset. As expected, the
accurate information regarding the location of the detected developed dataset is imbalanced due to the complex nature
weeds. The developed module is capable of semantically of the problem.
segmenting weed instances depicted on input RGB images The developed dataset was split into a training (80%)
captured during the site-specific mission and thus, provides and a testing (20%) set. The splitting process was repeated
critical high-level information regarding the location of 3 times aiming to create subsets with different class
detected weeds to the end-user in order to schedule distributions in order to conduct a more thorough evaluation
counteractions on the spot. of the employed detector.

4.2.1 Dataset 4.2.2 Train & Evaluate

It is important to note that weed detection is a quite The deployed DeepLabv3+ instance was pre-trained on
challenging task due to the nature of the problem since the PASCAL VOC 2012 dataset and further trained on the
captured weed instances present a wide variety in terms previously described, custom dataset for the weed semantic
of form, size, and shape. Moreover, the development of a
meaningful dataset that can enclose the majority of different
4 https://github.com/CoFly-Project/CoFly-WeedDB
occasions is not an easy task, since it requires an extreme
J Intell Robot Syst (2023) 107:23 Page 17 of 26 23

Table 4 Number of pixels & instances per class contained in the Table 5 Model accuracy in terms of IoU (%)
custom-built dataset
Background Weed mIoU
Class # Pixels ×106 # Instances
Split 1 95.44 44.93 70.18
Background 175 – Split 2 95.83 43.49 69.66
Weed 9 383 Split 3 96.62 43.96 70.29

segmentation task. The model was trained for 500 epochs missions that require surveillance of a new crop, analysis
with a batch size of 12 on a GeForce RTX 2060 Super of photometric indices, and detection of intrusive plant
GPU. As an optimization algorithm, Adam solver with a species. To finalize the CoFly software as an integrated end-
learning rate equal to 10−3 was selected. To tackle the to-end platform, the connection between the autonomous
imbalance issue among the dataset classes, focal loss was navigation system, the remote sensing control system, and
employed. Moreover, data augmentation techniques were the graphical user interface is essential to ensure stability
utilized to enhance the generalization ability of the model. in such a large-scale application. The communication of
In specific, every input image of the training set was data among different modules is implemented by a Restful
horizontally/vertically flipped and randomly resized on a API Service [40] which allows the different sub-systems
scale between 0.5 to 1.5 times of the initial size with a to connect and interact with each other by retrieving
chance of 50%. Finally, a patch-sized 320 × 320 pixels were the necessary data for their execution. Data transmission
randomly cropped from the original image before being fed between sub-systems is done via .json files which contain
to the model, to further augment the training data. all the necessary information needed to run any operation
The developed weed detector is trained and evaluated of the overall system. Figure 13 illustrates the overall
on the three different dataset splits, while its efficiency CoFly workflow along with the communications among
is measured in terms of Intersection-over-Union (IoU), a its modules. It must be emphasized that each and every
standard accuracy measure for semantic segmentation tasks. sub-system of Fig. 13 is integrated with the CoFly UI.
In Table 5 the IoU for each class among the overall mean- A video demonstration illustrating the sequence of events
IoU (mIoU) is reported. Results imply that the deployed among all the aforementioned services is attached to the
model is capable of correctly classifying the samples of following link6
background class with significantly high accuracy for all
splits. Regarding the weed class, the accuracy is lower 5.2 CoFly-to-UAV Bidirectional Communication
mostly due to the reduced number of samples in the dataset. System
Yet, one can consider the reported model performance
satisfying by taking into consideration that for the weed For attaining a drone mapping project defined by the
detection task it is adequate to detect a wide region where GUI, the coverage and inspection tasks are executed by
weed clusters are present, compared to other applications of forwarding the objectives of the corresponding mission
semantic segmentation, such as autonomous driving, where to the UAV. As an intermediate communication layer
a crisp prediction of the detected object shape is required. between the CoFly software and the UAV, a smartphone
that runs a mobile application is connected to the UAV’s
onboard controller. Towards this direction, in this paper,
5 System Integration to accommodate a common commercial UAV, a custom
android app adaptor has been developed with the use of
5.1 Graphical User Interface the DJI API, as a data transceiver between Algorithm 1 &
2 with the UAV (as shown in Fig. 14). This application is
All of the aforementioned services are packed inside an responsible for forwarding the path to the UAV as well as
end-to-end precision agriculture platform called CoFly. gathering information (telemetry, photographs, statuses, and
The CoFly software is designed with the user workflow mission logs) and transmitting them to the backend of the
in mind to provide him with an interactive environment platform.
based on precision agriculture applications. The primary Note that the aforementioned android application has
goal is to make scanning, imaging, and parametric analysis been set up to provide i) a high level of autonomy for the
of crops easy and accessible via the user interface5 (UI) missions and ii) a dynamic visualization of the flight data.
(Fig. 12). These functionalities are adequate for most simple
6 https://www.youtube.com/watch?v=C0hdCu-ZRQk&ab
5 https://github.com/CoFly-Project/cofly-gui channel=CoFlyProject
23 Page 18 of 26 J Intell Robot Syst (2023) 107:23

Fig. 12 CoFly GUI, showing the


available flight settings menu
bar on the left, and the map with
the polygon drawn by the user

Fig. 13 An overview of the


CoFly system workflow
illustrating the information
exchanged among its modules
J Intell Robot Syst (2023) 107:23 Page 19 of 26 23

Fig. 14 Custom-developed
android application adaptor

Based on this logic, the interaction between the user and monitoring and inspection was done by the developed
the CoFly software can be done entirely from the GUI, algorithm 1 and 2, respectively. For mission control, an
with no need to directly control the UAV with the controller android smartphone device (Xiaomi Mi Max 2) was used
or the smartphone application. Therefore, as the mobile to run the android app adaptor, described in Section 5.2.
application also works as an on-the-field pilot by allowing Finally, a portable 4G router (tp-link M7350) was used as an
to monitor live the mission, the user can take control and intermediate communication layer between the smartphone
directly operate the UAV, if needed. Note that according and the CoFly software. To evaluate the efficiency of the
to the type of the UAV and its functional requirements, proposed framework, two scenarios were considered. The
a different plugin as an intermediate communication layer first scenario was performed in July 2019 during the first
might be required (see Adaptor in Fig. 13). stages of plant growth, while the second scenario was
performed in September 2019 during the ripening period,
when the cotton fibers had appeared. The main objectives
6 Real-life Experiments of these two experiments were to (i) identify the optimum
flight envelope to acquire the best data and provide an
This section presents two real-life experiments, where the orthomosaic as an accurate map of the field, (ii) extract
proposed CoFly application was used to monitor, acquire plant health information (VIs) that could give indications on
data, and assess the vegetation health of real agricultural the year’s harvest and cotton’s growth stage (timeline view)
fields. The experiments were conducted in the fertile land and (iii) detect the existence of weeds in a specific site of
of the Thessalian plain and more specifically in a rural crops. In the next subsections, each experiment is analyzed
area planted with cotton located in the city of Larissa, and distinguished according to the flight date and cotton’s
Greece. The functionalities and technologies of the CoFly developmental phase.
software, as well as the produced results, were evaluated by
the experienced agronomists of the Bios Agrosystems S.A. 6.2 Mid-Summer
7 , who specialize in crop production, soil control, and soil

management. This section aims to evaluate the robustness As to the first scenario, the cotton field during its early
and efficiency of the CoFly software in real-world precision stage along with the calculated path is depicted in Fig. 15.
agriculture operations. To demonstrate the practical enforcement of the proposed
navigation algorithm (Algorithm 1) in terms of avoiding
6.1 Experimental Setup NFZs/Obstacles, we considered an area within the original
field as restricted, meaning that the UAV is not allowed to
For the real-life experiments, a commercial UAV (DJI fly above. For coverage mission (see Table 6), the selected
Phantom 4 Pro) equipped with an RGB camera was flight altitude and scanning density resulted in a spatial level
used for image acquisition. The UAV flight planning for resolution of 90%. In addition, the operational speed was
selected to be 3 m/s, which is low enough to maintain the
7 http://www.bios-agrosystems.gr/en-US/ quality of the collected images, and the gimbal pitch was
23 Page 20 of 26 J Intell Robot Syst (2023) 107:23

Fig. 15 Calculated coverage


path for mid-summer mission

Table 6 Mission details corresponding to Algorithm 1 & 2 for monitoring and inspection tasks

Mission Coverage Inspection

Crop season Flight altitude (m) Scanning density (m) Flight direction (◦ ) Speed (m/s) Corner radius (m) Gimbal pitch (◦ ) r (m) ϕ (◦ )

Mid-summer 30 18 30 3 0.5 -90 5 45


Early-autumn 50 15 30 3 0.5 -90 – –

Fig. 16 Results of the knowledge extraction module from the experimental evaluation conducted on mid-summer
J Intell Robot Syst (2023) 107:23 Page 21 of 26 23

Fig. 17 Designed (a)


site-specific mission and the
corresponding (b) weed
detection module output over
the acquired visual footage. In
(a) the take-off/landing area is
notated with the black marker
while in (b) the detected weeds
are highlighted with light red
color

selected to be -90◦ (meaning that the camera is always in the of the developed path planning module. In Fig. 16(b)-
“top-view” mode), an angle that calibrates images for 2D (e) the image representations of the calculated VIs are
map outputs. Figure 15 graphically illustrates the resulted presented. The calculated VIs can provide a crisp overview
path as calculated from Algorithm 1. A direct outcome of the crop’s health, indicating areas with relatively poor
is that the path-planning algorithm managed to efficiently vegetation conditions. Finally, in Fig. 16(f) the result of
alter the UAV path to meet the field characteristics. More the points of interest detection module is illustrated for the
specifically, path (i) completely covers the area of interest, VARI index. One can see that the implemented module
(ii) does not contain bootstrapping (path crosses), and (iii) can efficiently detect problematic, in terms of vegetation
carefully avoids the NFZs. health, field regions where the site-specific mission should
The acquired images during the conducted mission are be deployed.
forwarded to the knowledge extraction module for further Given the geolocation of the depicted points of interest,
processing. Initially, the orthomosaic is calculated and Algorithm 2 was deployed to provide a site-specific inspec-
presented in Fig. 16(a). It can be noted that the collected tion as illustrated in Fig. 17(a). Additional information
data are sufficient to create a high-resolution map of the corresponding to the Hot-Point mission for the inspection
examined agricultural area, implying the robust operation task can be found in Table 6. In Fig. 17(b) qualitative results

Fig. 18 Calculated coverage


path for early-autumn mission
23 Page 22 of 26 J Intell Robot Syst (2023) 107:23

Fig. 19 Results of the


knowledge extraction module
from the experimental evaluation
conducted on early autumn

from the site-specific mission in the aforementioned areas developed. Thus, any interfering weeds are covered by the
are presented. It is noted that the deployed weed detec- foliage of the cotton plants and cannot be visually detected
tor can sufficiently detect regions where weed instances by an above inspection. Furthermore, the treatment of weeds
are depicted. The advantage of a semantic segmentation can be considered an off-season activity during this time
approach is also illustrated in the results since accurate spa- period, since it is more crucial to conduct it at the early stage
tial information regarding the existence of weeds in the of the crop’s life cycle when the dominance of crop plants
field can be provided to the end-user through the selected over the weeds is crucial. Towards this direction, the site-
approach. specific mission for weed detection was not executed in this
case.
6.3 Early-Autumn

As to the second scenario in the early autumn, the cotton 7 Conclusions


field during its boll and fiber maturation along with the
calculated path is depicted in Fig. 18. Note that the selected UAV systems innovation is valuable in agriculture. Follow-
aspects of navigation for the coverage task were adjusted to ing leading-edge technologies, in this paper, an end-to-end
achieve a spatial level resolution of 90%, while the UAV’s and low-cost precision agriculture platform based on a fully
altitude was set to be 50m (see Table 6). functional autonomous UAV system has been proposed. In
In Fig. 19(a) the extracted orthomosaic of the examined contrast to the majority of the already existing agriculture
area is presented. Similarly to the previous case, the platforms, which use cloud-based approaches for process-
designed path enables the collection of sufficient data to ing agricultural data and establishing prices according to
provide a detailed overview of the field. Figure 19(b)-(e) the information processing, the proposed platform has been
illustrate the calculated VIs, pointing out regions where the designed to provide offline and real-time at the field’s edge,
vegetation conditions are poor. Finally, in Fig. 19(f) the stand counts, and assessments. Towards that purpose, cov-
key outcome of the problematic areas detection module is erage, and inspection missions for any type of UAV are
demonstrated for the VARI index. It should be noted that the developed to provide plant health maps, weed measure-
specific experiment was conducted in early autumn, a period ments, and detection by owning an integrated deep learning
in which cotton is harvested and the cotton plants are fully module.
J Intell Robot Syst (2023) 107:23 Page 23 of 26 23

All of these capabilities are packed inside an end-to-end Declarations


platform functioning on mobile devices including high-level
user commands and proper data representations that ease the Conflict of Interests The authors declare that they have no known
competing financial interests or personal relationships that could have
utilization of UAVs in precision agriculture applications. appeared to influence the work reported in this paper.
The functionality of the proposed system was tested
and validated with extensive experimentation in agricultural Open Access This article is licensed under a Creative Commons
fields with cotton in Larissa, Greece during two different Attribution 4.0 International License, which permits use, sharing,
crop sessions i) in mid-summer and ii) in early autumn. In adaptation, distribution and reproduction in any medium or format, as
long as you give appropriate credit to the original author(s) and the
both experiments, CoFly preserved all the desired features source, provide a link to the Creative Commons licence, and indicate
of common agricultural software, while at the same time, it if changes were made. The images or other third party material in this
achieved fully autonomous inspection tasks empowering the article are included in the article’s Creative Commons licence, unless
UAV robots also as a tool for site-specific precision farming. indicated otherwise in a credit line to the material. If material is not
included in the article’s Creative Commons licence and your intended
This feature is of paramount importance in agribusiness, as use is not permitted by statutory regulation or exceeds the permitted
it can be utilized in information procurement, examination, use, you will need to obtain permission directly from the copyright
and consistently observing fields by the farming community holder. To view a copy of this licence, visit http://creativecommons.
org/licenses/by/4.0/.
to acquire contemporary field management skills. CoFly
software with a low budget provides beneficial and high
accuracy insights to turn into the critical segments of the
farming industry. References
As future directions, we are interested in evolving
1. OpenDroneMap. https://github.com/OpenDroneMap/ODM
this research by integrating multi-UAVs for monitoring
2. Opensfm. https://github.com/mapillary/OpenSfM
and inspection tasks. In such an approach, the proposed 3. Agisoft: Agisoft Metashape. https://www.agisoft.com (2020)
agricultural software should be enriched with a multi-robot [Online; accessed 22 October-2020]
decision-making scheme to autonomously navigate each 4. Aiello, G., Valavanis, K.P., Rizzo, A.: Fixed-wing uav energy
efficient 3d path planning in cluttered environments. J. Intell.
robot to perform individual coverage subtasks according
Robot. Syst. 105(3), 1–13 (2022)
to the overall set of the user’s objectives. This work may 5. Analytics, S.F.: Agriculture Mapping Software. https://sentera.
examine more complicated scenarios where UAVs will also com/ (2020) [Online; accessed 22 October-2020]
perform pesticide spraying treatments to have a robust 6. Apostolidis, S.D., Kapoutsis, P.C., Kapoutsis, A.C., Kosmatopou-
system tailored to a variety of agricultural applications. los, E.B.: Cooperative multi-uav coverage mission planning plat-
form for remote sensing applications. Auton. Robot. 46(2), 373–
400 (2022)
Acknowledgments We gratefully acknowledge the support of 7. Applegate, D.L., Bixby, R.E., Chvatal, V., Cook, W.J.: The
NVIDIA Corporation with the donation of GPUs used for this traveling salesman problem: a computational study. Princeton
research. University Press (2006)
8. ArduPilot: Ardupilot Mission Planner. http://ardupilot.org/planner
Author Contributions E. K. Raptis contributed to the material (2020) [Online; accessed 22 October-2020]
preparation, UAV data collection, analysis, conceptualization, writing 9. Basiri, A., Mariani, V., Silano, G., Aatif, M., Iannelli, L., Glielmo,
- original draft, visualization, and software. M. Krestenitis contributed L.: A survey on the application of path-planning algorithms for
to the material preparation, conceptualization, data curation, writing multi-rotor uavs in precision agriculture. J. Navig. 75(2), 364–383
- review & editing, visualization, and software. Graphical user (2022)
interface and integration were performed by K. Egglezos and O. 10. Bolitho, M., Kazhdan, M., Burns, R., Hoppe, H.: Parallel poisson
Kypris. L. Doitsidis and A. Ch. Kapoutsis did the supervision, surface reconstruction. In: Bebis, G., Boyle, R., Parvin, B.,
project administration, writing - review & editing. K. Ioannidis, E. Koracin, D., Kuno, Y., Wang, J., Wang, J.X., Wang, J., Pajarola,
B. Kosmatopoulos, S. Vrochidis, and I. Kompatsiaris performed the R., Lindstrom, P., Hinkenjann, A., Encarnação, M.L., Silva, C.T.,
supervision, resources, and funding acquisition. All authors read and Coming, D. (eds.) Advances in Visual Computing, pp. 678–689.
approved the final manuscript. Springer, Berlin (2009)
11. Bombe, K.: Agriculture drone market worth $5.19 billion
by 2025, growing at a cagr of 31.1% from 2019- global
Funding Open access funding provided by HEAL-Link Greece. This market opportunity analysis and industry forecasts by meticulous
research has been financed by the European Regional Development research. Meticulous Market Research Pvt. Ltd., GLOBE
Fund of the European Union and Greek national funds through NEWSWIRE (2020) [Online; accessed 09 June-2020]
the Operational Program Competitiveness, Entrepreneurship and 12. Botlink: Automated Drone Flight Software. https://botlink.com/
Innovation, under the call RESEARCH - CREATE - INNOVATE (2020) [Online; accessed 22 October-2020]
(T1EDK-00636). 13. Caliński, T., Harabasz, J.: A dendrite method for cluster analysis.
Commun. Stat. - Theory Methods 3(1), 1–27 (1974)
Data Availability The overall precision agriculture platform along with 14. Chen, L.C., Zhu, Y., Papandreou, G., Schroff, F., Adam, H.:
the corresponding modules and the dataset generated during the real- Encoder-decoder with atrous separable convolution for semantic
life experiments are open-source and available in the https://github. image segmentation. In: Proceedings of the European conference
com/CoFly-Project repository. on computer vision (ECCV), pp. 801–818 (2018)
23 Page 24 of 26 J Intell Robot Syst (2023) 107:23

15. Christiansen, M.P., Laursen, M.S., Jørgensen, R.N., Skovsen, S., 33. Louhaichi, M., Borman, M.M., Johnson, D.E.: Spatially located
Gislum, R.: Designing and testing a uav mapping system for platform and aerial photography for documentation of grazing
agricultural field surveying. Sensors 17(12), 2703 (2017) impacts on wheat. Geocarto Int. 16(1), 65–70 (2001)
16. Deploy, D.: Drone Mapping Software. https://www.dronedeploy. 34. Popović, M., Galceran, E., Khanna, R., Sa, I.: High-level
com/ (2020) [Online; accessed 22 October-2020] waypoint-following for micro aerial vehicles (UAVs). https://
17. Ebeid, E., Skriver, M., Terkildsen, K.H., Jensen, K., Schultz, github.com/ethz-asl/waypoint navigator [Online; accessed 06
U.P.: A survey of Open-Source UAV flight controllers and flight August-2018]
simulators. Microprocess. Microsyst. 61(May), 11–20 (2018). 35. Pix4D: Professional photogrammetry and drone mapping soft-
https://doi.org/10.1016/j.micpro.2018.05.002 ware. https://www.pix4d.com/ (2020) [Online; accessed 22
18. Everingham, M., Eslami, S.A., Van Gool, L., Williams, C.K., October-2020]
Winn, J., Zisserman, A.: The pascal visual object classes 36. Popovici, A., Popovici, D.: Cellular Automata in Image Pro-
challenge: a retrospective. Int. J. Comput. Vis. 111(1), 98–136 cessing. In: Fifteenth International Symposium on Mathematical
(2015) Theory of Networks and Systems, vol. 1, pp. 1–6. Citeseer
19. Faiçal, B.S., Freitas, H., Gomes, P.H., Mano, L.Y., Pessin, (2002)
G., de Carvalho, A.C., Krishnamachari, B., Ueyama, J.: An 37. Primicerio, J., Di Gennaro, S.F., Fiorillo, E., Genesio, L.,
adaptive approach for uav-based pesticide spraying in dynamic Lugato, E., Matese, A., Vaccari, F.P.: A flexible unmanned aerial
environments. Comput. Electron. Agric. 138, 210–223 (2017) vehicle for precision agriculture. Precis. Agric. 13(4), 517–523
20. Faiçal, B.S., Pessin, G., Geraldo Filho, P., Carvalho, A.C., (2012)
Furquim, G., Ueyama, J.: Fine-tuning of Uav control rules 38. Quigley, M., Faust, J., Foote, T., Leibs, J.: Ros: an open-source
for spraying pesticides on crop fields. In: 2014 IEEE 26Th robot operating system
International Conference on Tools with Artificial Intelligence, pp. 39. Radoglou-Grammatikis, P., Sarigiannidis, P., Lagkas, T., Moscho-
527–533. IEEE (2014) lios, I.: A compilation of UAV applications for precision agricul-
21. FAO: Building a common vision for sustainable food and ture. Comput. Netw. 172, 107148 (2020). https://doi.org/10.1016/
agriculture: Principles and approaches (2014) j.comnet.2020.107148
22. Furrer, F., Burri, M., Achtelik, M., Siegwart, R.: Rotors—A 40. Richardson, L., Amundsen, M., Ruby, S.: RESTful Web APIs.
modular gazebo Mav simulator framework. In: Robot Operating O’Reilly Media, Inc (2013)
System (ROS), pp. 595–625. Springer (2016) 41. Röös, E., Bajželj, B., Smith, P., Patel, M., Little, D., Garnett, T.:
23. Gabriely, Y., Rimon, E.: Spanning-tree based coverage of Greedy or needy? land use and climate impacts of food in 2050
continuous areas by a mobile robot. Ann. Math. Artif. 31(1-4), under different livestock futures. Glob. Environ. Change 47, 1–12
77–98 (2001) (2017)
42. Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., Chen,
24. Galceran, E., Carreras, M.: A survey on coverage path planning
L.C.: Mobilenetv2: Inverted residuals and linear bottlenecks. In:
for robotics. Rob. Auton. Syst. 61(12), 1258–1276 (2013)
Proceedings of the IEEE Conference on Computer Vision and
25. Gitelson, A., Stark, R., Grits, U., Rundquist, D., Kaufman, Y.,
Pattern Recognition, pp. 4510–4520 (2018)
Derry, D.: Vegetation and soil lines in visible spectral space:
43. Singh, G., Yogi, K.K.: Internet of things-based devices/robots
a concept and technique for remote estimation of vegetation
in agriculture 4.0. In: Sustainable Communication Networks and
fraction. Int. J. Remote Sens. 23(13), 2537–2562 (2002)
Application, pp. 87–102. Springer (2022)
26. Holden, N.M., White, E.P., Lange, M.C., Oldfield, T.L.: Review of
44. Surveying, O., Committee, P., et al: Epsg geodetic parame-
the sustainability of food systems and transition using the internet
ter dataset. http://www.epsg.org/ (2005) [Online; accessed 12
of food. npj Sci. Food 2(1), 1–7 (2018)
August-2006]
27. Hunt, E.R., Cavigelli, M., Daughtry, C.S., Mcmurtrey, J.E.,
45. Takaya, K., Asai, T., Kroumov, V., Smarandache, F.: Simulation
Walthall, C.L.: Evaluation of digital photography from model
environment for mobile robots testing using ros and gazebo. In:
aircraft for remote sensing of crop biomass and nitrogen status.
2016 20Th International Conference on System Theory, Control
Precis. Agric. 6(4), 359–378 (2005)
28. Kapoutsis, A.C., Chatzichristofis, S.A., Kosmatopoulos, E.B.: and Computing (ICSTCC), pp. 96–101. IEEE (2016)
Darp: divide areas algorithm for optimal multi-robot cover- 46. Tan, M., Pang, R., Le, Q.V.: Efficientdet: Scalable and efficient
age path planning. J. Intell. Robot. Syst. 86(3-4), 663–680 object detection. In: Proceedings of the IEEE/CVF Conference
(2017) on Computer Vision and Pattern Recognition, pp. 10781–10790
29. Karatzinis, G.D., Apostolidis, S.D., Kapoutsis, A.C., (2020)
Panagiotopoulou, L., Boutalis, Y.S., Kosmatopoulos, E.B.: 47. Technology, B.R.: Blue River Technology. https://www.blue
Towards an integrated low-cost agricultural monitoring system rivertechnology.com/(2020) [Online; accessed 22 October-2020]
with unmanned aircraft system. In: 2020 International Conference 48. Uto, K., Seki, H., Saito, G., Kosugi, Y.: Development of Uav-
on Unmanned Aircraft Systems (ICUAS), pp. 1131–1138. IEEE mounted miniaturure hyperspectral sensor system for agricultural
(2020) monitoring. In: 2013 IEEE International Geoscience and Remote
30. Koutras, D.I., Kapoutsis, A.C., Kosmatopoulos, E.B.: Sensing Symposium-IGARSS, pp. 4415–4418. IEEE (2013)
Autonomous and cooperative design of the monitor positions for 49. Vasudevan, A., Kumar, D.A., Bhuvaneswari, N.: Precision
a team of uavs to maximize the quantity and quality of detected farming using unmanned aerial and ground vehicles. In: 2016
objects. IEEE Robot. Autom. Lett. 5(3), 4986–4993 (2020) IEEE Technological Innovations in ICT for Agriculture and Rural
31. Krestenitis, M., Raptis, E.K., Kapoutsis, A.C., Ioannidis, K., Development (TIAR), pp. 146–150. IEEE (2016)
Kosmatopoulos, E.B., Vrochidis, S., Kompatsiaris, I.: Cofly- 50. Wada, K.: labelme: Image Polygonal Annotation with Python.
weeddb: A uav image dataset for weed detection and species https://github.com/wkentaro/labelme (2016) [Online; accessed 08
identification. Data in Brief, pp. 108575 (2022) April-2018]
32. Kruskal, J.B.: On the shortest spanning subtree of a graph and the 51. Wu, X.D., Guo, J.L., Han, M., Chen, G.: An overview of arable
traveling salesman problem. Proc. Am. Math. Soc. 7(1), 48–50 land use for the world economy: from source to sink via the global
(1956) supply chain. Land Use Policy 76, 201–214 (2018)
J Intell Robot Syst (2023) 107:23 Page 25 of 26 23

52. Yang, D.: Gobi vegetation recognition based on low-altitude Dr. Konstantinos Ioannidis received the bachelor’s and Ph.D.
photogrammetry images of Uav. In: IOP Conference Series: Earth degrees from the Department of Electrical and Computer Engineering,
and Environmental Science, vol. 186, pp. 012053. IOP Publishing Democritus University of Thrace, Greece, in 2006 and 2013,
(2018) respectively. Currently, he is a Postdoctoral Research Fellow at
53. Zheng, H., Zhou, X., Cheng, T., Yao, X., Tian, Y., Cao, W., Centre for Research and Technology Hellas (CERTH), Information
Zhu, Y.: Evaluation of a Uav-based hyperspectral frame camera Technology Institute (ITI). His research interests include mainly the
for monitoring the leaf nitrogen concentration in rice. In: 2016 areas of digital twins, path planning and collective behaviour in
IEEE International Geoscience and Remote Sensing Symposium swarm robotics as well as various computer vision techniques mainly
(IGARSS), pp. 7350–7353. IEEE (2016) focused on robotic applications such as object detection, SLAM,
3d reconstruction, activity recognition, aerial imaging, image fusion,
Publisher’s Note Springer Nature remains neutral with regard to activity recognition, weed detection etc. Mr. Ioannidis has participated
jurisdictional claims in published maps and institutional affiliations. in more than 15 European and National research projects, as a deputy
coordinator, scientific and technical manager etc., and has been a
member of various organization teams of conferences, workshops etc.
Emmanuel K. Raptis is a Ph.D. student at the Department of Electrical He has been an author of more than 40 scientific journals, conferences,
and Computer Engineering (E.E.C.E) of the Democritus University of workshops and book chapters. He has served as a reviewer in several
Thrace. In 2018, he received his Diploma in Electrical Engineering international Journals and as a Technical Program Committee in
and completed his M.Sc. in Automatic Control Systems and Robotics various peer reviewed Conferences/Workshops.
Laboratory in 2020, both at Democritus University of Thrace. Since
2018, he has been working as a Research Associate at the Information
Technologies Institute of the Centre for Research and Technology Dr. Lefteris Doitsidis is an Assistant Professor at the School of
Hellas (CERTH). During these years, he is being participated in Production Engineering & Management of the Technical University of
several H2020 and National research projects. His research interests Crete. Prior to his appointment, he was a faculty member (Associate
focus on the field of Robotics, Automatic Control, Multi-Agents & Assistant Professor) at the Department of Electronics, Hellenic
and Autonomous systems, Artificial Intelligence, and Reinforcement Mediterranean University (former TEI of Crete). He was also a visiting
Learning. scholar at the Department of Computer Science and Engineering,
University of South Florida, FL, U.S.A. He is the author of more than
45 publications, in international journals, conference proceedings and
book chapters. His research interests lie in the areas of multirobot
Marios Krestenitis received the Diploma degree in Electrical and teams, design of novel control systems for robotic applications,
Computer Engineering in 2015 and the M.Sc. in Informatics in 2019, autonomous operation and navigation of unmanned robotic vehicles
both from the Aristotle University of Thessaloniki, Greece. Currently, and computational intelligence. He has been involved as a senior
he is a Research Assistant at the Information Technologies Institute researcher into numerous research projects including funded by
(ITI) of the Centre of Research and Technology Hellas (CERTH). His European and National funds.
research interests include machine learning, computer vision, digital
signal processing, pattern recognition and aerial imagery. He has been
involved in various EU and National funded projects.
Athanasios Ch. Kapoutsis received the Diploma and Ph.D. degrees
from the Department of Electrical and Computer Engineering
Konstantinos Egglezos is a Robotics & Automation Software (DUTH), Xanthi, Greece, in 2012 and 2017, respectively. He is
engineer with more than 6 years of experience working alongside the currently a postdoctoral researcher with the Information Technologies
executive team of IKH. Konstantinos specializes in develop software Institute (CERTH), Thessaloniki, Greece. During the past years, he
for autonomous systems & graphical interfaces. He has a Bachelor has been involved in several EU FP7, H2020 and National funded
of Science in Aegean University from the Department of Product Research and Development projects. His research is mainly focused
and Systems Design Engineering and also Master of Science in on robotics, reinforcement learning, multi-agent systems, machine
Mechatronics, Robotics, and Automation Engineering on National intelligence, adaptive control, and pattern recognition.
Technical University of Athens.

Orfeas Kypris received the B.Eng. degree in Electrical and Electronic Dr. Stefanos Vrochidis received the Diploma degree in Electrical
Engineering, and the M.Sc. degree in Magnetics from Cardiff Engineering from Aristotle University of Thessaloniki, Greece, the
University, Cardiff, U.K., in 2009 and 2010, respectively, and the MSc degree in Radio Frequency Communication Systems from
Ph.D. degree in Electrical Engineering from the Department of University of Southampton and the PhD degree in Electronic
Electrical and Computer Engineering, Iowa State University, USA, Engineering from Queen Mary University of London. Currently, he
in 2015. From 2015 to 2017, he was a Post-Doctoral Researcher is a Senior Researcher (Grade C) with the Information Technologies
with the Department of Computer Science, University of Oxford, Institute of the Centre for Research and Technology Hellas (ITI-
where he developed sensors for indoor localization and structural CERTH) and the Head of the Multimodal Data Fusion and
health monitoring using low-frequency magnetic fields. His research Analytics (M4D) Group. His research interests include multimedia
interests include nondestructive evaluation, applied electromagnetism, understanding and retrieval, multimodal fusion, computer vision,
and robotics. He is currently a software engineer specializing in multimodal analytics, artificial intelligence, as well as media & arts,
machine learning applications in the industry. environmental and security applications. Dr. Vrochidis has participated
in more than 80 European and National projects and has been member
of the organization team of several conferences and workshops. He has
edited 3 books and authored more than 300 related scientific journal,
conference and book chapter publications. He has served as a reviewer
in several international Journals and as Technical program committee
in well reputed conferences and workshops.
23 Page 26 of 26 J Intell Robot Syst (2023) 107:23

Dr. Ioannis (Yiannis) Kompatsiaris is the Director of the Information Elias B. Kosmatopoulos received the Diploma, M.Sc. and Ph.D.
Technologies Institute, Research Director at CERTH-ITI and the Head degrees from the Technical University of Crete, Greece, in 1990, 1992,
of Multimedia Knowledge and Social Media Analytics Laboratory. and 1995, respectively. He is currently a Professor with the Department
His research interests include AI/Machine Learning for multimedia of Electrical and Computer Engineering, Democritus University of
analysis, Semantics (multimedia ontologies and reasoning), Social Thrace, Greece and a collaborative Professor with the Information
Media and Big Data Analytics, Multimodal and Sensors Data Technologies Institute, Centre for Research & Technology, Hellas,
Analysis, Human Computer Interfaces, e- Health, Arts and Cultural, Greece. Previously, he was a faculty member of the Department
Media/Journalism, Environmental and Security applications. He is the of Production Engineering and Management, Technical University
co-author of 178 papers in refereed journals, 63 book chapters, 8 of Crete (TUC), Greece, a Research Assistant Professor with the
patents and 560 papers in international conferences. Since 2001, Dr. Department of Electrical Engineering-Systems, University of Southern
Kompatsiaris has participated in 88 National and European research California (USC) and a Postdoctoral Fellow with the Department
programs, in 31 of which he has been the Project Coordinator. He has of Electrical & Computer Engineering, University of Victoria, B.C.,
also been the PI in 15 contracts from the industry. Canada. Dr. Kosmatopoulos’ research interests are in the areas
of learning systems, adaptive optimization and control and their
applications to Internet of Things; energy efficient buildings and smart
grids; intelligent transportation systems; and robotic swarms. He is the
author of over 55 journal papers. Prof. Elias Kosmatopoulos has been
leading many research projects funded by the European Union and the
private sector with a total budget of about 17 million Euros.

You might also like