Sustainability 14 15539 v3

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

sustainability

Article
Mapping Agricultural Soil in Greenhouse Using an
Autonomous Low-Cost Robot and Precise Monitoring
Amine Saddik 1, * , Rachid Latif 1 , Fatma Taher 2 , Abdelhafid El Ouardi 3 and Mohamed Elhoseny 4,5

1 Laboratory of Systems Engineering and Information Technology LISTI, National School of Applied Sciences,
Ibn Zohr University, Agadir 80000, Morocco
2 College of Technological Innovation, Zayed University, Dubai 144534, United Arab Emirates
3 SATIE, CNRS, ENS Paris-Saclay, Université Paris-Saclay, 91190 Gif-sur-Yvette, France
4 College of Computing and Informatics, University of Sharjah, Sharjah 27272, United Arab Emirates
5 Faculty of Computers and Information, Mansoura University, Mansoura 35516, Egypt
* Correspondence: [email protected]

Abstract: Our work is focused on developing an autonomous robot to monitor greenhouses and
large fields. This system is designed to operate autonomously to extract useful information from the
plants based on precise GPS localization. The proposed robot is based on an RGB camera for plant
detection and a multispectral camera for extracting the different special bands for processing, and an
embedded architecture integrating a Nvidia Jetson Nano, which allows us to perform the required
processing. Our system uses a multi-sensor fusion to manage two parts of the algorithm. Therefore,
the proposed algorithm was partitioned on the CPU-GPU embedded architecture. This allows us
to process each image in 1.94 s in a sequential implementation on the embedded architecture. The
approach followed in our implementation is based on a Hardware/Software Co-Design study to
propose an optimal implementation. The experiments were conducted on a tomato farm, and the
system showed that we can process different images in real time. The parallel implementation allows
Citation: Saddik, A.; Latif, R.; Taher, to process each image in 36 ms allowing us to satisfy the real-time constraints based on 5 images/s.
F.; El Ouardi, A.; Elhoseny, M. On a laptop, we have a total processing time of 604 ms for the sequential implementation and 9 ms
Mapping Agricultural Soil in for the parallel processing. In this context, we obtained an acceleration factor of 66 for the laptop and
Greenhouse Using an Autonomous 54 for the embedded architecture. The energy consumption evaluation showed that the prototyped
Low-Cost Robot and Precise system consumes a power between 4 W and 8 W. For this raison, in our case, we opted a low-cost
Monitoring. Sustainability 2022, 14,
embedded architecture based on Nvidia Jetson Nano.
15539. https://doi.org/10.3390/
su142315539
Keywords: autonomous robot; greenhouses; GPS localization; energy; multispectral camera;
Academic Editor: Teodor Rusu embedded architecture; multi-sensor fusion; real-time

Received: 3 October 2022


Accepted: 13 November 2022
Published: 22 November 2022
1. Introduction
Publisher’s Note: MDPI stays neutral
Autonomous systems have shown great advantages in all fields of technology. These
with regard to jurisdictional claims in
systems vary from high complexity design to low complexity, all depending on the tasks
published maps and institutional affil-
expected by these systems. In addition, modern robots have known a huge revolution
iations.
in terms of autonomous and performed tasks. More precisely, a revolution in the field of
agriculture [1–3]. These robots can perform simple tasks to advanced tasks that require
robust algorithms. The successful performance of these robots requires multi-sensor fusion
Copyright: © 2022 by the authors.
approaches that include cameras, Light Detection and Ranging (LIDAR), and radar. In this
Licensee MDPI, Basel, Switzerland. context, the objective of these robots is to navigate the agricultural fields in order to extract
This article is an open access article useful information for the production of good quality of agricultural products [4]. The
distributed under the terms and problem here is the high price of development for these robots, which limits the purchase
conditions of the Creative Commons by farmers. This will influence the production efficiency of these systems; as a solution, the
Attribution (CC BY) license (https:// proposition of robots based on low-cost sensors and systems. In order to have autonomous
creativecommons.org/licenses/by/ robots to perform complicated tasks, such as monitoring indices, counting plants, and
4.0/). weed detection [5–7]. In addition, agricultural robots are divided into two types: aerial

Sustainability 2022, 14, 15539. https://doi.org/10.3390/su142315539 https://www.mdpi.com/journal/sustainability


Sustainability 2022, 14, 15539 2 of 26

robots and ground robots [8]. The problem of aerial robots is the energy consumption that
influences the operation of these systems. On the other hand, ground robots present an
efficient solution with high flexibility compared to aerial robots. Furthermore, soil robots
can perform local tasks in agricultural fields such as harvesting, the precise distribution of
chemicals, and others.
In the case of soil robots, we can find various solutions proposed for precision agri-
culture applications. R.P. Devanna et al. 2022 proposed a study based on a soil robot for
closed agricultural field monitoring. This work is based on a semi-supervised deep learning
model to detect pomegranates automatically. The robot developed is a semi-trainer in
order to improve the processing time compare d to the other technique developed. The
results show that the proposed system has achieved an F score of 86.42% and an IoU score
of 97.94% [9]. On the other hand, we can find the work of M. Skoczeń et al. 2021, who
developed techniques to avoid dynamic and static obstacles in agricultural fields. The
system proposed in this work is based on an RGB-D camera for depth calculation, and a soil
robot that moves in an autonomous mode. The results show that the system developed has
a distortion of 38 cm [10]. In a similar idea, we find M.R. Kamandar et al. 2022 proposing
a robot to improvise and reduce the effort for hedge trimming. The developed robots
are built with servo motors and wheels for movement based on five degrees of freedom
to give some flexibility to the robots [11]. In addition, W. Zheng et al. 2022 proposed
a bio-inspired human approach to developing a robot able to manipulate efficiently in
agricultural fields [12]. K. Li et al. 2022 proposed a system based on an arm to process
kiwi fruit pollination. The results obtained showed that the developed bra has an accuracy
that varies between 82% and 87% [13]. In this context, we can find a variety of proposed
systems that aim to perform tasks on agricultural farms. These tasks aim to improve the
productivity of agricultural fields. Several applications have been proposed to help farmers
make decisions [14–18]. These developed systems require autonomous movement without
the farmers’ participation, making the algorithmic conception a complicated task. Several
attempts have been made in the literature to propose localization and mapping algorithms
in the automotive domain [19,20]. These algorithms have been adapted to control robots in
the agricultural field. These systems are based on simultaneous localization and mapping
(SLAM) [21]. In this context, several works have been proposed. U. Weiss et al. 2011
proposed a simulation of an autonomous robot based on a 3D lidar sensor instead of the
traditional method based on stereo cameras [22]. In another work, I. Ali et al. 2020 were
based on localization and mapping in the forest in order to build maps for surveillance [23].
Similarly, A.S. Aguiar et al. 2021 were based on SLAM algorithms for autonomous robot
movement. The approach uses a 3D construction to localize and build the map of agri-
cultural fields [24]. All of these proposed works that aim to provide robust solutions for
agricultural field monitoring are based on complex, high-cost systems, which limits the use
of these proposed approaches. The best choice is a low-cost, robust, and flexible system
that helps identify specific problems in agricultural fields, which will increase the chance
of use and help improve agricultural product productivity and yield.
Our work focuses on developing an autonomous robot for real-time plant monitoring
in open agricultural fields and in closed greenhouses. The study was based on a greenhouse
in the southern region of Morocco. This region is known for the high production of
agricultural products, such as tomatoes and pepper. These two plants require permanent
monitoring of vegetation, water, and fertilizer. Our proposed system is based on a robot
equipped with RGB and multispectral cameras (Parrot Sequoia +) and electric motors
and wheels for movement. The role of RGB cameras is to detect plants for processing,
and the multispectral camera is the extraction of images with several bands for index
processing. Additionally, we tried to simulate our system on Catia v 2019 software for the
mechanical part and Proteus V8 for the electrical part. In addition, we tried to add the
precise localization of the plants to help the decision system. The results showed that the
robot is robust and flexible for various applications, such as weed detection and counting.
We tried to use an Nvidia Jetson Nano embedded architecture for the processing part, and
Sustainability 2022, 14, 15539 3 of 26

to control the motors and drivers to ensure autonomous movement, we used an Arduino
board. We aim to build a simple system of low energy consumption and cost, and our
proposed contribution is as follows.
(1) Development of an autonomous robot for real-time crop monitoring in open agricul-
tural fields and greenhouses
(2) A case study on a closed greenhouse based on tomatoes plant
(3) An implementation on low-cost embedded architecture based on the Cuda language
(4) In addition, an optimization based on the Hardware/Software Co-design approach
has been proposed to decrease the processing time and memory consumption for
real-time applications
Our paper is organized as follows: we have a general introduction to exploring the
problem to be treated, then we have materials and methods study which aim to study the
algorithmic part of our system. Section 3 is for the results obtained and, finally a conclusion.

2. Materials and Methods Study


2.1. Area Study
Greenhouses present a robust solution to increase plant yield. These closed green-
houses help control several crop types to improve the performance of plants. Generally, the
monitoring is manually performed based on the experience of the farmers. This leads to
some failures in the decision-making process, affecting the crop’s productivity and reducing
agricultural product yield. Therefore, our work will focus on tomatoes plant. The tomatoes
prefer humus soil, rich in nutrients and which warms up quickly. It is very greedy, and re-
quires constant fertilization before its installation and throughout its cultivation. Therefore,
tomatoes require vital signs monitoring, including water, nitrogen, and vegetation. For
this reason, we have evaluated the three most used indices for monitoring. The indices are
the normalized difference red edge index (NDRE), normalized difference vegetation index
(NDVI), and normalized difference water index (NDWI). The NDRE is based on red-edge
reflection and Near infrared (NIR) to estimate the nitrogen quantity in the plant. The NDRE
index is based on the red edge reflection and the NIR estimates the amount of nitrogen
in the plant. The NDWI is based on the green and NIR band estimates the water amount.
Equations (1)–(3) show the bands used to calculate these indices, with BNIR representing
the Near-infrared band, BR Red band, BG green band and BRedEdge Red-Edge band [25–27].

BN IR − BR
NDVI = (1)
BN IR + BR

BG − B R
NDWI = (2)
BG + B R
BN IR − BRededge
NDRE = (3)
BN IR + BRededge
The greenhouse farm used in our study is located in the Ait Aamira region near Agadir.
This region is known by the high production of tomatoes and closed greenhouses instead of
open fields. The greenhouses studied in our research are located at 30◦ 090 13” N 9◦ 300 50” W.
Figure 1 shows the greenhouses to be studied. The structure of the greenhouse is based on
a plastic cover. The greenhouse studied in this research is divided into 20 regions. Each
region contains two tomato rows with 100 m width and 75 m length with a 1.5 m between
the rows. In addition, each region is divided in two rows, one with 48 m and the other
with 48 m. Figure 2 shows the greenhouse structure and Figure 3 shows real images of the
closed greenhouse used in this study.
+ multispectral camera. The idea here is to monitor each plant based on the precise GPS
coordination of the multispectral camera to deliver an accurate plant assessment that gives
each plant its index value and GPS coordinates afterward. The use of these coordinates
will help to determine which plant lacks fertilizer, water, or vegetation in a precise ap-
proach. In addition, this type of camera gives images separated into four bands, red,
Sustainability 2022, 14, 15539 green, near-infrared, and RedEdge, with a resolution of 1280 × 960 pixels for each image 4 of 26
with a high-resolution to visualize the plants. This gives the flexibility to keep just the
RGB image delivered and remove the other unused bands.

Sustainability 2022, 14, x FOR PEER REVIEW 5 of 28

Figure 1. Greenhouse
Figure 1. Greenhouselocation.
location.

1.5m
1.5m

Figure 2. Greenhouse structure.

1.5m
Sustainability 2022, 14, 15539 5 of 26

Figure 2. Greenhouse structure.

1.5m

Figure 3. Greenhouse
Figure 3. real area.
Greenhouse real area.

The study
2.2. System was conducted on the totality of the rows to build a general report of
Modelling
tomatoFromplant’s condition.
the literature Thewe
study, tool
canused for the
conclude data
that wecollection
have threebased
tools on
for the Parrot
algorithm
Sequoia + multispectral camera. The idea here is to monitor each plant based
validation. The first tool is the satellite, the second is the drones, and the third is the on the
precise GPS coordination of the multispectral camera to deliver an accurate plant
ground robots. These three tools aim to support the different sensors that collect, process, assess-
mentsometimes
and that gives each
makeplant its index
real-time value and
decisions. GPS
In the coordinates
case afterward.
of the satellite, The use
we cannot of these
make de-
coordinates
cisions will help toand
simultaneously, determine
the lowwhich plantof
resolution lacks
the fertilizer, water,a or
images causes vegetation
wrong in a
diagnosis.
precise
For thisapproach. In addition,
reason, these solutionsthis
aretype of camera
limited givesand
to medium images
highseparated
agriculturalintofields.
four bands,
Addi-
red, green, near-infrared, and RedEdge, with a resolution of 1280 × 960
tionally, the application side is limited because the images cannot help when applying pixels for each
image with a high-resolution to visualize the plants. This gives the flexibility
counting algorithms, weed detection, and different diseases. So, we are limited to two to keep just
the RGB image delivered and remove the other unused bands.
solutions in our case, either the drones or the soil robots. Unmanned aerial vehicles have
shown solid and efficient solutions for surveillance and different applications. However,
2.2. System Modelling
the problem with these tools is the battery’s autonomy, which does not reach 30 min of
flightFrom
if no the literature
overload study,applied.
has been we canOn conclude
the otherthat we have
hand, if wethree
wanttools for aalgorithm
to build decision
validation. The first tool is the satellite, the second is the drones,
system using a UAV (Unmanned Arial Vehicle), this will increase the weight, and the third is theaffecting
ground
robots. These three tools aim to support the different sensors that collect,
the flight time. Similarly, UAVs are not flexible when we want to use surveillance in closed process, and
sometimes make real-time decisions. In the case of the satellite, we cannot
greenhouses. These constraints make the ground robots very strong regarding accuracy make decisions
simultaneously, and the low resolution of the images causes a wrong diagnosis. For this
and flexibility of applications in open and closed fields such as greenhouses. For this
reason, these solutions are limited to medium and high agricultural fields. Additionally,
the application side is limited because the images cannot help when applying counting
algorithms, weed detection, and different diseases. So, we are limited to two solutions in
our case, either the drones or the soil robots. Unmanned aerial vehicles have shown solid
and efficient solutions for surveillance and different applications. However, the problem
with these tools is the battery’s autonomy, which does not reach 30 min of flight if no
overload has been applied. On the other hand, if we want to build a decision system using
a UAV (Unmanned Arial Vehicle), this will increase the weight, affecting the flight time.
Similarly, UAVs are not flexible when we want to use surveillance in closed greenhouses.
These constraints make the ground robots very strong regarding accuracy and flexibility
of applications in open and closed fields such as greenhouses. For this study, we will
focus on designing a platform consisting of a soil robot that moves autonomously to
monitor vegetation, water, nitrogen, and different applications, such as counting and weed
detection. It can also make decisions in real-time. This research aims to show our proposed
algorithms’ applicability and utility. In this context, we have developed a system named
VSSAgri (vegetation surveillance system for Precision agriculture application). This robot
aims to validate the monitoring algorithms proposed in this work. The proposed prototype
is based on an embedded architecture and electrical motors powered by a battery. Similarly,
it offers a low-cost solution compared to the proposed in the literature.
The proposed system validation has been designed through several steps. Among
these steps is the system modeling on CATIA software to study the functional aspect before
the hardware design. As for the electrical part, we used Proteus for the functional electrical
diagram of the system. In this framework, the system is divided into two parts. The
electrical part consists of electric motors and a 12 V battery. The second part is based on the
mechanical modeling of the different components.
Similarly, it offers a low-cost solution compared to the proposed in the literature.
The proposed system validation has been designed through several steps. Among
these steps is the system modeling on CATIA software to study the functional aspect be-
fore the hardware design. As for the electrical part, we used Proteus for the functional
electrical diagram of the system. In this framework, the system is divided into two parts.
Sustainability 2022, 14, 15539 6 of 26
The electrical part consists of electric motors and a 12 V battery. The second part is based
on the mechanical modeling of the different components.

2.2.1.
2.2.1. Mechanical
Mechanical Study
Study
The
The proposed
proposed system
system consists
consists ofof metal
metal support
support with
with aa length
length ofof 150
150 cm
cm and
and 65
65 cm
cm
wide.
wide. The
The dimensions
dimensions chosen
chosen inin our
our case
case are
are based
based on
on the
the test
test we
we performed
performed to to validate
validate
the
the system.
system. In
In the
the real
real case,
case, either
either in
in aa greenhouse
greenhouse oror an
an open
open field,
field, we
we can
can change
change the
the
dimensions of the system. Figure 4 shows the metal support used
dimensions of the system. Figure 4 shows the metal support used in our case. in our case.

Figure 4.
Figure 4. Metal
Metal support.
support.

This
This support
support is
is equipped
equipped withwith two
twobarriers
barriers in
inthe
theextremities.
extremities. The
The role
role of
of these
these barriers
barriers
is
is to guarantee the flow of the processing system. These barriers will support the box that
to guarantee the flow of the processing system. These barriers will support the box that
contains the RGB
contains the RGB camera,
camera,multispectral,
multispectral,andandthe
theembedded
embedded architecture.
architecture. In this
In this context,
context, we
wehavehave tested
tested several
several solutions
solutions basedbased ontwo
on just justcables
two cables that
that will will ensure
ensure movement. movement.
But the
But the problem
problem of these
of these cables cables
is the is the movement
movement of the
of the box that box thatthe
influences influences the speed of
speed of displacement
Sustainability 2022, 14, x FOR PEER REVIEW 7 of 28
displacement
due to friction.due to friction.
Figure 5 shows theFigure 5 shows
solution the solution
that was proposedthat was
before theproposed before the
two barriers.
two barriers.

Figure 5. First
Figure5. First proposed
proposedsolution
solutionfor
forbox
boxmovement.
movement.

As
As shown
shown inin Figure
Figure5,5,we
wehave
havetwo
twometal
metalcables
cablesininthe
the extremity
extremityofofthe
thesupport.
support. This
This
solution has shown some constraints, such as the friction of the box during the displacement.
solution has shown some constraints, such as the friction of the box during the displace-
For thisFor
ment. reason, we havewe
this reason, selected the use of
have selected thebarriers. Then, we
use of barriers. havewe
Then, a metal
havebox that consists
a metal box that
of a power
consists of abank for bank
power the power
for thesupply
powerof the embedded
supply architecture,
of the embedded and thisand
architecture, boxthis
carries
box
carries the different cameras used. As a second solution, we proposed the use of wheels
with electric motors for the movement in the two barriers. Figure 6 shows the box used.
Figure 5. First proposed solution for box movement.

As shown in Figure 5, we have two metal cables in the extremity of the support. This
Sustainability 2022, 14, 15539 7 of 26
solution has shown some constraints, such as the friction of the box during the displace-
ment. For this reason, we have selected the use of barriers. Then, we have a metal box that
consists of a power bank for the power supply of the embedded architecture, and this box
the different
carries camerascameras
the different used. As a second
used. solution,
As a second we proposed
solution, the use
we proposed ofuse
the wheels with
of wheels
electric motors for the movement in the two barriers. Figure 6 shows the box used.
with electric motors for the movement in the two barriers. Figure 6 shows the box used.

Figure6.
Figure 6. Box
Box used
used in
in our
our prototype.
prototype.

Additionally,
Additionally, we we used
used eight
eight wheels
wheels with
with electric
electric motors,
motors, four
four toto drive
drive horizontally,
horizontally,
and
and the
the others
others to
to drive
drive vertically
vertically the
the box’s
box’s support.
support. These
These wheels
wheels will
will realize
realize aa scanner
scanner
principle
principle with
with aa horizontal
horizontaland
andvertical
verticaldisplacement
displacementtoto guarantee
guarantee a general
a general vision
vision of
of the
the field
field thatthat
willwill be handled.
be handled. The wheels
The wheels used used
can becan be changed
changed if weproblems
if we have have problems
of move-of
movement
ment related related
to thetogeography
the geography
of theofagricultural
the agricultural
areas.areas. For some
For some applications,
applications, we needwe
need big wheels that will ensure movement without problems. The collection
big wheels that will ensure movement without problems. The collection of all these8 com-
Sustainability 2022, 14, x FOR PEER REVIEW
of all these
of 28
components
ponents willwillgivegive a complete
a complete system,
system, as shown
as shown in Figure
in Figure 7 with
7 with the the different
different views
views on
on the
the CATIA software.
CATIA software.

Figure 7. Different views of the prototype using the


the CATIA
CATIA software.
software.

Generally,
Generally, the
the system
system moves
moves in in aa horizontal
horizontal direction
direction from
from thethe box
box that
that contains
contains thethe
cameras and the embedded architecture, as shown in Figure 7. Displacement
cameras and the embedded architecture, as shown in Figure 7. Displacement 1 guarantees 1 guarantees
that
that the box will
the box will cross
cross the
the horizontal axis to
horizontal axis to process
process allall the
the plants
plants in in this
this row.
row. Likewise,
Likewise,
displacement
displacement 22 ensures
ensuresthethemovement
movementofofallallthe thesupport
support ononthethe
fields
fieldsto to
move
move to the second
to the sec-
row.
ond row. These two mechanisms ensure the processing of the whole field. In this case, the
These two mechanisms ensure the processing of the whole field. In this case, the
temporal constraintisisthe
temporal constraint theplants’
plants’real-time
real-time indices
indices processing
processing to avoid
to avoid datadata
loss.loss. This
This tem-
temporal constraint
poral constraint depends
depends on on
thethe type
type of of camera
camera thatwill
that willbebeused.
used.IfIfwewewant
want toto monitor
monitor
indices, such as NDVI, NDWI, and NDRE, we will need a multispectral
indices, such as NDVI, NDWI, and NDRE, we will need a multispectral camera with camera with aa
timelapse of 5 frames/s. For monitoring the RGB indices, we will
timelapse of 5 frames/s. For monitoring the RGB indices, we will need an RGB camera need an RGB camera
with
with aa timelapse
timelapse of 30 frames
of 30 frames per
per second,
second, and and for
for all
all applications,
applications, suchsuch as as plant
plant counting,
counting,
weed,
weed, and disease detection. Temporal constraints in this case have been studied [5,25]
and disease detection. Temporal constraints in this case have been studied [5,25]
based
based on
on the
the different
different embedded
embedded architectures,
architectures, either
either CPU-GPU
CPU-GPU or or CPU-FPGA.
CPU-FPGA.

2.2.2. Electrical Study


This section focuses on the electrical design of our system. It is equipped with eight
electrical motors of low energy consumption. These motors are powered by a 12 V battery
that delivers the necessary voltage and current for the system to operate. Four motors are
reserved for the movement of all the metal supports, and the others for the movement of
the robot. The motors used in the system are flexible and can rotate in both directions. The
timelapse of 5 frames/s. For monitoring the RGB indices, we will need an RGB
with a timelapse of 30 frames per second, and for all applications, such as plant co
weed, and disease detection. Temporal constraints in this case have been studie
based on the different embedded architectures, either CPU-GPU or CPU-FPGA.
Sustainability 2022, 14, 15539 8 of 26

2.2.2. Electrical Study


This section
2.2.2. Electrical Studyfocuses on the electrical design of our system. It is equipped wi
electrical motors
This section of low
focuses energy
on the consumption.
electrical design of ourThese
system.motors are powered
It is equipped by a 12 V
with eight
electrical motorsthe
that delivers of low energy consumption.
necessary voltage and These motors
current forarethe
powered
system byto
a 12 V batteryFour mo
operate.
that
reserved for the movement of all the metal supports, and the others forare
delivers the necessary voltage and current for the system to operate. Four motors the move
reserved for the movement of all the metal supports, and the others for the movement of
therobot.
the robot. The
The motors
motors usedused
in the in the system
system areand
are flexible flexible andincan
can rotate bothrotate in both
directions. The directio
fundamental
fundamental design
design of amachine
of a DC DC machine is described
is described by(4)–(7)
by Equations Equations (4)–(7)
and Figure 8: and Figur

Figure Electrical
Figure8. 8. DC machine
Electrical diagram.
DC machine diagram.
dI1
U1 = R1 · I1 + L1 · (4)
dt 𝑑𝐼1
𝑈 = 𝑅1 · 𝐼1of +
with U1 voltage of supply field winding, R1s resistance
𝐿1 · field winding, L induc-
supply 𝑑𝑡 1
tance of field coil, I1 current of supply field winding.
with U1 voltage of supply field winding, Rs resistance of supply field winding, L1
ance of field coil, I1 current dI2 winding.
U2 =of
R2supply
· I2 + L2 · field
+ ω· MSR · I1 (5)
dt
with U2 voltage of armature coil, R2 Resistance of armature coil, I2 current of armature coil,
ω angular speed of the motor MSR mutual inductance.


M1 + M2 = J · + B·ω (6)
dt
with M2 moment of load, M1 moment of conversion, J total engine, B coefficient of friction.

M1 = MSR · I1 I2 (7)

The control of the motors was based on a low-cost Arduino Nano board, and we
used only two boards for the driver’s motor. The first will control the support motor and
the other for the box. These drivers allow us to synchronize all the motors to ensure a
simultaneous movement of the support and the box. The role of the Arduino board is to
control the motors through the drivers. Figure 9 shows the electrical diagram of our system.
𝑀1 = 𝑀𝑆𝑅 · 𝐼1 𝐼2 (7)
The control of the motors was based on a low-cost Arduino Nano board, and we used
only two boards for the driver’s motor. The first will control the support motor and the
Sustainability 2022, 14, 15539
other for the box. These drivers allow us to synchronize all the motors to ensure a simul-
9 of 26
taneous movement of the support and the box. The role of the Arduino board is to control
the motors through the drivers. Figure 9 shows the electrical diagram of our system.

Figure 9. Electrical diagram.


Figure 9. Electrical diagram.
2.2.3. Algorithm Study
2.2.3. Algorithm Study
Our proposed algorithm is based on two parts. The first one is for the front end, which
Ourthe
controls proposed algorithm
autonomous is based
movement of on
thetwo parts.
robot. On The first one
the other is we
side, for have
the front end,
the back-end
which controls the autonomous movement of the robot. On the other side, we have 10 theof 28
part, which focuses on processing the indices and counting the plants with the suitable
Sustainability 2022, 14, x FOR PEER REVIEW
back-end part, which focuses on processing the indices and counting the plants with the
threshold. Figure 10 shows the proposed global algorithm.
suitable threshold. Figure 10 shows the proposed global algorithm.

Figure
Figure 10.
10. General
General algorithm
algorithm overview.
overview.

The algorithm proposed in Figure 10 is based on the autonomous robot movement


control and indices processing. The front-end part ensures the movement of the different
parts of the robot based on an Arduino board, electric motors, and drivers. The algorithm
starts with the box movement, this box contains the embedded architecture that will pro-
Sustainability 2022, 14, 15539 10 of 26
Sustainability 2022, 14, 15539 10 of 26

The algorithm proposed in Figure 10 is based on the autonomous robot movement


controlThe
and algorithm proposed inThe
indices processing. Figure 10 is based
front-end on the autonomous
part ensures the movement robot movement
of the different
control and indices processing. The front-end part ensures the movement
parts of the robot based on an Arduino board, electric motors, and drivers. The algorithm of the different
partswith
starts of thetherobot based on anthis
box movement, Arduino board, electric
box contains motors,architecture
the embedded and drivers.that Thewill
algorithm
process,
starts
the with the boxand
multispectral movement, this boxWe
RGB cameras. contains
used athe embedded
power architecture
bank for the power that will process,
supply part of
thebox.
the multispectral
On the other andhand,
RGB cameras.
the movementWe used a power
control partbank for the
contains power
a large supply
battery withpart12ofV.
the box.
After the On
boxthe other hand,
movement, thethe movement
vision systemcontrol
detectspart
if we contains
have aaplant
largeor
battery
not, ifwith
yes,12 V.
then
After the box movement, the vision system detects if we have a plant or
it will apply a 1 s delay, allowing the multispectral cameras to take images with different not, if yes, then
it willfor
bands apply
the aindices
1 s delay, allowing We
processing. the have
multispectral
chosen acameras
1 s delay,todepending
take images onwith
our different
camera’s
bands for the indices processing. We have chosen a 1 s delay, depending
time-laps (5 frames/s). If not, the system will move the box again to find the plant. on our camera’s
For the
time-laps (5 frames/s).
displacement If not, the
time, it depends system
on the will move
distance the the
between boxplants.
again toFor
find the plant.
example, Forhave
if we the
displacement time, it depends on the distance between the plants. For example,
plants one after the other, therefore, every second, we will have five images. In the opposite if we have
plants
case, theone
timeafter
to the
takeother,
a new therefore, every second,
image depends on thewe will have
distance five images.
between In theInopposite
the plants. the case
case, the time to take a new image depends on the distance between the plants. In the case
of having located the plant, the multispectral camera takes images with several bands in
of having located the plant, the multispectral camera takes images with several bands in
order to extract useful information. As soon as the images are ready, the system sends them
order to extract useful information. As soon as the images are ready, the system sends them
Sustainability 2022, 14, x FOR PEERto the back-end to process the indices. The algorithmic part of the index calculation
REVIEW 11 of has
28
to the back-end to process the indices. The algorithmic part of the index calculation has
been studied recently [5,25]. Algorithm 1 shows the front-end process.
been studied recently [5,25]. Algorithm 1 shows the front-end process.

Algorithm
Algorithm1:1:Front-End
Front-End
1. Start algorithm
2. Inputs: Support Moving or Start
3. Function Move Box
4. Driver setting
5. Give a command the motors
6. Run M1, M2, M3, M4
7. End Function
8. Function Detect plant
9. For I = 0 to max line
10. For J = 0 to Max columns
11. Find the green color
12. Find the plant’s morphology
13. End For
14. End For
15. End function
16. IF The plant not detected Then
17. Return to Move Box
18. End IF
19. Else The plant detected Then
20. 1 s delay
21. Image taken
22. End Else
23. Function Send images to Back-end
24. RED = IMG_210903_104621_0000_RED
25. GRE = IMG_210903_104621_0000_GRE
26. NIR = IMG_210903_104621_0000_NIR
27. REG = IMG_210903_104621_0000_REG
28. End function
29. End Algorithm
30. Output: RED, GRE, NIR, REG

On
OnOn the other
theother
the hand,
otherhand, for
hand,for the
forthe back-end
theback-end part,
part,aaaplant
back-endpart, plant
plant counting
counting algorithm
counting algorithm based
algorithm based
based ononclassifi-
classi-
classifi-
cationhas
cation hasbeen
fication beenadded
has addedto
been tocount
added count the
to the number
count number of
the of plants
number plants that
of that have
plants have aa low
that havelow or
a or high
lowhighindex
or indexin
high inorder
index in
order
totoorder
haveto
have aageneral
general
have overview.
a general Once the
overview.
overview. Once the
Once indices are calculated,
the indices
indices are calculated, the
the algorithm
are calculated, sends
the algorithm
algorithm the
thedata
sends
sends the
data
data for segmentation and counting. For the indices processing, we performed a segmen-
tation step that will eliminate the negative values that correspond to the absence of vege-
tation (e.g., soil, earth...). The NDVI, NDWI, and NDRE values calculated previously are
now between 0 and 1. The closer the value is to 1, the more the index reflects good results.
Then, we performed multiple thresholding; we considered the values between 0.2 and 0.3
Sustainability 2022, 14, 15539 11 of 26
Sustainability 2022, 14, 15539 11 of 26

for segmentation and counting. For the indices processing, we performed a segmentation
for that
step segmentation and counting.
will eliminate For the
the negative indices
values thatprocessing,
correspondwe toperformed
the absence a segmentation
of vegetation
step that will eliminate the negative values that correspond
(e.g., soil, earth . . . ). The NDVI, NDWI, and NDRE values calculated previously to the absence of vegetation
are now
(e.g., soil,
between 0 earth
and 1.. . The
. ). The NDVI,
closer the NDWI,
value isand NDRE
to 1, valuesthe
the more calculated previously
index reflects goodare now
results.
between
Then, 0 and 1. The
we performed closerthresholding;
multiple the value is towe1,considered
the more the the index
valuesreflects
between good results.
0.2 and 0.3
asThen,
beingwe performed
relatively multiple
weak, so wethresholding;
assigned them we the
considered
red color;thethe
values between
values 0.2 0.3
between andand0.3
aswere
0.6 beingslightly
relatively weak,
better, wesoassigned
we assignedthemthem the redcolor;
the orange color; the
the values
values between
between 0.30.6 and
and
0.6 were slightly better, we assigned them the orange color; the
0.9 showed a very high level, we assigned them a green color. These different thresholdsvalues between 0.6 and
0.9 showed
were performed a very high to
in order level,
makewethe
assigned
imagesthem
moreainterpretable
green color. These
for thedifferent thresholds
farmer. The system
allows us to create a file containing the index name and value in the output file. system
were performed in order to make the images more interpretable for the farmer. The On the
allows
other us atofile
side, create
thatacontains
file containing the index
the images name
colored and value
for each type of in index
the output file.
overall Onhave
will the
other side, a file that contains the images colored for each type of index overall will have
Sustainability 2022, 14, x FOR PEERfour files. One for RGB images, two for NDVI, three for NDWI, and NDRE. Algorithm
REVIEW 12 of 28 2
four files.
shows One for RGBoperation.
the thresholding images, two for NDVI, three for NDWI, and NDRE. Algorithm 2
shows the thresholding operation.

Algorithm
Algorithm2:2:Back-End
Back-End
1. Start algorithm
2. Input: Indices value
3. For I = 0 to max line
4. For J = 0 to Max columns
5. Apply thresholding process on NDVI, NDWI, and NDRE
6. Apply the colorization based on the threshold
7. n < NDVI < m, n < NDWI < m, n < NDRE < m (N and M between 0.2–1)
8. End For
9. End For
10. Function Classification
11. Writing results in NDVI file
12. Writing results in NDWI file
13. Writing results in NDRE file
14. End Function
15. Function Computing
16. For I = 0 to max images
17. Convert NDVI images from RGB space to HSV
18. Convert NDWI images from RGB space to HSV
19. Convert NDRE images from RGB space to HSV
20. Find the edges
21. Apply the “Watershed” function
22. End For
23. For I = 0 to max line
24. For J = 0 to Max columns
25. Complete the segmented images
26. Count the number of plants
27. End For
28. End For
29. Storage of segmented images
30. End Function
31. End Algorithm
32. Output: Number of plants

The
TheThe purpose
purpose
purpose ofof
of the
thethe watershed technique
techniqueisisisto
watershedtechnique
watershed totosegment
segment the
segmentthe image.
theimage.
image.ItItIttreats
treats
treatsthe
theimage
the imageas
image as
a atopographic
astopographicmap
a topographic mapbased
mapbasedon onthe
based theintensity
on intensityof
the ofthe
intensity thepixels.
of pixels. We
the We find
pixels. find the
We the semantic
find semantic segmentation
the semantic segmen-
segmentation
that
tation
that refers
refers totorefers
that theprocess
the process duringthe
to the during
process the program
during tolink
the program
program to link each
each pixel
to link
pixel of an
each
of an image
pixel
image to
of an aa particular
to image to a
particular
class label.class
particular
class label. InInour
our
label.case,
case, theresult
In the
our resultthe
case, obtained would be
result obtained
obtained would be aa single
would single class,
be a class, and
single andclass,all
allandthe
theallpixels
the
pixels
would
pixels belong to the same
would belong to theclass
same soclass
the plants
so thewould
plants be treated
would be as a single
treated as aobject.
singleInstance
object.
Instance segmentation is identical to semantic segmentation except that it differs from se-
mantic segmentation in the fact that it treats the different objects in the image, several
objects of the same class, as distinct entities, which allows us to count the number of ob-
jects present in the image. The methodology followed in this last part of the algorithm:
(a) We convert the indices image to HSV (hue, saturation, value).
Sustainability 2022, 14, 15539 12 of 26

would belong to the same class so the plants would be treated as a single object. Instance
segmentation is identical to semantic segmentation except that it differs from semantic
segmentation in the fact that it treats the different objects in the image, several objects of
the same class, as distinct entities, which allows us to count the number of objects present
in the image. The methodology followed in this last part of the algorithm:
(a) We convert the indices image to HSV (hue, saturation, value).
(b) We create the CV_8U version of our HSV image, then we look for the contours present
in the HSV image.
(c) We trace the contours on the original image, this tracing step is divided into two steps:
a. Tracing the markers of the foreground;
b. Tracing the background markers in white;
c. The final image is a superposition of the two tracings a and b.
(d) We perform the segmentation using the OpenCV function “Watershed”; afterwards,
we fill the labeled objects with randomly generated colors.
The back-end part is based on image processing and is divided into three functions.
The first function is for the pre-processing of the images, the second for the indices process-
ing, and the third for the counting operation. After the algorithm finishes the back-end, it
goes back to the front end, then a test if the system has finished the vertical line is applied.
If yes, it will move all the metal support. If no, it will move only the box for the next plant.

3. Results and Discussion


Interventionary studies involving animals or humans, and other studies that require
ethical approval, must list the authority that provided approval and the corresponding
ethical approval code.

3.1. Test and Implementation


The processing test was based on two architectures, the first is a low-cost embedded
architecture type Nvidia Jetson Nano and the second is a laptop. Using the Jetson Nano
architecture has shown that the processing time is reduced using the CUDA language, and
we can achieve real-time processing. It is the same case for the laptop, but the problem here
is the portability of our system. Table 1 shows the specification of our used device.

Table 1. Architecture specification.

Device type Laptop Nvidia Jetson Nano


Processor type Intel CORE ARMv8
CPU name i7-10510U ARM A57
Base frequency 1.80 GHz 1.43 GHz
Number of cores 4 (8 threads) 4
GPU GeForce MX250 Tegra X1
GPU Architecture Pascal Maxwell
Base frequency 1519 MHz 643 MHz
Number of cores 384 128
4 GB
16 GB DDR4{XE “CDDR” \t “:
Memory LPDDR4{ XE “LPDDR4” \t “:
Double Data Rate ”}
Low Power Double Data Rate ” }

At the first step, a sequential implementation has been proposed in order to study
the back-end part of our algorithm. Compared to the Laptop on the Jetson Nano, the
three functions constitute the total processing time of the algorithm. Therefore, following a
workload analysis based on a hardware/software Co-Design approach allows us to reduce
the processing time as much as possible for these three functions for both architectures
and mainly for the Jetson Nano. Its compact size of 10 cm × 8 cm × 2.9 cm, and its
minimal power consumption of 5 W, added to its limited resources and adapted memory
capacity. Its relatively low cost allows it to meet a constraining specification related
Sustainability 2022, 14, 15539 13 of 26

Sustainability 2022, 14, x FOR PEER REVIEW 14 of 28


to embedded systems. After the sequential implementation we based on the CUDA
parallel programming language to accelerate the processing. Figure 11 shows a sequential
implementation based on C/C++.

LAPTOP NVIDIA JETSON NANO


Pre-processing Indices processing Counting Pre-processing Indices processing Thresholding

33.42% 34.90%
42.87%

64.12%
2.46%

22.23%

Figure 11. Processing time analysis.


Figure 11. Processing time analysis.
In Figure 11, we can conclude that in the laptop, the pre-processing part occupies
more than 64%
In Figure 11,ofwethecan
processing
conclude time,
thatwhile in laptop,
in the the Jetson
theNano architecture,part
pre-processing we have
occupies
only 42%. After the time analysis, we decided to accelerate the three functions
more than 64% of the processing time, while in the Jetson Nano architecture, we to decrease
have only
the processing time to satisfy the real-time constraint. Table 2 shows a processing time
42%. After the time analysis, we decided to accelerate the three functions to decrease the
comparison on the laptop and Jetson Nano based on C/C++ and CUDA.
processing time to satisfy the real-time constraint. Table 2 shows a processing time com-
parison onProcessing
Table 2. the laptop and
time Jetson Nano based on C/C++ and CUDA.
comparison.

C/C++ (s)
Table 2. Processing time comparison. CUDA (s) Acceleration
Laptop
Pre-processing C/C++
0.3878 (s) CUDA0.0039(s) Acceleration
99.43
Indices processing 0.0149 0.0015
Laptop 9.93
Counting 0.2021 0.0036 56.13
Pre-processing 0.3878 0.0039 99.43
Jetson Nano
Indices processing
Pre-processing 0.0149
0.8334 0.0015
0.0064 130.219.93
Indices processing
Counting 0.4321
0.2021 0.0124
0.0036 34.8456.13
Counting 0.6783 0.0175 38.76
Jetson Nano
Pre-processing 0.8334 0.0064 130.21
Using the results of the performance profiler, we have gone from a processing time
Indices processing 0.4321 0.0124 34.84
equal to 387.7 ms for the pre-processing to 3.9 ms, which is an acceleration of almost
Counting 0.6783 0.0175 38.76
100 times. For the indices processing, we have an acceleration of 10, and, finally, the
counting has been accelerated by 56 times. For the heterogeneous embedded system Jetson
Using
Nano, wethe
haveresults
gone of thea performance
from profiler,
processing time equal towe have
833.4 msgone from
for the a processing
pre-processing to time
6.4 to
equal ms,387.7
an acceleration
ms for theofpre-processing
130 times has been achieved.
to 3.9 For the
ms, which second
is an function,of
acceleration wealmost
have 100
gone
times. from
For the432.1 ms to
indices 12.4 ms, equivalent
processing, we have to
anan accelerationof
acceleration of10,
34 times, and, finally,
and, finally, the
the counting
hascounting has been accelerated
been accelerated by 56 times. byFor
a value of 38 times. Figure
the heterogeneous 12 represents
embedded systema temporal
Jetson Nano,
we synthesis
have gone of the
fromdifferent functions
a processing on our
time Laptop
equal and on
to 833.4 msJetson
for theNano.
pre-processing to 6.4 ms,
an acceleration of 130 times has been achieved. For the second function, we have gone
from 432.1 ms to 12.4 ms, equivalent to an acceleration of 34 times, and, finally, the count-
ing has been accelerated by a value of 38 times. Figure 12 represents a temporal synthesis
of the different functions on our Laptop and on Jetson Nano.
Sustainability 2022, 14, x FOR PEER REVIEW 15 of 28
Sustainability 2022, 14, 15539 14 of 26

1000

100

10

1
Pre-processing Indices processing Counting

Laptop C/C++ Laptop CUDA Jetson Nano C/C++ Jetson Nano CUDA

Figure 12. Temporal synthesis of the different functions.


Figure 12. Temporal synthesis of the different functions.
After the processing time analysis and acceleration, we added a detailed study on
After consumption
memory the processing
andtime analysistime.
processing and The
acceleration, we added
analysis results a detailed
obtained study
have been on
based
memory consumption and processing time. The analysis results obtained have been
on the profiler proposed by Nvidia. The tool offered us the percentage of time that these based on
the profiler proposed
functions by Nvidia.
take compared The
to the tool offered
global activityuslevel
the percentage
of the GPU of of
time
thethat these functions
functions “CUDA
take compared
memcpy to theand
H_to_D” global
“CUDAactivity level ofD_to_H”.
memcpy the GPU of the functions
Table 3 shows a“CUDA summary memcpy
of the
results obtained.
H_to_D” and “CUDA memcpy D_to_H”. Table 3 shows a summary of the results obtained.

GPUactivity
Table3.3.GPU
Table activityand
andData
Dataworkload
workloadresults.
results.

GPU Activity (%)


GPU Activity (%) Data Workload
Data Workload
GeForce MX250
GeForce Jetson GeForce MX250(GB/s)
GeForce Jetson (MB/s)
Jetson Jetson (MB/s)
CUDA memcpy MX250 MX250(GB/s)
34.78 30.35 2.4829 761.1025
H_to_D
CUDA memcpy H_to_D 34.78 30.35 2.4829 761.1025
CUDAmemcpy
CUDA memcpy D_to_H 18.46 15.62 2.7274 803.36
Total 18.46 53.24 15.62 45.97 2.72745.21 803.36
1564
D_to_H
Total 53.24 45.97 5.21 1564
Table 3 shows that in the case of the Laptop we have a CPU→GPU transfer rate
equal to 2.4829
Table 3 shows GB/s and
that in theacase
GPU CPU
of→the rate we
Laptop equal
havetoa2.7274
CPU→GPUGB/s.transfer
The Jetson Nano
rate equal
toembedded
2.4829 GB/s system
and ahas values 3 times
GPU→CPU lower to
rate equal in 2.7274
the order of MB/s.
GB/s. Our system
The Jetson consists of
Nano embedded
two movements,
system has valuesthe first for
3 times the box
lower containing
in the order of the embedded
MB/s. Our systemarchitecture
consistsand the cameras
of two move-
used for
ments, theprocessing.
first for theTheboxsecond part focuses
containing on moving
the embedded all metaland
architecture supports to a new
the cameras row.
used
With
for this method,
processing. The we can ensure
second that all
part focuses onthe plantsall
moving will be processed.
metal supports toFigure
a new13 shows
row. Withan
overview
this method, of we
ourcan
system.
ensure that all the plants will be processed. Figure 13 shows an over-
view ofTheourfirst step that was performed was the testing and validating of our robot in a
system.
closed space based on three rows to validate the mechanism of our system. each rows
contain mint, parsley and pepper plant, respectively. The validation of this system in a
closed space does not imply accurate functionality in a real environment. For this reason,
we have opted after the validation to make a real test in order to validate our algorithmic
and systematic approach. In Figure 13, we have on the top left the developed robot and, on
the bottom, a multispectral and RGB camera view. In the bottom right image, we have the
box that contains the Arduino architecture that operates the movement part of our robot.
Figure 14 shows images collected by our robot.

Mint
Table 3 shows that in the case of the Laptop we have a CPU→GPU transfer rate equal
to 2.4829 GB/s and a GPU→CPU rate equal to 2.7274 GB/s. The Jetson Nano embedded
system has values 3 times lower in the order of MB/s. Our system consists of two move-
ments, the first for the box containing the embedded architecture and the cameras used
for processing. The second part focuses on moving all metal supports to a new row. With
Sustainability 2022, 14, 15539 15 of 26
this method, we can ensure that all the plants will be processed. Figure 13 shows an over-
Sustainability 2022, 14, x FOR PEERview
REVIEW
of our system. 16 of 28

Sustainability 2022, 14, x FOR PEER REVIEW


Mint 16 of 28

Figure 13. System overview.

The first step that was performed was the testing and validating of our robot in a
closed space based on three rows to validate the mechanism of our system. each rows
contain mint, parsley and pepper plant, respectively. The validation of this system in a
closed space does not imply accurate functionality in a real environment. For this reason,
we have opted after the validation to make a real test in order to validate our algorithmic
and systematic approach. In Figure 13, we have on the top left the developed robot and,
on the bottom, a multispectral and RGB camera view. In the bottom right image, we have
the box that contains the Arduino architecture that operates the movement part of our
robot. Figure 14 shows images collected by our robot.
Figure 13. System overview.
Figure 13. System overview.

The first step that was performed was the testing and validating of our robot in a
closed space based on three rows to validate the mechanism of our system. each rows
contain mint, parsley and pepper plant, respectively. The validation of this system in a
closed space does not imply accurate functionality in a real environment. For this reason,
we have opted after the validation to make a real test in order to validate our algorithmic
and systematic approach. In Figure 13, we have on the top left the developed robot and,
on the bottom, a multispectral and RGB camera view. In the bottom right image, we have
the box that contains the Arduino architecture that operates the movement part of our
robot. Figure 14 shows images collected by our robot.

Figure 14. Camera


Figure 14. Camera overview
overview of
of our
our system.
system.

After the prototype validation in the laboratory, we moved to the field validation to
evaluate the prototype performance. The The results showed that the prototype works in the
same conditions
same conditions and
and mechanism
mechanism as as in
in the
the laboratory.
laboratory. The
The test
test was
was conducted
conducted inin an
an open
open
field and
field and aa closed
closed greenhouse,
greenhouse, showing
showing ourour system’s
system’s flexibility.
flexibility. In
In addition,
addition, our robot can
our robot can
be adapted
be adaptedforforthe
thedifferent
differentprecision
precisionagriculture applications
agriculture by editing
applications the back-end
by editing with
the back-end
the appropriate
with algorithm.
the appropriate TheseThese
algorithm. applications can becan
applications either weed weed
be either detection, fruit and
detection, fruitplant
and
counting,
plant or disease
counting, detection.
or disease In addition,
detection. a decision-making
In addition, systemsystem
a decision-making can be added
can be to take
added
real-time
to actions actions
take real-time in the agricultural field. This
in the agricultural approach
field. will help
This approach thehelp
will farmer
the make
farmerprecise
make
Figure 14. Camera overview of our system.

After the prototype validation in the laboratory, we moved to the field validation to
evaluate the prototype performance. The results showed that the prototype works in the
Sustainability 2022, 14, x FOR PEER REVIEW 17 of 28

Sustainability 2022, 14, 15539


Sustainability precise
2022, andPEER
14, x FOR fastREVIEW
decisions
to avoid difficult problems. The developed system is16character-
of 26 17
ized by its low cost and low energy consumption. Figure 15 shows the test of our system
in a real agricultural field.
and fast decisions to avoid
precise difficult
and fast problems.
decisions The
to avoid developed
difficult system
problems. Theis developed
characterized by is chara
system
its low cost andized
lowby
energy consumption. Figure 15 shows the test of our system in a real
its low cost and low energy consumption. Figure 15 shows the test of our sy
agricultural field.
in a real agricultural field.

Figure 15. Test and evaluation of the system in real field.


Figure
Figure 15. Test and 15. Testofand
evaluation theevaluation
system in of the
real system in real field.
field.
In Figure 15, module 1 represents the battery that powers the motors. On the other
hand, we have used
In Figure a power
15, module
In bank
1 represents
Figure 15, forthe
module the powerthat
battery
1 represents supply
thepowersof the
batterythe camera,
motors.
that powersOn and
thethefor theOn
other
motors. em-the o
hand, we have used a power bank for the power supply of the camera, and for the em-
bedded architecture.
hand, Module
we have 2 is
useda abox with
power the
bank control
for the part
power that gives
supply of actions
the to
camera, the
and mo-
for the
tor.bedded
Module architecture.
bedded
3 is the robotModule 2 is a box
architecture.
embedding Module with the
2
cameras is andcontrol
a box with part that gives actions
the control part
the architecture-based that to the Ad-
gives actions
processing. to the
motor. Module 3 is
tor. the robot
Module 3 embedding
is the robot cameras
embedding and the
cameras architecture-based
and the processing.
architecture-based
ditionally, we added a power consumption analysis as part of the robot specifications processing
Additionally, we added
ditionally,a power
we addedconsumption
study. Figure 16 shows the results obtained. analysis
a power consumption as part of
analysistheasrobot
part specifications
of the robot specifica
study. Figure 16study.
shows the results
Figure obtained.
16 shows the results obtained.

P (W) P (W)
8 8
6 6
4 4

2 2
0
0 1 11 21 31 41 51
1 11 21 31 41 51
I (A)
0.8 I (A)
0.8
0.6
0.6 0.4
0.4 0.2

0.2 0
1 11 21 31 41 51
0
1 11 21
Figure 31current consumption
16. Power and 41 of the electric motors.
51
Figure 16. Power and current consumption of the electric motors.
In Figure
Figure 16. Power and current 16, we haveofmeasured
consumption themotors.
the electric current consumption of the electric motors
In Figure 16,
thewe haveWe
power. measured
tried tothe current
make consumption
several of see
iterations to the electric
each timemotors and
the consumption.
theInpower. We tried
maximum to make
powerseveral iterations
consumption is to
aboutsee each
2.9 W, time
and the
for consumption.
the
Figure 16, we have measured the current consumption of the electric motors current The and
consumption is 0.5
maximum power consumption is about 2.9 W, and for the current consumption
the power. We tried to make several iterations to see each time the consumption. The is 0.59 A.
maximum power consumption is about 2.9 W, and for the current consumption is 0.59 A.
Sustainability 2022, 14, x FOR PEER REVIEW 18 of 28
Sustainability 2022, 14, x FOR PEER REVIEW 18 of 28

Sustainability 2022, 14, 15539 17 of 26


3.2. Experimental Result
3.2.The
Experimental
approachResult
used in this work was based on evaluating several indices, namely
3.2. Experimental
NDVI, The
NDWI, andResult
approach used in
NDRE. this these
Then, work indices
was based
will on
be evaluating
collected onseveral indices,
all 20 zones to namely
deter-
NDVI,
mine the NDWI,
region and
with NDRE.
low Then,
indices. these indices
Afterward, we will
try be
to collected
determine on
the all
GPS
The approach used in this work was based on evaluating several indices, namely 20 zones to
coordinatesdeter-
of
mine
each the
regionregion
with with
its low
index. indices.
The Afterward,
first index we
that try
has to
been determine
calculatedthe
is GPS
NDVI, NDWI, and NDRE. Then, these indices will be collected on all 20 zones to determine
NDVI.coordinates
Figure 17of
the region with low indices. Afterward, we try to determine the GPS coordinates of each
each
shows region
the with
evaluationits index.
of NDVI The
in first
the 20index
zones.that has been calculated is NDVI. Figure 17
region
shows with its index. The
the evaluation first index
of NDVI in thethat has been calculated is NDVI. Figure 17 shows the
20 zones.
evaluation of NDVI in the 20 zones.

Figure 17. NDVI evaluation based on 20 regions.


Figure
Figure 17. NDVI evaluation
17. NDVI evaluation based
based on
on 20
20 regions.
regions.
The NDVI processing was based on several plants to evaluate the variation of this
The NDVI processing was based on several plants to evaluate the variation of this
index.TheThe NDVI processing
indices’ values vary was based on
between 0.15several plants
and 0.8. to evaluate
Generally, the variation
the values close to 1ofpre-
this
index. The indices’ values vary between 0.15 and 0.8. Generally, the values close to 1 present
index.
sent Thevegetation
strong indices’ values vary
(reflects between
that 0.15
the plant and
has no0.8. Generally,
problem at thethe values close
vegetation toOnce
level). 1 pre-
strong vegetation (reflects that the plant has no problem at the vegetation level). Once we
wesent strong
have vegetation
calculated the (reflects
variationthat the plant
of NDVI onhas no problem
several at the vegetation
plants in level). Once
different regions, we
have calculated the variation of NDVI on several plants in the different regions, we move
we have
move calculated
to calculate the the
NDVIvariation
averageofinNDVI on several
each region plants
to define theinregions
the different regions,
with less vegeta- we
to calculate the NDVI average in each region to define the regions with less vegetation.
move
tion. to calculate
This method the allow
will NDVIus average
to in each
locate the region with
regions to define the regions
vegetation with less
problems. vegeta-
Figure 18
This method will allow us to locate the regions with vegetation problems. Figure 18 shows
tion. This
shows methodNDVI
the average will allow us to
inregion.
each locate the regions with vegetation problems. Figure 18
region.
the average NDVI in each
shows the average NDVI in each region.
0.8
0.8
0.7
0.7
0.6
0.6
NDVI value

0.5
NDVI value

0.5
0.4
0.4
0.3
0.3
0.2
0.2
0.1
0.1
0
0 Z1 Z2 Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
Z1 Z2average.
Figure 18. NDVI Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
Figure 18. NDVI average.
The
Figure 18. results in Figure 18 show that zones 6 and 10 give a low index compared to
NDVI average.
the other zones. Usually, the mean values vary between 0.15 for zone 6 and 0.67 for
Sustainability 2022, 14, x FOR PEER REVIEW 19 of
Sustainability 2022, 14, 15539 18 of 26

The results in Figure 18 show that zones 6 and 10 give a low index compared to th
zone 9. Zonesother
6 and 10 present
zones. Usually,low values
the mean due vary
values to the supply
between system
0.15 for zoneof6the
andnecessary
0.67 for zone 9. Zon
plant components.
6 andThis reflectslow
10 present thevalues
strong relationship
due to the supplybetween vegetation,
system of the necessarywater,
plantand the
components. Th
nitrogen content in thethe
reflects plants.
strongAfter locating between
relationship the region with a low
vegetation, index,
water, andwethe
tried to locate
nitrogen content in th
the plants withplants.
the vegetation index.
After locating the Figure 19 shows
region with a low the vegetation
index, we tried toindex
locatecalculated
the plants for
with the veg
each plant. tation index. Figure 19 shows the vegetation index calculated for each plant.

0.35
NDVI Z6 NDVI Z10
0.3

0.25
NDVI value

0.2

0.15

0.1

0.05

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Figure 19. NDVI of each plant for 20 regions.


Figure 19. NDVI of each plant for 20 regions.
As a result of evaluating the different plants in the areas with a vegetation problem,
we determined theAs a result
plant’s of evaluating
exact the different
position through plants incollected
the images the areaswith
with the
a vegetation
precise problem
we determined the plant’s exact position through the images
GPS data of the Parrot Sequoia + camera. This localization gives us the plant where collected with
wethe preci
have the vegetation problem, which will help the farmer or robots make precise decisions. where w
GPS data of the Parrot Sequoia + camera. This localization gives us the plant
Figure 20 shows have the vegetation
images problem,
of the plants and which will help
their NDVI the farmer
results. or robots
In Figure make
20 are theprecise
RGB decision
images of tomato plants collected in the greenhouse, and we have the evaluation of20the
Figure 20 shows images of the plants and their NDVI results. In Figure are the RG
images of tomato plants collected in the greenhouse, and we have the evaluation of th
binarized NDVI index using the threshold T1 = 0.5. Additionally, we have tried to vary the
binarized NDVI index using the threshold T1 = 0.5. Additionally, we have tried to vary th
threshold for the red images with T2 = 0.4. This threshold variation shows the plants that
threshold for the red images with T2 = 0.4. This threshold variation shows the plants th
have an index more than T1 or T2 . This operation will help us to classify the final results.
have an index more than T1 or T2. This operation will help us to classify the final result
To determine the index with its proposed threshold, we need to take the plants of each type
To determine the index with its proposed threshold, we need to take the plants of eac
and create a test with the vegetation sensors afterward to provide each plant with its own
type and create a test with the vegetation sensors afterward to provide each plant with i
index for decision-making.
own index for decision-making.
After determining the NDVI vegetation index, we need to calculate the other indices,
such as NDRE and NDWI. The reason for calculating these indices is that vegetation is
not enough to indicate that the plant is in good condition. The second evaluation was
based on the NDRE index shown in Figure 21. The NDRE evaluation was based on a
0.2 threshold. Regions with an index lower than 0.2 suffer from a nitrogen deficiency in the
plants. Figure 21 summarizes the results obtained for 20 zones that have been evaluated.
The results show that the index varies between 0.079 and 0.75 in zones 10 and 5, with
minimum and maximum values. We also find zones 1 and 6 with NDRE values of 0.15 and
0.08. These regions reflect the nitrogen lack in plants. Similarly, for the vegetation index,
we have zone 6 and 10 that are in the same situation. On the other hand, in the evaluation
of NDRE, another zone that suffers from a lack of nitrogen has been added. After a first
synthesis, we concluded that the suffering zones are 6, 1, and 10. For this reason, it is
necessary to extend the evaluation and see the average of the NDRE index in the different
regions. Figure 22 shows the different areas’ evaluation based on each zone’s average.
Sustainability 14, 15539
2022, 14,
Sustainability 2022, x FOR PEER REVIEW 19of
20 of 26
28

Figure 20. RGB and NDVI image evaluation.

After determining the NDVI vegetation index, we need to calculate the other indices,
such as NDRE and NDWI. The reason for calculating these indices is that vegetation is not
enough to indicate that the plant is in good condition. The second evaluation was based
on the NDRE index shown in Figure 21. The NDRE evaluation was based on a 0.2 thresh-
old. Regions with an index lower than 0.2 suffer from a nitrogen deficiency in the plants.
Figure 21 summarizes the results obtained for 20 zones that have been evaluated. The
results show that the index varies between 0.079 and 0.75 in zones 10 and 5, with mini-
mum and maximum values. We also find zones 1 and 6 with NDRE values of 0.15 and
0.08. These regions reflect the nitrogen lack in plants. Similarly, for the vegetation index,
we have zone 6 and 10 that are in the same situation. On the other hand, in the evaluation
of NDRE, another zone that suffers from a lack of nitrogen has been added. After a first
synthesis, we concluded that the suffering zones are 6, 1, and 10. For this reason, it is
necessary to extend the evaluation and see the average of the NDRE index in the different
regions. Figure 22 shows the different areas’ evaluation based on each zone’s average.
Figure 20.
Figure 20. RGB and NDVI
RGB and NDVI image
image evaluation.
evaluation.

After determining the NDVI vegetation index, we need to calculate the other indices,
such as NDRE and NDWI. The reason for calculating these indices is that vegetation is not
enough to indicate that the plant is in good condition. The second evaluation was based
on the NDRE index shown in Figure 21. The NDRE evaluation was based on a 0.2 thresh-
old. Regions with an index lower than 0.2 suffer from a nitrogen deficiency in the plants.
Figure 21 summarizes the results obtained for 20 zones that have been evaluated. The
results show that the index varies between 0.079 and 0.75 in zones 10 and 5, with21mini-
Sustainability 2022, 14, x FOR PEER REVIEW of 28
mum and maximum values. We also find zones 1 and 6 with NDRE values of 0.15 and
0.08. These regions reflect the nitrogen lack in plants. Similarly, for the vegetation index,
we have zone 6 and 10 that are in the same situation. On the other hand, in the evaluation
of NDRE, another zone that suffers from a lack of nitrogen has been added. After a first
synthesis, we concluded that the suffering zones are 6, 1, and 10. For this reason, it is
necessary to extend the evaluation and see the average of the NDRE index in the different
regions. Figure 22 shows the different areas’ evaluation based on each zone’s average.

Figure 21. NDRE


Figure 21. NDRE evaluation
evaluation for
for 20
20 regions.
regions.

0.8
0.7
0.6
0.5
NDRE value

0.4
0.3
0.2
0.1
0
Sustainability 2022, 14, 15539 20 of 26
Figure 21. NDRE evaluation for 20 regions.

0.8
0.7
0.6
0.5

NDRE value
0.4
0.3
0.2
0.1
0
Z1 Z2 Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20

Figure 22. NDREFigure


average.
22. NDRE average.

The application of the average processing in the different zones has reinforced the
The application of the average processing in the different zones has reinforced t
synthesis elaborated in Figure
synthesis 22. Itin
elaborated also confirmed
Figure that
22. It also zones 6, that
confirmed 1, and 10 suffer
zones from
6, 1, and the from t
10 suffer
lack of nitrogen.lack
Ourofevaluation methodology based on a thorough evaluation
nitrogen. Our evaluation methodology based on a thorough evaluation that aims to that ai
determine the exact plants where we have problems.
to determine the exact plants where we have problems.
In Figure 23, weInhaveFigureshown
23, we anhaveevaluation of 20 plants
shown an evaluation in plants
of 20 regions 1, 6, and
in regions 10.
1, 6, andThe10. The resu
results show thatshow that the plants in regions 6 and 10 have very low values compared to zone1.
the plants in regions 6 and 10 have very low values compared to zone 1. The valu
The values vary vary
between 0.050.05
between andand0.1. 0.1.
On On thethe
other hand,
other hand,zone
zone1 varies
1 variesbetween
between0.20.2and
and0.11.
0.11. After ev
After evaluatinguating
the different plants,
the different it isitnecessary
plants, is necessary totodetermine
determinethe the precise localizationof the plan
precise localization
of the plants. The
Sustainability 2022, 14, x FOR PEER REVIEW Thethird
thirdvital
vital index
index that
that isis opted
optedfor forour
ourevaluation
evaluation is is NDWI.
NDWI. ThisThis
index
22 index
of determines
28
determines the amount
amountofof water
water in vegetation.
in the the vegetation.Figure Figure
24 shows 24the
shows the evaluation
evaluation of the NDWI. of
the NDWI.

0.25
NDRE Z1 NDRE Z6

0.2
NDRE value

0.15

0.1

0.05

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Figure 23. NDRE for 1, 6 and 10 zones.
Figure 23. NDRE for 1, 6 and 10 zones.
The NDWI evaluation is based on a threshold of 0.3. Areas with less than 0.3 have
a water deficiency or water absence, while areas with more than 0.3 have water. The
index evaluation showed that, like the NDRE, areas 1, 6, and 10 have low water content.
Figures 25 and 26 present the results in each zone.
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Sustainability 2022, 14, 15539 21 of 26


Figure 23. NDRE for 1, 6 and 10 zones.

Sustainability 2022, 14, x FOR PEER REVIEW 23 of 28

Sustainability 2022, 14, x FOR PEER REVIEW 23

0.7 Figure 24.


Figure NDWI evaluation
24. NDWI evaluation based
based on
on 20
20 regions.
regions.
0.6 0.7
The NDWI evaluation is based on a threshold of 0.3. Areas with less than 0.3 have a
0.5 0.6 deficiency or water absence, while areas with more than 0.3 have water. The index
water
NDWI value

0.4 evaluation
0.5 showed that, like the NDRE, areas 1, 6, and 10 have low water content. Figures
25 and 26 present the results in each zone.
NDWI value

0.3 0.4
0.2 0.3
0.1 0.2
0 0.1
Z1 Z2 Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
0
Figure 25. Z1 Z2 average.
NDWI Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
Figure 25. NDWI average.
Figure 25. NDWI average.
NDWI Z1 NDWI Z6 NDWI Z10
0.35
NDWI Z1 NDWI Z6 NDWI
0.3 0.35
0.25 0.3
NDWI value

0.2 0.25
NDWI value

0.15 0.2
0.1 0.15
0.05 0.1
0 0.05
1 2
3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0
Figure 26. NDWI for 1, 6 and 102zones.
Figure 26. NDWI for 1, 6 and110 zones.
3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 2

The NDWIFigure
values26.
in NDWI
Figurefor
26 1, 6 andfrom
range 10 zones.
0.618 to 0.13 for zones 18 and 10. The calcula-
tion of NDWI in the figure is based on the average of each zone. The global analysis of the
The NDWI values in Figure 26 range from 0.618 to 0.13 for zones 18 and 10. The calc
greenhouse study showed that vegetation problems appeared in zones 6 and 10, while water
tion of NDWI in the figure is based on the average of each zone. The global analysis o
and nitrogen problems appeared in zones 1, 6, and 10. This methodology of interpretation and
greenhouse study showed that vegetation problems appeared in zones 6 and 10, while w
this study will help farmers monitor the agricultural fields and determine the plants and areas
Sustainability 2022, 14, 15539 22 of 26

The NDWI values in Figure 26 range from 0.618 to 0.13 for zones 18 and 10. The
calculation of NDWI in the figure is based on the average of each zone. The global analysis
of the greenhouse study showed that vegetation problems appeared in zones 6 and 10,
while water and nitrogen problems appeared in zones 1, 6, and 10. This methodology of
interpretation and this study will help farmers monitor the agricultural fields and determine
the plants and areas that suffer from a various problem. This will increase the productivity
Sustainability 2022, 14, x FOR PEER REVIEW 24 of 28
of the farm. After processing the indices in each plant, we will have the precise localization
of each plant Figure 27 shows an example of NDVI.

0.35
NDVI Z6 NDVI Z10
0.3

0.25

0.2

0.15

0.1

0.05

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

30°09’21’’N9° 30’49’’W 30°09’25’’N9° 30’56’’W

30°09’22’’N9° 30’49’’W 30°09’23’’N9° 30’49’’W

Figure
Figure 27.27. NDVI
NDVI andand GPS
GPS data
data forfor 6 and
6 and 10 10 zones.
zones.

The
TheGPS
GPSdata shown
data shown in in
Figure 2828
Figure areare
provided
providedbyby
thethe
Parrot Sequoia
Parrot Sequoia+ camera. Then,
+ camera. Then,
wewe
generated
generateda amap
mapofofthe
thegreenhouse
greenhouse with the 20
with the 20zones
zonesandandthe
theindex
index result
result with
with thethe
GPS
data, as shown in Figure 27. This will improve the decision system associated with the closed
greenhouse. Therefore, in the map, the green and blue colored rectangle shows zones 10 and
6. Additionally, the GPS coordinates of the plants that have a vegetation problem. After
evaluating the different vital indices, including NDVI, NDRE, and NDWI, we obtained the
different information needed in the agricultural fields. This information is the most relevant
Sustainability 2022, 14, 15539 23 of 26

GPS data, as shown in Figure 27. This will improve the decision system associated with
the closed greenhouse. Therefore, in the map, the green and blue colored rectangle shows
Sustainability 2022, 14, x FOR PEER REVIEW 25 of 28
zones 10 and 6. Additionally, the GPS coordinates of the plants that have a vegetation
problem. After evaluating the different vital indices, including NDVI, NDRE, and NDWI,
we obtained the different information needed in the agricultural fields. This information
is theThe
most
GPSrelevant to have
data shown an overview
in Figure of theby
28 are provided health of the
the Parrot plants.
Sequoia As soon
+ camera. as the
Then, we
evaluation
can generateisafinished,
map of thea global map should
greenhouse with thebe20generated
zones andfor
thethe farmer,
index resultcontaining the
with the GPS
different
data, information
as shown in Figureabout thewill
27. This greenhouse.
improve theFigure 28system
decision showsassociated
the overall
withmap of the
the closed
indices monitoring.
greenhouse. In the map, the green and blue colored rectangle shows these zones 10 and 6.

Figure 28. Indices general view.


Figure 28. Indices general view.

In addition, the GPS coordinates of the plants that have a vegetation problem are
given. After evaluating the different vital indices, including NDVI, NDRE, and NDWI, we
obtained the different information needed in the agricultural fields. This information is
the most relevant to have an overview of the health of the plants. As soon as the evaluation
is finished, a global map should be generated for the farmer, containing the different in-
formation about the greenhouse. Figure 28 shows the overall map of the indices monitor-
Sustainability 2022, 14, 15539 24 of 26

The GPS data shown in Figure 28 are provided by the Parrot Sequoia + camera. Then,
we can generate a map of the greenhouse with the 20 zones and the index result with the
GPS data, as shown in Figure 27. This will improve the decision system associated with the
closed greenhouse. In the map, the green and blue colored rectangle shows these zones
10 and 6.
In addition, the GPS coordinates of the plants that have a vegetation problem are
given. After evaluating the different vital indices, including NDVI, NDRE, and NDWI,
we obtained the different information needed in the agricultural fields. This informa-
tion is the most relevant to have an overview of the health of the plants. As soon as
the evaluation is finished, a global map should be generated for the farmer, containing
the different information about the greenhouse. Figure 28 shows the overall map of
the indices monitoring. Table 4 shows a Summary of normalized indices results in the
closed greenhouse.

Table 4. Summary of normalized indices results in the closed greenhouse.

Nomenclature for Table


- <0.2
+ 0.2–0.4
++ 0.4–0.6
+++ <0.6
Zones Value
NDVI NDWI NDRE
1 + + -
2 + + +
3 ++ + +
4 ++ + ++
5 ++ ++ +++
6 - - -
7 +++ ++ ++
8 +++ + +
9 +++ ++ ++
10 - - -
11 +++ ++ +
12 +++ ++ ++
13 ++ + ++
14 +++ ++ ++
15 ++ + +
16 +++ ++ ++
17 ++ + +
18 ++ ++ +
19 ++ + +
20 ++ ++ +

4. Conclusions
The validation of the algorithmic approach is a very important step to test the al-
gorithm’s reliability. In real scenarios, we can find a variety of environmental problems
that influence the algorithm’s operation. The development of autonomous robots helps to
validate the research approach and make it useful. In this work, we have proposed an au-
tonomous monitoring system that monitors crops in closed greenhouses and in open fields.
This system delivers a map that contains an image with vegetation, water, and fertilizer
information, and GPS localization. This technique will increase the precision of monitor-
ing, which will help us to improve the decision systems and reduce the consumption of
resources required by the plant for growth. This will maximize the yield by decreasing the
consumption of resources. Therefore, we added a sequential and parallel implementation
in a heterogeneous embedded architecture type CPU-GPU in order to study the processing
time and the memory consumption. The results allowed us to process each image in 1.94 s
Sustainability 2022, 14, 15539 25 of 26

for the Jetson Nano embedded architecture. On the other hand, the Laptop processed each
image in 0.604 s, which does not give real-time processing based on 5 images/s. For this
reason, an optimization based on the architecture-algorithm mapping approach allowed
us to reduce the processing time for the Laptop to 9 ms and the embedded architecture to
36 ms. This gave us real-time processing for both architectures with an acceleration factor
of 66 for the laptop and 54 for the Jetson Nano.

Author Contributions: Conceptualization, A.S. writing—original draft preparation, A.S. and A.S.;
methodology, A.S. and A.E.O.; software, A.S. and A.E.O.; validation, M.E. and F.T.; formal analysis,
R.L.; data curation, A.S.; writing—review and editing, M.E. and R.L.; visualization, F.T. All authors
have read and agreed to the published version of the manuscript.
Funding: This research is funded by the National Centre for Scientific and Technical Research of
Morocco (CNRST) (grant number: 19 UIZ2020).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Data used in this paper are available upon requests.
Acknowledgments: We owe a debt of gratitude to thank the National Centre for Scientific and
Technical Research of Morocco (CNRST) for their financial support and for their supervision (grant
number: 19 UIZ2020).
Conflicts of Interest: The authors declare no conflict of interest.

References
1. Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review.
Comput. Electron. Agric. 2022, 198, 107085. [CrossRef]
2. Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, W.; Stachniss, C. Agricultural robot dataset for plant classification,
localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052. [CrossRef]
3. Abualkishik, A.Z.; Almajed, R.; Thompson, W. Evaluating Smart Agricultural Production Efficiency using Fuzzy MARCOS
method. J. Neutrosophic Fuzzy Syst. 2022, 3, 8–18. [CrossRef]
4. Tofigh, M.A.; Mu, Z. Intelligent Web Information Extraction Model for Agricultural Product Quality and Safety System. J. Intell.
Syst. Internet Things 2021, 4, 99–110. [CrossRef]
5. Saddik, A.; Latif, R.; El Ouardi, A.; Alghamdi, M.I.; Elhoseny, M. Improving Sustainable Vegetation Indices Processing on
Low-Cost Architectures. Sustainability 2022, 14, 2521. [CrossRef]
6. Saddik, A.; Latif, R.; el Ouardi, A.; Elhoseny, M.; Khelifi, A. Computer development based embedded systems in precision
agriculture: Tools and application. Acta Agric. Scand. Sect. B Soil Plant Sci. 2022, 72, 589–611. [CrossRef]
7. Amine, S.; Latif, R.; El Ouardi, A. Low-Power FPGA Architecture Based Monitoring Applications in Precision Agriculture. J. Low
Power Electron. Appl. 2021, 11, 39. [CrossRef]
8. Abualkishik, A.Z.; Almajed, R.; Thompson, W. Multi-attribute decision-making method for prioritizing autonomous vehicles in
real-time traffic management: Towards active sustainable transport. Int. J. Wirel. Ad Hoc Commun. 2021, 3, 91–101. [CrossRef]
9. Devanna, R.P.; Milella, A.; Marani, R.; Garofalo, S.P.; Vivaldi, G.A.; Pascuzzi, S.; Galati, R.; Reina, G. In-Field Automatic
Identification of Pomegranates Using a Farmer Robot. Sensors 2022, 22, 5821. [CrossRef]
10. Skoczeń, M.; Ochman, M.; Spyra, K.; Nikodem, M.; Krata, D.; Panek, M.; Pawłowski, A. Obstacle Detection System for Agricultural
Mobile Robot Application Using RGB-D Cameras. Sensors 2021, 21, 5292. [CrossRef]
11. Kamandar, M.R.; Massah, J.; Jamzad, M. Design and evaluation of hedge trimmer robot. Comput. Electron. Agric. 2022,
199, 107065. [CrossRef]
12. Zheng, W.; Guo, N.; Zhang, B.; Zhou, J.; Tian, G.; Xiong, Y. Human Grasp Mechanism Understanding, Human-Inspired Grasp
Control and Robotic Grasping Planning for Agricultural Robots. Sensors 2022, 22, 5240. [CrossRef] [PubMed]
13. Li, K.; Huo, Y.; Liu, Y.; Shi, Y.; He, Z.; Cui, Y. Design of a lightweight robotic arm for kiwifruit pollination. Comput. Electron. Agric.
2022, 198, 107114. [CrossRef]
14. Cho, B.-H.; Kim, Y.-H.; Lee, K.-B.; Hong, Y.-K.; Kim, K.-C. Potential of Snapshot-Type Hyperspectral Imagery Using Support
Vector Classifier for the Classification of Tomatoes Maturity. Sensors 2022, 22, 4378. [CrossRef] [PubMed]
15. Yang, J.; Ni, J.; Li, Y.; Wen, J.; Chen, D. The Intelligent Path Planning System of Agricultural Robot via Reinforcement Learning.
Sensors 2022, 22, 4316. [CrossRef]
16. Gao, P.; Lee, H.; Jeon, C.-W.; Yun, C.; Kim, H.-J.; Wang, W.; Liang, G.; Chen, Y.; Zhang, Z.; Han, X. Improved Position Estimation
Algorithm of Agricultural Mobile Robots Based on Multisensor Fusion and Autoencoder Neural Network. Sensors 2022, 22, 1522.
[CrossRef]
Sustainability 2022, 14, 15539 26 of 26

17. Yang, T.; Ye, J.; Zhou, S.; Xu, A.; Yin, J. 3D reconstruction method for tree seedlings based on point cloud self-registration. Comput.
Electron. Agric. 2022, 200, 107210. [CrossRef]
18. Duan, M.; Song, X.; Liu, X.; Cui, D.; Zhang, X. Mapping the soil types combining multi-temporal remote sensing data with texture
features. Comput. Electron. Agric. 2022, 200, 107230. [CrossRef]
19. Chghaf, M.; Rodriguez, S.; Ouardi, A.E. Camera, LiDAR and Multi-modal SLAM Systems for Autonomous Ground Vehicles: A
Survey. J. Intell. Robot Syst. 2022, 105, 2. [CrossRef]
20. Nguyen, D.D.; El Ouardi, A.; Rodriguez, S.; Bouaziz, S. FPGA implementation of HOOFR bucketing extractor-based real-time
embedded SLAM applications. J. Real-Time Image Proc. 2021, 18, 525–538. [CrossRef]
21. Ericson, S.K.; Åstrand, B.S. Analysis of two visual odometry systems for use in an agricultural field environment. Biosyst. Eng.
2018, 166, 116–125. [CrossRef]
22. Weiss, U.; Biber, P. Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot. Auton. Syst. 2011, 59,
265–273. [CrossRef]
23. Ali, I.; Durmush, A.; Suominen, O.; Yli-Hietanen, J.; Peltonen, S.; Atanas, J.; Finn, G. Forest dataset: A forest landscape for visual
SLAM. Robot. Auton. Syst. 2020, 132, 103610. [CrossRef]
24. Aguiar, A.S.; dos Santos, F.N.; Sobreira, H.; Cunha, J.B.; Sousa, A.J. Particle filter refinement based on clustering procedures for
high-dimensional localization and mapping systems. Robot. Auton. Syst. 2021, 137, 103725. [CrossRef]
25. Saddik, A.; Latif, R.; Elhoseny, M.; El Ouardi, A. Real-time evaluation of different indexes in precision agriculture using a
heterogeneous embedded system. Sustain. Comput. Inform. Syst. 2021, 30, 100506. [CrossRef]
26. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150.
[CrossRef]
27. McFeeters, S.K. The use of the Normalized Difference Water Index (NDWI) in the delineation of open water features. Int. J.
Remote Sens. 1996, 17, 1425–1432. [CrossRef]

You might also like