Sustainability 14 15539 v3
Sustainability 14 15539 v3
Sustainability 14 15539 v3
Article
Mapping Agricultural Soil in Greenhouse Using an
Autonomous Low-Cost Robot and Precise Monitoring
Amine Saddik 1, * , Rachid Latif 1 , Fatma Taher 2 , Abdelhafid El Ouardi 3 and Mohamed Elhoseny 4,5
1 Laboratory of Systems Engineering and Information Technology LISTI, National School of Applied Sciences,
Ibn Zohr University, Agadir 80000, Morocco
2 College of Technological Innovation, Zayed University, Dubai 144534, United Arab Emirates
3 SATIE, CNRS, ENS Paris-Saclay, Université Paris-Saclay, 91190 Gif-sur-Yvette, France
4 College of Computing and Informatics, University of Sharjah, Sharjah 27272, United Arab Emirates
5 Faculty of Computers and Information, Mansoura University, Mansoura 35516, Egypt
* Correspondence: [email protected]
Abstract: Our work is focused on developing an autonomous robot to monitor greenhouses and
large fields. This system is designed to operate autonomously to extract useful information from the
plants based on precise GPS localization. The proposed robot is based on an RGB camera for plant
detection and a multispectral camera for extracting the different special bands for processing, and an
embedded architecture integrating a Nvidia Jetson Nano, which allows us to perform the required
processing. Our system uses a multi-sensor fusion to manage two parts of the algorithm. Therefore,
the proposed algorithm was partitioned on the CPU-GPU embedded architecture. This allows us
to process each image in 1.94 s in a sequential implementation on the embedded architecture. The
approach followed in our implementation is based on a Hardware/Software Co-Design study to
propose an optimal implementation. The experiments were conducted on a tomato farm, and the
system showed that we can process different images in real time. The parallel implementation allows
Citation: Saddik, A.; Latif, R.; Taher, to process each image in 36 ms allowing us to satisfy the real-time constraints based on 5 images/s.
F.; El Ouardi, A.; Elhoseny, M. On a laptop, we have a total processing time of 604 ms for the sequential implementation and 9 ms
Mapping Agricultural Soil in for the parallel processing. In this context, we obtained an acceleration factor of 66 for the laptop and
Greenhouse Using an Autonomous 54 for the embedded architecture. The energy consumption evaluation showed that the prototyped
Low-Cost Robot and Precise system consumes a power between 4 W and 8 W. For this raison, in our case, we opted a low-cost
Monitoring. Sustainability 2022, 14,
embedded architecture based on Nvidia Jetson Nano.
15539. https://doi.org/10.3390/
su142315539
Keywords: autonomous robot; greenhouses; GPS localization; energy; multispectral camera;
Academic Editor: Teodor Rusu embedded architecture; multi-sensor fusion; real-time
robots and ground robots [8]. The problem of aerial robots is the energy consumption that
influences the operation of these systems. On the other hand, ground robots present an
efficient solution with high flexibility compared to aerial robots. Furthermore, soil robots
can perform local tasks in agricultural fields such as harvesting, the precise distribution of
chemicals, and others.
In the case of soil robots, we can find various solutions proposed for precision agri-
culture applications. R.P. Devanna et al. 2022 proposed a study based on a soil robot for
closed agricultural field monitoring. This work is based on a semi-supervised deep learning
model to detect pomegranates automatically. The robot developed is a semi-trainer in
order to improve the processing time compare d to the other technique developed. The
results show that the proposed system has achieved an F score of 86.42% and an IoU score
of 97.94% [9]. On the other hand, we can find the work of M. Skoczeń et al. 2021, who
developed techniques to avoid dynamic and static obstacles in agricultural fields. The
system proposed in this work is based on an RGB-D camera for depth calculation, and a soil
robot that moves in an autonomous mode. The results show that the system developed has
a distortion of 38 cm [10]. In a similar idea, we find M.R. Kamandar et al. 2022 proposing
a robot to improvise and reduce the effort for hedge trimming. The developed robots
are built with servo motors and wheels for movement based on five degrees of freedom
to give some flexibility to the robots [11]. In addition, W. Zheng et al. 2022 proposed
a bio-inspired human approach to developing a robot able to manipulate efficiently in
agricultural fields [12]. K. Li et al. 2022 proposed a system based on an arm to process
kiwi fruit pollination. The results obtained showed that the developed bra has an accuracy
that varies between 82% and 87% [13]. In this context, we can find a variety of proposed
systems that aim to perform tasks on agricultural farms. These tasks aim to improve the
productivity of agricultural fields. Several applications have been proposed to help farmers
make decisions [14–18]. These developed systems require autonomous movement without
the farmers’ participation, making the algorithmic conception a complicated task. Several
attempts have been made in the literature to propose localization and mapping algorithms
in the automotive domain [19,20]. These algorithms have been adapted to control robots in
the agricultural field. These systems are based on simultaneous localization and mapping
(SLAM) [21]. In this context, several works have been proposed. U. Weiss et al. 2011
proposed a simulation of an autonomous robot based on a 3D lidar sensor instead of the
traditional method based on stereo cameras [22]. In another work, I. Ali et al. 2020 were
based on localization and mapping in the forest in order to build maps for surveillance [23].
Similarly, A.S. Aguiar et al. 2021 were based on SLAM algorithms for autonomous robot
movement. The approach uses a 3D construction to localize and build the map of agri-
cultural fields [24]. All of these proposed works that aim to provide robust solutions for
agricultural field monitoring are based on complex, high-cost systems, which limits the use
of these proposed approaches. The best choice is a low-cost, robust, and flexible system
that helps identify specific problems in agricultural fields, which will increase the chance
of use and help improve agricultural product productivity and yield.
Our work focuses on developing an autonomous robot for real-time plant monitoring
in open agricultural fields and in closed greenhouses. The study was based on a greenhouse
in the southern region of Morocco. This region is known for the high production of
agricultural products, such as tomatoes and pepper. These two plants require permanent
monitoring of vegetation, water, and fertilizer. Our proposed system is based on a robot
equipped with RGB and multispectral cameras (Parrot Sequoia +) and electric motors
and wheels for movement. The role of RGB cameras is to detect plants for processing,
and the multispectral camera is the extraction of images with several bands for index
processing. Additionally, we tried to simulate our system on Catia v 2019 software for the
mechanical part and Proteus V8 for the electrical part. In addition, we tried to add the
precise localization of the plants to help the decision system. The results showed that the
robot is robust and flexible for various applications, such as weed detection and counting.
We tried to use an Nvidia Jetson Nano embedded architecture for the processing part, and
Sustainability 2022, 14, 15539 3 of 26
to control the motors and drivers to ensure autonomous movement, we used an Arduino
board. We aim to build a simple system of low energy consumption and cost, and our
proposed contribution is as follows.
(1) Development of an autonomous robot for real-time crop monitoring in open agricul-
tural fields and greenhouses
(2) A case study on a closed greenhouse based on tomatoes plant
(3) An implementation on low-cost embedded architecture based on the Cuda language
(4) In addition, an optimization based on the Hardware/Software Co-design approach
has been proposed to decrease the processing time and memory consumption for
real-time applications
Our paper is organized as follows: we have a general introduction to exploring the
problem to be treated, then we have materials and methods study which aim to study the
algorithmic part of our system. Section 3 is for the results obtained and, finally a conclusion.
BN IR − BR
NDVI = (1)
BN IR + BR
BG − B R
NDWI = (2)
BG + B R
BN IR − BRededge
NDRE = (3)
BN IR + BRededge
The greenhouse farm used in our study is located in the Ait Aamira region near Agadir.
This region is known by the high production of tomatoes and closed greenhouses instead of
open fields. The greenhouses studied in our research are located at 30◦ 090 13” N 9◦ 300 50” W.
Figure 1 shows the greenhouses to be studied. The structure of the greenhouse is based on
a plastic cover. The greenhouse studied in this research is divided into 20 regions. Each
region contains two tomato rows with 100 m width and 75 m length with a 1.5 m between
the rows. In addition, each region is divided in two rows, one with 48 m and the other
with 48 m. Figure 2 shows the greenhouse structure and Figure 3 shows real images of the
closed greenhouse used in this study.
+ multispectral camera. The idea here is to monitor each plant based on the precise GPS
coordination of the multispectral camera to deliver an accurate plant assessment that gives
each plant its index value and GPS coordinates afterward. The use of these coordinates
will help to determine which plant lacks fertilizer, water, or vegetation in a precise ap-
proach. In addition, this type of camera gives images separated into four bands, red,
Sustainability 2022, 14, 15539 green, near-infrared, and RedEdge, with a resolution of 1280 × 960 pixels for each image 4 of 26
with a high-resolution to visualize the plants. This gives the flexibility to keep just the
RGB image delivered and remove the other unused bands.
Figure 1. Greenhouse
Figure 1. Greenhouselocation.
location.
1.5m
1.5m
1.5m
Sustainability 2022, 14, 15539 5 of 26
1.5m
Figure 3. Greenhouse
Figure 3. real area.
Greenhouse real area.
The study
2.2. System was conducted on the totality of the rows to build a general report of
Modelling
tomatoFromplant’s condition.
the literature Thewe
study, tool
canused for the
conclude data
that wecollection
have threebased
tools on
for the Parrot
algorithm
Sequoia + multispectral camera. The idea here is to monitor each plant based
validation. The first tool is the satellite, the second is the drones, and the third is the on the
precise GPS coordination of the multispectral camera to deliver an accurate plant
ground robots. These three tools aim to support the different sensors that collect, process, assess-
mentsometimes
and that gives each
makeplant its index
real-time value and
decisions. GPS
In the coordinates
case afterward.
of the satellite, The use
we cannot of these
make de-
coordinates
cisions will help toand
simultaneously, determine
the lowwhich plantof
resolution lacks
the fertilizer, water,a or
images causes vegetation
wrong in a
diagnosis.
precise
For thisapproach. In addition,
reason, these solutionsthis
aretype of camera
limited givesand
to medium images
highseparated
agriculturalintofields.
four bands,
Addi-
red, green, near-infrared, and RedEdge, with a resolution of 1280 × 960
tionally, the application side is limited because the images cannot help when applying pixels for each
image with a high-resolution to visualize the plants. This gives the flexibility
counting algorithms, weed detection, and different diseases. So, we are limited to two to keep just
the RGB image delivered and remove the other unused bands.
solutions in our case, either the drones or the soil robots. Unmanned aerial vehicles have
shown solid and efficient solutions for surveillance and different applications. However,
2.2. System Modelling
the problem with these tools is the battery’s autonomy, which does not reach 30 min of
flightFrom
if no the literature
overload study,applied.
has been we canOn conclude
the otherthat we have
hand, if wethree
wanttools for aalgorithm
to build decision
validation. The first tool is the satellite, the second is the drones,
system using a UAV (Unmanned Arial Vehicle), this will increase the weight, and the third is theaffecting
ground
robots. These three tools aim to support the different sensors that collect,
the flight time. Similarly, UAVs are not flexible when we want to use surveillance in closed process, and
sometimes make real-time decisions. In the case of the satellite, we cannot
greenhouses. These constraints make the ground robots very strong regarding accuracy make decisions
simultaneously, and the low resolution of the images causes a wrong diagnosis. For this
and flexibility of applications in open and closed fields such as greenhouses. For this
reason, these solutions are limited to medium and high agricultural fields. Additionally,
the application side is limited because the images cannot help when applying counting
algorithms, weed detection, and different diseases. So, we are limited to two solutions in
our case, either the drones or the soil robots. Unmanned aerial vehicles have shown solid
and efficient solutions for surveillance and different applications. However, the problem
with these tools is the battery’s autonomy, which does not reach 30 min of flight if no
overload has been applied. On the other hand, if we want to build a decision system using
a UAV (Unmanned Arial Vehicle), this will increase the weight, affecting the flight time.
Similarly, UAVs are not flexible when we want to use surveillance in closed greenhouses.
These constraints make the ground robots very strong regarding accuracy and flexibility
of applications in open and closed fields such as greenhouses. For this study, we will
focus on designing a platform consisting of a soil robot that moves autonomously to
monitor vegetation, water, nitrogen, and different applications, such as counting and weed
detection. It can also make decisions in real-time. This research aims to show our proposed
algorithms’ applicability and utility. In this context, we have developed a system named
VSSAgri (vegetation surveillance system for Precision agriculture application). This robot
aims to validate the monitoring algorithms proposed in this work. The proposed prototype
is based on an embedded architecture and electrical motors powered by a battery. Similarly,
it offers a low-cost solution compared to the proposed in the literature.
The proposed system validation has been designed through several steps. Among
these steps is the system modeling on CATIA software to study the functional aspect before
the hardware design. As for the electrical part, we used Proteus for the functional electrical
diagram of the system. In this framework, the system is divided into two parts. The
electrical part consists of electric motors and a 12 V battery. The second part is based on the
mechanical modeling of the different components.
Similarly, it offers a low-cost solution compared to the proposed in the literature.
The proposed system validation has been designed through several steps. Among
these steps is the system modeling on CATIA software to study the functional aspect be-
fore the hardware design. As for the electrical part, we used Proteus for the functional
electrical diagram of the system. In this framework, the system is divided into two parts.
Sustainability 2022, 14, 15539 6 of 26
The electrical part consists of electric motors and a 12 V battery. The second part is based
on the mechanical modeling of the different components.
2.2.1.
2.2.1. Mechanical
Mechanical Study
Study
The
The proposed
proposed system
system consists
consists ofof metal
metal support
support with
with aa length
length ofof 150
150 cm
cm and
and 65
65 cm
cm
wide.
wide. The
The dimensions
dimensions chosen
chosen inin our
our case
case are
are based
based on
on the
the test
test we
we performed
performed to to validate
validate
the
the system.
system. In
In the
the real
real case,
case, either
either in
in aa greenhouse
greenhouse oror an
an open
open field,
field, we
we can
can change
change the
the
dimensions of the system. Figure 4 shows the metal support used
dimensions of the system. Figure 4 shows the metal support used in our case. in our case.
Figure 4.
Figure 4. Metal
Metal support.
support.
This
This support
support is
is equipped
equipped withwith two
twobarriers
barriers in
inthe
theextremities.
extremities. The
The role
role of
of these
these barriers
barriers
is
is to guarantee the flow of the processing system. These barriers will support the box that
to guarantee the flow of the processing system. These barriers will support the box that
contains the RGB
contains the RGB camera,
camera,multispectral,
multispectral,andandthe
theembedded
embedded architecture.
architecture. In this
In this context,
context, we
wehavehave tested
tested several
several solutions
solutions basedbased ontwo
on just justcables
two cables that
that will will ensure
ensure movement. movement.
But the
But the problem
problem of these
of these cables cables
is the is the movement
movement of the
of the box that box thatthe
influences influences the speed of
speed of displacement
Sustainability 2022, 14, x FOR PEER REVIEW 7 of 28
displacement
due to friction.due to friction.
Figure 5 shows theFigure 5 shows
solution the solution
that was proposedthat was
before theproposed before the
two barriers.
two barriers.
Figure 5. First
Figure5. First proposed
proposedsolution
solutionfor
forbox
boxmovement.
movement.
As
As shown
shown inin Figure
Figure5,5,we
wehave
havetwo
twometal
metalcables
cablesininthe
the extremity
extremityofofthe
thesupport.
support. This
This
solution has shown some constraints, such as the friction of the box during the displacement.
solution has shown some constraints, such as the friction of the box during the displace-
For thisFor
ment. reason, we havewe
this reason, selected the use of
have selected thebarriers. Then, we
use of barriers. havewe
Then, a metal
havebox that consists
a metal box that
of a power
consists of abank for bank
power the power
for thesupply
powerof the embedded
supply architecture,
of the embedded and thisand
architecture, boxthis
carries
box
carries the different cameras used. As a second solution, we proposed the use of wheels
with electric motors for the movement in the two barriers. Figure 6 shows the box used.
Figure 5. First proposed solution for box movement.
As shown in Figure 5, we have two metal cables in the extremity of the support. This
Sustainability 2022, 14, 15539 7 of 26
solution has shown some constraints, such as the friction of the box during the displace-
ment. For this reason, we have selected the use of barriers. Then, we have a metal box that
consists of a power bank for the power supply of the embedded architecture, and this box
the different
carries camerascameras
the different used. As a second
used. solution,
As a second we proposed
solution, the use
we proposed ofuse
the wheels with
of wheels
electric motors for the movement in the two barriers. Figure 6 shows the box used.
with electric motors for the movement in the two barriers. Figure 6 shows the box used.
Figure6.
Figure 6. Box
Box used
used in
in our
our prototype.
prototype.
Additionally,
Additionally, we we used
used eight
eight wheels
wheels with
with electric
electric motors,
motors, four
four toto drive
drive horizontally,
horizontally,
and
and the
the others
others to
to drive
drive vertically
vertically the
the box’s
box’s support.
support. These
These wheels
wheels will
will realize
realize aa scanner
scanner
principle
principle with
with aa horizontal
horizontaland
andvertical
verticaldisplacement
displacementtoto guarantee
guarantee a general
a general vision
vision of
of the
the field
field thatthat
willwill be handled.
be handled. The wheels
The wheels used used
can becan be changed
changed if weproblems
if we have have problems
of move-of
movement
ment related related
to thetogeography
the geography
of theofagricultural
the agricultural
areas.areas. For some
For some applications,
applications, we needwe
need big wheels that will ensure movement without problems. The collection
big wheels that will ensure movement without problems. The collection of all these8 com-
Sustainability 2022, 14, x FOR PEER REVIEW
of all these
of 28
components
ponents willwillgivegive a complete
a complete system,
system, as shown
as shown in Figure
in Figure 7 with
7 with the the different
different views
views on
on the
the CATIA software.
CATIA software.
Generally,
Generally, the
the system
system moves
moves in in aa horizontal
horizontal direction
direction from
from thethe box
box that
that contains
contains thethe
cameras and the embedded architecture, as shown in Figure 7. Displacement
cameras and the embedded architecture, as shown in Figure 7. Displacement 1 guarantees 1 guarantees
that
that the box will
the box will cross
cross the
the horizontal axis to
horizontal axis to process
process allall the
the plants
plants in in this
this row.
row. Likewise,
Likewise,
displacement
displacement 22 ensures
ensuresthethemovement
movementofofallallthe thesupport
support ononthethe
fields
fieldsto to
move
move to the second
to the sec-
row.
ond row. These two mechanisms ensure the processing of the whole field. In this case, the
These two mechanisms ensure the processing of the whole field. In this case, the
temporal constraintisisthe
temporal constraint theplants’
plants’real-time
real-time indices
indices processing
processing to avoid
to avoid datadata
loss.loss. This
This tem-
temporal constraint
poral constraint depends
depends on on
thethe type
type of of camera
camera thatwill
that willbebeused.
used.IfIfwewewant
want toto monitor
monitor
indices, such as NDVI, NDWI, and NDRE, we will need a multispectral
indices, such as NDVI, NDWI, and NDRE, we will need a multispectral camera with camera with aa
timelapse of 5 frames/s. For monitoring the RGB indices, we will
timelapse of 5 frames/s. For monitoring the RGB indices, we will need an RGB camera need an RGB camera
with
with aa timelapse
timelapse of 30 frames
of 30 frames per
per second,
second, and and for
for all
all applications,
applications, suchsuch as as plant
plant counting,
counting,
weed,
weed, and disease detection. Temporal constraints in this case have been studied [5,25]
and disease detection. Temporal constraints in this case have been studied [5,25]
based
based on
on the
the different
different embedded
embedded architectures,
architectures, either
either CPU-GPU
CPU-GPU or or CPU-FPGA.
CPU-FPGA.
Figure Electrical
Figure8. 8. DC machine
Electrical diagram.
DC machine diagram.
dI1
U1 = R1 · I1 + L1 · (4)
dt 𝑑𝐼1
𝑈 = 𝑅1 · 𝐼1of +
with U1 voltage of supply field winding, R1s resistance
𝐿1 · field winding, L induc-
supply 𝑑𝑡 1
tance of field coil, I1 current of supply field winding.
with U1 voltage of supply field winding, Rs resistance of supply field winding, L1
ance of field coil, I1 current dI2 winding.
U2 =of
R2supply
· I2 + L2 · field
+ ω· MSR · I1 (5)
dt
with U2 voltage of armature coil, R2 Resistance of armature coil, I2 current of armature coil,
ω angular speed of the motor MSR mutual inductance.
dω
M1 + M2 = J · + B·ω (6)
dt
with M2 moment of load, M1 moment of conversion, J total engine, B coefficient of friction.
M1 = MSR · I1 I2 (7)
The control of the motors was based on a low-cost Arduino Nano board, and we
used only two boards for the driver’s motor. The first will control the support motor and
the other for the box. These drivers allow us to synchronize all the motors to ensure a
simultaneous movement of the support and the box. The role of the Arduino board is to
control the motors through the drivers. Figure 9 shows the electrical diagram of our system.
𝑀1 = 𝑀𝑆𝑅 · 𝐼1 𝐼2 (7)
The control of the motors was based on a low-cost Arduino Nano board, and we used
only two boards for the driver’s motor. The first will control the support motor and the
Sustainability 2022, 14, 15539
other for the box. These drivers allow us to synchronize all the motors to ensure a simul-
9 of 26
taneous movement of the support and the box. The role of the Arduino board is to control
the motors through the drivers. Figure 9 shows the electrical diagram of our system.
Figure
Figure 10.
10. General
General algorithm
algorithm overview.
overview.
Algorithm
Algorithm1:1:Front-End
Front-End
1. Start algorithm
2. Inputs: Support Moving or Start
3. Function Move Box
4. Driver setting
5. Give a command the motors
6. Run M1, M2, M3, M4
7. End Function
8. Function Detect plant
9. For I = 0 to max line
10. For J = 0 to Max columns
11. Find the green color
12. Find the plant’s morphology
13. End For
14. End For
15. End function
16. IF The plant not detected Then
17. Return to Move Box
18. End IF
19. Else The plant detected Then
20. 1 s delay
21. Image taken
22. End Else
23. Function Send images to Back-end
24. RED = IMG_210903_104621_0000_RED
25. GRE = IMG_210903_104621_0000_GRE
26. NIR = IMG_210903_104621_0000_NIR
27. REG = IMG_210903_104621_0000_REG
28. End function
29. End Algorithm
30. Output: RED, GRE, NIR, REG
On
OnOn the other
theother
the hand,
otherhand, for
hand,for the
forthe back-end
theback-end part,
part,aaaplant
back-endpart, plant
plant counting
counting algorithm
counting algorithm based
algorithm based
based ononclassifi-
classi-
classifi-
cationhas
cation hasbeen
fication beenadded
has addedto
been tocount
added count the
to the number
count number of
the of plants
number plants that
of that have
plants have aa low
that havelow or
a or high
lowhighindex
or indexin
high inorder
index in
order
totoorder
haveto
have aageneral
general
have overview.
a general Once the
overview.
overview. Once the
Once indices are calculated,
the indices
indices are calculated, the
the algorithm
are calculated, sends
the algorithm
algorithm the
thedata
sends
sends the
data
data for segmentation and counting. For the indices processing, we performed a segmen-
tation step that will eliminate the negative values that correspond to the absence of vege-
tation (e.g., soil, earth...). The NDVI, NDWI, and NDRE values calculated previously are
now between 0 and 1. The closer the value is to 1, the more the index reflects good results.
Then, we performed multiple thresholding; we considered the values between 0.2 and 0.3
Sustainability 2022, 14, 15539 11 of 26
Sustainability 2022, 14, 15539 11 of 26
for segmentation and counting. For the indices processing, we performed a segmentation
for that
step segmentation and counting.
will eliminate For the
the negative indices
values thatprocessing,
correspondwe toperformed
the absence a segmentation
of vegetation
step that will eliminate the negative values that correspond
(e.g., soil, earth . . . ). The NDVI, NDWI, and NDRE values calculated previously to the absence of vegetation
are now
(e.g., soil,
between 0 earth
and 1.. . The
. ). The NDVI,
closer the NDWI,
value isand NDRE
to 1, valuesthe
the more calculated previously
index reflects goodare now
results.
between
Then, 0 and 1. The
we performed closerthresholding;
multiple the value is towe1,considered
the more the the index
valuesreflects
between good results.
0.2 and 0.3
asThen,
beingwe performed
relatively multiple
weak, so wethresholding;
assigned them we the
considered
red color;thethe
values between
values 0.2 0.3
between andand0.3
aswere
0.6 beingslightly
relatively weak,
better, wesoassigned
we assignedthemthem the redcolor;
the orange color; the
the values
values between
between 0.30.6 and
and
0.6 were slightly better, we assigned them the orange color; the
0.9 showed a very high level, we assigned them a green color. These different thresholdsvalues between 0.6 and
0.9 showed
were performed a very high to
in order level,
makewethe
assigned
imagesthem
moreainterpretable
green color. These
for thedifferent thresholds
farmer. The system
allows us to create a file containing the index name and value in the output file. system
were performed in order to make the images more interpretable for the farmer. The On the
allows
other us atofile
side, create
thatacontains
file containing the index
the images name
colored and value
for each type of in index
the output file.
overall Onhave
will the
other side, a file that contains the images colored for each type of index overall will have
Sustainability 2022, 14, x FOR PEERfour files. One for RGB images, two for NDVI, three for NDWI, and NDRE. Algorithm
REVIEW 12 of 28 2
four files.
shows One for RGBoperation.
the thresholding images, two for NDVI, three for NDWI, and NDRE. Algorithm 2
shows the thresholding operation.
Algorithm
Algorithm2:2:Back-End
Back-End
1. Start algorithm
2. Input: Indices value
3. For I = 0 to max line
4. For J = 0 to Max columns
5. Apply thresholding process on NDVI, NDWI, and NDRE
6. Apply the colorization based on the threshold
7. n < NDVI < m, n < NDWI < m, n < NDRE < m (N and M between 0.2–1)
8. End For
9. End For
10. Function Classification
11. Writing results in NDVI file
12. Writing results in NDWI file
13. Writing results in NDRE file
14. End Function
15. Function Computing
16. For I = 0 to max images
17. Convert NDVI images from RGB space to HSV
18. Convert NDWI images from RGB space to HSV
19. Convert NDRE images from RGB space to HSV
20. Find the edges
21. Apply the “Watershed” function
22. End For
23. For I = 0 to max line
24. For J = 0 to Max columns
25. Complete the segmented images
26. Count the number of plants
27. End For
28. End For
29. Storage of segmented images
30. End Function
31. End Algorithm
32. Output: Number of plants
The
TheThe purpose
purpose
purpose ofof
of the
thethe watershed technique
techniqueisisisto
watershedtechnique
watershed totosegment
segment the
segmentthe image.
theimage.
image.ItItIttreats
treats
treatsthe
theimage
the imageas
image as
a atopographic
astopographicmap
a topographic mapbased
mapbasedon onthe
based theintensity
on intensityof
the ofthe
intensity thepixels.
of pixels. We
the We find
pixels. find the
We the semantic
find semantic segmentation
the semantic segmen-
segmentation
that
tation
that refers
refers totorefers
that theprocess
the process duringthe
to the during
process the program
during tolink
the program
program to link each
each pixel
to link
pixel of an
each
of an image
pixel
image to
of an aa particular
to image to a
particular
class label.class
particular
class label. InInour
our
label.case,
case, theresult
In the
our resultthe
case, obtained would be
result obtained
obtained would be aa single
would single class,
be a class, and
single andclass,all
allandthe
theallpixels
the
pixels
would
pixels belong to the same
would belong to theclass
same soclass
the plants
so thewould
plants be treated
would be as a single
treated as aobject.
singleInstance
object.
Instance segmentation is identical to semantic segmentation except that it differs from se-
mantic segmentation in the fact that it treats the different objects in the image, several
objects of the same class, as distinct entities, which allows us to count the number of ob-
jects present in the image. The methodology followed in this last part of the algorithm:
(a) We convert the indices image to HSV (hue, saturation, value).
Sustainability 2022, 14, 15539 12 of 26
would belong to the same class so the plants would be treated as a single object. Instance
segmentation is identical to semantic segmentation except that it differs from semantic
segmentation in the fact that it treats the different objects in the image, several objects of
the same class, as distinct entities, which allows us to count the number of objects present
in the image. The methodology followed in this last part of the algorithm:
(a) We convert the indices image to HSV (hue, saturation, value).
(b) We create the CV_8U version of our HSV image, then we look for the contours present
in the HSV image.
(c) We trace the contours on the original image, this tracing step is divided into two steps:
a. Tracing the markers of the foreground;
b. Tracing the background markers in white;
c. The final image is a superposition of the two tracings a and b.
(d) We perform the segmentation using the OpenCV function “Watershed”; afterwards,
we fill the labeled objects with randomly generated colors.
The back-end part is based on image processing and is divided into three functions.
The first function is for the pre-processing of the images, the second for the indices process-
ing, and the third for the counting operation. After the algorithm finishes the back-end, it
goes back to the front end, then a test if the system has finished the vertical line is applied.
If yes, it will move all the metal support. If no, it will move only the box for the next plant.
At the first step, a sequential implementation has been proposed in order to study
the back-end part of our algorithm. Compared to the Laptop on the Jetson Nano, the
three functions constitute the total processing time of the algorithm. Therefore, following a
workload analysis based on a hardware/software Co-Design approach allows us to reduce
the processing time as much as possible for these three functions for both architectures
and mainly for the Jetson Nano. Its compact size of 10 cm × 8 cm × 2.9 cm, and its
minimal power consumption of 5 W, added to its limited resources and adapted memory
capacity. Its relatively low cost allows it to meet a constraining specification related
Sustainability 2022, 14, 15539 13 of 26
33.42% 34.90%
42.87%
64.12%
2.46%
22.23%
C/C++ (s)
Table 2. Processing time comparison. CUDA (s) Acceleration
Laptop
Pre-processing C/C++
0.3878 (s) CUDA0.0039(s) Acceleration
99.43
Indices processing 0.0149 0.0015
Laptop 9.93
Counting 0.2021 0.0036 56.13
Pre-processing 0.3878 0.0039 99.43
Jetson Nano
Indices processing
Pre-processing 0.0149
0.8334 0.0015
0.0064 130.219.93
Indices processing
Counting 0.4321
0.2021 0.0124
0.0036 34.8456.13
Counting 0.6783 0.0175 38.76
Jetson Nano
Pre-processing 0.8334 0.0064 130.21
Using the results of the performance profiler, we have gone from a processing time
Indices processing 0.4321 0.0124 34.84
equal to 387.7 ms for the pre-processing to 3.9 ms, which is an acceleration of almost
Counting 0.6783 0.0175 38.76
100 times. For the indices processing, we have an acceleration of 10, and, finally, the
counting has been accelerated by 56 times. For the heterogeneous embedded system Jetson
Using
Nano, wethe
haveresults
gone of thea performance
from profiler,
processing time equal towe have
833.4 msgone from
for the a processing
pre-processing to time
6.4 to
equal ms,387.7
an acceleration
ms for theofpre-processing
130 times has been achieved.
to 3.9 For the
ms, which second
is an function,of
acceleration wealmost
have 100
gone
times. from
For the432.1 ms to
indices 12.4 ms, equivalent
processing, we have to
anan accelerationof
acceleration of10,
34 times, and, finally,
and, finally, the
the counting
hascounting has been accelerated
been accelerated by 56 times. byFor
a value of 38 times. Figure
the heterogeneous 12 represents
embedded systema temporal
Jetson Nano,
we synthesis
have gone of the
fromdifferent functions
a processing on our
time Laptop
equal and on
to 833.4 msJetson
for theNano.
pre-processing to 6.4 ms,
an acceleration of 130 times has been achieved. For the second function, we have gone
from 432.1 ms to 12.4 ms, equivalent to an acceleration of 34 times, and, finally, the count-
ing has been accelerated by a value of 38 times. Figure 12 represents a temporal synthesis
of the different functions on our Laptop and on Jetson Nano.
Sustainability 2022, 14, x FOR PEER REVIEW 15 of 28
Sustainability 2022, 14, 15539 14 of 26
1000
100
10
1
Pre-processing Indices processing Counting
Laptop C/C++ Laptop CUDA Jetson Nano C/C++ Jetson Nano CUDA
GPUactivity
Table3.3.GPU
Table activityand
andData
Dataworkload
workloadresults.
results.
Mint
Table 3 shows that in the case of the Laptop we have a CPU→GPU transfer rate equal
to 2.4829 GB/s and a GPU→CPU rate equal to 2.7274 GB/s. The Jetson Nano embedded
system has values 3 times lower in the order of MB/s. Our system consists of two move-
ments, the first for the box containing the embedded architecture and the cameras used
for processing. The second part focuses on moving all metal supports to a new row. With
Sustainability 2022, 14, 15539 15 of 26
this method, we can ensure that all the plants will be processed. Figure 13 shows an over-
Sustainability 2022, 14, x FOR PEERview
REVIEW
of our system. 16 of 28
The first step that was performed was the testing and validating of our robot in a
closed space based on three rows to validate the mechanism of our system. each rows
contain mint, parsley and pepper plant, respectively. The validation of this system in a
closed space does not imply accurate functionality in a real environment. For this reason,
we have opted after the validation to make a real test in order to validate our algorithmic
and systematic approach. In Figure 13, we have on the top left the developed robot and,
on the bottom, a multispectral and RGB camera view. In the bottom right image, we have
the box that contains the Arduino architecture that operates the movement part of our
robot. Figure 14 shows images collected by our robot.
Figure 13. System overview.
Figure 13. System overview.
The first step that was performed was the testing and validating of our robot in a
closed space based on three rows to validate the mechanism of our system. each rows
contain mint, parsley and pepper plant, respectively. The validation of this system in a
closed space does not imply accurate functionality in a real environment. For this reason,
we have opted after the validation to make a real test in order to validate our algorithmic
and systematic approach. In Figure 13, we have on the top left the developed robot and,
on the bottom, a multispectral and RGB camera view. In the bottom right image, we have
the box that contains the Arduino architecture that operates the movement part of our
robot. Figure 14 shows images collected by our robot.
After the prototype validation in the laboratory, we moved to the field validation to
evaluate the prototype performance. The The results showed that the prototype works in the
same conditions
same conditions and
and mechanism
mechanism as as in
in the
the laboratory.
laboratory. The
The test
test was
was conducted
conducted inin an
an open
open
field and
field and aa closed
closed greenhouse,
greenhouse, showing
showing ourour system’s
system’s flexibility.
flexibility. In
In addition,
addition, our robot can
our robot can
be adapted
be adaptedforforthe
thedifferent
differentprecision
precisionagriculture applications
agriculture by editing
applications the back-end
by editing with
the back-end
the appropriate
with algorithm.
the appropriate TheseThese
algorithm. applications can becan
applications either weed weed
be either detection, fruit and
detection, fruitplant
and
counting,
plant or disease
counting, detection.
or disease In addition,
detection. a decision-making
In addition, systemsystem
a decision-making can be added
can be to take
added
real-time
to actions actions
take real-time in the agricultural field. This
in the agricultural approach
field. will help
This approach thehelp
will farmer
the make
farmerprecise
make
Figure 14. Camera overview of our system.
After the prototype validation in the laboratory, we moved to the field validation to
evaluate the prototype performance. The results showed that the prototype works in the
Sustainability 2022, 14, x FOR PEER REVIEW 17 of 28
P (W) P (W)
8 8
6 6
4 4
2 2
0
0 1 11 21 31 41 51
1 11 21 31 41 51
I (A)
0.8 I (A)
0.8
0.6
0.6 0.4
0.4 0.2
0.2 0
1 11 21 31 41 51
0
1 11 21
Figure 31current consumption
16. Power and 41 of the electric motors.
51
Figure 16. Power and current consumption of the electric motors.
In Figure
Figure 16. Power and current 16, we haveofmeasured
consumption themotors.
the electric current consumption of the electric motors
In Figure 16,
thewe haveWe
power. measured
tried tothe current
make consumption
several of see
iterations to the electric
each timemotors and
the consumption.
theInpower. We tried
maximum to make
powerseveral iterations
consumption is to
aboutsee each
2.9 W, time
and the
for consumption.
the
Figure 16, we have measured the current consumption of the electric motors current The and
consumption is 0.5
maximum power consumption is about 2.9 W, and for the current consumption
the power. We tried to make several iterations to see each time the consumption. The is 0.59 A.
maximum power consumption is about 2.9 W, and for the current consumption is 0.59 A.
Sustainability 2022, 14, x FOR PEER REVIEW 18 of 28
Sustainability 2022, 14, x FOR PEER REVIEW 18 of 28
0.5
NDVI value
0.5
0.4
0.4
0.3
0.3
0.2
0.2
0.1
0.1
0
0 Z1 Z2 Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
Z1 Z2average.
Figure 18. NDVI Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
Figure 18. NDVI average.
The
Figure 18. results in Figure 18 show that zones 6 and 10 give a low index compared to
NDVI average.
the other zones. Usually, the mean values vary between 0.15 for zone 6 and 0.67 for
Sustainability 2022, 14, x FOR PEER REVIEW 19 of
Sustainability 2022, 14, 15539 18 of 26
The results in Figure 18 show that zones 6 and 10 give a low index compared to th
zone 9. Zonesother
6 and 10 present
zones. Usually,low values
the mean due vary
values to the supply
between system
0.15 for zoneof6the
andnecessary
0.67 for zone 9. Zon
plant components.
6 andThis reflectslow
10 present thevalues
strong relationship
due to the supplybetween vegetation,
system of the necessarywater,
plantand the
components. Th
nitrogen content in thethe
reflects plants.
strongAfter locating between
relationship the region with a low
vegetation, index,
water, andwethe
tried to locate
nitrogen content in th
the plants withplants.
the vegetation index.
After locating the Figure 19 shows
region with a low the vegetation
index, we tried toindex
locatecalculated
the plants for
with the veg
each plant. tation index. Figure 19 shows the vegetation index calculated for each plant.
0.35
NDVI Z6 NDVI Z10
0.3
0.25
NDVI value
0.2
0.15
0.1
0.05
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
After determining the NDVI vegetation index, we need to calculate the other indices,
such as NDRE and NDWI. The reason for calculating these indices is that vegetation is not
enough to indicate that the plant is in good condition. The second evaluation was based
on the NDRE index shown in Figure 21. The NDRE evaluation was based on a 0.2 thresh-
old. Regions with an index lower than 0.2 suffer from a nitrogen deficiency in the plants.
Figure 21 summarizes the results obtained for 20 zones that have been evaluated. The
results show that the index varies between 0.079 and 0.75 in zones 10 and 5, with mini-
mum and maximum values. We also find zones 1 and 6 with NDRE values of 0.15 and
0.08. These regions reflect the nitrogen lack in plants. Similarly, for the vegetation index,
we have zone 6 and 10 that are in the same situation. On the other hand, in the evaluation
of NDRE, another zone that suffers from a lack of nitrogen has been added. After a first
synthesis, we concluded that the suffering zones are 6, 1, and 10. For this reason, it is
necessary to extend the evaluation and see the average of the NDRE index in the different
regions. Figure 22 shows the different areas’ evaluation based on each zone’s average.
Figure 20.
Figure 20. RGB and NDVI
RGB and NDVI image
image evaluation.
evaluation.
After determining the NDVI vegetation index, we need to calculate the other indices,
such as NDRE and NDWI. The reason for calculating these indices is that vegetation is not
enough to indicate that the plant is in good condition. The second evaluation was based
on the NDRE index shown in Figure 21. The NDRE evaluation was based on a 0.2 thresh-
old. Regions with an index lower than 0.2 suffer from a nitrogen deficiency in the plants.
Figure 21 summarizes the results obtained for 20 zones that have been evaluated. The
results show that the index varies between 0.079 and 0.75 in zones 10 and 5, with21mini-
Sustainability 2022, 14, x FOR PEER REVIEW of 28
mum and maximum values. We also find zones 1 and 6 with NDRE values of 0.15 and
0.08. These regions reflect the nitrogen lack in plants. Similarly, for the vegetation index,
we have zone 6 and 10 that are in the same situation. On the other hand, in the evaluation
of NDRE, another zone that suffers from a lack of nitrogen has been added. After a first
synthesis, we concluded that the suffering zones are 6, 1, and 10. For this reason, it is
necessary to extend the evaluation and see the average of the NDRE index in the different
regions. Figure 22 shows the different areas’ evaluation based on each zone’s average.
0.8
0.7
0.6
0.5
NDRE value
0.4
0.3
0.2
0.1
0
Sustainability 2022, 14, 15539 20 of 26
Figure 21. NDRE evaluation for 20 regions.
0.8
0.7
0.6
0.5
NDRE value
0.4
0.3
0.2
0.1
0
Z1 Z2 Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
The application of the average processing in the different zones has reinforced the
The application of the average processing in the different zones has reinforced t
synthesis elaborated in Figure
synthesis 22. Itin
elaborated also confirmed
Figure that
22. It also zones 6, that
confirmed 1, and 10 suffer
zones from
6, 1, and the from t
10 suffer
lack of nitrogen.lack
Ourofevaluation methodology based on a thorough evaluation
nitrogen. Our evaluation methodology based on a thorough evaluation that aims to that ai
determine the exact plants where we have problems.
to determine the exact plants where we have problems.
In Figure 23, weInhaveFigureshown
23, we anhaveevaluation of 20 plants
shown an evaluation in plants
of 20 regions 1, 6, and
in regions 10.
1, 6, andThe10. The resu
results show thatshow that the plants in regions 6 and 10 have very low values compared to zone1.
the plants in regions 6 and 10 have very low values compared to zone 1. The valu
The values vary vary
between 0.050.05
between andand0.1. 0.1.
On On thethe
other hand,
other hand,zone
zone1 varies
1 variesbetween
between0.20.2and
and0.11.
0.11. After ev
After evaluatinguating
the different plants,
the different it isitnecessary
plants, is necessary totodetermine
determinethe the precise localizationof the plan
precise localization
of the plants. The
Sustainability 2022, 14, x FOR PEER REVIEW Thethird
thirdvital
vital index
index that
that isis opted
optedfor forour
ourevaluation
evaluation is is NDWI.
NDWI. ThisThis
index
22 index
of determines
28
determines the amount
amountofof water
water in vegetation.
in the the vegetation.Figure Figure
24 shows 24the
shows the evaluation
evaluation of the NDWI. of
the NDWI.
0.25
NDRE Z1 NDRE Z6
0.2
NDRE value
0.15
0.1
0.05
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Figure 23. NDRE for 1, 6 and 10 zones.
Figure 23. NDRE for 1, 6 and 10 zones.
The NDWI evaluation is based on a threshold of 0.3. Areas with less than 0.3 have
a water deficiency or water absence, while areas with more than 0.3 have water. The
index evaluation showed that, like the NDRE, areas 1, 6, and 10 have low water content.
Figures 25 and 26 present the results in each zone.
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0.4 evaluation
0.5 showed that, like the NDRE, areas 1, 6, and 10 have low water content. Figures
25 and 26 present the results in each zone.
NDWI value
0.3 0.4
0.2 0.3
0.1 0.2
0 0.1
Z1 Z2 Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
0
Figure 25. Z1 Z2 average.
NDWI Z3 Z4 Z5 Z6 Z7 Z8 Z9 Z10 Z11 Z12 Z13 Z14 Z15 Z16 Z17 Z18 Z19 Z20
Figure 25. NDWI average.
Figure 25. NDWI average.
NDWI Z1 NDWI Z6 NDWI Z10
0.35
NDWI Z1 NDWI Z6 NDWI
0.3 0.35
0.25 0.3
NDWI value
0.2 0.25
NDWI value
0.15 0.2
0.1 0.15
0.05 0.1
0 0.05
1 2
3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
0
Figure 26. NDWI for 1, 6 and 102zones.
Figure 26. NDWI for 1, 6 and110 zones.
3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 2
The NDWIFigure
values26.
in NDWI
Figurefor
26 1, 6 andfrom
range 10 zones.
0.618 to 0.13 for zones 18 and 10. The calcula-
tion of NDWI in the figure is based on the average of each zone. The global analysis of the
The NDWI values in Figure 26 range from 0.618 to 0.13 for zones 18 and 10. The calc
greenhouse study showed that vegetation problems appeared in zones 6 and 10, while water
tion of NDWI in the figure is based on the average of each zone. The global analysis o
and nitrogen problems appeared in zones 1, 6, and 10. This methodology of interpretation and
greenhouse study showed that vegetation problems appeared in zones 6 and 10, while w
this study will help farmers monitor the agricultural fields and determine the plants and areas
Sustainability 2022, 14, 15539 22 of 26
The NDWI values in Figure 26 range from 0.618 to 0.13 for zones 18 and 10. The
calculation of NDWI in the figure is based on the average of each zone. The global analysis
of the greenhouse study showed that vegetation problems appeared in zones 6 and 10,
while water and nitrogen problems appeared in zones 1, 6, and 10. This methodology of
interpretation and this study will help farmers monitor the agricultural fields and determine
the plants and areas that suffer from a various problem. This will increase the productivity
Sustainability 2022, 14, x FOR PEER REVIEW 24 of 28
of the farm. After processing the indices in each plant, we will have the precise localization
of each plant Figure 27 shows an example of NDVI.
0.35
NDVI Z6 NDVI Z10
0.3
0.25
0.2
0.15
0.1
0.05
0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Figure
Figure 27.27. NDVI
NDVI andand GPS
GPS data
data forfor 6 and
6 and 10 10 zones.
zones.
The
TheGPS
GPSdata shown
data shown in in
Figure 2828
Figure areare
provided
providedbyby
thethe
Parrot Sequoia
Parrot Sequoia+ camera. Then,
+ camera. Then,
wewe
generated
generateda amap
mapofofthe
thegreenhouse
greenhouse with the 20
with the 20zones
zonesandandthe
theindex
index result
result with
with thethe
GPS
data, as shown in Figure 27. This will improve the decision system associated with the closed
greenhouse. Therefore, in the map, the green and blue colored rectangle shows zones 10 and
6. Additionally, the GPS coordinates of the plants that have a vegetation problem. After
evaluating the different vital indices, including NDVI, NDRE, and NDWI, we obtained the
different information needed in the agricultural fields. This information is the most relevant
Sustainability 2022, 14, 15539 23 of 26
GPS data, as shown in Figure 27. This will improve the decision system associated with
the closed greenhouse. Therefore, in the map, the green and blue colored rectangle shows
Sustainability 2022, 14, x FOR PEER REVIEW 25 of 28
zones 10 and 6. Additionally, the GPS coordinates of the plants that have a vegetation
problem. After evaluating the different vital indices, including NDVI, NDRE, and NDWI,
we obtained the different information needed in the agricultural fields. This information
is theThe
most
GPSrelevant to have
data shown an overview
in Figure of theby
28 are provided health of the
the Parrot plants.
Sequoia As soon
+ camera. as the
Then, we
evaluation
can generateisafinished,
map of thea global map should
greenhouse with thebe20generated
zones andfor
thethe farmer,
index resultcontaining the
with the GPS
different
data, information
as shown in Figureabout thewill
27. This greenhouse.
improve theFigure 28system
decision showsassociated
the overall
withmap of the
the closed
indices monitoring.
greenhouse. In the map, the green and blue colored rectangle shows these zones 10 and 6.
In addition, the GPS coordinates of the plants that have a vegetation problem are
given. After evaluating the different vital indices, including NDVI, NDRE, and NDWI, we
obtained the different information needed in the agricultural fields. This information is
the most relevant to have an overview of the health of the plants. As soon as the evaluation
is finished, a global map should be generated for the farmer, containing the different in-
formation about the greenhouse. Figure 28 shows the overall map of the indices monitor-
Sustainability 2022, 14, 15539 24 of 26
The GPS data shown in Figure 28 are provided by the Parrot Sequoia + camera. Then,
we can generate a map of the greenhouse with the 20 zones and the index result with the
GPS data, as shown in Figure 27. This will improve the decision system associated with the
closed greenhouse. In the map, the green and blue colored rectangle shows these zones
10 and 6.
In addition, the GPS coordinates of the plants that have a vegetation problem are
given. After evaluating the different vital indices, including NDVI, NDRE, and NDWI,
we obtained the different information needed in the agricultural fields. This informa-
tion is the most relevant to have an overview of the health of the plants. As soon as
the evaluation is finished, a global map should be generated for the farmer, containing
the different information about the greenhouse. Figure 28 shows the overall map of
the indices monitoring. Table 4 shows a Summary of normalized indices results in the
closed greenhouse.
4. Conclusions
The validation of the algorithmic approach is a very important step to test the al-
gorithm’s reliability. In real scenarios, we can find a variety of environmental problems
that influence the algorithm’s operation. The development of autonomous robots helps to
validate the research approach and make it useful. In this work, we have proposed an au-
tonomous monitoring system that monitors crops in closed greenhouses and in open fields.
This system delivers a map that contains an image with vegetation, water, and fertilizer
information, and GPS localization. This technique will increase the precision of monitor-
ing, which will help us to improve the decision systems and reduce the consumption of
resources required by the plant for growth. This will maximize the yield by decreasing the
consumption of resources. Therefore, we added a sequential and parallel implementation
in a heterogeneous embedded architecture type CPU-GPU in order to study the processing
time and the memory consumption. The results allowed us to process each image in 1.94 s
Sustainability 2022, 14, 15539 25 of 26
for the Jetson Nano embedded architecture. On the other hand, the Laptop processed each
image in 0.604 s, which does not give real-time processing based on 5 images/s. For this
reason, an optimization based on the architecture-algorithm mapping approach allowed
us to reduce the processing time for the Laptop to 9 ms and the embedded architecture to
36 ms. This gave us real-time processing for both architectures with an acceleration factor
of 66 for the laptop and 54 for the Jetson Nano.
Author Contributions: Conceptualization, A.S. writing—original draft preparation, A.S. and A.S.;
methodology, A.S. and A.E.O.; software, A.S. and A.E.O.; validation, M.E. and F.T.; formal analysis,
R.L.; data curation, A.S.; writing—review and editing, M.E. and R.L.; visualization, F.T. All authors
have read and agreed to the published version of the manuscript.
Funding: This research is funded by the National Centre for Scientific and Technical Research of
Morocco (CNRST) (grant number: 19 UIZ2020).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: Data used in this paper are available upon requests.
Acknowledgments: We owe a debt of gratitude to thank the National Centre for Scientific and
Technical Research of Morocco (CNRST) for their financial support and for their supervision (grant
number: 19 UIZ2020).
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Wang, T.; Chen, B.; Zhang, Z.; Li, H.; Zhang, M. Applications of machine vision in agricultural robot navigation: A review.
Comput. Electron. Agric. 2022, 198, 107085. [CrossRef]
2. Chebrolu, N.; Lottes, P.; Schaefer, A.; Winterhalter, W.; Burgard, W.; Stachniss, C. Agricultural robot dataset for plant classification,
localization and mapping on sugar beet fields. Int. J. Robot. Res. 2017, 36, 1045–1052. [CrossRef]
3. Abualkishik, A.Z.; Almajed, R.; Thompson, W. Evaluating Smart Agricultural Production Efficiency using Fuzzy MARCOS
method. J. Neutrosophic Fuzzy Syst. 2022, 3, 8–18. [CrossRef]
4. Tofigh, M.A.; Mu, Z. Intelligent Web Information Extraction Model for Agricultural Product Quality and Safety System. J. Intell.
Syst. Internet Things 2021, 4, 99–110. [CrossRef]
5. Saddik, A.; Latif, R.; El Ouardi, A.; Alghamdi, M.I.; Elhoseny, M. Improving Sustainable Vegetation Indices Processing on
Low-Cost Architectures. Sustainability 2022, 14, 2521. [CrossRef]
6. Saddik, A.; Latif, R.; el Ouardi, A.; Elhoseny, M.; Khelifi, A. Computer development based embedded systems in precision
agriculture: Tools and application. Acta Agric. Scand. Sect. B Soil Plant Sci. 2022, 72, 589–611. [CrossRef]
7. Amine, S.; Latif, R.; El Ouardi, A. Low-Power FPGA Architecture Based Monitoring Applications in Precision Agriculture. J. Low
Power Electron. Appl. 2021, 11, 39. [CrossRef]
8. Abualkishik, A.Z.; Almajed, R.; Thompson, W. Multi-attribute decision-making method for prioritizing autonomous vehicles in
real-time traffic management: Towards active sustainable transport. Int. J. Wirel. Ad Hoc Commun. 2021, 3, 91–101. [CrossRef]
9. Devanna, R.P.; Milella, A.; Marani, R.; Garofalo, S.P.; Vivaldi, G.A.; Pascuzzi, S.; Galati, R.; Reina, G. In-Field Automatic
Identification of Pomegranates Using a Farmer Robot. Sensors 2022, 22, 5821. [CrossRef]
10. Skoczeń, M.; Ochman, M.; Spyra, K.; Nikodem, M.; Krata, D.; Panek, M.; Pawłowski, A. Obstacle Detection System for Agricultural
Mobile Robot Application Using RGB-D Cameras. Sensors 2021, 21, 5292. [CrossRef]
11. Kamandar, M.R.; Massah, J.; Jamzad, M. Design and evaluation of hedge trimmer robot. Comput. Electron. Agric. 2022,
199, 107065. [CrossRef]
12. Zheng, W.; Guo, N.; Zhang, B.; Zhou, J.; Tian, G.; Xiong, Y. Human Grasp Mechanism Understanding, Human-Inspired Grasp
Control and Robotic Grasping Planning for Agricultural Robots. Sensors 2022, 22, 5240. [CrossRef] [PubMed]
13. Li, K.; Huo, Y.; Liu, Y.; Shi, Y.; He, Z.; Cui, Y. Design of a lightweight robotic arm for kiwifruit pollination. Comput. Electron. Agric.
2022, 198, 107114. [CrossRef]
14. Cho, B.-H.; Kim, Y.-H.; Lee, K.-B.; Hong, Y.-K.; Kim, K.-C. Potential of Snapshot-Type Hyperspectral Imagery Using Support
Vector Classifier for the Classification of Tomatoes Maturity. Sensors 2022, 22, 4378. [CrossRef] [PubMed]
15. Yang, J.; Ni, J.; Li, Y.; Wen, J.; Chen, D. The Intelligent Path Planning System of Agricultural Robot via Reinforcement Learning.
Sensors 2022, 22, 4316. [CrossRef]
16. Gao, P.; Lee, H.; Jeon, C.-W.; Yun, C.; Kim, H.-J.; Wang, W.; Liang, G.; Chen, Y.; Zhang, Z.; Han, X. Improved Position Estimation
Algorithm of Agricultural Mobile Robots Based on Multisensor Fusion and Autoencoder Neural Network. Sensors 2022, 22, 1522.
[CrossRef]
Sustainability 2022, 14, 15539 26 of 26
17. Yang, T.; Ye, J.; Zhou, S.; Xu, A.; Yin, J. 3D reconstruction method for tree seedlings based on point cloud self-registration. Comput.
Electron. Agric. 2022, 200, 107210. [CrossRef]
18. Duan, M.; Song, X.; Liu, X.; Cui, D.; Zhang, X. Mapping the soil types combining multi-temporal remote sensing data with texture
features. Comput. Electron. Agric. 2022, 200, 107230. [CrossRef]
19. Chghaf, M.; Rodriguez, S.; Ouardi, A.E. Camera, LiDAR and Multi-modal SLAM Systems for Autonomous Ground Vehicles: A
Survey. J. Intell. Robot Syst. 2022, 105, 2. [CrossRef]
20. Nguyen, D.D.; El Ouardi, A.; Rodriguez, S.; Bouaziz, S. FPGA implementation of HOOFR bucketing extractor-based real-time
embedded SLAM applications. J. Real-Time Image Proc. 2021, 18, 525–538. [CrossRef]
21. Ericson, S.K.; Åstrand, B.S. Analysis of two visual odometry systems for use in an agricultural field environment. Biosyst. Eng.
2018, 166, 116–125. [CrossRef]
22. Weiss, U.; Biber, P. Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot. Auton. Syst. 2011, 59,
265–273. [CrossRef]
23. Ali, I.; Durmush, A.; Suominen, O.; Yli-Hietanen, J.; Peltonen, S.; Atanas, J.; Finn, G. Forest dataset: A forest landscape for visual
SLAM. Robot. Auton. Syst. 2020, 132, 103610. [CrossRef]
24. Aguiar, A.S.; dos Santos, F.N.; Sobreira, H.; Cunha, J.B.; Sousa, A.J. Particle filter refinement based on clustering procedures for
high-dimensional localization and mapping systems. Robot. Auton. Syst. 2021, 137, 103725. [CrossRef]
25. Saddik, A.; Latif, R.; Elhoseny, M.; El Ouardi, A. Real-time evaluation of different indexes in precision agriculture using a
heterogeneous embedded system. Sustain. Comput. Inform. Syst. 2021, 30, 100506. [CrossRef]
26. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150.
[CrossRef]
27. McFeeters, S.K. The use of the Normalized Difference Water Index (NDWI) in the delineation of open water features. Int. J.
Remote Sens. 1996, 17, 1425–1432. [CrossRef]