Agriengineering 04 00058

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

AgriEngineering

Article
Pesticide-Free Robotic Control of Aphids as Crop Pests
Virginie Lacotte 1 , Toan NGuyen 2 , Javier Diaz Sempere 2 , Vivien Novales 2 , Vincent Dufour 2 ,
Richard Moreau 2 , Minh Tu Pham 2 , Kanty Rabenorosoa 3 , Sergio Peignier 1 , François G. Feugier 4 ,
Robin Gaetani 4,5 , Thomas Grenier 6 , Bruno Masenelli 5 , Pedro da Silva 1 , Abdelaziz Heddi 1
and Arnaud Lelevé 2, *
1 Univ Lyon, INSA Lyon, INRAE, BF2I, UMR 203, 69621 Villeurbanne, France
2 Univ Lyon, INSA Lyon, Université Claude Bernard Lyon 1, Ecole Centrale de Lyon, CNRS, Ampère, UMR5005,
69621 Villeurbanne, France
3 FEMTO-ST, UBFC, CNRS, 25030 Besançon, France
4 Greenshield, 75004 Paris, France
5 Univ Lyon, INSA Lyon, CNRS, Ecole Centrale de Lyon, Université Claude Bernard Lyon 1, CPE Lyon, INL,
UMR5270, 69621 Villeurbanne, France
6 CREATIS, UMR 5220, U1294, INSA Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS,
Inserm, Univ Lyon, 69100 Lyon, France
* Correspondence: [email protected]

Abstract: Because our civilization has relied on pesticides to fight weeds, insects, and diseases
since antiquity, the use of these chemicals has become natural and exclusive. Unfortunately, the
use of pesticides has progressively had alarming effects on water quality, biodiversity, and human
health. This paper proposes to improve farming practices by replacing pesticides with a laser-based
robotic approach. This study focused on the neutralization of aphids, as they are among the most
 harmful pests for crops and complex to control. With the help of deep learning, we developed

a mobile robot that spans crop rows, locates aphids, and neutralizes them with laser beams. We
Citation: Lacotte, V.; NGuyen, T.;
Sempere, D.J.; Novales, V.; Dufour, V.;
have built a prototype with the sole purpose of validating the localization-neutralization loop on
Moreau, R; Pham, M.T.; Rabenorosoa, a single seedling row. The experiments performed in our laboratory demonstrate the feasibility of
K.; Peignier, S.; Feugier, F.G.; et al. detecting different lines of aphids (50% detected at 3 cm/s) and of neutralizing them (90% mortality)
Pesticide-Free Robotic Control of without impacting the growth of their host plants. The results are encouraging since aphids are one
Aphids as Crop Pests. of the most challenging crop pests to eradicate. However, enhancements in detection and mainly in
AgriEngineering 2022, 4, 903–921. targeting are necessary to be useful in a real farming context. Moreover, robustness regarding field
https://doi.org/10.3390/ conditions should be evaluated.
agriengineering4040058

Academic Editors: Roland Lenain Keywords: farming; robotics; aphid detection; laser-based neutralization; deep learning; image-based
and Eric Lucet visual servoing

Received: 29 July 2022


Accepted: 22 September 2022
Published: 7 October 2022
1. Introduction
Publisher’s Note: MDPI stays neutral
The use of pesticides has become natural and exclusive to us because our civilization
with regard to jurisdictional claims in
has relied on them since antiquity to fight weeds, insects, and diseases, which would
published maps and institutional affil-
generate a mean 50% production loss [1]. Unfortunately, this has progressively led to
iations.
alarming consequences on water quality, biodiversity, and human health. Exposure of
the European population to endocrine-disrupting pesticides alone cost up to €270B in
health expenses in 2017 [2]. Tang et al. [3] concluded that 64% of global agricultural
Copyright: © 2022 by the authors.
land (25.106 km2 ) is at risk of pesticide pollution, and 31% is at high risk. The class of
Licensee MDPI, Basel, Switzerland. neonicotinoids, pesticides responsible for the deaths of 300,000 bee colonies annually, has
This article is an open access article even been recently banned from use [4]. In its “Ecophyto” plan, the French government
distributed under the terms and has decided to reduce the use of agrochemicals by 50% by 2018. Unfortunately, due to
conditions of the Creative Commons alternatives to these chemicals being too scarce, the objective has been postponed to 2025.
Attribution (CC BY) license (https:// For ten years, progress in computer science, the Internet of Things (IoT), robotics, and
creativecommons.org/licenses/by/ agronomic modes have improved farming practices, provoking two consecutive revolu-
4.0/). tions: smart farming [5,6] and now, robotic farming [7]. A major line of research in this

AgriEngineering 2022, 4, 903–921. https://doi.org/10.3390/agriengineering4040058 https://www.mdpi.com/journal/agriengineering


AgriEngineering 2022, 4 904

revolution is the improvement of crop pest management thanks to the development of


new tools for automated pest detection and their neutralization by more environmentally
friendly means.
Thus, several studies have demonstrated the possibility of the automated detection
of multiple agricultural pests in laboratory or field conditions with a good level of accu-
racy thanks to image analysis techniques associated with artificial intelligence. Images
are captured using RGB (red-green-blue), multi-spectral or hyperspectral cameras with
variable dimensions and depths (3 dimensions, RGB-depth, etc.) [8,9]. Spectral cameras
provide high quality and quantity of data for pest and disease monitoring on crops [10–12],
detection of insect infestations on stored food [13], mosquito gender and age evaluation,
and taxonomical applications [14], with excellent results provided by automated solutions.
Nevertheless, they require complex technology, mainly tested under laboratory conditions,
which requires more research for efficient use in the field. On the contrary, RGB image
processing has been widely used to develop pest detection systems on traps or plants in the
field based on the analysis of the color composition, gray levels, shape or texture of objects
in the image [8,15]. Detection methods performed with 3D cameras, computer-based classi-
fication, and notably, deep learning algorithms [16–20] have also been investigated. Some
optical solutions for automated pest detection have been mounted on robots to explore the
feasibility of using such technology in real conditions in greenhouses [21] or in the fields
[22]. In addition, some of these robots can multitask, offering detection of pests on crops
and their neutralization. They perform localized spraying of pesticides in small quantities,
which is preferable to the traditional method that consists of spraying the entire crop when
infested areas are discovered [9,23,24].
Both mechanical and electrical-based works have already been explored [25,26]. How-
ever, very few works have been conducted on the laser-based neutralization of insect pests
as an alternative to pesticides. Hori et al. investigated the effect of UV irradiation on eggs,
larvae, pupae and adults of Drosophila melanogaster. High doses were effective in order to
neutralize them, but genetic variation in insect resistance could appear in the long term.
Laser irradiation with a green light was also effective on mosquitoes, but it has never
been tested on an autonomous robot [27]. Furthermore, weed processing is now actively
commercialized [28]. Weed destruction is typically carried out by a targeted herbicide spray
or mechanically based methods. Thermal methods are sometimes used, and laser-based
destruction is also still under study [29]. As soon as 2001, laser-burning could reduce weed
biomass by 90% with only 20% of the energy usually used for thermal weeding [30].
To date, technologies for automated spraying or mechanical destruction of weeds are
appearing on the market either as large and costly farming robots [31] or as small-sized
ones, such as Oz (Naïo Tech). Yet, no non-chemical solution exists concerning crop pests.
This is why the Greenshield (see https://anr-greenshield.insa-lyon.eu/, accessed on 9
September 2022) project’s objective was to demonstrate the feasibility of using a laser-based
detection-neutralisation module (DNM) embedded in a mobile robot dedicated to this
purpose on aphids. This robot managed to do demonstrate its feasibility in lab conditions in
September 2021. It also demonstrated the feasibility of using the DNM with a commercially
available robot (Dino, Naio Tech) to combat weeds.
Detecting aphids in crop fields is a complex task due to their small size and con-
founding color. As with numerous other applications, a deep learning approach has been
envisaged. Indeed, this artificial intelligence approach is nowadays widespread for face
recognition [32], image segmentation [33], signal analysis and automatic interpretation, and
the control of autonomous cars. More particularly, deep learning and image processing are
two disciplines whose applications are very varied and encompass areas such as medicine,
robotics, security, and surveillance. In the frame of image processing, deep learning is
very powerful at extracting complex information from input images. Such algorithms’
structures are often based on convolutional neural networks that can capture the spatial
dependencies at a different scale in an image. They consist of stacking layers of convolution
and mathematical functions with several adjustable parameters, which seek to identify
the presence of particular patterns on the image at different scales. The tuning and the
AgriEngineering 2022, 4 905

optimization of these parameters are performed during a preliminary training phase, which
relies on the availability of a large quantity of data, thanks to specific cost functions (loss).
The originality of this project lies in the embedding of various technologies in a single
farming robot necessitating a multidisciplinary approach including pest biology, laser
photonics, visual servoing, and mobile robotics. As aforementioned, the main objective
was to prove the feasibility of such an approach to replace pesticides. Therefore, this paper
describes the main requirements of this system and the related solutions we envisage.
It exposes the experimental approach to validate their consistency, and it discusses the
results. It is organized as follows. The next section details materials and methods for the
detection and neutralization of insect pests on crops. Experimental results are provided in
Section 3 and discussed in Section 4. We then conclude this paper, which summarizes the
achievements of the Greenshield project, and we evoke future directions.

2. Materials and Methods


2.1. Prototype Robot Structure
The main subsystems dedicated to the detection and neutralization of aphids are all
embedded in a mobile robot. In the future, this mobile robot should be able to autonomously
proceed in crop fields, globally browse them and permit a global detection/neutralization
action. In the frame of this study, a prototype (see Figure 1) has been designed with the
requirement of validating the detection/neutralization functions. This prototype had to
proceed at a minimum speed of 10 m per minute, work 3 h between two battery recharges,
and weigh less than 100 kg to avoid compacting the ground. To span rows, the overall
width of the wheels had to be less than 1 m, and the inner space between wheels had
to be greater than 30 cm. The robot had to be empty between the wheels to host the
detection/neutralization hardware.
We actually could not find any mobile robot off-the-shelf fitting all the requirements.
We then designed it only for detection/neutralization experimental purposes. In this
version, maintainability requirements limited the number of coverings and so drastically
decreased its ability to work in humid environments. We only covered its bottom part
according to local laser-based security regulations to avoid hitting and wounding surround-
ing people. During the experiments, it was located on rails to travel only in straight lines
over one plant row. It is therefore not optimized for real external conditions (with humidity,
rocks, obstacles, U-turns at the end of rows ...). To free up some room on the bottom part of
the robot, wheels with embedded brushless motors and position encoders (MOTTROTXFS6-
OZOV1 by OZO) were chosen. The four wheels were connected to 2 dsPIC33EP512GM710
(Microchip) motor control boards to synchronize their actions and globally control the
robot displacements. These boards were programmed with Matlab-Simulink (Mathworks)
software using the “MPlab for Simulink” library furnished by Microchip.

2.2. Insects and Plants


This study focused on the neutralization of aphids, as they are among the most harmful
pests for crops and complex to control [34]. A solution able to control their spread should
most likely work on other invertebrates such as cockroaches, mosquitoes and fruit flies.
To do so, we chose three aphid lines from parthenogenetic females in the fields, different
in color and size, providing a diverse database to develop our DNM. Thus, we used two
pea aphid lines, Acyrthosiphon pisum LL01 (green) and YR2 (pink), and one bird cherry-oat
aphid line Rhopalosiphum padi LR01 (black, 2 to 3 times smaller than A. pisum).
Aphids were kept under obligate parthenogenesis in laboratory conditions to rear
unsynchronised parthenogenetic aphid populations and synchronized nymphs N1 (first
instar) as described by Simonet et al. [35] according to study requirements. A. pisum lines
were maintained on young broad bean plants (Vicia faba. The L. cv. Aquadulce) and R.
padi lines were maintained on wheat seedlings (Triticum aestivum (Linnaeus) cv. Orvantis).
Insects and plants were reared in controlled conditions at 21 ◦ C ± 2 ◦ C and 65% relative
humidity (RH) ± 5% with a 16 h photoperiod.
AgriEngineering 2022, 4 906

Supervision laptop
Microchip boards
Battery
Rear view Side view

Jetson Xavier

Micro-mirror + camera

Top view

Figure 1. Mobile robot prototype overview.

Organic seeds of V. faba and T. aestivum were supplied by Graines-Voltz® (Loire-


Authion, France) and Nature et Progrès® (Alès, France), respectively. All plants in this
study were grown in a commercial peat substrate Florentaise® TRH400 (Saint-Mars-du-
Désert, France) for 10 days from seed to 2–3 leaves (seedling stage for broad bean and
emergence stage before tillering for wheat). Then, the plants were used for a week for study
purposes, including insect rearing, plant and aphid detection, aphid mortality monitoring
and evaluation of the potential damage caused by laser shots on plants. The method used
for these experiments will be detailed in the following sections.

2.3. Global Process


A trolley-like robot with three embedded subsystems is proposed: the RGB-D (red-
green-blue plus depth) acquisition system and light sources, the aphid localization algo-
rithm using deep learning, and the neutralization system based on the laser.
Figure 2 illustrates the whole system organization inside the robot. The light sources,
the camera, and the seedlings featuring several aphids were collocated in a black box. In the
following, we detail each subsystem and start with a global overview of the neutralization
process.

Figure 2. Greenshield prototype description and working principle: the robot spans a crop row,
detects the aphids on leaves, and targets them using the laser beam positioned by the micro-mirrors.

The mobile robot proceeds in a right line over a single plant row and detects aphids
during motion. When detected aphids are out of reach, the robot stops and triggers the
neutralization tasks. In the long term, the neutralization task should run continuously.
For the real-time detection of aphids, we use a 3D camera and artificial intelligence object
AgriEngineering 2022, 4 907

recognition techniques (Section 2.5). The coordinates of the aphids visible on the acquired
image are transmitted to the neutralization module.
The aphid neutralization module is mounted on the robot and consists of a (harmless)
laser pointer and a power laser source, taking the same optical path toward a pair of
micro-mirrors (see Figure 3b) that position beams in the robot’s workspace (see Figure 2).
The laser pointer is used to realize a visual servoing (detailed in Section 2.6) to target the
aphids located in the images. Once the pointer spot is stabilized on the target, the power
laser emits a pulse carrying a sufficient amount of energy to neutralize the target aphid
(by raising its temperature; see Section 2.8). It then moves on to the next targeted aphid
in the image. We describe the path optimization in Section 2.7. When all detected aphids
are treated, the robot restarts its motion with pest scanning. Several types of low-power
laser sources that can be mounted on a mobile robot have been tested to identify the best
beam tracking on the leaves. Experiments were performed allowing us to correlate the
energy emitted by the laser beam with the fecundity and mortality rates of aphids and the
efficiency of weed destruction. The detection/localization/neutralization software was
linked to the robot locomotion to dynamically enable proceeding. ROS middleware [36]
was used to manage the communication between modules.

(b) Two degrees of freedom actuated mirrors


for laser positioning (DK-M3-RS-U-2M-20-L,
(a) ZED mini RGB-D camera (Stereo Labs). Newscale Technology).
Figure 3. Camera and laser steering device.

2.4. RGB-D Acquisition


To localize aphids, a RGB-D camera (ZED mini, Stereo labs, see https://www.stereolabs.
com/zed-mini/, accessed on 9 September 2022) is used (see Figure 3a). Its settings were
adjusted to limit light reflections and maximize the contrast (maximum contrast and
saturation, reduced gain; see Table 1) to ensure the best image acquisition in the lighting
condition. The light sources used were white LEDs, and, to reduce the impact of the foreign
light sources, we added some light protections, as visible in Figure 4. These protections
will also serve to limit laser exposure.
AgriEngineering 2022, 4 908

Table 1. Configuration of various parameters of the ZED camera.

Parameters Values Explanation


A 2K resolution would consume a lot of
HD1080 (1920
Resolution resources and therefore would slow down the
× 1080)
detection algorithm.
Capture Speed 30 FPS Maximum available in the HD1080 mode.
Low brightness to limit light reflections on the
Brightness 1
surface of the leaves.
High contrast makes it easier to detect pink
Contrast 6
aphids on green leaves.
Default value that matches the colors perceived
Hue 0
by human eyes.
Maximized to let the aphids appear on green
Saturation 8
leaves.
A low gamma level limits the white light in the
Gamma 2
picture.
Average value as high values generate noise on
Acuity 4
the back plane.
To adapt it taking into account fixed other color
White Balance auto
parameters (hue, saturation).
Exposition 75% Set to keep the brightness at an acceptable level.
Adjusted to keep the consistency between the
Gain 10
other settings with minimal add noise addition.

Figure 4. Under the robot: light sources and protections, camera and neutralization systems.

2.5. Detection and Localization


The objective of this module is to detect aphids on images of crops that the robot
scans from above, localize them, and send these coordinates to the neutralization module
described below. In terms of specifications, it must meet the following three criteria:
• Detection accuracy: a maximum error of 3 mm must exist between the center of
detected aphids and their true location in the image so that the laser spot always
overlaps a part of the targeted aphid;
• Detection sensitivity: at least 60% of aphids present in the image (this level has been
set arbitrarily taking into account that natural predators should finish the work, but it
requires experimental validation);
• Real-time operation: the entire program (detection algorithm + laser control) must
run at a speed greater than 10 frames per second to permit the robot to cover a 1 ha
AgriEngineering 2022, 4 909

crop field in 24 h (in the case of a field where rows are located every 40 cm, the robot
will have to travel 25 km during 12 h, corresponding to a mean speed of 1 km/h or
29 cm/s.).
The problem of detecting and localizing aphids of different sizes, colors and positions
in natural image sequences of various crops is particularly difficult. From our knowledge,
only artificial neural networks based on deep learning detection architectures can perform
this task in real-time. Therefore, to detect aphids in each image acquired by the camera
located under the robot as the latter proceeds over each crop row, we selected from the
current arsenal of detection architectures (EfficientDet, Yolo, FasterRCNN, U-Net with
Hausdorff distance [37]) YOLO version 4 [38] for its speed and its small-size object perfor-
mance in images. Such approaches are supervised; this means that an important data set
with manual annotations is needed for the training phase. When trained, the network was
transferred and run on an embedded GPU board (Nvidia Jetson Xavier AGX).
To do that, we first built an image database featuring the two pea aphid lines, A. pisum
LL01 (green) and YR2 (pink) on broad bean seedlings. We chose A. pisum on broad bean for
the training set because of the color diversity of the aphid lines occurring on the same crop,
which has an optimal leaf surface for plant and aphid detection. Furthermore, we used
young plants for the detection of plants and aphids because of their vulnerability to pest
attacks and their compactness compatible with the regular passage of a robot above the
rows. Thus, this technology could be useful for the detection and neutralization of pests
that can cause serious damage and economic losses to vulnerable young plants, such as
soybean aphids [39] and black cutworms on corn [40], or crops with a compact habit, such
as black bean aphids on sugar beets [41].
The training set of images was annotated using LabelMe (see https://github.com/
wkentaro/labelme, accessed on 9 September 2022) software. Around 700 images were
manually labeled, with each one featuring 20 to 30 aphids. It was necessary to manually
draw a bounding box around each aphid as well as to put a dot on the center of each aphid.
Since we did not have a sufficient amount of images for training the neural networks from
scratch, we fine-tuned our network from an already trained network (trained from MS
COCO data set) and we used data augmentation on our images during training to avoid
the phenomenon of over-fitting and to improve the generalization ability of the trained
model to new images. The following image transformations were applied randomly to
generate modified images during training:
• Horizontal or vertical image flip;
• Rotation of 10° in the positive or negative direction;
• Brightness modification;
• Hue and saturation modification;
• Contrast modification;
• Grayscale modification.
We started to develop this localization function with YOLOv4 [38] (whose code is
available at https://github.com/AlexeyAB/darknet, accessed on 9 September 2022) as it
is very efficient in the detection of objects in real-time and is broadly used in applications
requiring fast detection speed, such as in drones or autonomous car vision. YOLOv4 is a
one-stage detector. Its backbone is CSPDarknet53 [42], a model tuned for image recognition.
The CSPNet framework improves the residual structure of the neural network [43] to
diminish the issues of the gradient-vanishing problem during back-propagation of the
gradient , encourage the network to reuse feature maps (with residual layers) at different
hierarchies, and reduce the set of optimization parameters. The secondary part of the
network (the neck) features a PANet (path aggregation network) [44] to mix and combine
the maps of characteristics generated by the backbone. It is indeed a preparation step for
the following detection. The YOLO detector predicts the class of the object, its bounding
box, and the probability of its presence in its bounding box at three different scales. Because
of its small characteristic maps, this kind of multi-scale detector struggles to detect small
AgriEngineering 2022, 4 910

objects in the image. Thus, in practice, it could not detect aphids located at a distance
greater than 30 cm from the camera.
We tested several input image sizes (640 × 640 and 512 × 512) knowing that the size
of the images acquired by the camera is 2208 × 1242 pixels to determine the best choice. We
also optimized the network process using batch inference, which was not yet implemented
in the original YOLOv4 code. To speed-up the inference time, we chose tkDNN (the
source code for tkDNN is available on Github: https://github.com/ceccocats/tkDNN,
accessed on 9 September 2022), which is a neural network library specifically designed to
work on NVIDIA Jetson boards. It is built with the primitives of cuDNN, a library from
NVIDIA regarding neural networks, and TensorRT, an inference platform high performance
that enables high throughput and low latency for inference applications related to deep
learning. TensorRT engine optimizes models by quantifying all network layer parameters
and converting them into formats that occupy less memory space. The tkDNN library
allows inference with network parameters in three different numeric formats: FP32 (a
32-bit floating point format representation), FP16 (a 16-bits floating point format), and INT8
(integers represented with 8 bits). In this project, we chose the FP16 format as it is displayed
the most accurate and efficient format (see https://github.com/ceccocats/tkDNN#fps-
results, accessed on 9 September 2022). The data degradation caused by format conversion
remained meaner than 0.02%.
We also studied an approach to detect smaller objects without a bounding box based
on the weighted Hausdorff distance [37] (code available on Github: https://github.com/
javiribera/locating-objects-without-bboxes, accessed on 9 September 2022). The chosen
neural network architecture was a U-Net optimized with a loss function based on this
distance. Rather than taking bounding boxes as references (ground truth), the network
seeks to localize objects according to reference points.
Once the bounding box center or an aphid point was well located on the image in the
form of pixel coordinates, it was then possible to convert these points into a 3D reference
using the Software Development Kit of the ZED mini camera. Indeed, the 2D image is
taken by only one of the two objectives of the ZED camera, and the other one is used to
interpret the points in 3 dimensions. As the captured images were in RGBXYZ format (2D
image with depth), we could obtain the cloud measurement of neighborhood points in
XYZ (3D) from a detected point on the RGB (2D) image.

2.6. Laser-Based Targeting


Targeting aphids from their localization in acquired images requires visual servoing.
Indeed, the 3D positions of the detected aphids furnished by the RGB-D camera were
not precise enough to efficiently target aphids. Several works have dealt with visual
servoing. The closer ones are analyzed here. In 1996, Hutchinson et al. proposed an
IBVS (image-based visual servo) approach with a PID control law. They obtained a mean
and maximum error of 2 and 3 pixels, respectively, on flying mosquito targets [45]. In
2016, Andreff et al. used the same IBVS approach in laser micro-surgery, also using a
2DOF piezoelectric mirror to steer the laser ray [46] with a simple proportional gain in the
control law, resulting in an exponential convergence of the error. In 2018, Kudryavtsev et
al. developed a visual servoing system for a three-tube concentric tube robot, using an
adaptive proportional gain approach that was successfully validated in simulation using
the visual servoing platform ViSP and then in an experimental setup [47]. The adaptive
gain increased the convergence speed while ensuring stability, avoiding overshoot for small
errors and limiting the maximum speed.
Finally, we based this study on the works described in [48]. In this specific case, the
system consisted, on the one hand, of a coarse tracking system using a pair of stereoscopic
cameras to identify the approximate 3D location of a mosquito pest target. On the other
hand, the system featured a visual servoing system based on a single high-speed camera
and a fast scanning mirror to guide the tracking laser to the target pest aiming at minimizing
pixel tracking error. The annihilating laser dose was then administrated through the same
optical path. To orientate the laser beams towards the targets, we chose a 2-axis system
AgriEngineering 2022, 4 911

that orientates two micro-mirrors and permits ±25° of rotation range with a resolution
meaner than 5 µrad in a closed loop and a repeatability of 100 µrad. This system features
sensors with a bandwidth of 350 Hz for rotations meaner than 0.1°. The IBVS approach
was chosen because of the properties of the laser-mirror system and the robot operation.
The exact geometric relationship between the target and the mirrors is not known given
the evident variability of the pest locations and the complicated potential estimation of
their 3D position using the camera, in contrast to an industrial application where the target
can be fixed. Three control laws were studied, consisting of two variants of an adaptive
proportional gain (AG1 and AG2) and a PID. These control laws were implemented in the
visual servoing controller in Figure 5. The PID controller, expressed in z space, was:

T 1 − z −1
CPID (z) = K p + Ki − 1
+ Kd (1)
1−z T
where K p , Ki , and Kd are constant parameters, and T is the sampling period. The AG1
controller was:
2
C AG1 [k] = λ[k ] with λ[k] = λ[k − 1] + c1 .e−c2 .ε[k] (2)

where λ[k] is the adaptive gain, c1 and c2 are constant parameters, and ε[k ] is the error
signal (ε = pd − pl ). The AG2 controller was:
λ̇0
−λ ε [ k ]2
C AG2 [k ] = λ[k] with λ[k] = (λ0 − λ∞ ).e 0 −λ∞ + λ∞ (3)

where λ0 (resp. λ∞ ) is the gain when ε = 0 (resp. ∞), and λ̇0 is the slope of λ(ε) when
ε = 0 [49].
The proposed control scheme runs on the Jetson Xavier board. We first tested the
targeting system with 2D scenes of virtual aphids (to isolate the targeting performance
from the aphid detection performance). We input in real-time the coordinates of virtual
aphids randomly located in the images of the camera. We did the same to evaluate the
performance of the system with moving targets by varying the coordinates of virtual aphids
in the images. The camera was located on top of a 40 cm radius disk that was explored by
the visual servoing system with a low-power laser beam (see Figure 6). During experiments,
the system orientated the micro-mirrors to make the laser spot, visible on the plane, travel
as quickly as possible to each target.

Figure 5. Targeting control scheme embedding visual servoing.


AgriEngineering 2022, 4 912

Figure 6. Visual servoing experimental setup.

2.7. Multiple Target Optimization


When multiple pest targets are present in the image, optimization is required to
minimize the global distance traveled and thus to reduce the time taken to destroy all
the targets. An example of a popular algorithm in literature is the traveling salesman
problem (TSP). It is a combinatorial optimization algorithm that computes the shortest
possible path to reach a set of objects. However, the TSP literature mainly considers
only stationary targets and ignores any time-dependent behavior, which is critical in
this application since every target moves relative to the camera. It results in a delayed
disappearance from the image and thus their potential miss. Helvig et al. [50] proposed a
generalization of a time-dependent TSP, which they called mving-target TSP. It was related
to several applications where a pursuer must intercept in a minimum time a set of targets
moving with constant velocities. The following theorem was developed for a mixture
of approaching and receding targets from the origin, which would be analogous to this
application where aphids are approaching and receding from the origin of the image. Two
types of optimization algorithms were developed and tested: moving targets TSP, and a
hybrid moving-targets TSP and nearest neighbors.
The first one consisted of prioritizing the targets in the function of their distance to
the image origin in increasing order. This is equivalent to prioritizing the targets with the
greatest Y-coordinate in the image axis (see Figure 7). An alternative solution was tested
where a secondary criterion was applied, giving priority to the nearest target from the
current position, selected among the lowermost targets.
To evaluate the performance of the system with moving targets, we made the virtual
aphids translate (towards the bottom of the image) in real-time in the images at a constant
velocity of 5 pixels per second. We also performed experiments with blue LED stripes
located in a 40 cm × 20 cm on a rotating disc to test the system by including some target
detection. We could not use living aphids as they constantly move in such a situation. We
similarly could not use dead ones as they do not keep their initial color for enough time.

Figure 7. Representation of the moving-target TSP method (left) and the hybrid method with the
added nearest neighbors variant (right).
AgriEngineering 2022, 4 913

2.8. Laser Choice and Dimensioning


To select the most effective laser, the correlation between laser energy and aphid
mortality was evaluated and published in a previous work [51]. For that purpose, aphids
treated with different laser energies were put back on the leaves and observed every day
for mortality counts. Careful examinations made it possible to distinguish between live
and dead aphids. As long as an aphid moved, even slightly, it was considered alive. In
contrast, dead aphids were those with darkened abdomens due to laser burns and stiff
spread feet; they were unresponsive to any stimuli.
A first experiment to determine the lethal dose (LD) necessary to achieve a 90%
mortality (LD90) at Day+1 was set up as in [27,52].
Three wavelengths were used, with each one covering a different part of the electro-
magnetic spectrum: 532 nm (visible), 1090 nm (short wavelength infrared (SWIR)), and
10.6 µm (long wavelength infrared (LWIR)) known as a CO2 laser, manufactured by CNI
Lasers, IPG and an Access Laser, respectively. Samples of 48 aphids originating from the
three lines aforementioned were used. The aphids were random unsynchronized adults,
representative of what would be found in a field without an early detection system. The
Hosmer–Lemeshow and the area under the receiver operating characteristic curve tests
were performed on logistical fits to ensure that the retrieved fit was correct and accurate.
Then, to analyze the mortality dynamics of aphids during their development, the same
experiment was conducted on synchronized one-day-old nymphs (N1) with a mortality
counting every day until they reached their adult stage on the 7th day, thus considering
an LD90 at Day+7. Given the great results of the previous experiment, we focused on
the 10.6 µm laser and used the A. pisum line LL01 so we could rely on their well-known
life-history traits [35].
Going further, potential transgenerational laser-induced effects were addressed as
well. To do this, we reproduced the previous experiment, but this time using an LD50
to have a good balance between the number of survivors and quantifiable laser-induced
effects and to ensure a robust statistic. The survivors of the irradiated aphid generation (F0)
at Day+7 gave birth to nymphs of the second generation (F1) every day for 15 days. For the
two generations F0 and F1, mortality was assessed every day during their development.
Then, we measured their mass and developmental delay at Day+7. Finally, we monitored
the fecundity for 15 days of F0 aphids and F1 aphids born on the 1st, 5th, 10th, and
15th days of the reproduction period of F0 aphids [51]. All the treatments were repeated
five times. Statistical tests were performed using R Studio software. The mortality rate
of the synchronized N1 aphids is presented here with a GLM Gaussian test with a 95%
confidence interval.
Moreover, the laser beam must be innocuous for host plants. Indeed, in case of
false-positive detection, the entire laser energy would strike them. Hence, the impact of
the energy delivered in our experiments has also been investigated on host plants in our
previous work [51].
V. faba and T. aestivum host plants were chosen for the A. pisum and R. padi aphid
lines, respectively. The plants were cultivated as described in Section 2.2 and shot at the
seedling stage of 2–3 leaves. We shot the host plants 4 times with different values found in
each experiment with the 10.6 µm laser. First, to ensure no underestimation, we used the
highest fluences corresponding to LD90 at Day+1 on A. pisum LL01 and R. padi adult aphids.
Then, we also used the fluence that is the most reasonable for in-field use, corresponding to
LD90 at Day+7 on N1 LL01 aphids. According to aphids’ location on their host plants, we
shot different organs, namely the adaxial and abaxial leaf surfaces, the stem, and the apex.
Samples of 20 plants were shot by fluence level and by targeted organ. On Day+7, height,
fresh mass, and leaf surface were measured. The statistical analyses detailed in the previous
study [51] did not show any significant difference between control and irradiated plants.
AgriEngineering 2022, 4 914

3. Results
A prototype has been designed and built with the only purpose to validate the
localization-neutralization loop on a single seedling row (see Figure 8). The experimental
results performed in the laboratory with this prototype demonstrate the feasibility of de-
tecting different lines of aphids (50% detected at 3 cm/s) and of neutralizing them without
impacting the growth of their host plants.

Cropped area for detection Mobile robot Bean plants


with aphids

(a) A picture acquired by the camera of the (b) The robot spanning a row of broad bean
mobile robot; only the area contained in the plants featuring aphids (front view).
blue frame is analyzed. The blue squares
indicate the location of the detected aphids.
Figure 8. Detection Experimental Setup.

3.1. Aphid Detection and Localization


3.1.1. Lighting Conditions
We could observe that the best results in aphid detection were obtained with the white
LEDs. Hence, white LED strips were installed in the robot to create a homogeneous lighting
environment, reducing the sensitivity to external conditions (see Figure 4). The camera
settings were also adjusted to limit light reflections and maximize the contrast (maximum
contrast and saturation, reduced gain; see Table 1). These results were obtained with the
robot over an artificial ground reproducing near field conditions. They should be confirmed
by real in-field experiments with various natural ambient light conditions.

3.1.2. Localization Performance


We evaluated the performance of the two convolutional neural networks (Darknet
YoloV4 and U-Net with Hausdorff) using precision and recall criteria. Accuracy is defined
by the proportion of correct detection among all the proposed detected aphids. Recall
(or sensitivity) is the proportion of correct detection among all the relevant elements.
A detection was considered a true positive when the distance between its predicted position
and the reference was less than 5 pixels. This tolerance was defined by considering that the
average size of aphids on the image was 10 × 10 pixels. This corresponds to a maximum
error of 2 mm when the aphids are close to the camera (≈25 cm) and 4 mm when they
are at the camera detection limit (≈50 cm). Both algorithms were evaluated based on
84 test images. The networks were trained with the same data set of 400 training images
plus 200 validation images.
Table 2 summarizes the experimental results obtained with both proposed networks.
We manually optimized the U-Net-HD architecture and its hyper-parameters to obtain the
best results in terms of true positive detection (TP). Such optimizations were not performed
on YOLO. Despite the manual optimizations, YOLOv4 outperforms the U-Net-HD network
considering the speed, precision, and sensibility criteria. Therefore, we worked on the
optimization of YOLO.
The effect of cropping the camera images and keeping only the relevant area (800 ×
800 pixels) led to a rise of 6% in accuracy and 5% in sensitivity compared to a whole picture.
Indeed, the YOLO network has an input dimension of 512 × 512 pixels. Sub-sampling the
AgriEngineering 2022, 4 915

input image (2208 × 1242 pixels) to a 512 × 512 pixels image blurred the objects and caused
a reduction in precision.

Table 2. Performance comparison between YOLOv4 and U-Net with Hausdorff distance (U-Net-HD)
for aphid detection. Values in bold indicate the best results for each criterion.

YOLOv4 U-Net-HD
FPS (Nvidia Quadro 400) 10-11 2-3
True Positive (TP) 238 278
False Positive (FP) 490 997
False Negative (FN) 1371 1349
Precision 0.37 0.15
Recall 0.21 0.17

To get better accuracy, we also tiled the area of 800 × 800 pixels into 4 or 16 sub-
parts. The inference speed (frames per second) was computed when the network ended
up processing all these areas. We observed an increase in sensitivity when using 400 ×
400 pixel tiles (0.73 versus 0.51 for 800 × 800 pixel tiles). In the case of very small images
(200 × 200 pixels), the sensitivity falls to 0.53. We also noticed that this method greatly
reduces the inference speed of the model because of the large number of images to be
processed for each capture. We also observed that the code developed in C++ (gcc version
7.5.0-ubuntu 18.04) was more efficient than that of Python API (Python 3.6.9, ubuntu (about
1.5 times faster in terms of FPS speed). The averaged results on our test set are visible
in Table 3. Some results are visible in Figure 9. This sample frame illustrates the results
obtained with YOLOv4. It shows a single frame from the test set, (a) showing the initial
image, (b) the known position of aphids, and (c) the resulting prediction. One can observe
that 2 aphids were not found while another non-existing one was found.

Table 3. Performance comparison for YOLOv4 with smaller images.

Network Input Input Image FPS (Quadro 400)


Precision Recall
Size Size Python API C++
640 × 640 2208 × 1242 10–11 16 0.35 0.49
512 × 512 2208 × 1242 13–14 21 0.34 0.46
512 × 512 800 × 600 13–14 21 0.4 0.51

We could then raise the model inference speed from 9 FPS to 19 FPS without any loss of
precision. We used the tkDNN engine, FP16 floating point precision, and the C++ API on the
Jetson Xavier AGX embedded board with cropped images of 320 × 320 pixels. Some result-
ing films are visible on https://www.creatis.insa-lyon.fr/~grenier/research/GreenShield/,
(accessed on 9 September 2022). It was more than our real-time detection expectations.

(a) Original image (b) Ground truth image (c) Predicted image
Figure 9. Example of aphid detection with YOLOv4 on a test frame. Readers can observe that the
prediction missed 3 aphids and found a false one.
AgriEngineering 2022, 4 916

3.2. Laser-Based Neutralization


3.2.1. Pest Neutralization
The efficiency of the laser treatment on unsynchronized adult aphids has been studied
and published in [51]. We analyzed the effect of lasers with wavelengths in the visible
(532 nm wavelength), in the near infra-red (1070 nm) and in the far infra-red (10.6 µm). We
demonstrated that the 1070 nm wavelength is inefficient for killing R. padi aphids, requiring
lethal doses four orders of magnitude higher than the two other wavelengths. For A. pisum
LL01 (green) and YR2 (pink) aphids, the 10.6 µm wavelength is more effective than the
532 nm one. For instance, with green aphids, LD90 fluence values of 53.55 J·cm−2 and
12.91 J·cm−2 are retrieved at 532 nm and 10.6 µm, respectively. Indeed, going from one laser
to another results in an LD90 more than four times lower. Hence, the 10.6 µm wavelength
was chosen for the rest of the study. These values are quite the same for the pink aphids.
However, this is not the case for R. padi (black) aphids. In this case, both wavelengths
exhibit the same efficiency. The smaller size of this specie can explain this difference.
By changing the targeted aphid stage and extending the observation period, the LD90
can be lowered again. Consequently, a lower LD90 fluence means faster, more secure, and
more energy-saving treatment. Thus, targeting nymphs results in an LD90 at Day+1 that
is reduced by four times compared to adults. Moreover, counting the nymphs’ mortality
at Day+7 instead of Day+1 leads to a 2.6 times decreased LD90 value. Globally, targeting
nymphs and observing them at Day+7 allows for decreasing the fluence to 1.15 J·cm−2 .
This value is ten times lower than the previous value, while still maintaining a high level
90% neutralization rate. Moreover, the mortality dynamics of aphids over 7 days are not
linear [51]. Indeed, most aphids are already neutralized after 3 days, and then a plateau is
reached. This means that the LD90s at Day+7 and Day+3 are almost identical. Consequently,
the irradiated aphids with a low fluence will already be dead after 3 days, thus reducing
the duration of their attack on the plants.
Regarding potential transgenerational effects, results show that laser treatment does
not induce any significant effect on the mortality of the next F1 generation [51]. Hence, the
laser-based strategy must be based on direct laser effects.

3.2.2. Targeting
The IBVS algorithm featuring a PID controller obtained the best results versus the two
variants of AG. We could obtain a mean response time t5% = 490 ms (vs. 830 and 920 for
AG), corresponding to a mean t5% = 490 ms (see Figure 10). With moving virtual targets,
the PID control showed slightly better performance with a mean t5% = 420 ms (vs. 590 and
870 for AG). We also performed experiments with velocities ranging from 5 to 12 pixels/s.
We observed that the system did not have the time to travel to all targets at 12 pixels/s and
over (corresponding approximately to 1 cm/s at a distance of 30 cm). Experiments with
blue LED stripes showed a mean response time t5% that greatly increased to 620 ms.

Figure 10. Evolution of t5 % and t∞ response times versus initial error magnitudes for PID, AG1, and
AG2 control laws.
AgriEngineering 2022, 4 917

4. Discussion
Concerning the detection of aphids, by inferring with 2208 × 1242 pixel resolution
images, we found that the speed of the Hausdorff-optimized U-Net detection was not
suitable for real-time application. As the images were captured at 15 FPS by the ZED mini
camera, and this network can process only 2 or 3 images per second, many images were
not taken into account when the robot was moving. This was due to the excessive depth
of this model, which was set to increase the ability to detect small objects. This indeed
resulted in a gain in precision but a loss in efficiency. Moreover, the accuracy of U-Net
was not better than that of YOLOv4 (accuracy 0.15 vs. 0.21 of YOLOv4) as it generated
much false detection (997 false positives versus 490 false positives for YOLOv4). We then
simplified the architecture of U-Net by reducing the number of layers and by training it
with cropped images of smaller sizes. However, its accuracy remained poor compared to
the model of Darknet YOLOv4. The simplified U-Net model failed to achieve 10% accuracy
on cropped areas of 400 × 400 pixels. Taking into account these results, we, therefore,
decided to apply YOLOv4 for the detection of aphids, with the aforementioned accuracy
and speed optimizations.
Concerning the targeting process, aiming at sending the laser spot onto detected
aphids, we could obtain medium performance in 2D scenes with mean travel times of
620 ms between two targets. This is much too low a value and was moreover obtained
in laboratory conditions with virtual aphids, knowing that plants infected with tens of
aphids should be treated faster to enable a 1 ha field treatment in 24 h at 29 cm/s. Speed
improvement should be researched for a robot spanning several rows with several detec-
tion/neutralization modules working in parallel. Faster micro-mirrors should be envisaged,
taking into account the cost and affordability of such a robot.
Additionally, to perform a precise localization of the laser spot, it was necessary to
be in a black ambiance, while aphid detection required a good luminosity. We then had
to alternate detection and targeting and synchronize the lighting so that each detection
was performed in the best lighting conditions. In the robot, as we employed LED stripes,
we could efficiently perform the switching, but these working conditions decreased the
operational flow. On-robot experiments showed that the performance decreases as soon
as the lighting conditions deteriorate (which may be the case during sunny days with
external rays of light coming through the entrance for plants) and when 3D objects are
present in the scene, necessitating stopping the robot. Indeed, in 3D scenes, the spot was
sometimes not visible due to the non-alignment of the laser source and the camera axis
and the presence of leaves in different depths of field. The algorithm had to work blindly
during the time the spot remains invisible. When this period was too long, the system
skipped this target and selected the next one. This greatly slowed the servoing and reduced
the performance. Therefore, during the experiments, to decrease these periods, even if we
could target aphids with the robot moving, we preferred to stop it during the targeting
phases. A study enhancing the robustness of visual servoing in the presence of occlusions
is mandatory.
At last, Newscale micro-mirrors are too fragile for farming conditions. Other stronger
solutions have to be studied, knowing that we did not study the impact of vibrations due
to the locomotion of the robot on the ground yet. Regrettably, we could not validate (for
logistical reasons) the whole chain simultaneously (from aphid detection to neutralization
with the power laser): detection/localization/targeting has been validated independently
from targeting/neutralization. Globally, we could make the robot proceed at a velocity of
3 cm/s and have it detect 50% of the present aphids, then target 83% of the detected aphids
on 3 successive broad bean plants, with each one featuring 8 aphids. This is far from the
requirements but it provides the first proof of concept validating the feasibility of such
an approach.
Regarding aphid neutralization, we found that the CO2 laser is best suited to neutralize
almost all aphids with low fluences by raising their cell water temperature with LWIR at
10.6 µm. Indeed, the energy needed to have an LD90 after one day in an adult population
AgriEngineering 2022, 4 918

equal to 12.91 J·cm−2 per shot was 4 times lower than the other lasers tested. Additionally,
targeting younger aphids as N1 and extending the observation period by a few days can
lower the energy required by 10 times, resulting in 1.15 J·cm−2 . Keller et al. found a value
quite similar for mosquitoes using the same wavelength (1.1 J·cm−2 ) [27]. Additionally,
they concluded that the LD90 value is roughly stable when changing the beam size. These
results and our estimates of energy consumed for a field treatment detailed in our previous
work [51] show that the laser is not the most power-consuming component in our robot.
However, the detection must be very precise to target aphids at the youngest stages and
to differentiate them from other insects beneficial to crops. Moreover, most aphids are
localized on the abaxial leaf surface. They are not visible by the camera that films the leaves
from above. We thought of putting the camera in a lower position, aiming upwards, but we
fear that such a position would be risky in case of obstacles, and it would be more sensible
to dust accumulation. Another (more interesting) approach would consist of agitating the
leaves using airflow or a mechanical system so that aphids leave them to fall (fainting being
dead). We would then just have to detect and shoot them on the ground.
Finally, in case of false positive detection, we ensured that hitting leaves would not
impact plant growth [51].

5. Conclusions
The experimental results (a video is available at https://www.creatis.insa-lyon.fr/
~grenier/research/GreenShield/, accessed on 9 September 2022) globally demonstrate, in
lab conditions, the feasibility of detecting different lines of aphids (50% detected at 3 cm/s)
and of neutralizing them by CO2 laser shots with high efficiency (90% mortality after 3 days
with 1.15 J·cm−2 for A. pisum LL01 N1) without impacting the growth of their host plants in
case of a missed shot. Showing the feasibility of this approach is encouraging as aphids are
one of the most difficult crop pests to combat. Thanks to a black environment reproduced
under the robot, the latter could be used particularly on vulnerable young plants such as
soybeans and corn or crops with a compact habit such as sugar beets. However, aphid
neutralization is closely dependent on the quality of the detection, since it is recommended
to detect aphids at the nymphal stage and to differentiate them from the beneficial insects
that one wishes to preserve on the crops. Other pests such as fall armyworms Spodoptera
frugiperda and dark sword grass Agrotis ipsilon larvae that are bigger should pose fewer
detection issues. Future directions consist of enhancing the performance of the detection
and targeting phases to obtain characteristics closer to the requirements, making it useful
in a real farming context. Additionally, the robustness concerning field conditions is to
be studied. The most worrying aspect is the presence of vibrations due to the irregular
ground, which may decrease the performance of the positioning of the micro-mirrors and
so the quality of the pest targeting.

Author Contributions: Conceptualization, A.H., A.L., B.M., F.G.F., K.R., P.D.S., M.T.P., R.M., T.G. and
V.L.; methodology, A.L., B.M., F.G.F., K.R., M.T.P., P.D.S., R.M., S.P., T.G. and V.L.; software, J.D.S.,
T.N., V.D. and V.N.; validation, A.L., B.M., F.G.F., K.R., M.T.P., P.D.S., R.M., S.P., T.G., V.L. and V.N.;
formal analysis, B.M., K.R., M.T.P., P.D.S., R.G., R.M., S.P., T.N. and V.L.; investigation, F.G.F., J.D.S.,
R.G., S.P., T.N., V.D., V.L. and V.N.; resources, K.R., V.D., V.L. and S.P.; data curation, J.D.S., R.G.,
S.P., V.D. and V.L.; writing—original draft preparation, A.L., B.M., F.G.F., J.D.S., K.R., M.T.P., P.D.S.,
R.M., R.G., S.P., T.G., T.N., V.D., V.L. and V.N.; writing—review and editing, A.L., B.M., K.R., M.T.P.,
P.D.S., R.G., R.M., S.P., T.G. and V.L.; visualization, J.D.S., K.R., P.D.S., R.G., S.P., T.G., V.D. and V.L.;
supervision, A.L., B.M., F.G.F., K.R., M.T.P., P.D.S., R.M. and T.G.; project administration, A.H., A.L.,
B.M., F.G.F. and K.R.; funding acquisition, A.H., A.L., B.M., F.G.F. and K.R. All authors have read and
agreed to the published version of the manuscript.
Funding: This research was funded by ANR grant number ANR-17-CE34-0012.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
AgriEngineering 2022, 4 919

Data Availability Statement: The data about aphids presented in this study are available in [51]. The
data about AI-based detection are available at https://www.creatis.insa-lyon.fr/~grenier/research/
GreenShield/aphid_V3.zip.
Conflicts of Interest: The authors declare no conflict of interest.

Abbreviations
The following abbreviations are used in this manuscript:

DNM detection-neutralisation module


FP16 16-bit floating point format
FPS Frames per Second
GPU Graphics Processing Unit
IBVS Image-Based Visual Servo
INT8 Integer coded with 8 bits
LD Lethal Dose
LWIR Long Wavelength Infra-Red
MDPI Multidisciplinary Digital Publishing Institute
RGB Red Green Blue
RGB-D Red Green Blue and Depth
RH Relative Humidity
SWIR Short Wavelength Infra-Red
TSP Traveling Salesman Problem
UV Ultraviolet

References
1. RISE Foundation. Crop Protection & the EU Food System: Where Are They Going, 1st ed.; RISE Foundation: Brüssel, Belgium, 2020.
2. Pesticide Action Network Europe. Endocrine Disrupting Pesticides in European Food; Pesticide Action Network Europe: Brussels,
Belgium, 2017.
3. Tang, F.H.M.; Lenzen, M.; McBratney, A.; Maggi, F. Risk of pesticide pollution at the global scale. Nat. Geosci. 2021, 14, 206–210.
4. Ellis, C.; Park, K.J.; Whitehorn, P.; David, A.; Goulson, D. The Neonicotinoid Insecticide Thiacloprid Impacts upon Bumblebee
Colony Development under Field Conditions. Environ. Sci. Technol. 2017, 51, 1727–1732. https://doi.org/10.1021/acs.est.6b04791.
5. Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy
2020, 10, 207. https://doi.org/10.3390/agronomy10020207.
6. Phasinam, K.; Kassanuk, T.; Shabaz, M. Applicability of internet of things in smart farming. J. Food Qual. 2022, 2022, 7692922.
7. Vougioukas, S.G. Agricultural Robotics. Annu. Rev. Control. Robot. Auton. Syst. 2019, 2, 365–392. https://doi.org/10.1146/annurev-
control-053018-023617.
8. Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for
Crop Farming. J. Imaging 2019, 5, 89. https://doi.org/10.3390/jimaging5120089
9. Meshram, A.T.; Vanalkar, A.V.; Kalambe, K.B.; Badar, A.M. Pesticide spraying robot for precision agriculture: A categorical
literature review and future trends. J. Field Robot. 2022, 39, 153–171. https://doi.org/10.1002/rob.22043.
10. Žibrat, U.; Knapič, M.; Urek, G. Plant pests and disease detection using optical sensors/Daljinsko zaznavanje rastlinskih bolezni
in škodljivcev. Folia Biol. Geol. 2019, 60, 41–52.
11. Mahlein, A.K.; Kuska, M.T.; Behmann, J.; Polder, G.; Walter, A. Hyperspectral sensors and imaging technologies in phytopathology:
state of the art. Annu. Rev. Phytopathol. 2018, 56, 535–558.
12. Lacotte, V.; Peignier, S.; Raynal, M.; Demeaux, I.; Delmotte, F.; da Silva, P. Spatial–Spectral Analysis of Hyperspec-
tral Images Reveals Early Detection of Downy Mildew on Grapevine Leaves. Int. J. Mol. Sci. 2022, 23, 10012.
https://doi.org/10.3390/ijms231710012.
13. Haff, R.P.; Saranwong, S.; Thanapase, W.; Janhiran, A.; Kasemsumran, S.; Kawano, S. Automatic image analysis and spot
classification for detection of fruit fly infestation in hyperspectral images of mangoes. Postharvest Biol. Technol. 2013, 86, 23–28.
https://doi.org/10.1016/j.postharvbio.2013.06.003.
14. Johnson, J.B.; Naiker, M. Seeing red: A review of the use of near-infrared spectroscopy (NIRS) in entomology. Appl. Spectrosc. Rev.
2020, 55, 810–839. https://doi.org/10.1080/05704928.2019.1685532.
15. Lima, M.; Leandro, M.E.; Pereira, L.; Valero, C.; Gonçalves Bazzo, C. Automatic Detection and Monitoring of Insect Pests—A
Review. Agriculture 2020, 10, 161. https://doi.org/10.3390/agriculture10050161.
16. Martineau, M.; Conte, D.; Raveaux, R.; Arnault, I.; Munier, D.; Venturini, G. A survey on image-based insect classification. Pattern
Recognit. 2017, 65, 273–284. https://doi.org/10.1016/j.patcog.2016.12.020.
17. Xie, C.; Wang, H.; Shao, Y.; He, Y. Different algorithms for detection of malondialdehyde content in eggplant leaves stressed by
grey mold based on hyperspectral imaging technique. Intell. Autom. Soft Comput. 2015, 21, 395–407.
AgriEngineering 2022, 4 920

18. Li, R.; Wang, R.; Xie, C.; Liu, L.; Zhang, J.; Wang, F.; Liu, W. A coarse-to-fine network for aphid recognition and detection in the
field. Biosyst. Eng. 2019, 187, 39–52.
19. Ebrahimi, M.A.; Khoshtaghaza, M.H.; Minaei, S.; Jamshidi, B. Vision-based pest detection based on SVM classification method.
Comput. Electron. Agric. 2017, 137, 52–58. https://doi.org/10.1016/j.compag.2017.03.016.
20. Asefpour Vakilian, K.; Massah, J. Performance evaluation of a machine vision system for insect pests identification of field crops
using artificial neural networks. Arch. Phytopathol. Plant Prot. 2013, 46, 1262–1269. https://doi.org/10.1080/03235408.2013.763620.
21. Rupanagudi, S.R.; Ranjani B. S..; Nagaraj, P.; Bhat, V.G.; Thippeswamy G. A novel cloud computing based smart farming system for
early detection of borer insects in tomatoes. In Proceedings of the 2015 International Conference on Communication, Information
& Computing Technology (ICCICT), Mumbai, India, 15–17 January 2015; pp. 1–6. https://doi.org/10.1109/ICCICT.2015.7045722.
22. Srisuphab, A.; Silapachote, P.; Tantratorn, W.; Krakornkul, P.; Darote, P. Insect Detection on an Unmanned Ground Rover.
In Proceedings of the TENCON 2018—2018 IEEE Region 10 Conference, Jeju, Korea, 28–31 October 2018; pp. 0954–0959.
https://doi.org/10.1109/TENCON.2018.8650312.
23. Li, Y.; Xia, C.; Lee, J. Vision-based pest detection and automatic spray of greenhouse plant. In Proceedings of the 2009 IEEE Interna-
tional Symposium on Industrial Electronics, Seoul, Korea, 5–8 July 2009; pp. 920–925. https://doi.org/10.1109/ISIE.2009.5218251.
24. Vibhute, A.S.; Tate Deshmukh, K.R.; Hindule, R.S.; Sonawane, S.M. Pest Management System Using Agriculture Robot. In
Techno-Societal 2020; Pawar, P.M., Balasubramaniam, R., Ronge, B.P., Salunkhe, S.B., Vibhute, A.S., Melinamath, B., Eds.; Springer
International Publishing: Cham, Switzerland, pp. 829–837. https://doi.org/10.1007/978-3-030-69925-3_79.
25. Drees, B.M.; Leroy, T.R. Evaluation of alternative methods for suppression of crape myrtle aphids. In Upper Coast 1990–1991
Entomological Result Demonstration Handbook, Texas A&M University System Edition; Texas Agricultural Extension Service: Tamu,
TX, USA 1991; pp. 21–22.
26. Kusakari, S.i.; Okada, K.; Shibao, M.; Toyoda, H. High Voltage Electric Fields Have Potential to Create New Physical Pest Control
Systems. Insects 2020, 11, 447. https://doi.org/10.3390/insects11070447.
27. Keller, M.D.; Leahy, D.J.; Norton, B.J.; Johanson, T.; Mullen, E.R.; Marvit, M.; Makagon, A. Laser induced mortality of Anopheles
stephensi mosquitoes. Sci. Rep. 2016, 6, 20936. https://doi.org/10.1038/srep20936.
28. Obasekore, H.; Fanni, M.; Ahmed, S.M. Insect Killing Robot for Agricultural Purposes. In Proceedings of the 2019 IEEE/ASME
International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, China, 8–12 July 2019; pp. 1068–1074.
https://doi.org/10.1109/AIM.2019.8868507.
29. Wu, X.; Aravecchia, S.; Lottes, P.; Stachniss, C.; Pradalier, C. Robotic weed control using automated weed and crop classification.
J. Field Robot. 2020, 37, 322–340. https://doi.org/10.1002/rob.21938.
30. Kaierle, S.; Marx, C.; Rath, T.; Hustedt, M. Find and Irradiate—Lasers Used for Weed Control. Laser Tech. J. 2013, 10, 44–47.
https://doi.org/10.1002/latj.201390038.
31. Asha, K.; Mahore, A.; Malkani, P.; Singh, A.K. Robotics-automation and sensor-based approaches in weed detection and control:
A review. Int. J. Chem. Stud. 2020, 8, 542–550.
32. Fuad, M.T.H.; Fime, A.A.; Sikder, D.; Iftee, M.A.R.; Rabbi, J.; Al-Rakhami, M.S.; Gumaei, A.; Sen, O.; Fuad, M.; Is-
lam, M.N. Recent Advances in Deep Learning Techniques for Face Recognition. IEEE Access 2021, 9, 99112–99142.
https://doi.org/10.1109/ACCESS.2021.3096136.
33. Minaee, S.; Boykov, Y.; Porikli, F.; Plaza, A.; Kehtarnavaz, N.; Terzopoulos, D. Image Segmentation Using Deep Learning: A
Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 3523–3542. https://doi.org/10.1109/TPAMI.2021.3059968.
34. Minks, A.K.; Harrewijn, P. Aphids: Their Biology, Natural Enemies, and Control; Elsevier: Amsterdam, The Netherlands, 1987.
35. Simonet, P.; Duport, G.; Gaget, K.; Weiss-Gayet, M.; Colella, S.; Febvay, G.; Charles, H.; Viñuelas, J.; Heddi, A.; Calevro, F. Direct
flow cytometry measurements reveal a fine-tuning of symbiotic cell dynamics according to the host developmental needs in
aphid symbiosis. Sci. Rep. 2016, 6, 19967. https://doi.org/10.1038/srep19967.
36. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot
Operating System. In ICRA Workshop on Open Source Software; IEEE: Kobe, Japan, 2009; Volume 3, p. 5.
37. Ribera, J.; Güera, D.; Chen, Y.; Delp, E.J. Locating Objects Without Bounding Boxes. In Proceedings of the Computer Vision and
Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019.
38. Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934.
39. Wang, X.; Fang, Y.; Lin, Z.; Zhang, L.; Wang, H. A Study on the Damage and Economic Threshold of the Soybean Aphid at the
Seedling Stage. Plant Prot. 1994, 20, 12–13.
40. Showers, W.B.; Von Kaster, L.; Mulder, P.G. Corn Seedling Growth Stage and Black Cutworm (Lepidoptera: Noctuidae) Damage
1. Environ. Entomol. 1983, 12, 241–244. https://doi.org/10.1093/ee/12.1.241.
41. Hurej, M.; Werf, W.V.D. The influence of black bean aphid, Aphis fabae Scop., and its honeydew on leaf growth and dry matter
production of sugar beet. Ann. Appl. Biol. 1993, 122, 201–214. https://doi.org/10.1111/j.1744-7348.1993.tb04027.x.
42. Wang, C.Y.; Liao, H.Y.M.; Yeh, I.H.; Wu, Y.H.; Chen, P.Y.; Hsieh, J.W. CSPNet: A New Backbone that can Enhance Learning
Capability of CNN. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops
(CVPRW), Seattle, WA, USA, 14–19 June 2020
43. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778.
44. Liu, S.; Qi, L.; Qin, H.; Shi, J.; Jia, J. Path Aggregation Network for Instance Segmentation. In Proceedings of the 2018 IEEE/CVF
Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018.
AgriEngineering 2022, 4 921

45. Hutchinson, S.; Hager, G.; Corke, P. A tutorial on visual servo control. IEEE Trans. Robot. Autom. 1996, 12, 651–670.
https://doi.org/10.1109/70.538972.
46. Andreff, N.; Tamadazte, B. Laser steering using virtual trifocal visual servoing. Int. J. Robot. Res. 2016, 35, 672–694.
https://doi.org/10.1177/0278364915585585.
47. Kudryavtsev, A.V.; Chikhaoui, M.T.; Liadov, A.; Rougeot, P.; Spindler, F.; Rabenorosoa, K.; Burgner-Kahrs, J.; Tamadazte,
B.; Andreff, N. Eye-in-Hand Visual Servoing of Concentric Tube Robots. IEEE Robot. Autom. Lett. 2018, 3, 2315–2321.
https://doi.org/10.1109/LRA.2018.2807592.
48. Keller, M.D.; Norton, B.J.; Farrar, D.J.; Rutschman, P.; Marvit, M.; Makagon, A. Optical tracking and laser-induced mortality of
insects during flight. Sci. Rep. 2020, 10, 14795.
49. Lagadic Team. ViSP Tutorial: How to Boost Your Visual Servo Control Law. 2021. Available online: https://visp-
doc.inria.fr/doxygen/visp-2.9.0/tutorial-boost-vs.html: (accessed on 28 April 2021).
50. Helvig, C.S.; Robins, G.; Zelikovsky, A. Moving-Target TSP and Related Problems. Algorithms—ESA’ 98; Bilardi, G., Italiano, G.F.,
Pietracaprina, A., Pucci, G., Eds.; Springer: Berlin, Heidelberg, Germany, 1998; pp. 453–464.
51. Gaetani, R.; Lacotte, V.; Dufour, V.; Clavel, A.; Duport, G.; Gaget, K.; Calevro, F.; Da Silva, P.; Heddi, A.; Vincent, D.; et al.
Sustainable laser-based technology for insect pest control. Sci. Rep. 2021, 11, 11068. https://doi.org/10.1038/s41598-021-90782-7.
52. Hori, M.; Shibuya, K.; Sato, M.; Saito, Y. Lethal effects of short-wavelength visible light on insects. Sci. Rep. 2015, 4, 7383.
https://doi.org/10.1038/srep07383.

You might also like