Agriengineering 04 00058
Agriengineering 04 00058
Agriengineering 04 00058
Article
Pesticide-Free Robotic Control of Aphids as Crop Pests
Virginie Lacotte 1 , Toan NGuyen 2 , Javier Diaz Sempere 2 , Vivien Novales 2 , Vincent Dufour 2 ,
Richard Moreau 2 , Minh Tu Pham 2 , Kanty Rabenorosoa 3 , Sergio Peignier 1 , François G. Feugier 4 ,
Robin Gaetani 4,5 , Thomas Grenier 6 , Bruno Masenelli 5 , Pedro da Silva 1 , Abdelaziz Heddi 1
and Arnaud Lelevé 2, *
1 Univ Lyon, INSA Lyon, INRAE, BF2I, UMR 203, 69621 Villeurbanne, France
2 Univ Lyon, INSA Lyon, Université Claude Bernard Lyon 1, Ecole Centrale de Lyon, CNRS, Ampère, UMR5005,
69621 Villeurbanne, France
3 FEMTO-ST, UBFC, CNRS, 25030 Besançon, France
4 Greenshield, 75004 Paris, France
5 Univ Lyon, INSA Lyon, CNRS, Ecole Centrale de Lyon, Université Claude Bernard Lyon 1, CPE Lyon, INL,
UMR5270, 69621 Villeurbanne, France
6 CREATIS, UMR 5220, U1294, INSA Lyon, Université Claude Bernard Lyon 1, UJM-Saint Etienne, CNRS,
Inserm, Univ Lyon, 69100 Lyon, France
* Correspondence: [email protected]
Abstract: Because our civilization has relied on pesticides to fight weeds, insects, and diseases
since antiquity, the use of these chemicals has become natural and exclusive. Unfortunately, the
use of pesticides has progressively had alarming effects on water quality, biodiversity, and human
health. This paper proposes to improve farming practices by replacing pesticides with a laser-based
robotic approach. This study focused on the neutralization of aphids, as they are among the most
harmful pests for crops and complex to control. With the help of deep learning, we developed
a mobile robot that spans crop rows, locates aphids, and neutralizes them with laser beams. We
Citation: Lacotte, V.; NGuyen, T.;
Sempere, D.J.; Novales, V.; Dufour, V.;
have built a prototype with the sole purpose of validating the localization-neutralization loop on
Moreau, R; Pham, M.T.; Rabenorosoa, a single seedling row. The experiments performed in our laboratory demonstrate the feasibility of
K.; Peignier, S.; Feugier, F.G.; et al. detecting different lines of aphids (50% detected at 3 cm/s) and of neutralizing them (90% mortality)
Pesticide-Free Robotic Control of without impacting the growth of their host plants. The results are encouraging since aphids are one
Aphids as Crop Pests. of the most challenging crop pests to eradicate. However, enhancements in detection and mainly in
AgriEngineering 2022, 4, 903–921. targeting are necessary to be useful in a real farming context. Moreover, robustness regarding field
https://doi.org/10.3390/ conditions should be evaluated.
agriengineering4040058
Academic Editors: Roland Lenain Keywords: farming; robotics; aphid detection; laser-based neutralization; deep learning; image-based
and Eric Lucet visual servoing
optimization of these parameters are performed during a preliminary training phase, which
relies on the availability of a large quantity of data, thanks to specific cost functions (loss).
The originality of this project lies in the embedding of various technologies in a single
farming robot necessitating a multidisciplinary approach including pest biology, laser
photonics, visual servoing, and mobile robotics. As aforementioned, the main objective
was to prove the feasibility of such an approach to replace pesticides. Therefore, this paper
describes the main requirements of this system and the related solutions we envisage.
It exposes the experimental approach to validate their consistency, and it discusses the
results. It is organized as follows. The next section details materials and methods for the
detection and neutralization of insect pests on crops. Experimental results are provided in
Section 3 and discussed in Section 4. We then conclude this paper, which summarizes the
achievements of the Greenshield project, and we evoke future directions.
Supervision laptop
Microchip boards
Battery
Rear view Side view
Jetson Xavier
Micro-mirror + camera
Top view
Figure 2. Greenshield prototype description and working principle: the robot spans a crop row,
detects the aphids on leaves, and targets them using the laser beam positioned by the micro-mirrors.
The mobile robot proceeds in a right line over a single plant row and detects aphids
during motion. When detected aphids are out of reach, the robot stops and triggers the
neutralization tasks. In the long term, the neutralization task should run continuously.
For the real-time detection of aphids, we use a 3D camera and artificial intelligence object
AgriEngineering 2022, 4 907
recognition techniques (Section 2.5). The coordinates of the aphids visible on the acquired
image are transmitted to the neutralization module.
The aphid neutralization module is mounted on the robot and consists of a (harmless)
laser pointer and a power laser source, taking the same optical path toward a pair of
micro-mirrors (see Figure 3b) that position beams in the robot’s workspace (see Figure 2).
The laser pointer is used to realize a visual servoing (detailed in Section 2.6) to target the
aphids located in the images. Once the pointer spot is stabilized on the target, the power
laser emits a pulse carrying a sufficient amount of energy to neutralize the target aphid
(by raising its temperature; see Section 2.8). It then moves on to the next targeted aphid
in the image. We describe the path optimization in Section 2.7. When all detected aphids
are treated, the robot restarts its motion with pest scanning. Several types of low-power
laser sources that can be mounted on a mobile robot have been tested to identify the best
beam tracking on the leaves. Experiments were performed allowing us to correlate the
energy emitted by the laser beam with the fecundity and mortality rates of aphids and the
efficiency of weed destruction. The detection/localization/neutralization software was
linked to the robot locomotion to dynamically enable proceeding. ROS middleware [36]
was used to manage the communication between modules.
Figure 4. Under the robot: light sources and protections, camera and neutralization systems.
crop field in 24 h (in the case of a field where rows are located every 40 cm, the robot
will have to travel 25 km during 12 h, corresponding to a mean speed of 1 km/h or
29 cm/s.).
The problem of detecting and localizing aphids of different sizes, colors and positions
in natural image sequences of various crops is particularly difficult. From our knowledge,
only artificial neural networks based on deep learning detection architectures can perform
this task in real-time. Therefore, to detect aphids in each image acquired by the camera
located under the robot as the latter proceeds over each crop row, we selected from the
current arsenal of detection architectures (EfficientDet, Yolo, FasterRCNN, U-Net with
Hausdorff distance [37]) YOLO version 4 [38] for its speed and its small-size object perfor-
mance in images. Such approaches are supervised; this means that an important data set
with manual annotations is needed for the training phase. When trained, the network was
transferred and run on an embedded GPU board (Nvidia Jetson Xavier AGX).
To do that, we first built an image database featuring the two pea aphid lines, A. pisum
LL01 (green) and YR2 (pink) on broad bean seedlings. We chose A. pisum on broad bean for
the training set because of the color diversity of the aphid lines occurring on the same crop,
which has an optimal leaf surface for plant and aphid detection. Furthermore, we used
young plants for the detection of plants and aphids because of their vulnerability to pest
attacks and their compactness compatible with the regular passage of a robot above the
rows. Thus, this technology could be useful for the detection and neutralization of pests
that can cause serious damage and economic losses to vulnerable young plants, such as
soybean aphids [39] and black cutworms on corn [40], or crops with a compact habit, such
as black bean aphids on sugar beets [41].
The training set of images was annotated using LabelMe (see https://github.com/
wkentaro/labelme, accessed on 9 September 2022) software. Around 700 images were
manually labeled, with each one featuring 20 to 30 aphids. It was necessary to manually
draw a bounding box around each aphid as well as to put a dot on the center of each aphid.
Since we did not have a sufficient amount of images for training the neural networks from
scratch, we fine-tuned our network from an already trained network (trained from MS
COCO data set) and we used data augmentation on our images during training to avoid
the phenomenon of over-fitting and to improve the generalization ability of the trained
model to new images. The following image transformations were applied randomly to
generate modified images during training:
• Horizontal or vertical image flip;
• Rotation of 10° in the positive or negative direction;
• Brightness modification;
• Hue and saturation modification;
• Contrast modification;
• Grayscale modification.
We started to develop this localization function with YOLOv4 [38] (whose code is
available at https://github.com/AlexeyAB/darknet, accessed on 9 September 2022) as it
is very efficient in the detection of objects in real-time and is broadly used in applications
requiring fast detection speed, such as in drones or autonomous car vision. YOLOv4 is a
one-stage detector. Its backbone is CSPDarknet53 [42], a model tuned for image recognition.
The CSPNet framework improves the residual structure of the neural network [43] to
diminish the issues of the gradient-vanishing problem during back-propagation of the
gradient , encourage the network to reuse feature maps (with residual layers) at different
hierarchies, and reduce the set of optimization parameters. The secondary part of the
network (the neck) features a PANet (path aggregation network) [44] to mix and combine
the maps of characteristics generated by the backbone. It is indeed a preparation step for
the following detection. The YOLO detector predicts the class of the object, its bounding
box, and the probability of its presence in its bounding box at three different scales. Because
of its small characteristic maps, this kind of multi-scale detector struggles to detect small
AgriEngineering 2022, 4 910
objects in the image. Thus, in practice, it could not detect aphids located at a distance
greater than 30 cm from the camera.
We tested several input image sizes (640 × 640 and 512 × 512) knowing that the size
of the images acquired by the camera is 2208 × 1242 pixels to determine the best choice. We
also optimized the network process using batch inference, which was not yet implemented
in the original YOLOv4 code. To speed-up the inference time, we chose tkDNN (the
source code for tkDNN is available on Github: https://github.com/ceccocats/tkDNN,
accessed on 9 September 2022), which is a neural network library specifically designed to
work on NVIDIA Jetson boards. It is built with the primitives of cuDNN, a library from
NVIDIA regarding neural networks, and TensorRT, an inference platform high performance
that enables high throughput and low latency for inference applications related to deep
learning. TensorRT engine optimizes models by quantifying all network layer parameters
and converting them into formats that occupy less memory space. The tkDNN library
allows inference with network parameters in three different numeric formats: FP32 (a
32-bit floating point format representation), FP16 (a 16-bits floating point format), and INT8
(integers represented with 8 bits). In this project, we chose the FP16 format as it is displayed
the most accurate and efficient format (see https://github.com/ceccocats/tkDNN#fps-
results, accessed on 9 September 2022). The data degradation caused by format conversion
remained meaner than 0.02%.
We also studied an approach to detect smaller objects without a bounding box based
on the weighted Hausdorff distance [37] (code available on Github: https://github.com/
javiribera/locating-objects-without-bboxes, accessed on 9 September 2022). The chosen
neural network architecture was a U-Net optimized with a loss function based on this
distance. Rather than taking bounding boxes as references (ground truth), the network
seeks to localize objects according to reference points.
Once the bounding box center or an aphid point was well located on the image in the
form of pixel coordinates, it was then possible to convert these points into a 3D reference
using the Software Development Kit of the ZED mini camera. Indeed, the 2D image is
taken by only one of the two objectives of the ZED camera, and the other one is used to
interpret the points in 3 dimensions. As the captured images were in RGBXYZ format (2D
image with depth), we could obtain the cloud measurement of neighborhood points in
XYZ (3D) from a detected point on the RGB (2D) image.
that orientates two micro-mirrors and permits ±25° of rotation range with a resolution
meaner than 5 µrad in a closed loop and a repeatability of 100 µrad. This system features
sensors with a bandwidth of 350 Hz for rotations meaner than 0.1°. The IBVS approach
was chosen because of the properties of the laser-mirror system and the robot operation.
The exact geometric relationship between the target and the mirrors is not known given
the evident variability of the pest locations and the complicated potential estimation of
their 3D position using the camera, in contrast to an industrial application where the target
can be fixed. Three control laws were studied, consisting of two variants of an adaptive
proportional gain (AG1 and AG2) and a PID. These control laws were implemented in the
visual servoing controller in Figure 5. The PID controller, expressed in z space, was:
T 1 − z −1
CPID (z) = K p + Ki − 1
+ Kd (1)
1−z T
where K p , Ki , and Kd are constant parameters, and T is the sampling period. The AG1
controller was:
2
C AG1 [k] = λ[k ] with λ[k] = λ[k − 1] + c1 .e−c2 .ε[k] (2)
where λ[k] is the adaptive gain, c1 and c2 are constant parameters, and ε[k ] is the error
signal (ε = pd − pl ). The AG2 controller was:
λ̇0
−λ ε [ k ]2
C AG2 [k ] = λ[k] with λ[k] = (λ0 − λ∞ ).e 0 −λ∞ + λ∞ (3)
where λ0 (resp. λ∞ ) is the gain when ε = 0 (resp. ∞), and λ̇0 is the slope of λ(ε) when
ε = 0 [49].
The proposed control scheme runs on the Jetson Xavier board. We first tested the
targeting system with 2D scenes of virtual aphids (to isolate the targeting performance
from the aphid detection performance). We input in real-time the coordinates of virtual
aphids randomly located in the images of the camera. We did the same to evaluate the
performance of the system with moving targets by varying the coordinates of virtual aphids
in the images. The camera was located on top of a 40 cm radius disk that was explored by
the visual servoing system with a low-power laser beam (see Figure 6). During experiments,
the system orientated the micro-mirrors to make the laser spot, visible on the plane, travel
as quickly as possible to each target.
Figure 7. Representation of the moving-target TSP method (left) and the hybrid method with the
added nearest neighbors variant (right).
AgriEngineering 2022, 4 913
3. Results
A prototype has been designed and built with the only purpose to validate the
localization-neutralization loop on a single seedling row (see Figure 8). The experimental
results performed in the laboratory with this prototype demonstrate the feasibility of de-
tecting different lines of aphids (50% detected at 3 cm/s) and of neutralizing them without
impacting the growth of their host plants.
(a) A picture acquired by the camera of the (b) The robot spanning a row of broad bean
mobile robot; only the area contained in the plants featuring aphids (front view).
blue frame is analyzed. The blue squares
indicate the location of the detected aphids.
Figure 8. Detection Experimental Setup.
input image (2208 × 1242 pixels) to a 512 × 512 pixels image blurred the objects and caused
a reduction in precision.
Table 2. Performance comparison between YOLOv4 and U-Net with Hausdorff distance (U-Net-HD)
for aphid detection. Values in bold indicate the best results for each criterion.
YOLOv4 U-Net-HD
FPS (Nvidia Quadro 400) 10-11 2-3
True Positive (TP) 238 278
False Positive (FP) 490 997
False Negative (FN) 1371 1349
Precision 0.37 0.15
Recall 0.21 0.17
To get better accuracy, we also tiled the area of 800 × 800 pixels into 4 or 16 sub-
parts. The inference speed (frames per second) was computed when the network ended
up processing all these areas. We observed an increase in sensitivity when using 400 ×
400 pixel tiles (0.73 versus 0.51 for 800 × 800 pixel tiles). In the case of very small images
(200 × 200 pixels), the sensitivity falls to 0.53. We also noticed that this method greatly
reduces the inference speed of the model because of the large number of images to be
processed for each capture. We also observed that the code developed in C++ (gcc version
7.5.0-ubuntu 18.04) was more efficient than that of Python API (Python 3.6.9, ubuntu (about
1.5 times faster in terms of FPS speed). The averaged results on our test set are visible
in Table 3. Some results are visible in Figure 9. This sample frame illustrates the results
obtained with YOLOv4. It shows a single frame from the test set, (a) showing the initial
image, (b) the known position of aphids, and (c) the resulting prediction. One can observe
that 2 aphids were not found while another non-existing one was found.
We could then raise the model inference speed from 9 FPS to 19 FPS without any loss of
precision. We used the tkDNN engine, FP16 floating point precision, and the C++ API on the
Jetson Xavier AGX embedded board with cropped images of 320 × 320 pixels. Some result-
ing films are visible on https://www.creatis.insa-lyon.fr/~grenier/research/GreenShield/,
(accessed on 9 September 2022). It was more than our real-time detection expectations.
(a) Original image (b) Ground truth image (c) Predicted image
Figure 9. Example of aphid detection with YOLOv4 on a test frame. Readers can observe that the
prediction missed 3 aphids and found a false one.
AgriEngineering 2022, 4 916
3.2.2. Targeting
The IBVS algorithm featuring a PID controller obtained the best results versus the two
variants of AG. We could obtain a mean response time t5% = 490 ms (vs. 830 and 920 for
AG), corresponding to a mean t5% = 490 ms (see Figure 10). With moving virtual targets,
the PID control showed slightly better performance with a mean t5% = 420 ms (vs. 590 and
870 for AG). We also performed experiments with velocities ranging from 5 to 12 pixels/s.
We observed that the system did not have the time to travel to all targets at 12 pixels/s and
over (corresponding approximately to 1 cm/s at a distance of 30 cm). Experiments with
blue LED stripes showed a mean response time t5% that greatly increased to 620 ms.
Figure 10. Evolution of t5 % and t∞ response times versus initial error magnitudes for PID, AG1, and
AG2 control laws.
AgriEngineering 2022, 4 917
4. Discussion
Concerning the detection of aphids, by inferring with 2208 × 1242 pixel resolution
images, we found that the speed of the Hausdorff-optimized U-Net detection was not
suitable for real-time application. As the images were captured at 15 FPS by the ZED mini
camera, and this network can process only 2 or 3 images per second, many images were
not taken into account when the robot was moving. This was due to the excessive depth
of this model, which was set to increase the ability to detect small objects. This indeed
resulted in a gain in precision but a loss in efficiency. Moreover, the accuracy of U-Net
was not better than that of YOLOv4 (accuracy 0.15 vs. 0.21 of YOLOv4) as it generated
much false detection (997 false positives versus 490 false positives for YOLOv4). We then
simplified the architecture of U-Net by reducing the number of layers and by training it
with cropped images of smaller sizes. However, its accuracy remained poor compared to
the model of Darknet YOLOv4. The simplified U-Net model failed to achieve 10% accuracy
on cropped areas of 400 × 400 pixels. Taking into account these results, we, therefore,
decided to apply YOLOv4 for the detection of aphids, with the aforementioned accuracy
and speed optimizations.
Concerning the targeting process, aiming at sending the laser spot onto detected
aphids, we could obtain medium performance in 2D scenes with mean travel times of
620 ms between two targets. This is much too low a value and was moreover obtained
in laboratory conditions with virtual aphids, knowing that plants infected with tens of
aphids should be treated faster to enable a 1 ha field treatment in 24 h at 29 cm/s. Speed
improvement should be researched for a robot spanning several rows with several detec-
tion/neutralization modules working in parallel. Faster micro-mirrors should be envisaged,
taking into account the cost and affordability of such a robot.
Additionally, to perform a precise localization of the laser spot, it was necessary to
be in a black ambiance, while aphid detection required a good luminosity. We then had
to alternate detection and targeting and synchronize the lighting so that each detection
was performed in the best lighting conditions. In the robot, as we employed LED stripes,
we could efficiently perform the switching, but these working conditions decreased the
operational flow. On-robot experiments showed that the performance decreases as soon
as the lighting conditions deteriorate (which may be the case during sunny days with
external rays of light coming through the entrance for plants) and when 3D objects are
present in the scene, necessitating stopping the robot. Indeed, in 3D scenes, the spot was
sometimes not visible due to the non-alignment of the laser source and the camera axis
and the presence of leaves in different depths of field. The algorithm had to work blindly
during the time the spot remains invisible. When this period was too long, the system
skipped this target and selected the next one. This greatly slowed the servoing and reduced
the performance. Therefore, during the experiments, to decrease these periods, even if we
could target aphids with the robot moving, we preferred to stop it during the targeting
phases. A study enhancing the robustness of visual servoing in the presence of occlusions
is mandatory.
At last, Newscale micro-mirrors are too fragile for farming conditions. Other stronger
solutions have to be studied, knowing that we did not study the impact of vibrations due
to the locomotion of the robot on the ground yet. Regrettably, we could not validate (for
logistical reasons) the whole chain simultaneously (from aphid detection to neutralization
with the power laser): detection/localization/targeting has been validated independently
from targeting/neutralization. Globally, we could make the robot proceed at a velocity of
3 cm/s and have it detect 50% of the present aphids, then target 83% of the detected aphids
on 3 successive broad bean plants, with each one featuring 8 aphids. This is far from the
requirements but it provides the first proof of concept validating the feasibility of such
an approach.
Regarding aphid neutralization, we found that the CO2 laser is best suited to neutralize
almost all aphids with low fluences by raising their cell water temperature with LWIR at
10.6 µm. Indeed, the energy needed to have an LD90 after one day in an adult population
AgriEngineering 2022, 4 918
equal to 12.91 J·cm−2 per shot was 4 times lower than the other lasers tested. Additionally,
targeting younger aphids as N1 and extending the observation period by a few days can
lower the energy required by 10 times, resulting in 1.15 J·cm−2 . Keller et al. found a value
quite similar for mosquitoes using the same wavelength (1.1 J·cm−2 ) [27]. Additionally,
they concluded that the LD90 value is roughly stable when changing the beam size. These
results and our estimates of energy consumed for a field treatment detailed in our previous
work [51] show that the laser is not the most power-consuming component in our robot.
However, the detection must be very precise to target aphids at the youngest stages and
to differentiate them from other insects beneficial to crops. Moreover, most aphids are
localized on the abaxial leaf surface. They are not visible by the camera that films the leaves
from above. We thought of putting the camera in a lower position, aiming upwards, but we
fear that such a position would be risky in case of obstacles, and it would be more sensible
to dust accumulation. Another (more interesting) approach would consist of agitating the
leaves using airflow or a mechanical system so that aphids leave them to fall (fainting being
dead). We would then just have to detect and shoot them on the ground.
Finally, in case of false positive detection, we ensured that hitting leaves would not
impact plant growth [51].
5. Conclusions
The experimental results (a video is available at https://www.creatis.insa-lyon.fr/
~grenier/research/GreenShield/, accessed on 9 September 2022) globally demonstrate, in
lab conditions, the feasibility of detecting different lines of aphids (50% detected at 3 cm/s)
and of neutralizing them by CO2 laser shots with high efficiency (90% mortality after 3 days
with 1.15 J·cm−2 for A. pisum LL01 N1) without impacting the growth of their host plants in
case of a missed shot. Showing the feasibility of this approach is encouraging as aphids are
one of the most difficult crop pests to combat. Thanks to a black environment reproduced
under the robot, the latter could be used particularly on vulnerable young plants such as
soybeans and corn or crops with a compact habit such as sugar beets. However, aphid
neutralization is closely dependent on the quality of the detection, since it is recommended
to detect aphids at the nymphal stage and to differentiate them from the beneficial insects
that one wishes to preserve on the crops. Other pests such as fall armyworms Spodoptera
frugiperda and dark sword grass Agrotis ipsilon larvae that are bigger should pose fewer
detection issues. Future directions consist of enhancing the performance of the detection
and targeting phases to obtain characteristics closer to the requirements, making it useful
in a real farming context. Additionally, the robustness concerning field conditions is to
be studied. The most worrying aspect is the presence of vibrations due to the irregular
ground, which may decrease the performance of the positioning of the micro-mirrors and
so the quality of the pest targeting.
Author Contributions: Conceptualization, A.H., A.L., B.M., F.G.F., K.R., P.D.S., M.T.P., R.M., T.G. and
V.L.; methodology, A.L., B.M., F.G.F., K.R., M.T.P., P.D.S., R.M., S.P., T.G. and V.L.; software, J.D.S.,
T.N., V.D. and V.N.; validation, A.L., B.M., F.G.F., K.R., M.T.P., P.D.S., R.M., S.P., T.G., V.L. and V.N.;
formal analysis, B.M., K.R., M.T.P., P.D.S., R.G., R.M., S.P., T.N. and V.L.; investigation, F.G.F., J.D.S.,
R.G., S.P., T.N., V.D., V.L. and V.N.; resources, K.R., V.D., V.L. and S.P.; data curation, J.D.S., R.G.,
S.P., V.D. and V.L.; writing—original draft preparation, A.L., B.M., F.G.F., J.D.S., K.R., M.T.P., P.D.S.,
R.M., R.G., S.P., T.G., T.N., V.D., V.L. and V.N.; writing—review and editing, A.L., B.M., K.R., M.T.P.,
P.D.S., R.G., R.M., S.P., T.G. and V.L.; visualization, J.D.S., K.R., P.D.S., R.G., S.P., T.G., V.D. and V.L.;
supervision, A.L., B.M., F.G.F., K.R., M.T.P., P.D.S., R.M. and T.G.; project administration, A.H., A.L.,
B.M., F.G.F. and K.R.; funding acquisition, A.H., A.L., B.M., F.G.F. and K.R. All authors have read and
agreed to the published version of the manuscript.
Funding: This research was funded by ANR grant number ANR-17-CE34-0012.
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
AgriEngineering 2022, 4 919
Data Availability Statement: The data about aphids presented in this study are available in [51]. The
data about AI-based detection are available at https://www.creatis.insa-lyon.fr/~grenier/research/
GreenShield/aphid_V3.zip.
Conflicts of Interest: The authors declare no conflict of interest.
Abbreviations
The following abbreviations are used in this manuscript:
References
1. RISE Foundation. Crop Protection & the EU Food System: Where Are They Going, 1st ed.; RISE Foundation: Brüssel, Belgium, 2020.
2. Pesticide Action Network Europe. Endocrine Disrupting Pesticides in European Food; Pesticide Action Network Europe: Brussels,
Belgium, 2017.
3. Tang, F.H.M.; Lenzen, M.; McBratney, A.; Maggi, F. Risk of pesticide pollution at the global scale. Nat. Geosci. 2021, 14, 206–210.
4. Ellis, C.; Park, K.J.; Whitehorn, P.; David, A.; Goulson, D. The Neonicotinoid Insecticide Thiacloprid Impacts upon Bumblebee
Colony Development under Field Conditions. Environ. Sci. Technol. 2017, 51, 1727–1732. https://doi.org/10.1021/acs.est.6b04791.
5. Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy
2020, 10, 207. https://doi.org/10.3390/agronomy10020207.
6. Phasinam, K.; Kassanuk, T.; Shabaz, M. Applicability of internet of things in smart farming. J. Food Qual. 2022, 2022, 7692922.
7. Vougioukas, S.G. Agricultural Robotics. Annu. Rev. Control. Robot. Auton. Syst. 2019, 2, 365–392. https://doi.org/10.1146/annurev-
control-053018-023617.
8. Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for
Crop Farming. J. Imaging 2019, 5, 89. https://doi.org/10.3390/jimaging5120089
9. Meshram, A.T.; Vanalkar, A.V.; Kalambe, K.B.; Badar, A.M. Pesticide spraying robot for precision agriculture: A categorical
literature review and future trends. J. Field Robot. 2022, 39, 153–171. https://doi.org/10.1002/rob.22043.
10. Žibrat, U.; Knapič, M.; Urek, G. Plant pests and disease detection using optical sensors/Daljinsko zaznavanje rastlinskih bolezni
in škodljivcev. Folia Biol. Geol. 2019, 60, 41–52.
11. Mahlein, A.K.; Kuska, M.T.; Behmann, J.; Polder, G.; Walter, A. Hyperspectral sensors and imaging technologies in phytopathology:
state of the art. Annu. Rev. Phytopathol. 2018, 56, 535–558.
12. Lacotte, V.; Peignier, S.; Raynal, M.; Demeaux, I.; Delmotte, F.; da Silva, P. Spatial–Spectral Analysis of Hyperspec-
tral Images Reveals Early Detection of Downy Mildew on Grapevine Leaves. Int. J. Mol. Sci. 2022, 23, 10012.
https://doi.org/10.3390/ijms231710012.
13. Haff, R.P.; Saranwong, S.; Thanapase, W.; Janhiran, A.; Kasemsumran, S.; Kawano, S. Automatic image analysis and spot
classification for detection of fruit fly infestation in hyperspectral images of mangoes. Postharvest Biol. Technol. 2013, 86, 23–28.
https://doi.org/10.1016/j.postharvbio.2013.06.003.
14. Johnson, J.B.; Naiker, M. Seeing red: A review of the use of near-infrared spectroscopy (NIRS) in entomology. Appl. Spectrosc. Rev.
2020, 55, 810–839. https://doi.org/10.1080/05704928.2019.1685532.
15. Lima, M.; Leandro, M.E.; Pereira, L.; Valero, C.; Gonçalves Bazzo, C. Automatic Detection and Monitoring of Insect Pests—A
Review. Agriculture 2020, 10, 161. https://doi.org/10.3390/agriculture10050161.
16. Martineau, M.; Conte, D.; Raveaux, R.; Arnault, I.; Munier, D.; Venturini, G. A survey on image-based insect classification. Pattern
Recognit. 2017, 65, 273–284. https://doi.org/10.1016/j.patcog.2016.12.020.
17. Xie, C.; Wang, H.; Shao, Y.; He, Y. Different algorithms for detection of malondialdehyde content in eggplant leaves stressed by
grey mold based on hyperspectral imaging technique. Intell. Autom. Soft Comput. 2015, 21, 395–407.
AgriEngineering 2022, 4 920
18. Li, R.; Wang, R.; Xie, C.; Liu, L.; Zhang, J.; Wang, F.; Liu, W. A coarse-to-fine network for aphid recognition and detection in the
field. Biosyst. Eng. 2019, 187, 39–52.
19. Ebrahimi, M.A.; Khoshtaghaza, M.H.; Minaei, S.; Jamshidi, B. Vision-based pest detection based on SVM classification method.
Comput. Electron. Agric. 2017, 137, 52–58. https://doi.org/10.1016/j.compag.2017.03.016.
20. Asefpour Vakilian, K.; Massah, J. Performance evaluation of a machine vision system for insect pests identification of field crops
using artificial neural networks. Arch. Phytopathol. Plant Prot. 2013, 46, 1262–1269. https://doi.org/10.1080/03235408.2013.763620.
21. Rupanagudi, S.R.; Ranjani B. S..; Nagaraj, P.; Bhat, V.G.; Thippeswamy G. A novel cloud computing based smart farming system for
early detection of borer insects in tomatoes. In Proceedings of the 2015 International Conference on Communication, Information
& Computing Technology (ICCICT), Mumbai, India, 15–17 January 2015; pp. 1–6. https://doi.org/10.1109/ICCICT.2015.7045722.
22. Srisuphab, A.; Silapachote, P.; Tantratorn, W.; Krakornkul, P.; Darote, P. Insect Detection on an Unmanned Ground Rover.
In Proceedings of the TENCON 2018—2018 IEEE Region 10 Conference, Jeju, Korea, 28–31 October 2018; pp. 0954–0959.
https://doi.org/10.1109/TENCON.2018.8650312.
23. Li, Y.; Xia, C.; Lee, J. Vision-based pest detection and automatic spray of greenhouse plant. In Proceedings of the 2009 IEEE Interna-
tional Symposium on Industrial Electronics, Seoul, Korea, 5–8 July 2009; pp. 920–925. https://doi.org/10.1109/ISIE.2009.5218251.
24. Vibhute, A.S.; Tate Deshmukh, K.R.; Hindule, R.S.; Sonawane, S.M. Pest Management System Using Agriculture Robot. In
Techno-Societal 2020; Pawar, P.M., Balasubramaniam, R., Ronge, B.P., Salunkhe, S.B., Vibhute, A.S., Melinamath, B., Eds.; Springer
International Publishing: Cham, Switzerland, pp. 829–837. https://doi.org/10.1007/978-3-030-69925-3_79.
25. Drees, B.M.; Leroy, T.R. Evaluation of alternative methods for suppression of crape myrtle aphids. In Upper Coast 1990–1991
Entomological Result Demonstration Handbook, Texas A&M University System Edition; Texas Agricultural Extension Service: Tamu,
TX, USA 1991; pp. 21–22.
26. Kusakari, S.i.; Okada, K.; Shibao, M.; Toyoda, H. High Voltage Electric Fields Have Potential to Create New Physical Pest Control
Systems. Insects 2020, 11, 447. https://doi.org/10.3390/insects11070447.
27. Keller, M.D.; Leahy, D.J.; Norton, B.J.; Johanson, T.; Mullen, E.R.; Marvit, M.; Makagon, A. Laser induced mortality of Anopheles
stephensi mosquitoes. Sci. Rep. 2016, 6, 20936. https://doi.org/10.1038/srep20936.
28. Obasekore, H.; Fanni, M.; Ahmed, S.M. Insect Killing Robot for Agricultural Purposes. In Proceedings of the 2019 IEEE/ASME
International Conference on Advanced Intelligent Mechatronics (AIM), Hong Kong, China, 8–12 July 2019; pp. 1068–1074.
https://doi.org/10.1109/AIM.2019.8868507.
29. Wu, X.; Aravecchia, S.; Lottes, P.; Stachniss, C.; Pradalier, C. Robotic weed control using automated weed and crop classification.
J. Field Robot. 2020, 37, 322–340. https://doi.org/10.1002/rob.21938.
30. Kaierle, S.; Marx, C.; Rath, T.; Hustedt, M. Find and Irradiate—Lasers Used for Weed Control. Laser Tech. J. 2013, 10, 44–47.
https://doi.org/10.1002/latj.201390038.
31. Asha, K.; Mahore, A.; Malkani, P.; Singh, A.K. Robotics-automation and sensor-based approaches in weed detection and control:
A review. Int. J. Chem. Stud. 2020, 8, 542–550.
32. Fuad, M.T.H.; Fime, A.A.; Sikder, D.; Iftee, M.A.R.; Rabbi, J.; Al-Rakhami, M.S.; Gumaei, A.; Sen, O.; Fuad, M.; Is-
lam, M.N. Recent Advances in Deep Learning Techniques for Face Recognition. IEEE Access 2021, 9, 99112–99142.
https://doi.org/10.1109/ACCESS.2021.3096136.
33. Minaee, S.; Boykov, Y.; Porikli, F.; Plaza, A.; Kehtarnavaz, N.; Terzopoulos, D. Image Segmentation Using Deep Learning: A
Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 3523–3542. https://doi.org/10.1109/TPAMI.2021.3059968.
34. Minks, A.K.; Harrewijn, P. Aphids: Their Biology, Natural Enemies, and Control; Elsevier: Amsterdam, The Netherlands, 1987.
35. Simonet, P.; Duport, G.; Gaget, K.; Weiss-Gayet, M.; Colella, S.; Febvay, G.; Charles, H.; Viñuelas, J.; Heddi, A.; Calevro, F. Direct
flow cytometry measurements reveal a fine-tuning of symbiotic cell dynamics according to the host developmental needs in
aphid symbiosis. Sci. Rep. 2016, 6, 19967. https://doi.org/10.1038/srep19967.
36. Quigley, M.; Conley, K.; Gerkey, B.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot
Operating System. In ICRA Workshop on Open Source Software; IEEE: Kobe, Japan, 2009; Volume 3, p. 5.
37. Ribera, J.; Güera, D.; Chen, Y.; Delp, E.J. Locating Objects Without Bounding Boxes. In Proceedings of the Computer Vision and
Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019.
38. Bochkovskiy, A.; Wang, C.Y.; Liao, H.Y.M. Yolov4: Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934.
39. Wang, X.; Fang, Y.; Lin, Z.; Zhang, L.; Wang, H. A Study on the Damage and Economic Threshold of the Soybean Aphid at the
Seedling Stage. Plant Prot. 1994, 20, 12–13.
40. Showers, W.B.; Von Kaster, L.; Mulder, P.G. Corn Seedling Growth Stage and Black Cutworm (Lepidoptera: Noctuidae) Damage
1. Environ. Entomol. 1983, 12, 241–244. https://doi.org/10.1093/ee/12.1.241.
41. Hurej, M.; Werf, W.V.D. The influence of black bean aphid, Aphis fabae Scop., and its honeydew on leaf growth and dry matter
production of sugar beet. Ann. Appl. Biol. 1993, 122, 201–214. https://doi.org/10.1111/j.1744-7348.1993.tb04027.x.
42. Wang, C.Y.; Liao, H.Y.M.; Yeh, I.H.; Wu, Y.H.; Chen, P.Y.; Hsieh, J.W. CSPNet: A New Backbone that can Enhance Learning
Capability of CNN. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops
(CVPRW), Seattle, WA, USA, 14–19 June 2020
43. He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on
Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778.
44. Liu, S.; Qi, L.; Qin, H.; Shi, J.; Jia, J. Path Aggregation Network for Instance Segmentation. In Proceedings of the 2018 IEEE/CVF
Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018.
AgriEngineering 2022, 4 921
45. Hutchinson, S.; Hager, G.; Corke, P. A tutorial on visual servo control. IEEE Trans. Robot. Autom. 1996, 12, 651–670.
https://doi.org/10.1109/70.538972.
46. Andreff, N.; Tamadazte, B. Laser steering using virtual trifocal visual servoing. Int. J. Robot. Res. 2016, 35, 672–694.
https://doi.org/10.1177/0278364915585585.
47. Kudryavtsev, A.V.; Chikhaoui, M.T.; Liadov, A.; Rougeot, P.; Spindler, F.; Rabenorosoa, K.; Burgner-Kahrs, J.; Tamadazte,
B.; Andreff, N. Eye-in-Hand Visual Servoing of Concentric Tube Robots. IEEE Robot. Autom. Lett. 2018, 3, 2315–2321.
https://doi.org/10.1109/LRA.2018.2807592.
48. Keller, M.D.; Norton, B.J.; Farrar, D.J.; Rutschman, P.; Marvit, M.; Makagon, A. Optical tracking and laser-induced mortality of
insects during flight. Sci. Rep. 2020, 10, 14795.
49. Lagadic Team. ViSP Tutorial: How to Boost Your Visual Servo Control Law. 2021. Available online: https://visp-
doc.inria.fr/doxygen/visp-2.9.0/tutorial-boost-vs.html: (accessed on 28 April 2021).
50. Helvig, C.S.; Robins, G.; Zelikovsky, A. Moving-Target TSP and Related Problems. Algorithms—ESA’ 98; Bilardi, G., Italiano, G.F.,
Pietracaprina, A., Pucci, G., Eds.; Springer: Berlin, Heidelberg, Germany, 1998; pp. 453–464.
51. Gaetani, R.; Lacotte, V.; Dufour, V.; Clavel, A.; Duport, G.; Gaget, K.; Calevro, F.; Da Silva, P.; Heddi, A.; Vincent, D.; et al.
Sustainable laser-based technology for insect pest control. Sci. Rep. 2021, 11, 11068. https://doi.org/10.1038/s41598-021-90782-7.
52. Hori, M.; Shibuya, K.; Sato, M.; Saito, Y. Lethal effects of short-wavelength visible light on insects. Sci. Rep. 2015, 4, 7383.
https://doi.org/10.1038/srep07383.