2071 4258 1 SM
2071 4258 1 SM
2071 4258 1 SM
ISSN 2406-9833
IJEScA
ABSTRACT
The development of navigation and tracking of an object using unmanned aerial vehicles (UAV) has been
evolved more advanced. The utilization of this technology is scattered throughout the area including for
search and rescue in small and large areas. In this research, the UAV used is a multirotor based on camera
for human detection that has been developed and integrated into a fully autonomous system. The first
stage is designing a controller and the second step is to integrate the companion computer on the
quadcopter that has the OpenCV library installed with the HOG and SVM methods. The system presented
in this paper is able to perform the stable flying via PID controller. The output response of roll, pitch, and
yaw has an overshoot of 0.5 %. Identify target in the form of human body in real-time and identify target
location based on the direction of the Earth's compass with test result has average error, latitude 1 =
0.026%, longitude 1 = 0.034%, latitude 2 = 0.016%, and longitude 2 = 0.00168%.
60
Odroid XU4 needed to control the Pixhawk in parameters of PID use S-shape reaction curve
order to perform actions such as auto vertical where consist of time constant 𝑇 and delay time 𝐿.
takeoff, altitude hold for 30 seconds, to the
specified waypoint and autoland. The serial for
the UART driver module is required for Odroid
communication XU4 Pixhawk through MavLink
communication.
61
Weight Vote into
Normalize Gamma
Input Image Compute Gradients Spatial &
& Colour
Orientation Cell
Contrast Normlize
Person / non-person Colect HOG’s Over
Linier SVM over Overlapping
Classification Detection Window
Spatial Block
Fig. 4. HOG Method Extraction Process
non-humans. The process of this method works B-Frame and E-Frame that can determine the
when the HOG framework on OpenCV video linear position of the Quadcopter (r E).
𝑟 𝐸 = [𝑋 𝑌 𝑍]𝑇 (1)
counting results from video captured by webcam.
Θ𝐸 = [𝜙 𝜃 𝜓 ]𝑇 (2)
In this process, SVM serves as a separator of two
When flying, quadcopter will generate a
classes in input space between -1 and + 1. Class-
rotation matrix is a B-Frame transformation into
1 is symbolized by a red box and class + 1 is
E-Frame use the matrix transfer 𝑇𝜃 :
symbolized by a yellow circle. The basic principle
𝑝 𝜙
of SVM itself is as a linear classifier that can be [𝑞 ] = 𝑇𝜃−1 [ 𝜃 ], (3)
seen in Fig. 5 [7]. 𝑟 𝜓
where the transformation matrix from B-Frame to
E-Frame:
1 0 −𝑠𝜃
𝑇𝜃−1 = [0 𝑐𝜙 𝑐𝜃 𝑠𝜙 ] (4)
0 −𝑠𝜙 𝑐𝜙 𝑐𝜃
1 −𝑠𝜙 𝑡𝜃 𝑠𝜃 𝑡𝜃
Fig. 5. The SVM process finds the best 0 𝑐𝜙 −𝑠𝜙
𝑇𝜃−1 = 𝑠𝜙 𝑐𝜙 (5)
hyperplane to separate both class-1 and + 1 0
[ 𝑐𝜃 𝑐𝜃 ]
E. Mathematical Model of Quadcopter The equation (6) – (11) was obtained a
The mathematical modelling of general quadcopter equation derived from the
quadcopter essentially refers to the quadcopter Euler-Newton method
motion or can be called degrees of freedom. The 𝑈1
𝑋̈ = (𝑠𝑖𝑛𝜓𝑠𝑖𝑛𝜙 + 𝑐𝑜𝑠𝜓𝑠𝑖𝑛𝜃𝑐𝑜𝑠𝜙) (6)
𝑚
quadcopter has 6 Degree of Freedom (DoF) which
𝑈1
𝑌̈ = (−𝑐𝑜𝑠𝜓𝑠𝑖𝑛𝜙 + 𝑠𝑖𝑛𝜓𝑠𝑖𝑛𝜃𝑐𝑜𝑠𝜙) (7)
is divided into two reference frames i.e. E-Frame 𝑚
is a fixed axis that is earth and B-Frame is a 𝑈1
𝑍̈ = −𝑔 + (𝑐𝑜𝑠𝜃𝑐𝑜𝑠𝜙) (8)
𝑚
quadcopter moving axis that can be seen in Fig. 6 𝐼𝑌𝑌 − 𝐼𝑍𝑍 𝐽𝑇𝑃 𝑈2
𝑝̇ = 𝑞𝑟 − 𝑞Ω + (9)
[8] [9]. 𝐼𝑋𝑋 𝐼𝑋𝑋 𝐼𝑋𝑋
𝐼𝑍𝑍 −𝐼𝑋𝑋 𝐽 𝑈3
𝑞̇ = 𝐼𝑌𝑌
𝑝𝑟 − 𝐼 𝑇𝑃 𝑝Ω + 𝐼𝑌𝑌
(10)
𝑌𝑌
𝐼𝑋𝑋 −𝐼𝑌𝑌 𝑈4
𝑟̇ = 𝐼𝑍𝑍
𝑝𝑟 − 𝐼𝑍𝑍𝑍
(11)
62
While the input of the general equation
quadcopter (6)-(11) is determined from the
equation (12)-(16) which is the velocity of the
propeller when add disturbance.
𝑈1 = 𝑏 (Ω12 + Ω22 + Ω23 + Ω24 ) (12)
𝑈2 = 𝑏𝑙 (−Ω22 + Ω24 ) (13)
𝑈3 = 𝑏𝑙 (−Ω12 + Ω24 ) (14)
𝑈4 = 𝑑 (−Ω12 + Ω22 −Ω23 + Ω24 ) (15)
Ω = −Ω1 + Ω2 − Ω3 + Ω4 (16)
After the transfer function is acquired
Fig. 8. Complete Hardware of Quadcopter
from the general equation (6)-(11), that take part
of the pitch, roll and yaw to be controlled using Table 2. Physical measuring parameters
Physical measuring quadcopter
PID. The tuning of PID controller parameters
Name Value
(proportional Integral differential) is always based
LxF (m) 0,16
on the review of the regulated characteristics
LyF (m) 0,17
(plant). LxB (m) 0,16
LyB (m) 0,17
F. Physical parameters identification
Massa (kg) 1,82 kg
The moment of inertia calculation is
necessary to know the value of Ixx, Iyy and Izz. Parameter identification using experiment
In the moment of inertia required data method. The experimental method is a technique
specifications of the quadcopter physical of measuring characteristics of the quadcopter
parameters. The specification can be seen in Table plant using physical measurements moment of
2. inertia with trivial pendulum method. The trivial
pendulum method is chosen because its
implementation is very easy. Measurements are
performed in X, Y and Z axes.
63
After that a moment of inertial calculation B. Implementation Auto Take-off and Landing
with the equation (17): The experiment was carried out by giving
𝑀. 𝑔. 𝑅2 𝑇𝑥,𝑦,𝑧
2 the auto take-off and landing command to odroid-
𝐼𝑥𝑥,𝑦𝑦,𝑧𝑧 = (17)
4. 𝜋 2 . 𝐿 xu4 which had been connected to the Pixhawk. In
Figure 3.5, there are three parts of quadcopter
Finally, the moment of inertia can be seen in Table condition. The first is the quadcopter position on
3. hover in 10 seconds, the second is the quadcopter
Table 3. Moment of inertia state while running the mission to the specified
Moment of inertia (Kg.m2) waypoint in 16 seconds while in the third part is
Ixx 0,0232
the quadcopter state when it is landing.
Iyy 0,0228
Izz 0,0298
C. Object Detection
The following test results from the
autonomous quadcopter program run with remote
Fig. 10. Response of PID controller control of the PuTTY and VNC Viewer software
on a PC. All results are displayed in Figure 3.6
Table 4. The value of PID with the parameter number of objects, altitude,
Name P I D Overshoot
Roll 1,05 6,562 0,042 6% coordinates, flight mode and local time. The
Pitch 1,05 6,562 0,042 5.5%
Yaw 1,44 14,4 0,036 8% results of the detection are taken based on four
directions of scanning (north, east, south and
west) at one coordinate point with latitude data =-
5.3599526 and longitude = 105.3103595. In this
64
coordinate point area, there is one sample 4. CONCLUSIONS
prepared to test the program that was created. It
The result of the implementation
can be seen in Figure 3.6 that detected 1 person in
hardware is quadcopter can detect the human
the scanning area to the east and west.
object and fly stable with the average height that
successfully performed is 5.78 meters from the
specified 6 meters set-point. The result of the
coordinate test is done with the test coordinate
value obtained the measured coordinate value of
the system with the average error value of each
latitude 1 = 0.026%, longitude 1 = 0.034%,
Fig. 12. Human detection east direction latitude 2 = 0.016%, and longitude 2 = 0.0168%.
From the value, each of these errors can be stored
system has a fairly high level of accuracy.
ACKNOWLEDGEMENT
Thanks to Institut Teknologi Sumatera
(ITERA), this research is funded by ITERA Smart
65
CV – Python. Journal of Computer [8] Bresciani, T., 2018. Modelling, Identification
Application; 162(8), 17-21. and Control of a Quadrotor Helicopter. M.S.
[5] Arifin, F., Daniel, R.A., Widiyanto, D., 2014. Thesis, Lund University, Lund Scania
Autonomous Detection And Tracking Of An Sweden
Object Autonomously Using Ar.Drone
Quadcopter. Journal of Computer Science
and Information; 7(1), 11-17.
[6] Suryadi, K., Sikumbang, S., 2015. Human
Detection Menggunakan Metode Histogram
Of Oriented Gradients (Hog) Berbasis
Open_Cv. Journal of Pendidikan Teknik
Elektro; 4(2).
[7] Dalal, N., Triggs, B., 2005. Histograms of
oriented gradients for human detection.
Computer Society Conference on Computer
Vision and Pattern Recognition (CVPR'05).
66