459-Article Text-2622-1-10-20230409

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.

2 (2023)

Quality of Service Analysis in Hand Gesture


Identification Application Based on Convex Hull
Algorithm to Control a Learning Media
Mitodius Nicho Swacaesar Setiawan1, Nurul Hidayati2*, Rachmad Saptono3
1,3
Digital Telecommunication Network Study Program,
Department of Electrical Engineering, State Polytechnic of Malang, Malang City, 65141, Indonesia
2
Telecommunication Engineering Study Program,
Department of Electrical Engineering, State Polytechnic of Malang, Malang City, 65141, Indonesia
[email protected], [email protected], [email protected]

Abstract— The technique of human interaction with computers has developed very rapidly. Body movement is the easiest and
most expressive way and hand movements are flexible. We can use gestures as a simple identification command. This research
discusses the application of hand gesture reading as a remote control for learning media and analyzes its quality of service
using Wireshark software. This system uses a Raspberry Pi as a computing center. Raspberry Pi reads hand gestures using
the Convex Hull algorithm, which can read the number of fingers raised on the hand by taking the outermost point in the
contour scan of the hand. Each gesture in the form of the number of fingers was allocated to the keyboard commands used to
select answers on the learning media. On the learning media side, a quiz system was created that uses keyboard commands to
select the answers. Then the Raspberry Pi is connected to the laptop using a third-party application called VNC. Based on
QoS measurement results, the throughput result is 62 kbps, which is included in the very good category. Packet Loss of 0%,
which is also included in the very good category. The delay of 52.2 ms, which is included in the very good category. Thus, the
overall quality of the network can be categorized as very good.
Keywords— Convex Hull, Hand Gesture Detection, Learning Media, OpenCV, Quality of Service.
I. INTRODUCTION the mouse pointer was smoothed using the Kalman filter method
and hand movements were identified using the color detection
The interaction between people and computers is always method [2].
changing as technology advances. At first it still used a keyboard An android-based system for hand gesture recognition was
and mouse, but now it can already use touch screen technology. developed by M. Anshary. To recognize hand motions, the
Now that interactions are more natural and users may system use the convex hull method and convexity defects. The
communicate with one another through body language, speech, background color and light intensity in this system are adjustable
facial expressions, and eye contact, the next revolution is in order to determine the background color and light intensity that
beginning to take shape. work best for identifying hands [3].
The computer is able to easily understand the user's A. R. Adnan used PoseNet and the K Nearest Neighbor
intentions with the support of advanced sensors and algorithms. algorithm to construct a system for identifying human arm
This may increase the system's activity and interaction. Users and movements. This motion identification is utilized to regulate the
computers still interact using mouse in their communication. flame of two switches connected over the IoT network [4].
Given that it is simpler to recognize hand gestures, hand gesture Another example is the Convolutional Neural Network (CNN)-
recognition can be utilized to facilitate interaction rather than based hand gesture detection system developed by A. Adi. This
using a mouse. system is implemented in wheeled robots, where there are 12
There are various situations that occur during lectures that hand gestures to control the 12 maneuvers of wheeled robots [5].
require lecturers to stay in the lecturer's room while teaching A finger tracking device was also developed by Alan Tompunu
students in class. Lecturers can present material via Google Meet using a Raspberry Pi 3. OpenCV and Python were used in this
or Zoom and quizzes may be given by lecturers at certain points. research. An accuracy rate of 75% and an average video
There are a number of mobile applications that can be used as the processing time of 9.2 seconds were the result of this study [6].
quiz system itself, however the system necessitates that students In this project, a hand gesture identification system was
bring their own smartphones, and sometimes there are devices developed for learning media in the form of a quiz program.
that are incompatible with the program. Another issue is the Convex hull [7] and convexity defect methods [8] are used in this
inconsistent and sometimes missing internet connectivity identification system together with the Python [9] programming
between students. language and OpenCV plugins [10].
A presentation control system was developed by M.H. For accuracy smoothing and enhancement, the system
Khoirul using hand gesture recognition. The Support Vector employs an external webcam camera. Hand gestures taken by the
Machine Classification technique is used by the system to classify webcam camera will be converted into RGB color [11] types so
skin tones according to background color. The camera for this that they can be processed by OpenCV. After that, the
experiment was a Logitech C270 webcam, and the data background color changes to green. After that, the background
processing system was an Intel NUC5i7RYH. There are only five color changes to green. From these results, the algorithms for
keyboard commands available for the system [1]. convex hull and convexity defect are applied. By seeking out the
A hand gesture identification system that D.R.M. Harika fingertip point with the least amount of skin tone, the algorithm
developed can be used to control the mouse pointer. In this work,

* Coresponding author
E-ISSN: 2654-6531 P- ISSN: 2407-0807 103
Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

locates the fingertips' points. Students' answers to the questions right area, then the presentation will proceed to the next question,
displayed are identified by the hand movements they make. while if there is one finger detected in the lower right area, then
The factors examined in this study were website system the presentation will proceed to the previous question. Fig. 2 and
success, accuracy, hand gesture recognition process time Fig. 3 shows how the system work.
[12][13], and Quality of Service (QoS) [14]. The performance of
transmission system could be seen by Quality of Service
measurement, including delay, throughput, packet loss and jitter
[14][15]. This measurement utilizes a wireshark software.

II. METHOD
A. Block Diagram System
Fig. 1 depicts a system overview that describes the hardware
process of the system.

Webcam
Raspberry Presentation /
Logitech Laptop
Pi 3B Test Application
C270
Figure 3. Webcam capture display design
Room 1 Proyektor Room 2
C. Flowchart Convex Hull Algorithm
The convex hull of any point P is denoted by the notation
CH(P). In simple terms, we can suppose that point P is a
collection of nails stuck in a tree or wall and the convex hull is
Router Wifi the smallest area like rubber made in a circle where every point P
Figure 1. Block Diagram System is inside the rubber area. Convex Hull is formed from lines
Hand gestures are performed by users who are in Room 1 connecting the points P which are in the outermost area. In the
and the webcam captures the movements and hand gestures implementation, the convex hull has several methods that can be
demonstrated by the user. The program is a hand gesture used, namely Jarvis Wrap. In simple terms the Jarvis Wrap
detection system and is created through the Thonny IDE. In the method can be explained in the following steps:
program, there is a convex hull algorithm, and this program is • Take one of the points that is located on the outermost, for
tasked with computing the input video data into a collection of
example, which has the highest or lowest x coordinate, and
dots that form the hand. From these dots, the gesture and the
number of fingers of the hand that will be used for the output it could also be the one with the highest or lowest y-
process will be determined. The output of the number of fingers coordinate.
indicates the answer choice chosen by the user. The Raspberry Pi • Find the nearest line that makes the largest angle when
is a platform where video input data is converted into output data drawn by the line from the first point
in the form of answer choices chosen by the user. • The second step is done continuously until it returns to the
B. System Planning starting point.
To accomplish the convex hull algorithm, theory convexity
defects is needed. The use of the convex hull approach is thought
to be less exact since one finger might occasionally be identified
at both ends. To identify it accurately, we need an assistance
algorithm. As a result, the convexity defect method is employed
in this work. The convex hull technique is utilized to detect
knuckles, which makes determining the number of elevated
fingers more particular and exact. These convexity defects are the
deepest concavities or deviations between two points convex hull.
This theory calculates the angle generated between 2 fingers.
Angles are calculated using the cosine Equation 1.
𝑐 = √𝑎2 + 𝑏 2 − 2𝑎𝑏. cos⁡(𝛼) (1)
where a is length of finger 1, b is length of finger 2, c is distance
between finger 1 and finger 2, d is the angle of finger 1 and finger
Figure 2. User's position from behind 2. Thus, to calculate 𝛼 use the following Equation 2.
𝑎2 +𝑏 2 −𝑐 2 180
In order to ensure that the convex hull only selected up the 𝛼 = 𝑎⁡𝑐𝑜𝑠 ( )𝑥 (2)
2𝑎𝑏 𝜋
hand part, the camera capture was divided by a bulkhead. This
If the angle 𝛼 is less than 90 degrees, then 𝛼 is a convexity
sealing is carried out because the convex hull method can only
differentiate between the outermost ends of the set of dots, defect. The number of convexity defects is the number of fingers
making it unable to distinguish between hands and faces at this raised minus 1. Figure 4 depicts the flowchart of Convex Hull
time. Inside the area, there is also another seal on the upper right Algorithm process.
and lower right sides. If there is one finger detected in the upper

E-ISSN: 2654-6531 P- ISSN: 2407-0807 104


Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

Start A B
A
Hand Look for the Look for the
point to the point to the Start Number of
edge line yes Choosing the first answer
right of it left of it fingers = 1
coordina
te data Users perform hand
gestures No
Searched for If connected make If connected make
Webcam captures hand Number of Choosing the second
the far left the largest angle the largest angle yes
gestures fingers = 2 answer
point of the
hand No
Video changed to BGR
Connect, and Connect, and mode then to gray Number of
use the point yes Choosing the third answer
use the point fingers = 3
A
as a reference as a reference Video thresholding
point point process No

Number of Choosing the fourth


yes
The reference fingers = 4 answer
The reference Searched outlines
point is on the far from the hand
right point is at the
starting point
Number of Choosing the fourth
Video on the process using yes
fingers = 5 answer
B
Hull's Convex algorithm
End
End
A
Figure 4. Convex Hull Algorithm
Figure 5. System Flowchart Diagram
The convex hull algorithm are: (1) The initial data needed is TABLE I
the border point data or hand contours scanned through the HSV COMMAND AND HAND GESTURES
Command Hand Gestures
color filter and commanded drawContour; (2) The data will be
Answer A Identified 1 finger
scanned using the convex Hull command; (3) the first thing to do Answer B Identified 2 finger
is take the starting point on the far left of the object (the biggest Answer C Identified 3 finger
Answer D Identified 4 finger
X value); (4) Create a reference line, namely line x = x coordinate Answer E Identified 5 finger
of the starting point. Find all points with the closest x coordinate
and y coordinate above that point, connect the first point and all E. Web Quiz System
the points closest to that point, if the angle between the two lines Web Quiz is made by PHP and JavaScript language. The data
these (reference lines and connection lines) form the largest is stored in PhpMyAdmin local database. This web is using radio
angle, then the closest point that forms the line is the hull point. button that assigned to keyboard press for check the radio button,
Then the reference point is changed to that point and the reference as shown in Fig. 6 and Fig. 7.
line is changed to the line that connects the starting point and the
reference point, then the steps are repeated until the point is far START
right; (5) The same thing is done after the point is on the far right,
looking for the closest point and measuring the largest angle
between the 2 reference lines; (6) This is repeated until the
reference point is at the starting point. quizID, minPoint Pick the quiz

D. System Flowchart Diagram


Get question,answer,
Figure 5 illustrates the flow of system performance depicted question, answers, corect
corect ans, and time based
in the form of a flowchart. ans, time
on quizID
When the user makes a gesture with their hands, the camera
will record it in real time. The webcam video will be
automatically converted to BGR format (Blue, Green, and Red). C,N=0
After that, a convex hull algorithm will be used to process the
video and calculate how many fingers there are based on the hand
gesture that was recorded. The number of fingers on the hand A
gesture that has been captured will be identified, which will later
be the output of the answer from the quiz. The table below lists Figure 6. depicts the flowchart of web quiz system
the inputs and outputs for hand gestures, as depicted in Table I.

E-ISSN: 2654-6531 P- ISSN: 2407-0807 105


Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

A
the value of J will be reset, but if the value of C is equal to the
number of questions, then the process will continue to value
calculation. Finally, value is calculated with the formula
time - =1 display C order
question, answer, time Value=N/number of questions*100 and presented on display.

Chose answer based III. RESULTS AND DISCUSSION


on gesture
C +=1
A. Callibration of Sensors
f
time =0 Light Intensity Sensor (BH1750) is used in this research.
Testing of the light intensity sensor was carried out using two
t
measuring instruments, the Digital Lux Meter AS803 and the
f
BH1750 sensor. This test was carried out in 2 conditions, room
Is the answer
N +=0
correct
conditions and conditions when exposed to direct light. The
results of the calculation of light intensity will be displayed on the
t Thonny terminal. In the BH1750 datasheet it is written that the
intensity that can be measured is 0.11 lux to 100000 lux. The
N+=1 average measurement error stated on the datasheet is ± 20%. The
results of testing the light intensity sensor error rate can be
f calculated using the following formula:
C=number
of question 𝐿𝑢𝑥⁡𝑀𝑒𝑡𝑒𝑟⁡𝐴𝑆803⁡𝑉𝑎𝑙𝑢𝑒 − ⁡𝐵𝐻1750⁡𝑉𝑎𝑙𝑢𝑒
𝐸𝑟𝑟𝑜𝑟⁡(%) = ⁡ × 100%
t 𝐿𝑢𝑥⁡𝑀𝑒𝑡𝑒𝑟⁡𝐴𝑆803⁡𝑉𝑎𝑙𝑢𝑒
Point=N/number of
The error percentage calculation is performed between
question*100
the values obtained from the BH1750 sensor and the AS803
luxmeter, which are presented in the Table II.
f TABLE II
point>minpoint Status = remidial MEASUREMENT RESULTS OF THE BH1750 SENSOR AND AS803 LUXMETER
No BH1750 (lx) AS803 (lx) Deviation (lx) Error (%)
t 1 30.83 31 0.17 0.6%
Status = pass
2 30 31 1 3.2%
3 30 30 0 0%
4 30 30 0 0%
Display point and 5 30 30 0 0%
status 6 30 31 1 3.2%
7 30 30 0 0%
END 8 30 31 1 3.2%
9 275 278 3 1.07%
Figure 7. Web Quiz Flowchart 10 274.17 278 3.83 1.3%
11 270.83 280 9.17 3.2%
Fig 6 and 7 show flowchart of web quiz. The user has 12 271.67 277 5.33 1.92%
selected a quiz on the main page and received data in the form of 13 271.67 277 5.33 1.92%
14 271.67 278 6.33 2.27%
a quiz and value limit. Retrieving all questions, answers 1-5, 15 273.33 278 4.67 1.68%
correct answers, and time according to the id quiz in the database 16 273.33 277 3.67 1.32%
so that you get data in the form of questions, answers 1-5, correct Average 2.78 1.56%
answers, and time. Then declare the variable C=0 (count), which
is useful for the index of questions to be displayed, as well as N=0 Based on Table II, the average value of the difference is
(grades) for initial grades, and J=0 (answers) to find out the 2.78 lux, and the average error value is 1.56%. According to the
number of students who have answered a question. Displays the datasheet of the BH1750 sensor, this sensor has a tolerance value
(C+1) question. Students choose answers according to the of ±20%. Therefore, it can be said that the BH1750 sensor used
questions that appear. is accurate and can be used because it does not exceed the
If the answer chosen is in accordance with the correct tolerance value.
answer, then N is increased by 1, but if the answer chosen is not
appropriate then the value of J will be checked first. If the value B. Hardware Implementation Result
of J ≠ 1 then the time will be reset, the value of J will increase by There are few hardware components in this system. In the
1 and other students can answer again. If the value of J = 1 then student side there are Raspberry pi, 2 webcam, BH1750 light
it will proceed to the next question without adding value. On the sensor, and monitor to display raspberry pi. In the lecturer side
other hand, if the value of C is not the same as the number of just only a laptop. This laptop is the source of the web quiz, and
questions, then the value of C will increase by 1 and the process the raspberry pi is just a tool to recognize the hand gesture that
will return to displaying the questions and answers to (C + 1) and student do in the other room, as depicted in Fig. 8 and Fig. 9.

E-ISSN: 2654-6531 P- ISSN: 2407-0807 106


Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

A pop-up page will appear, and the title, class, and quiz id
input fields will be filled in automatically according to the
selected quiz, as presented in Fig. 12.

Figure 8. Hardware Implementation on student and lecturer side

Figure 12. Pop Up page to set the number of students to take the quiz

If the answer is correct then the box will change color to


Figure 9. Display when webquiz and hand gesture recognition running
green and it will say "Answer (name of student) is correct". If the
C. Software Implementation Result answer is wrong then the box will change color to red and it will
This software is made by PHP and JavaScript language. say "Incorrect answer (name of student), as shown in Fig. 13.
PhpMyAdmin database is the database that used in this web quiz.
The information below provides documentation of the software
The Answer is correct

in the form of webpage based on quiz. This webpage was created


using visual studio code software, and this web is installed only
on lecturer laptop.
In testing the login page will be carried out by opening the
login page and filling in the username and password. If the Figure 13. Quiz Page
registered username and password match then it will change to
the main page. If the username or password is not registered, a Displays the initial value recap page which contains a
warning will appear, as shown in Fig. 10. select class column. The value of each student will be displayed
according to the order of the highest score, as highlighted in Fig.
14.

Figure 14. Point Recap

Figure 10. Login page

On the main page there are 4 components that can be


pressed, namely: choice of quiz title, add quiz schedule, score
recap, and sign out. The homepage is depicted in Fig. 11.

Figure 15. Hand Gesture Recognition

D. System Testing for Hand Gesture Recognition


This hand gesture recognition accuracy test is conducted
at 3 various distances (30 cm, 50cm, and 80cm) and 3 various
light intensity (<50 lux, 50-100 lux, and >100lux). The accuracy
Figure 11. Homepage calculation will be calculated by the success and failure of hand

E-ISSN: 2654-6531 P- ISSN: 2407-0807 107


Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

gesture identification. The table III shows the result of the E. Delay testing for hand gesture recognition
recognition This hand gesture recognition accuracy test is conducted
TABLE III at 3 various distances (30 cm, 50cm, and 80cm) and 3 various
EXAMPLE RESULT OF HAND GESTURE RECOGNITION
light intensity (<50 lux, 50-100 lux, and >100lux). The delay
Image Description
calculation will be calculated by time that raspberry pi gets the
gesture recognition, minus by time that raspberry pi gets the
frame from webcam. The Table V depicts the results of
1 finger up calculating the delay.
TABLE V
DELAY TEST RESULT
Average Delay per Average Delay
Light
No. Distance Light Intensity per Light
Intensity
and Distance Intensity
2 finger up 30 0.0273
1 <50 50 0.0293 0.0280
80 0.0275
30 0.003
2 50-100 50 0.0036 0.0032
80 0.003
30 0.0033
3 >100 50 0.003 0.004
3 finger up 80 0.0056

From these results, it can be seen that the highest average


delay is found in hand gesture testing at light intensities below 50
lux. This is due to the lack of light, so that the raspberry pi takes
longer to carry out the identification process.
4 finger up F. Quiz Web Test Using Hand Gestures Results
This test aims to find out whether the answers on the quiz
webpage can change according to the hand gesture input
performed. This test is carried out after the Raspberry Pi has been
connected to the laptop and the laptop is in the state of opening
5 finger up the web page and answering questions, as highlighted in Fig. 16,
Fig. 17, Fig. 18 and Fig. 19.

The calculation of the accuracy of hand gestures is done


after getting results at each distance of 30 cm, 50 cm, and 80 cm
with three different light intensities, namely <50 lux, 50-100 lux,
and >100 lux. The Table IV presents the results of calculating the
accuracy.
TABLE IV
HAND GESTURE RECOGNITION ACCURACY RESULT
Light Intensity Accuracy Avg accuracy
No. Distance
(Lux) (%) (%)
<50 88%
1. 30 cm 50 -100 100% 96%
>100 100%
Figure 16. Hand Gesture Recognition on 1 student
<50 92%
2. 50 cm 50 -100 100% 97%
>100 100%
<50 88%
3. 80 cm 50 -100 100% 96%
>100 100%
After the testing, it is known that the accuracy percentage
of the <50 lux condition is below the other light intensity. And
we can see, the distance does not affect the accuracy of this hand
gesture recognition. This test gets 100% accuracy on every
distance when the light intensity is above 50 lux. But if the light
intensity below 50 lux, the accuracy become lower, 88%-92% in
every distance.
Figure 17. Hand Gesture Recognition on 2 students

E-ISSN: 2654-6531 P- ISSN: 2407-0807 108


Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

2) Packet Loss Measurement: The number of packets lost


compared to packets sent.
Σsent⁡packet − Σ𝑑𝑒𝑙𝑖𝑣𝑒𝑟𝑒𝑑⁡𝑝𝑎𝑐𝑘𝑒𝑡
𝑃𝑎𝑐𝑘𝑒𝑡⁡𝐿𝑜𝑠𝑠 =
Σsent⁡packet
55221 − 55221
𝑃𝑎𝑐𝑘𝑒𝑡⁡𝐿𝑜𝑠𝑠 = 𝑥100% = 0%
55221
Based on Wireshark application, packet loss measurement can be
shown in Fig. 21

Figure 18. Hand Gesture Recognition on 3 students

Figure 21. The result of packet loss measurement

The packet loss test results are 0%, which means there are no
failed packets. The value of 0% packet loss is included in the very
good category (<3%), according to TIPHON.

3) Delay Measurement: Delay is the amount of time it takes


for a packet to travel from origin to destination. Delay is obtained
from the division between Time Span and Packets. Delay is
affected by distance, physical media and congestion.
Figure 19. Hand Gesture Recognition on 4 students timespan
𝐷𝑒𝑙𝑎𝑦 =
After testing, it can be seen that the overall quiz web packets
2883.35
system is able to recognize answers based on hand gestures. 𝐷𝑒𝑙𝑎𝑦 = = 0.0522𝑠 = 52.2⁡𝑚𝑠
55221
Based on Wireshark application, packet loss measurement can be
G. Quality of Service (QoS) Analysis shown in Fig. 22
This QoS test is conducted in order to assess the network
quality utilized by laptops and Raspberry Pi. Throughput, Packet
Loss, and Delay are the variables that were examined in this
study. The wireshark program was used to conduct this test on the
laptop side.
1) Throughput Measurement: Throughput is the total
Figure 22. Delay measurement
number of packets that have successfully arrived during a certain
period of time divided by the duration of that period. The delay obtained is 52.2 ms, this value is included in the very
Σ𝑑𝑒𝑙𝑖𝑣𝑒𝑟𝑒𝑑⁡𝑝𝑎𝑐𝑘𝑒𝑡 category good (<150ms), according to the TIPHON standard.
𝑇ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 =
𝑙𝑒𝑛𝑔ℎ𝑡⁡𝑜𝑓⁡𝑚𝑒𝑎𝑠𝑢𝑟𝑒𝑚𝑒𝑛𝑡
22479930 IV. CONCLUSION
𝑇ℎ𝑟𝑜𝑢𝑔ℎ𝑝𝑢𝑡 = = 7.796⁡𝐾𝐵𝑝𝑠 = 62.371⁡𝑘𝑏𝑝𝑠
2883.350 Following the design, implementation, and testing phases,
it can be concluded. The system is made up of several hardware
Based on Wireshark application, throughput measurement can be and software components. The Raspberry Pi, a webcam, a light
shown in Fig. 20 intensity sensor, and a laptop make up the hardware. The software
is a web-based application that was created using the PHP.
Through testing, a total accuracy of 96.5% was obtained based on
2 parameter such as distance and light density. In that test, can be
concluded that brighter light and closer distance will make the
accuracy higher. In this research, a third-party program called
Figure 20. The result of throughput measurement
VNC was used to connect the quiz web system (a laptop) and
Raspberry Pi It was discovered through testing in chapter 4 that
From the throughput results obtained, it can be seen that the the Raspberry Pi can provide keyboard commands in response to
throughput values obtained are included in the very good movements made. The keyboard commands that are then
category (> 100bps), according to the TIPHON standard. delivered are also consistent with the chosen answer option.
Based on QoS measurement results, the throughput result is 62
kbps which is included in the very good category. Packet Loss of

E-ISSN: 2654-6531 P- ISSN: 2407-0807 109


Journal of Telecommunication Network (Jurnal Jaringan Telekomunikasi) Vol. 13, No.2 (2023)

0% which is also included in the very good category. Delay of RGB Images," in IEEE Transactions on Industrial
52.2 ms which is included in the very good category. Thus, the Informatics, vol. 18, no. 9, pp. 5992-6002, Sept. 2022, doi:
overall quality of the network can be categorized as very good. 10.1109/TII.2021.3134016.
[14] P. E. Mas’udia, C. A. Pratama, D. Purwati, Y. Ratnawati,
REFERENCES
M. Sarosa, and N. Hidayati, “Rancang Bangun dan
[1] M.H.Khoirul, “Sistem Pengontrol Presentasi Analisis QoS pada Sistem Informasi Penjualan Obat
Menggunakan Pengenalan Gestur Tangan Berbasis Fitur dengan Layanan Antar-Jemput Berbasis Android,”
pada Contour Dengan Metode Klasifikasi Support Vector Techno.Com, vol. 21, no. 3, pp. 633–643, 2022, doi:
Machine,” Jurnal Pengembangan Teknologi Informasi 10.33633/tc.v21i3.6209.
dan Ilmu Komputer, vol. 4, no. 4, pp. 1083-1089, 2020.
[15] D. Priadi, “Pengukuran Quality of Service (QoS) Pada
[2] D. R. M. Harika, “Rancang Bangun Pengontrol Presentasi Aplikasi File Sharing dengan Metode Client Server
Berbasis Slide dengan Teknik Analisis Gerakan Jari dan Berbasis Android”, Jartel, vol. 6, no. 1, pp. 39-49, May
Tangan,” JOIN ( Jurnal Online Informatika ), vol. 1, no. 2, 2018.
2016.
[3] M. Anshary, “Prototype Program Hand Gesture Recognize
Using the Convex Hull Method and Convexity Defect on
Android,” JOIN ( Jurnal Online Informatika ), vol. 5, no.
2, pp. 205 - 2011, 2020.
[4] A.R.Adnan, “Klasifikasi Gestur Lengan Manusia
Menggunakan Metode KNN Untuk Kendali Stop Kontak
Pintar Berbasis Internet of Things,” e-Proceeding of
Engieering Telkom University, vol. 8, no. 1, pp. 9-16,
2021.
[5] I. C. H.A. Adi, “Sistem Pengenal Isyarat Tangan Untuk
Mengendalikan,” Indonesian Journal of Electronics and
Instrumentation Systems (IJEIS), vol. 9, no. 2, pp. 193-202,
2019.
[6] S. Qin, "Real-time Hand Gesture Recognition from Depth
Images," J Sign Process Syst, 2018.
[7] A. Tompunu, "FINGER TRACKING AND
RECOGNITION USING OPENCV RASPBERRY PI 3,"
Proceeding Forum in Research, Science, and Technology
(FIRST), 2017.
[8] Wilkinson, "A Raspberry Pi-based camera system and
image processing procedure for low cost and long-term
monitoring of forest canopy dynamics.," Methods Ecol,
vol. 12, pp. 1316-1322, 2021.
[9] J. Minichino, "Learning OpenCV 3 Computer Vision with
Python Second Edition," in Preface, Brimingham, UK,
Packt Publishing, 2015, p. vii.
[10] K. &. Atul, "The AI Learner," 9 November 2020. [Online].
Available: https://theailearner.com/2020/11/09/convexity-
defects-opencv/. [Accessed 12 Agustus 2022].
[11] Bhowmik, Interactive Display : Natural Human - Interface
Technologies, John Wiley & Sons, 2015.
[12] A. Skuric, V. Padois, N. Rezzoug and D. Daney, "On-Line
Feasible Wrench Polytope Evaluation Based on Human
Musculoskeletal Models: An Iterative Convex Hull
Method," in IEEE Robotics and Automation Letters, vol.
7, no. 2, pp. 5206-5213, April 2022, doi:
10.1109/LRA.2022.3155374.
[13] S. -J. Horng, D. -T. Vu, T. -V. Nguyen, W. Zhou and C. -
T. Lin, "Recognizing Palm Vein in Smartphones Using

E-ISSN: 2654-6531 P- ISSN: 2407-0807 110

You might also like