Project Documentation Sandeep

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

VIRTUAL MOUSE

A Project Report
Submitted in partial fulfillment of the requirements for the award
of the degree of

BACHELOR OF SCIENCE

(COMPUTER SCIENCE)

By

MR. SANDEEP VIJAY GUPTA


Roll Number : 534

Under the esteemed guidance of

Ms.Anisha Asirvatham
Assistant Professor

NAGINDAS KHANDWALA COLLEGE


(Empowered Autonomous College)
(Affiliated to University of Mumbai)
MUMBAI - 400 064
MAHARASHTRA
2023 – 24
NAGINDAS KHANDWALA COLLEGE (Autonomous)
(Empowered Autonomous College)
(Affiliated to University of Mumbai)
MUMBAI - 400 064
MAHARASHTRA

DEPARTMENT OF COMPUTER AND INFORMATION SCIENCE

CERTIFICATE

This is to certify that the project titled, ”VIRTUAL MOUSE” , is bonafied work of Mr.
SANDEEP VIJAY GUPTA bearing Roll No: 534 is submitted in partial fulfillment of the
requirements for the award of degree of BACHELOR OF SCIENCE in COMPUTER
SCIENCE from University of Mumbai.

Internal Guide Coordinator External Examiner

Date College Seal


DECLARATION

I hereby declare that the project entitled, VIRTUAL MOUSE done at Nagindas
Khandwala College, has not been in any case duplicated to submit to any other university for
the award of any degree. To the best of my knowledge other than me, no one has submitted to
any other university. The project is done in partial fulfillment of the requirements for the award
of degree of BACHELOR OF SCIENCE (COMPUTER SCIENCE) to be submitted as final
semester project as part of our curriculum.

Signature

SANDEEP VIJAY GUPTA


ACKNOWLEDGEMENT
I would like to acknowledge the enormous help given to me in creating this project. I
am grateful to the Director of M.K.E.S Dr. Mrs. Ancy Jose, Director-MKES INSTITUTIONS
Prof. Dr. Moushumi Datta, Principal Prof. Dr. Mona Mehta, Vice Principle & IQAC Co-
ordinator and Computer and Information Science Dr. Pallavi Tawde, for their kind cooperation
in the completion of my project. For the patience, guidance, and support, I would like to thank
my project guide, Ms. Anisha Asirvatham. I would like to acknowledge my college Nagindas
Khandwala College for providing me with the proper ambiance and providing me with the right
amenities that have helped and supported me during the preparation of this document and the
project. I wish to thank my parents as well for their undivided support and interest who inspired
me and encouraged me to go my own way, without whom I would be unable to complete my
project. In the end, I want to thank my friends who displayed appreciation and motivated me to
continue my work. Once again, thanks to everyone for making this project successful.
ABSTRACT

The mouse is one of the wonderful inventions of Human-Computer Interaction (HCI)


technology. Currently, wireless mouse or a Bluetooth mouse still uses devices and is not free
of devices completely since it uses a battery for power and a dongle to connect it to the PC. In
the proposed AI virtual mouse system, this limitation can be overcome by employing webcam
or a built-in camera for capturing of hand gestures and hand tip detection using computer vision.
The algorithm used in the system makes use of the machine learning algorithm. Based on the
hand gestures, the computer can be controlled virtually and can perform left click, right click,
scrolling functions, and computer cursor function without the use of the physical mouse. The
algorithm is based on deep learning for detecting the hands. Hence, the proposed system will
avoid COVID-19 spread by eliminating the human intervention and dependency of devices to
control the computer.
Table of Contents
Topic Page
No.
Chapter 1: INTRODUCTION 01-03
1.1 Background of the Project 01
1.2 Scope of the Project 02-02
1.3 Objectives of the Project 02-02
1.4 Applicability 03-03
CHAPTER 2: GAP ANALYSIS/ DRAWBACK OF EXISTING SYSTEM 04
CHAPTER 3: REQUIREMENTS AND ANALYSIS 05-06
3.1 Problem Definition 05-05
3.2 Requirements Specification 05-05
3.3 Planning and Scheduling 06
CHAPTER 4: SYSTEM DESIGN 07-11
4.1 Schema Design 07
4.2 UML Diagrams / Block Diagram/ Circuit Diagram/ Algorithms Design 07-09
4.3 User interface design 10-11
CHAPTER 5: IMPLEMENTATION AND TESTING 12
5.1 Code 12
5.2 Testing Approach and Test Cases 12
REFERENCES…………………………… 13
Chapter 1: Introduction
1.1 Background:

With the development technologies in the areas of augmented reality and devices that we
use in our daily life, these devices are becoming compact in the form of Bluetooth or wireless
technologies. This paper proposes an AI virtual mouse system that makes use of the hand
gestures and hand tip detection for performing mouse functions in the computer using computer
vision. The main objective of the proposed system is to perform computer mouse cursor
functions and scroll function using a web camera or a built-in camera in the computer instead
of using a traditional mouse device. Hand gesture and hand tip detection by using computer
vision is used as a HCI [1] with the computer. With the use of the AI virtual mouse system, we
can track the fingertip of the hand gesture by using a built-in camera or web camera and perform
the mouse cursor operations and scrolling function and also move the cursor with it.

While using a wireless or a Bluetooth mouse, some devices such as the mouse, the dongle
to connect to the PC, and also, a battery to power the mouse to operate are used, but in this
paper, the user uses his/her built-in camera or a webcam and uses his/her hand gestures to
control the computer mouse operations. In the proposed system, the web camera captures and
then processes the frames that have been captured and then recognizes the various hand gestures
and hand tip gestures and then performs the particular mouse function.

Python programming language is used for developing the AI virtual mouse system, and
also, OpenCV which is the library for computer vision is used in the AI virtual mouse system.
In the proposed AI virtual mouse system, the model makes use of the MediaPipe package for
the tracking of the hands and for tracking of the tip of the hands, and also, PyAutoGUI packages
were used for moving around the window screen of the computer for performing functions such
as left click, right click, and scrolling functions. The results of the proposed model showed very
high accuracy level, and the proposed model can work very well in real-world application with
the use of a CPU without the use of a GPU.

Page 1 of 13
1.2 Project Scope

For most laptop touchpad is not the most comfortable and convenient.
Virtual mouse, known as Virtual Multitask Mouse.
This is real time application.
User friendly application.
This project removes the requirement of having a physical contact with the touchpad.

1.3 Objectives

The goal is to manage computers and other devices with gestures rather than pointing and
clicking a mouse or touching a display directly

This approach can make it not only easier to carry out many existing chores but also take on
trickier tasks such as creating 3-D models, browsing medical imagery during surgery without
touching anything.

Reduce cost of hardware

Page 2 of 13
1.4 Applicability
The AI virtual mouse system is useful for many applications; it can be used to reduce the space
for using the physical mouse, and it can be used in situations where we cannot use the physical
mouse. The system eliminates the usage of devices, and it improves the human-computer
interaction.

Major applications:

1. The proposed model has a greater accuracy of 99% which is far greater than the that of
other proposed models for virtual mouse, and it has many applications

2. Amidst the COVID-19 situation, it is not safe to use the devices by touching them
because it may result in a possible situation of spread of the virus by touching the
devices, so the proposed AI virtual mouse can be used to control the PC mouse functions
without using the physical mouse

3. The system can be used to control robots and automation systems without the usage of
devices

4. 2D and 3D images can be drawn using the AI virtual system using the hand gestures

5. AI virtual mouse can be used to play virtual reality- and augmented reality-based games
without the wireless or wired mouse devices

6. Persons with problems in their hands can use this system to control the mouse functions
in the computer

7. In the field of robotics, the proposed system like HCI can be used for controlling robots

8. In designing and architecture, the proposed system can be used for designing virtually
for prototyping

Page 3 of 13
Chapter 2: Gap Analysis

To develop the Virutal Mouse project, a careful gap analysis has revealed certain areas where
we can enhance the project's capabilities to better serve our users.

Gap Analysis:-

The proposed AI virtual mouse system can be used to overcome problems in the real world
such as situations where there is no space to use a physical mouse and also for the persons who
have problems in their hands and are not able to control a physical mouse. Also, amidst of the
COVID-19 situation, it is not safe to use the devices by touching them because it may result in
a possible situation of spread of the virus by touching the devices, so the proposed AI virtual
mouse can be used to overcome these problems since hand gesture and hand Tip detection is
used to control the PC mouse functions by using a webcam or a built-in camera.

Page 4 of 13
Chapter 3: Requirements and Analysis

3.1 Problem Definition

The system can be used to control robots and automation systems without the usage of devices

This innovative approach leverages advanced software and communication protocols to enable
users to exert precise control over robotic operations and automation processes.

By eliminating the dependency on external gadgets, this system not only streamlines operations
but also reduces costs and enhances flexibility, making it an ideal solution for various
industrial applications.

3.2 Requirements Specification

The Requirements of “Virtual Mouse” are outlined below :

Software Requirements – PyCharm


PyCharm is a powerful and widely-used integrated development environment (IDE) designed
primarily for Python programming. Developed by JetBrains, PyCharm provides developers
with a comprehensive and user-friendly environment for coding, debugging, testing, and
deploying Python applications. It offers a rich set of features, including code completion, syntax
highlighting, smart code analysis, and an integrated debugger. PyCharm also supports various
web development frameworks like Django and Flask, making it suitable for both web and
desktop application development. Its intuitive user interface, robust set of tools, and extensive
plugin support make it a popular choice among Python developers, contributing to increased
productivity and code quality. Whether you're a beginner or an experienced Python developer,
PyCharm can significantly simplify and enhance your development workflow.

Page 5 of 13
Hardware Requirements - Processor : Minimum 1 GHz; Recommended 2 GHz or more -
Hard Drive : Minimum 32GB; Recommended 64GB or more - Memory(RAM) : Minimum
1GB; Recommended 4GB or above
Web Cam
MATLAB is a high-performance language for technical computing. It integrates computation,
visualization, and programming in an easy-to-use environment.

3.3 Planning and Scheduling

Page 6 of 13
Chapter 4: System Design

4.1 Schema Design

4.2 UML Diagrams / Block Diagram/ Circuit Diagram/ Algorithms Design

 Block Diagram

 Key points

 Neural Network

Page 7 of 13
 Flow Chat

Page 8 of 13
 GANTT CHATT

Page 9 of 13
4.3 User interface design

 Home page

 Images processing

Page 10 of 13
 Output / Left Click

Page 11 of 13
Chapter 5: Implementation and Testing.

Code:-

import cv2
import mediapipe as mp
import pyautogui
cap = cv2.VideoCapture(0)
hand_detector = mp.solutions.hands.Hands()
drawing_utils = mp.solutions.drawing_utils
screen_width, screen_height = pyautogui.size()
index_y = 0
while True:
_, frame = cap.read()
frame = cv2.flip(frame, 1)
frame_height, frame_width, _ = frame.shape
rgb_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
output = hand_detector.process(rgb_frame)
hands = output.multi_hand_landmarks
if hands:
for hand in hands:
drawing_utils.draw_landmarks(frame, hand)
landmarks = hand.landmark
for id, landmark in enumerate(landmarks):
x = int(landmark.x*frame_width)
y = int(landmark.y*frame_height)
if id == 8:
cv2.circle(img=frame, center=(x,y), radius=10, color=(0, 255, 255))
index_x = screen_width/frame_width*x
index_y = screen_height/frame_height*y
pyautogui.moveTo(index_x, index_y)
if id == 4:
cv2.circle(img=frame, center=(x, y), radius=10, color=(0, 255, 255))
thumb_x = screen_width / frame_width * x
thumb_y = screen_height / frame_height * y
print('outside', abs(index_y - thumb_y))
if abs(index_y - thumb_y) < 30:
pyautogui.click()
pyautogui.sleep(1)
cv2.imshow('Virtual Mouse',frame)
cv2.waitKey(1)

5.2 Testing Approach and Test Cases

Page 12 of 13
REFERENCES:

 http://globalaccessibilitynews.com/2011/06/06/uae-students-invents-virtual-
mouse-to-help-disabled-people/

 http://www.inclusive.co.uk/articles/adapting-the-computer-for-those-with-
physical-disabilities--a250#

 https://ieeexplore.ieee.org/document/8934612

 http://www.onlinejournal.in/UIRV314/130.pdf

 https://interestingengineering.com/how-gesture-recognition-will-change-our-
relationship-with-tech-devices

 https://www.washington.edu/doit/working-together-people-disabilities-and-
computer-technology

Page 13 of 13

You might also like