0% found this document useful (0 votes)
24 views33 pages

Inhouse Reportt

The document is a project report for a Bachelor of Technology in Computer Science and Engineering, detailing a project titled 'Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking.' The project aims to develop a gesture-based virtual mouse system that utilizes real-time hand tracking via a webcam to facilitate intuitive user interaction with computers. It outlines the objectives, methodology, and potential benefits of the project, emphasizing the importance of gesture recognition technology in enhancing human-computer interaction.

Uploaded by

DEEKSHITH R
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views33 pages

Inhouse Reportt

The document is a project report for a Bachelor of Technology in Computer Science and Engineering, detailing a project titled 'Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking.' The project aims to develop a gesture-based virtual mouse system that utilizes real-time hand tracking via a webcam to facilitate intuitive user interaction with computers. It outlines the objectives, methodology, and potential benefits of the project, emphasizing the importance of gesture recognition technology in enhancing human-computer interaction.

Uploaded by

DEEKSHITH R
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

School of Computer Science and Engineering

(Computer Science & Engineering)


Faculty of Engineering & Technology
Jain Global Campus, Kanakapura Taluk - 562112
Ramanagara District, Karnataka, India

2024-2025
( VI Semester)

A Project Report on

“Gesture-Controlled Virtual Mouse


Using Real-Time Hand Tracking”
Submitted in partial fulfilment for the award of in-house project

BACHELOR OF TECHNOLOGY
IN
COMPUTER SCIENCE AND ENGINEERING

Submitted by

Bathala Harsha, Deekshith R, Hrishikesh U Gowda, Poornesh D,


Pranshu Jain

22BTRCN047,22BTRCN068,22BTRCN110,
22BTRCN206, 22BTRCN211

Under the guidance of


Dr. S Jerald Nirmal Kumar
Assistant Professor
Department of Computer Science and Engineering
School of Computer Science & Engineering
Faculty of Engineering & Technology
JAIN (Deemed to-be University)

Computer Science and Engineering


1
Department of Computer Science and Engineering
School of Computer Science & Engineering
Faculty of Engineering & Technology
Jain Global Campus, Kanakapura Taluk - 562112
Ramanagara District, Karnataka, India

CERTIFICATE

This is to certify that the project work titled “Gesture-Controlled Virtual Mouse Using
Real-Time Hand Tracking” is carried out Bathala Harsha(22BTRCN047), Deekshith
R(22BTRCN068), Hrishikesh U Gowda(22BTRCN110), Poornesh D(22BTRCN206), Pranshu
Jain(22BTRCN211), a bonafide student(s) of Bachelor of Technology at the School of
Engineering & Technology, Faculty of Engineering & Technology, JAIN (Deemed-to-be
University), Bangalore in partial fulfillment for the award of in-house project, during the year
2024-2025.

Dr. S Jerald Nirmal Kumar Dr. Mahesh TR , Dr. Geetha G


Asst Professor Program Head, Director,
Dept. of CS&E, Computer Science and School of Computer Science
Engineering, & Engineering
Date: School of Computer Science & Faculty of Engineering &
Engineering Technology
Faculty of Engineering & JAIN (Deemed to-be
Technology University)
JAIN (Deemed to-be University) Date:
Date:

Name of the Examiner Signature of Examiner

1.

2.

Computer Science and Engineering


2
DECLARATION

We, Bathala Harsha(22BTRCN047), Deekshith R(22BTRCN068), Hrishikesh U


Gowda(22BTRCN110), Poornesh D(22BTRCN206), Pranshu Jain(22BTRCN211), students of
6th semester B.Tech in Computer Science and Engineering, at School of Engineering &
Technology, Faculty of Engineering & Technology, JAIN (Deemed to-be University),
hereby declare that the internship work titled “Gesture-Controlled Virtual Mouse Using
Real-Time Hand Tracking” has been carried out by us and submitted in partial fulfilment for
the award of in-house project in Bachelor of Technology in Computer Science and Engineering
during the academic year 2024-2025. Further, the matter presented in the work has not been
submitted previously by anybody for the award of any degree or any diploma to any other University,
to the best of our knowledge and faith.

Name 1: Bathala Harsha Signature


USN :22BTRCN047
Name 2: Deekshith R Signature
USN :22BTRCN068
Name 3: Hrishikesh U Gowda Signature
USN :22BTRCN110
Name 4: Poornesh D Signature
USN :22BTRCN206
Name 5: Pranshu Jain Signature
USN :22BTRCN211

Place : Bangalore
Date :

Computer Science and Engineering


3
ACKNOWLEDGEMENT

It is a great pleasure for me to acknowledge the assistance and support of a large


number of individuals who have been responsible for the successful completion of this project
work.
First, I take this opportunity to express my sincere gratitude to Faculty of Engineering
& Technology, JAIN (Deemed to-be University) for providing me with a great opportunity to
pursue my Bachelors Degree in this institution.
I am deeply thankful to several individuals whose invaluable contributions have made
this project a reality. I wish to extend my heartfelt gratitude to Dr. Chandraj Roy Chand,
Chancellor, for his tireless commitment to fostering excellence in teaching and research at
Jain (Deemed-to-be-University). I am also profoundly grateful to the honorable Vice
Chancellor, Dr. Raj Singh, and Dr. Dinesh Nilkant, Pro Vice Chancellor, for their
unwavering support. Furthermore, I would like to express my sincere thanks to Dr. Jitendra
Kumar Mishra, Registrar, whose guidance has imparted invaluable qualities and skills that
will serve us well in our future endeavors.
I extend my sincere gratitude to Dr. Hariprasad S A, Director of the Faculty of
Engineering & Technology, and Dr. Geetha G, Director of the School of Computer Science &
Engineering within the Faculty of Engineering & Technology, for their constant
encouragement and expert advice. Additionally, I would like to express my appreciation to Dr.
Krishnan Batri, Deputy Director (Course and Delivery), and Dr. Deepak K. Sinha, Deputy
Director (Students & Industry Relations), for their invaluable contributions and support
throughout this project.
It is a matter of immense pleasure to express my sincere thanks to Dr. Mahesh TR,
Program Head, Computer Science and Engineering, School of Computer Science &
Engineering Faculty of Engineering & Technology for providing right academic guidance that
made my task possible.
I would like to thank our guide Dr. S Jerald Nirmal Kumar, Assistant Professor, Dept.
of Computer Science and Engineering, for sparing his valuable time to extend help in every
step of my work, which paved the way for smooth progress and fruitful culmination of the
project.
I would like to thank our Project Coordinator Dr. Rhea Sriniwas, and all the staff
members of Computer Science and Engineering for their support.

Computer Science and Engineering


4
I am also grateful to my family and friends who provided me with every requirement
throughout the project.
I would like to thank one and all who directly or indirectly helped me in completing
the work successfully.

Signature of Student(s)

i
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

ABSTRACT

In the rapidly evolving technology environment of the current era, people


increasingly need affordable and hands-free systems. As smart technologies and
easy-to-use interfaces become mainstream, systems allowing natural interaction are
now a must. The "Gesture Controlled Virtual Mouse Using Real-Time Hand
Tracking" project aims to bridge the gap between convenience for users and
advanced hand gesture recognition technology to offer an innovative and easy-to-
use solution to interact with digital devices.

The primary objective of this project is to create and deploy a gesture-based virtual
mouse system that integrates real-time hand tracking, mouse cursor movement,
mouse click recognition, and scrolling. The system takes advantage of a webcam to
recognize hand gestures, providing a natural, free-hand experience for users.
Computer vision libraries, including OpenCV and MediaPipe, are applied to track
the hand, and pyautogui is used for emulating the movement of the mouse on the
computer.

Besides, this project explores the viability of multi-hand gesture control wherein the
right hand is used for mouse movement and clicking, while the left hand may be
used for scrolling. A mode-switching functionality is also provided such that the user
can interchange the roles of the right and left hands. Through this project,
sophisticated concepts in interactive user interface design, machine learning, and
computer vision are demonstrated, which lead to a seamless and immersive user
experience.
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

TABLE OF CONTENTS

Chapter 1 08
1. Introduction 08
1.1Background & Motivation 08
1.2Objective 10
1.3Delimitation of research 11
1.4Benefits of research 12

Chapter 2 13
2. LITERATURE SURVEY 13
2.1 Literature Review 13
2.2 Inferences Drawn from Literature Review 15

Chapter 3 16
3. Problem Formulation and Proposed Work 16
3.1 Introduction 16
3.2 Problem Statement 16
3.3 Proposed Algorithms 17
3.4 Proposed Work 18

Chapter 4
4. Implementation 19
Software algorithm 19

Chapter 5 21
Results and discussion 21

Chapter 6 23
Conclusions And Future Scope 23

References (IEEE format ) xxvi

Appendices xxvii
Appendix – I xxvii
Appendix – II xxviii
Information Regarding Student xxxi
Photograph Along With Guide xxxii
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking
Linear Regression
Chapter 1

1. INTRODUCTION
In the modern computer world, human-computer interaction (HCI) is growing more
complex, and the trend is moving toward gesture-based interfaces to enable users to
communicate with systems in a more natural way. The conventional input devices like the
mouse and keyboard are being supplemented or even substituted by more natural, hands-
free solutions. Computer vision-driven gesture-based systems are also proving to be a
promising substitute for conventional input devices, a simple and seamless way of
commanding digital devices. The "Gesture Controlled Virtual Mouse Using Real-Time
Hand Tracking" project aims to create a system by which people can use natural hand
movements to control their computers, the potential for greater ease and comfort of use
leading the way. With the aid of real-time hand tracking, the system can identify specific
hand movements to simulate mouse movement, clicking, and scrolling, for an immersive
and user-friendly experience.

1.1. Background & Motivation

The breakthrough development of computer vision and machine learning has greatly
affected human-computer interaction. The traditional methods of input, such as interacting
through a mouse and keyboard, are limiting and laborious, especially among disability
patients or individuals who want an intuitive interface. Gesture input has the potential to
remove these limitations and enable users to manipulate their machines without actually
touching a mouse or keyboard. The motivation for this project is to explore the possibility
of gesture-based hand interaction as an intuitive and natural way to interact with a
computer. Existing systems are often hardware-intensive or require significant training.
This project is intended to create a light and easy-to-use system that will be able to operate
on the mere use of a standard webcam, providing a simple yet stable way for users to control
their devices using gestures.

Computer Science and Engineering


8
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking
Linear Regression

Flowchart for developing the Virtual Mouse system using hand gesture recognition and
computer vision techniques.

Computer Science and Engineering


9
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

1.2. Objective

The main goal of this project is to implement and create a gesture-based virtual mouse
system for interacting with computers by hand gestures. The application shall:

• Offer real-time tracking of hands to identify gestures and move the mouse cursor.

• Implement left and right click capabilities through certain hand gestures.

• Provide scrolling capability triggered by hand movements.

• Be able to enable multi-hand gestures, where each hand enables a different operation
(right hand cursor movement, left hand scroll).

• Have a mode-switching capability to flip the hands' roles for further control flexibility.

• Use on-screen real-time instructions in an attempt to guide users how they need to
utilize the system.

Lastly, the system aims to balance gesture recognition and intuitive user engagement in
order to produce a free-hand, sensitive virtual mouse improving user experience.

Computer Science and Engineering


10
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

1.3. Delimitation of research

To ensure that the project is not too big and it remains attuned, there are some restrictions
imposed:
• The system uses a webcam for the capture of hand gestures without having to use any
special hardware in the form of depth sensors.
• It doesn't work on supporting voice recognition or any other high-level modes of input;
instead, it is based solely on hand gestures.
• The system employs a limited number of hand movements to ensure simplicity,
focusing on real-time cursor movement, clicks, and scrolling.
• The project has been implemented for Windows operating systems and may need
additional adaptations for cross-platform use in the future.
• The system does not employ advanced error handling or advanced machine learning
model training for recognizing hand gestures; rather, it employs pre-trained models for
efficiency.
These limitations render the project feasible yet demonstrate essential principles in hand
gesture recognition and computer vision.

Computer Science and Engineering


11
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

1.4. Benefits of research


This project possesses a number of technical and practical advantages in it:

• Educational Value: It is a rich learning space for learning about computer vision,
hand tracking, and user interface design.

• Innovative Interaction: The project provides users with a new and intuitive way of
interaction with the computer and hand-free for everyone who desires more relaxing
ways of input.

• User Experience Centricity: With real-time gesture recognition, the project focuses
on creating an intuitive and user-centric experience.

• Scalability: The system design can be scaled in the future even more to include
additional features, such as voice operation, multi-device support, or more complex
gestures.

• Practical Application: The project has the potential for being the platform upon
which a more complex system could be created using gesture controls, and one could
utilize the system in countless different industries such as assistive technology or even
game and design.

On a broader scale, the project is concerned with exploring the possibility of combining
computer vision, machine learning, and human-computer interaction to develop innovative
systems that enable more interaction between humans and technology.

Computer Science and Engineering


12
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

Chapter 2

2. LITERATURE SURVEY

2.1. Literature Review

Gesture-based human-computer interaction (HCI) has gained significant attention in recent


years, offering touchless and intuitive alternatives to traditional input devices. Various
research studies have explored the development of virtual mouse systems utilizing hand
gesture recognition, computer vision, and real-time tracking algorithms. This section
reviews key contributions in this field to contextualize the proposed work.
N. R. Sathish Kumar et al. [1] present a gesture-controlled virtual mouse system that
employs computer vision and hand-tracking techniques to enable intuitive human-computer
interaction. Utilizing OpenCV and MediaPipe, the system recognizes specific hand
gestures captured via webcam, interpreting them as mouse operations such as cursor
movement, left-click, right-click, and scrolling. Real-time tracking of finger positions and
hand landmarks is used to execute mouse commands, eliminating the need for physical
contact. The authors emphasize the broad applicability of this technology, especially in
touchless interfaces and accessibility solutions. This study illustrates how gesture-based
interaction can offer an efficient, accessible, and hardware-free alternative to traditional
input devices.
Akash Singh et al. [2] explore a novel approach to replacing traditional mouse devices by
using a live webcam to recognize and process hand gestures. The proposed system uses
computer vision techniques to replicate all mouse functionalities, including cursor control,
clicks, and scrolling, through hand movements. Unlike conventional methods that rely on
hardware modifications or peripherals, this solution leverages real-time video analysis for
gesture detection and tracking. The study emphasizes how this technology represents a
significant advancement in HCI, potentially making physical input devices obsolete.
Neha Sabrin TK et al. [3] investigate the use of hand gestures as a replacement for
traditional mouse input, focusing on creating a user-friendly, budget-friendly virtual mouse
system. The authors utilize machine learning algorithms and sensor-based gesture
recognition to translate hand movements into cursor actions on the screen. Their study finds
the system to be accurate and effective in diverse environments, showcasing potential
applications beyond conventional settings, such as underwater operations, space missions,
Computer Science and Engineering
13
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking
and other remote or extreme scenarios. This work underlines the relevance of touchless
technology and its ability to enhance HCI across various fields.
Xue Xue et al. [4] propose a cost-effective simulated mouse system using dynamic hand
gesture recognition with a Kinect camera. Unlike glove-based or hardware-dependent
systems, their approach relies on identifying the palm node as a skeleton point, allowing
for simplified gesture tracking without complex motion detection algorithms. This reduces
equipment needs and user constraints while increasing the recognition rate. Their
experimental results show a gesture recognition accuracy of 99.2%, significantly
outperforming traditional hand-tracking models.
P.C.D Kalaivaani et al. [5] focus on developing a gesture-based virtual mouse system aimed
at improving HCI, particularly for users with accessibility challenges. The system uses a
basic webcam to detect and process hand movements, replacing traditional input devices.
It allows users to control cursor motions and perform click functions without touching the
hardware. This camera-based solution is especially relevant in the context of minimizing
contact with shared devices in public spaces, helping reduce the spread of contagious
diseases. The study promotes the role of such technology in creating a more inclusive and
hygienic computing environment.
P. High Court Durai et al. [6] design a touchless virtual mouse system to reduce physical
contact with digital devices in public spaces. Recognizing the health concerns raised during
the pandemic, the authors present a solution based on hand gesture recognition using
computer vision. The system captures real-time images via a webcam, processes them, and
assigns various mouse actions to corresponding hand gestures. For example, a click is
performed by joining the index and middle fingers. Implemented using Python, this model
allows for effective control of the cursor without a physical mouse, thereby promoting
hygiene and supporting the next stage of HCI evolution.
These studies collectively highlight the growing importance and feasibility of gesture-
based systems as effective alternatives to traditional input methods. The reviewed literature
emphasizes accuracy, accessibility, hygiene, and adaptability in various use-case
environments. These findings strongly support the continued development and refinement
of virtual mouse systems that leverage computer vision and real-time gesture recognition
for natural and contactless interaction.

Computer Science and Engineering


14
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

2.2. Inferences Drawn from Literature Review

The analysis of current literature brings out the accelerated development and usability of
gesture-based Human-Computer Interaction (HCI), specifically in the field of virtual
mouse systems. The consistent finding across a series of studies is the greater use of
computer vision and real-time hand tracking to simulate the functionality of an old physical
mouse. Such systems usually utilize technologies like OpenCV, MediaPipe, or depth
sensors like Kinect to record hand movements through a regular webcam, allowing control
over cursor dynamics, clicking, and scrolling actions.
One of the main drivers of these developments is the minimization of reliance on physical
input devices. This is particularly important for optimizing accessibility for individuals
with physical disabilities and minimizing touch in public or shared computing
environments, which is particularly relevant in the context of health crises like the COVID-
19 pandemic. Studies also establish that gesture-controlled systems can be used in a range
of environments, including factories, remote locations, and public kiosks, where
conventional input is impractical or undesirable.
The literature reviewed exhibits very high rates of accuracy in gesture recognition and
system responsiveness. For example, there are some systems with over 99% recognition
rates, which indicates that there has been significant improvement in reliable input
interpretation. In spite of these improvements, a number of studies identify areas for
improvement. These include improving gesture recognition under varying lighting
conditions, refining motion smoothing algorithms, and ensuring consistent performance
across different user hand shapes and sizes.
In addition, the literature indicates the potential for expanding these systems into more
application-specific uses. Potential future enhancements include incorporating machine
learning methods for dynamic gesture recognition, as well as integrating multiple data
sources in different modalities to enhance system robustness and user flexibility.
In conclusion, the current body of work solidly underlines the future and potential of hand-
gesture-controlled virtual mouse systems. As computer vision and real-time processing
technologies continue to improve, gesture-control systems will tend to provide increasingly
seamless, natural, and touchless user experience, transforming paradigms in computing
interaction.

Computer Science and Engineering


15
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

Chapter 3

3. PROBLEM FORMULATION AND PROPOSED WORK

3.1. Introduction

With the increasing need for more intuitive and user-friendly input systems, the
conventional inputs such as the mouse and keyboard are being replaced more and more by
gesture-based input systems. With the development of technology, particularly in computer
vision, there is a potential for developing hands-free interaction systems with a more natural
and effective method of interacting with digital devices. However, the majority of modern
gesture recognition systems are handicapped by the necessity for high accuracy, low
latency, or ease of use. The goal of this project is to build a solid system for a virtual mouse
that, with real-time hand tracking, allows users to move the cursor, click, and scroll at their
own discretion without any need for physical peripherals, thereby lowering barriers to use
and user comfort.

3.2. Problem Statement

The conventional method of mouse input is based largely on physical interaction, which may
be uncomfortable and inefficient, especially for disabled individuals or those wanting a more
ergonomic and intuitive interface to interact with their devices. Existing gesture-controlled
systems are generally hampered by inaccuracy in tracking, poor gesture recognition, and
unresponsiveness. Additionally, the majority of gesture-based systems require special
hardware or extensive calibration processes, thereby limiting their utility. This project
eliminates such obstacles by utilizing a real-time hand gesture-based virtual mouse system
based on computer vision technology such as OpenCV and MediaPipe to efficiently trace and
recognize the movements of hands with minimal latency. The system is user-friendly, requiring
only a standard webcam and no other such special hardware.

Computer Science and Engineering


16
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

3.3 Proposed Algorithm

Gesture Detection and Hand Tracking Logic


• Hand Tracking: Hand tracking shall be attained real-time utilizing MediaPipe library in
order to observe the position as well as motion of the hand of the user.
• Gesture Recognition: Certain gestures like index finger pointing or opening and closing
the hand will be translated to mouse events like cursor movement, left-click, right-click,
and scrolling.
• Cursor Movement Algorithm: The position of the hand in the video stream will be
identified by the system and translated into relative cursor movement on the screen. The
cursor direction and speed will be derived from hand movement.
Mouse Actions and Scrolling Logic
• Clicking Algorithm: The computer will detect when the user makes a fist or extends the
index finger to simulate left or right clicks.
• Scroll Algorithm: It will start scrolling by sliding the index finger up or down in the
direction of the screen. The computer will be able to differentiate between scrolling actions
and other hand movements.
• Mode Switching Algorithm: There will be a mode-switching facility whereby switching
functionality among the left and right hands will be possible, adding control flexibility.
Backend Logic and Integration
• PyAutoGUI for Simulation: The system will use PyAutoGUI to simulate scrolls, clicks,
and mouse movement based on the recognized gestures.
• Real-Time Display: Real-time feedback and guidance will be offered to the user through
webcam display so they know which gesture relates to which action.

Computer Science and Engineering


17
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

3.4 Proposed Work

The objective of this project is to develop an efficient and easy-to-use virtual mouse system
that supports real-time mouse cursor, click, and scrolling control with hand tracking. The
project will comprise the following components:

• Real-Time Gesture Recognition and Hand Tracking: OpenCV and MediaPipe will
track the movement of the hands and convert it to make the mouse pointer move.
• Mouse Functions and Cursor Movement: The users will drive the system with simple
hand motions such as opening, closing, or pointing, which will be interpreted as cursor
movement.
• Advanced User Experience: A role-shifting feature will be introduced, which allows
users to exchange their hands' roles, and display real-time instructions in order to allow
users to adapt to the gestures.
• Backend Integration: PyAutoGUI will handle mouse movement and clicking, while the
system will use OpenCV for video capture and MediaPipe for hand gesture tracking.
•User Friendliness and Accessibility: The system shall be simple to use with minimal or
no setup and calibration and shall be usable by people with physical disabilities, offering a
hands-free option in place of the traditional mouse input.

The system will provide an interactive, fluid experience that may be expanded in the future
to enable other capabilities such as voice command, multiple hand tracking, or
compatibility with other smart products. It will offer a fast, real-time, and personal solution
for controlling a mouse using hands.

Computer Science and Engineering


18
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

Chapter 4

4. SOFTWARE ALGORITHM

Initialize Overview:
The virtual mouse system is intended to track and interpret hand movements through real-
time video feeds to manipulate the mouse pointer, simulate clicks, and facilitate scrolling.
The system employs MediaPipe for tracking hands, pyautogui for mimicking mouse
movement, and OpenCV for capturing and processing the video feed.

1. Hand Tracking Setup


• Library Setup:
MediaPipe's hand tracking feature is turned on by invoking mp.solutions.hands
for detecting and following real-time hand landmarks. Two hands are detected
and tracked simultaneously and gives maximum accuracy by mentioning the
minimum confidence of detection and tracking as 80%.
• Hand Labeling:
The application distinguishes between the right and left hand for some actions.
The mode switching enables people to assign roles (scroll versus control) to the
hands dynamically.
2. Video Capture and Screen Size
• Capture Video Feed:
The webcam captures high-definition video (1280x720). OpenCV is utilized to
handle the frames such that smooth tracking of hand movement is ensured.
• Screen Size Calculation:
The screen size is obtained using pyautogui such that there is accurate movement
of the cursor as per the hand movement.
3. Control of Cursor Movement
•Smoothing the Cursor Movement:
The movement of the cursor is smoothed out by averaging the last hand and
current hand positions to eliminate jitter and give a natural feel to the interaction.
The alpha value used for smoothing is 0.5, which gives the optimal
responsiveness.
•Real-Time Mouse Movement:
The cursor is kept at the trailing point of user hand motion depending on ring
finger location. The system converts ring finger position 2D coordinates into
screen coordinates adjusted for screen resolution.
4. Click Detection
•Left and Right Clicks:
The left click and right click are felt by means of the middle fingers and the index
fingers. A left-click is triggered when the tip of the index finger is below its dip,
and in the same way, a right-click is triggered when the tip of the middle finger is
below its dip.
• Click Cooldown:
To avoid a sequence of rapid clicks, a cooldown (0.3 seconds) time is added
between the click action.

Computer Science and Engineering


19
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

5. Scrolling Control
• Scroll Detection:
The scrolling is managed with the left or right hand based on mode settings. The
vertical movement of the index finger is tracked by the system and, on the basis of
palm relative movement, the direction of scrolling is decided.
• Scroll Sensitivity:
A factor of scroll sensitivity is used to adjust responsiveness for scrolling
operations. The threshold level is fixed as a minimal amount of vertical shifting
(0.005 units) to ensure a smoother scrolling process.
6. Mode Switching
• Hand Role Assignment
The system enables control and scroll swapping between the users' left hand and
right hand dynamically via the 'M' key. The system offers easy hand use in a
flexible manner for various user preferences.
7. Real-Time Instructions and Display
• Use Instructions:
Semi-transparent overlay at the top of the screen for displaying real-time use
instructions that direct the user. The instructions change based on mode (left-hand
or right-hand control/scroll).
• Video Feed Display
The instructions dynamically change so that the user knows at all times what the
current hands do (i.e., which hand controls the cursor and which scrolls).
8. Real-Time Updates and UI
• Camera Preview:
The video stream is resized and displayed in a self-contained window with real-
time user feedback about hand movements and gestures. It improves the overall
user experience.
• Exit and Mode Switching:
Users can exit the application using the 'Q' key, and the 'M' key can alternate
between left and right hands without any glitches for control/scrolling.
9. Final Notes
• Hand Tracking and Gesture Recognition:
MediaPipe provides decent hand tracking to allow for precise hand landmark
detection. The system focuses on targeting meaningful hand landmarks such as
the index, middle, and ring fingers for carrying out actions such as clicks and
scrolling.
• Usability:
The system provides smooth, natural control of cursor, click, and scroll with basic
hand movement, making interaction better without a traditional mouse or
touchpad.

Computer Science and Engineering


20
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking

Chapter 5

5. RESULTS AND DISCUSSIONS

5.1 Model Performance

• Accuracy of Gestures:
The virtual mouse system exhibits high precision in converting movement of the hands into
gestures that enable the cursor to follow alongside the movement of the user's hand without
serious lag. The system has real-time tracking through a smooth and responsive interface.

• Click and Scroll Detection:


The click and scroll detection functions are sensitive and accurate, and each action (left
click, right click, scroll) is provided with positive feedback.

• User Experience
Its user-friendly nature provides an engaging and hands-off experience with simpler setup.

5.2 Feature Importance

• Hand Landmarks:
Index, middle, and ring fingers are the most significant hand landmarks used to identify
gestures. The location of these landmarks relative to one another aids in distinguishing
between movement, scrolling, and click gestures.

• Smoothing Factor:
Alpha smoothing factor plays a critical role in enabling natural, smooth cursor movement.

5.3 Visualization of Results

• Gesture Heatmap:
A heatmap of cursor movement can be drawn to illustrate hand movements and positions.
It checks if the system is tracking the hand's motion appropriately.

Computer Science and Engineering


21
Gesture-Controlled Virtual Mouse Using Real-Time Hand Tracking
• Performance Metrics:
Response time and click and scroll accuracy are the most critical performance measures
that can be plotted against time in order to gauge the efficacy of the gesture control system.

5.4 Discussion

• Advantages:
o Accuracy and Speed: The system delivers accurate gesture detection and quick response
time.
o Seamless Integration: The virtual mouse integrates smoothly with existing user
interfaces to improve interaction without the need for extra hardware.
o User Flexibility: Mode-switching makes users more flexible, and any hand can use the
system.
•Limitations & Future Enhancements:
o Gesture Complexity: Additional gestures can be introduced to broaden functionality
(multi-finger gestures for complex operations).
o Device Support: Subsequent versions may incorporate support for mobile devices or
compatibility with smart glasses for augmented reality experiences.
Conclusion:
The virtual mouse system successfully integrates hand gesture recognition and real-
world usability to provide an intuitive input method for cursor control, click events, and
scrolling with little delay.

Computer Science and Engineering


22
Chapter 6

6. CONCLUSIONS AND FUTURE SCOPE

6.1 Conclusions

The virtual mouse project has effectively proven an innovative, hands-free method of
controlling a computer using real-time tracking of hand gestures. Using OpenCV,
MediaPipe, and pyautogui, the system effectively maps hand movement to cursor action,
with the additional capability of clicking and scrolling. The project points out the potential
of computer vision in creating intuitive human-computer interaction systems.
Major accomplishments are:

• Real-Time Gesture Recognition: The system successfully tracks hand gestures,


including cursor movement, left and right-clicking, and scrolling.
• Intuitive Control: Hand movement control of the cursor and clicking is a more natural
user action, providing an immersive experience.
• Flexible Mode Switching: The ability to switch between roles in each hand for cursor
control and scrolling provides flexibility and customization for the user.
• Real-Time Instructions: The on-screen instructions help the user, making the system
easy to operate and understand without particular training.
• Seamless Integration: The project contains seamless integration in a normal computing
environment, in which users are able to surf their system completely by hand
movements.

In general, the project demonstrates the capability of gesture-based control systems,


providing a hands-free, interactive, and intuitive way for technology interaction.

Computer Science and Engineering


23
6.2 Future Scope

While the virtual mouse project has reached a functional and stable state, there are several
avenues for improvement and future expansion:
1. Gesture Recognition Improvements
o Multi-Finger Gestures: Improve gesture recognition to support more
sophisticated multi-finger gestures for more sophisticated actions, such as swipe
gestures or pinch-to-zoom.
o Improved Tracking: Enhance the precision and responsiveness of hand
tracking by introducing machine learning models that have been trained to detect
smaller hand motions or more subtle gestures.
o Finger Pose Detection: Add finger pose recognition so users can manipulate
tasks such as object rotation or scaling using their fingers in 3D space.
2. Personalization to Users
o Hand Preference Learning: The user's device is capable of learning and
retaining user preference, including control or scroll-hand, and making
automatic adjustments based on user patterns of usage.
o Interchangeable Gestures: Allow users to create their own gestures for specific
functionalities, like launching a particular application or performing a function.
3. Performance Optimization
o Latency Reduction: Increase processing speed and lower latency to make the
cursor glide even smoother and respond quicker to clicking. This can be done
through optimized hand detection algorithms and more advanced hardware.
o Cross-Platform Support: Make the system function smoothly across platforms
(Windows, macOS, Linux) and interact with standard operating systems'
accessibility features.
4. ADVANCED INTERACTION FEATURES
o Voice Control Integration: Integrate voice command with gesture control to
introduce a more powerful interaction framework. For instance, users would be
able to say "click" to mimic a click without executing the associated gesture.
o Haptic Feedback: Provide haptic feedback through a wearable device so that
users receive a vibration when they undertake an activity such as a click or scroll.

Computer Science and Engineering


24
o Augmented Reality Integration: Use AR to project a virtual cursor on the real
world, allowing new types of user interaction with physical devices and objects.
5. User Interface and Experience Enhancements
o Enhanced On-Screen Instructions: Make the instructions better by adding
visual indicators or animations that tell the user how to perform some gestures.
o Interface Customization: Provide users with the ability to change the location,
color, and font size of on-screen instructions according to their needs.
6. Deployment and Scalability
o Cloud Integration: Host the system on the cloud to enable remote access
through devices such as smartphones and tablets by integrating with cloud
platforms such as AWS or Google Cloud.
o Mobile Version Development: Create the mobile version of the system through
which users can access their phone or tablet using hand gestures and provide a
totally cross-platform experience.

By incorporating these enhancements, the virtual mouse project could evolve into a powerful
and versatile tool, suitable for various applications such as accessibility, gaming, and interactive
environments. The development of additional features will enable the system to offer even more
immersive, user-friendly, and practical solutions for hands-free interaction with technology.

Computer Science and Engineering


25
REFERENCES

[1] N. R. S. Kumar, A. V. Reddy, B. A. Kumar, B. P. K. Reddy and A. S. K. Reddy, "Hand


Gesture-Based Virtual Mouse with Advanced Controls Using OpenCV and Mediapipe,"
2024 International Conference on Power, Energy, Control and Transmission Systems
(ICPECTS), Chennai, India, 2024, pp. 1-6, doi:10.1109/ICPECTS62210.2024.10780144.
keywords: {Computer vision;Tracking;Webcams;Gesture
recognition;Microcomputers;People with disabilities;Control systems;Real-time
systems;User experience;Delays;Gesture Recognition;Virtual Mouse;Hand
Tracking;Computer Vision;OpenCV;Mediapipe},

[2] A. Singh, S. Sagar, S. Bhatt and T. Upadhyay, "Real Time Virtual Mouse System Using
Hand Tracking Module Based on Artificial Intelligence," 2023 International Conference
on Communication, Security and Artificial Intelligence (ICCSAI), Greater Noida, India,
2023, pp. 36-39, doi: 10.1109/ICCSAI59793.2023.10421198. keywords: {Human
computer interaction;Tracking;Streaming media;Control systems;Mice;Real-time
systems;Task analysis;Hand-Tracking;Virtual Mouse;Image
Capture;Masking;Virtualization;Enhancement;Normalization},

[3] N. S. TK and A. Karande, "Real-Time Virtual Mouse using Hand Gestures for
Unconventional Environment," 2023 14th International Conference on Computing
Communication and Networking Technologies (ICCCNT), Delhi, India, 2023, pp. 1-6,
doi: 10.1109/ICCCNT56998.2023.10308331. keywords: {Human computer
interaction;Performance evaluation;Input devices;Mice;User experience;Real-time
systems;Space exploration;Virtual mouse;Image processing;OpenCV;Me-diapipe;hand
gestures;Artificial Intelligence;Machine Learning},

[4] X. Xue, W. Zhong, L. Ye and Q. Zhang, "The simulated mouse method based on
dynamic hand gesture recognition," 2015 8th International Congress on Image and Signal
Processing (CISP), Shenyang, China, 2015, pp. 1494-1498, doi:
10.1109/CISP.2015.7408120. keywords: {Mice;Gesture recognition;Hidden Markov
models;Trajectory;Cameras;Target tracking;simulated mouse;dynamic hand gesture
recognition;hand tracking;gesture recognition rate},

[5] P. C. D. Kalaivaani, R. Kumaresan, A. J. Manow Ranjith and S. Nandha Kumar, "Hand


Gestures for Mouse Movements with Hand and Computer Interaction model," 2023
International Conference on Computer Communication and Informatics (ICCCI),
Coimbatore, India, 2023, pp. 1-6, doi: 10.1109/ICCCI56745.2023.10128273. keywords:
{Computers;Human computer interaction;Webcams;Keyboards;Input
devices;Lighting;Mice;component;formatting;style;styling;insert;pipeline;motion;gesture
s},
[6] P. High Court durai, S. Jayasathyan, V. Shanjai Sethupathy and M. Bharathi,
"Implementation of Real Time Virtual Clicking using OpenCV," 2022 8th International
Conference on Advanced Computing and Communication Systems (ICACCS),
Coimbatore, India, 2022, pp. 729-732, doi: 10.1109/ICACCS54159.2022.9785293.
keywords: {Wireless communication;Performance evaluation;Human computer
interaction;Pandemics;COVID-19;Human computer interaction;Diseases;Human
factors;Personal digital devices;Computer interfaces;Real-time systems;Gesture
recognition;openCV;numpy;autopy;mediapipe;math},
Computer Science and Engineering
26
APPENDIX-I

SYSTEM ARCHITECTURE

Computer Science and Engineering


27
APPENDIX - II
SOURCE CODE
APPENDIX - II
SOURCE CODE
APPENDIX - II
SOURCE CODE
INFORMATION REGARDING STUDENT(S)
STUDENT NAME EMAIL ID PERMANENT ADDRESS PHONE PLACEME PHOTOGRAPH
NUMBER NT
DETAILS
DEEKSHITH R deekshithdeekshi0408@gmail.com #1535, 4TH CROSS, GANAPATI 7483323197
TEMPLE ROAD,
SANNAKKIBAYALU,
KAMAKSHIPALYA, BANGALORE
-79

POORNESH D dayananddayanand65@gmail.com #5 11TH CROSS SANJEEVNI NAGAR 8310245665


NEAR VEERANJANEYA TEMPLE
HEGGANALLI CROSS
SUNKADKATTE BANGLORE -91

HRISHIKESH hrishikeshgowda266@gmail.com HALE GABBADI S/O: G K 9972421527


U GOWDA UMASHANAKAR, HAROHALLI
HOBALI KAGGALA HALLI
RAMANAGAR
KAMATAKA - 562112
PRANSHU JAIN jainpranshu415@gmail.com PARMARWARA SAGWARA, 7023482220
NEAR YOGENDRA GIRI,
SAGWARA,
DUNGARPUR,314025

BATHALA HARSHA harshabathala21@gmail.com 144, VALLEY VIEW 6362017792


APARTMENTS, CANARA BANK
COLONY, CHANDRA LAYOUT,
BENGALURU-560040
Photograph Along with Guide

You might also like