Built using OpenCV, MediaPipe, Tensorflow and Kivy for the user interface, Virtual Mouse allows mouse control using several hand gestures mapped to perform different mouse functions such as movement, dragging, clicking, scrolling, zooming etc. These mappings can be changed and configured along with additional settings in runtime through the Kivy application.
The Virtual Mouse app follows a structured methodology to achieve accurate hand gesture recognition and mapping to mouse actions for each frame.
The dataset used for training the model was manually recorded and is available in training/collected_dataset.zip.
A simple guide to the available program's settings.
-
Detection Confidence: Sets the minimum confidence percentage for the detection to be considered successful.
- High Detection Confidence: Ensures that only highly confident hand detections are considered, which significantly reduces false detections but may miss more valid hand detections.
- Low Detection Confidence: Allows more detections to be considered valid, which can capture more hand movements but may include more false detections.
-
Tracking Confidence: Sets the minimum confidence percentage for the hand landmarks to be tracked successfully across frames.
- High Tracking Confidence: Ensures stable tracking of hand landmarks, which can result in smoother and more accurate gesture recognition but may lose track of hands more easily.
- Low Tracking Confidence: Allows for more continuous tracking of hand landmarks even with lower confidence, which can maintain tracking better but may introduce some jitter.
-
Detection Responsiveness: Adjusts how fast the program reacts to gesture changes. It has four values: Instant, Fast, Normal, and Slow.
- Instant: Program reacts immediately to gesture changes, providing the quickest response time but may be less stable.
- Fast: Program reacts quickly to gesture changes, balancing speed and stability.
- Normal: Program reacts at a moderate speed, providing a stable and responsive experience.
- Slow: Program reacts more slowly to gesture changes, prioritizing stability over speed.
-
Toggle Relative Mouse: Toggle relative mouse mode on or off. This can be toggled using a gesture as well.
- ON: Introduces touch pad like behavior, moving the mouse relative to its previous position and set sensitivity. This works using multiple screens.
- OFF: Maps hand position in camera to screen position depending on screen size. This does not work using more than one screen.
-
Relative Mouse Sensitivity: Sets the mouse sensitivity to use when relative mouse mode is on.
-
Scroll Sensitivity: Sets the sensitivity or speed of scrolling when using the set gesture.
You can customize which gestures perform specific mouse actions (e.g., left click, right click, scroll, idle) through the settings in the Kivy application. This enables you to tailor the control scheme to your needs.
To modify gesture mappings:
- Open the application and navigate to the settings screen.
- Select the "Gesture Settings" option.
- Change the mouse actions for the desired gestures.
- Python
- Pip
- You can run these commands on linux to install python, pip, python-venv and other dependencies
sudo apt-get update
sudo apt install python3
sudo apt install python3-pip
sudo apt install python3-venv
sudo apt-get install python3-tk python3-dev
- Install python from the official website
- Check if pip is already installed by running:
pip help
- If pip is not installed, please check this guide for installing pip on Windows
-
Clone the repository
git clone https://github.com/RamezzE/VirtualMouse-HandTracking.git
-
Navigate to project folder
cd VirtualMouse-HandTracking
-
Create and activate a python virtual environment
-
Linux
python3 -m venv venv source venv/bin/activate
-
Windows
python -m venv venv venv\Scripts\activate
-
-
Install necessary pip packages
pip install -r requirements.txt
-
If the above command does not work or throws an error, run the below command instead
pip install numpy mediapipe scikit-learn kivy[base] mouse pyautogui pyaml opencv-python
-
If you want to run training.ipynb, then install these extra packages as well
pip install ipykernel tensorflow pandas xgboost
-
-
Run main application file
python main.py
-
Optionally, if you'd like to run the script directly without running the Kivy application, you can run the alternative main file
python main_no_gui.py
Most of the icons used are provided by Icons8
This project is licensed under the MIT License.