Skip to content

This project implements a real-time hand gesture recognition system using computer vision and machine learning. It allows users to control mouse movements, lock the screen, and play sound effects through specific hand gestures detected via a webcam.

Notifications You must be signed in to change notification settings

0nsec/hand_controler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hand Landmark Control System

Overview

This project implements a real-time hand gesture recognition system using computer vision and machine learning. It allows users to control mouse movements, lock the screen, and play sound effects through specific hand gestures detected via a webcam.

The system utilizes MediaPipe for hand landmark detection, OpenCV for video processing, PyAutoGUI for system control, and Pygame for audio playback.

Features

  • Mouse Control: Control the computer mouse using the left hand's index finger for cursor movement and thumb for clicking.
  • Screen Lock: Lock the Windows screen by displaying the middle finger gesture with the left hand.
  • Sound Playback: Play a sound effect by showing a thumbs up gesture with the left hand.
  • Gesture Toggle: Enable or disable gesture recognition for safety.
  • GPU Support: Optional GPU acceleration for improved performance and accuracy.
  • Real-time Feedback: Debug information displayed on-screen for gesture status.

Requirements

  • Python 3.8 or higher
  • Webcam
  • Windows operating system (for screen lock functionality)
  • Required Python packages:
    • mediapipe
    • opencv-python
    • pyautogui
    • pygame
    • numpy

Installation

  1. Clone or download the repository:

    git clone https://github.com/0nsec/hand_controler.git
    cd hand_controler
    
  2. Install the required packages:

    pip install -r requirements.txt
    
  3. Ensure the sound file effects/tuturu_1.mp3 exists in the project directory.

Usage

Run the main script:

python hand.py

To enable GPU acceleration (higher model complexity for better accuracy):

python hand.py --gpu

A window will open showing the webcam feed with hand landmarks and debug information.

Controls

  • Mouse Control:

    • Extend your left index finger to move the cursor.
    • Bring your left thumb close to the index finger to click.
  • Screen Lock:

    • With your left hand, extend the middle finger while keeping other fingers curled.
    • Press 'g' to toggle gesture recognition on/off.
  • Sound Playback:

    • With your left hand, raise your thumb significantly higher than other fingers while keeping them curled.
  • General:

    • Press 'm' to toggle mouse control on/off.
    • Press 'q' to quit the application.

Troubleshooting

  • Gestures not detected: Ensure good lighting and clear hand visibility in the camera frame.
  • Mouse not moving: Check that mouse control is enabled ('m' key) and your left hand is detected.
  • Sound not playing: Verify the effects/tuturu_1.mp3 file exists and Pygame is installed correctly.
  • Performance issues: Lower camera resolution or close other applications if the system lags.

Technical Details

  • Hand detection uses MediaPipe's hand landmark model with 21 keypoints per hand.
  • Gesture recognition is based on relative positions of finger landmarks.
  • Mouse control implements smoothing for natural cursor movement.
  • Screen lock uses Windows API calls for secure locking.
  • Audio playback supports MP3 files via Pygame.

About

This project implements a real-time hand gesture recognition system using computer vision and machine learning. It allows users to control mouse movements, lock the screen, and play sound effects through specific hand gestures detected via a webcam.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages