This project enables users to control their computer using hand gestures, eliminating the need for traditional input hardware like a mouse or keyboard. It uses a webcam to detect and interpret hand movements in real-time.
Developed in Python using computer vision and automation libraries, this project aimed at exploring Human-Computer Interaction (HCI) using AI-based gesture recognition.
-
🖱️ Virtual Mouse
Move cursor, click, and scroll using intuitive finger gestures. -
⌨️ Virtual Keyboard
Interact with an on-screen keyboard by pointing at keys with finger gestures. -
📷 Real-Time Detection
Smooth and fast hand tracking using a regular webcam. -
💻 No Physical Input Devices Required
Designed to simulate mouse and keyboard control through hand movement alone.
- Language: Python 3.x
- Libraries Used:
opencv-contrib-pythonmediapipepyautoguicvzonepynputnumpytkinter