A virtual keyboard that allows users to type using hand gestures via webcam, implemented with OpenCV and CVZone. The application uses hand tracking to detect finger positions and enables typing by hovering and pinching gestures.
- Real-time hand tracking
- Virtual keyboard layout
- Typing through hand gestures
- Interactive visual feedback
- Python 3.7+
- OpenCV
- CVZone
- NumPy
- Clone the repository
git clone https://github.com/yourusername/virtual-hand-gesture-keyboard.git
cd virtual-hand-gesture-keyboard
- Create a virtual environment (optional but recommended)
python -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
- Install required dependencies
pip install -r requirements.txt
- opencv-python
- cvzone
- numpy
Run the script with:
python main.py
- Hover your hand over a key to highlight it
- Pinch (bring thumb and ring finger close) to select a key
- The selected text appears in the text box at the bottom
- Requires good lighting
- May need calibration for different hand sizes
- Dependent on webcam quality
- OpenCV
- CVZone
- Inspired by computer vision interaction techniques
MediaPipe is a Google-developed, open-source framework for building cross-platform AI-powered multimedia solutions. For hand tracking, it:
- Detects up to 2 hands simultaneously
- Tracks 21 hand landmarks in real-time
- Provides 3D coordinate tracking
- Works across different platforms
- Offers high accuracy (95%+)
- Low computational requirements
Key Uses:
- Gesture recognition
- Virtual interfaces
- Computer vision applications
- Interactive experiences
How It Works in Our Virtual Keyboard:
- Tracks index finger for hovering
- Measures thumb-index distance for clicks
- Enables touchless typing through hand gestures
The framework simplifies complex machine learning tasks, making advanced computer vision accessible and efficient.