-
Notifications
You must be signed in to change notification settings - Fork 4
Home
- What Is HandWave?
- What Is HandWaveFor?
- What Technologies Underlie HandWave?
- How Does HandWave Accomplish Its Goals?
- Authors
- Licenses and Issues
In the rapidly evolving landscape of AI and Machine Learning, innovation is inevitable. HandWave represents a cutting-edge application that harnesses the power of hand gestures and facial recognition to enable computer control. Utilizing camera-based gesture and facial detection, HandWave simplifies computer interaction, particularly in remote presentation scenarios, serving as a versatile tool that enhances accessibility and boosts productivity.
History of HandWave
HandWave traces its roots back to the original idea of remote presentation control, using Hand Gesture Recognition library of Google MediaPipe. Initially developed by a team of four Metropolia students in 2023, this project has continually evolved and expanded its capabilities. Over time, it has integrated additional libraries and features, aiming to deliver both convenience and pioneering solutions for future endeavors.
HandWave is designed to do the following:
- Transform hand movements into intuitive commands to navigate the computer.
- Track face landmarking to prevent unintentional keystrokes.
- Screens or applications selection to implement movements.
- Screen recording and saving to local drive.
- Supporting near real-time monitoring, recording, and operational control.
HandWave consists of:
- Front-end: React, TypeScript, Vite, Google Mediapipe
- Back-end: Electron, Sequelize, SQLite3, nut-js
HandWave operates by:
- Leveraging the HaGRID Dataset, an invaluable resource for the development of hand gesture recognition systems.
- Incorporating Google Mediapipe solutions, such as the Hand Gesture Recognizer and Face Landmarking, to effectively detect and interpret gestures and facial features.
- Implementing Typescript as the primary programming language within the project, with Vite as the module bundler.
- Using Electron.js to ensure cross-platform compatibility across various operating systems (Windows, Linux, Mac).
- Employing nut.js as a versatile source for comprehensive desktop automation, including functionalities like mouse manipulation, keyboard input, and screen recognition.
- Utilizing Sequelize and SQLite3 to manage the database and persist custom gestures with associated keystrokes.
- Tomi Jumppanen
- Roope Laine
- Anton Tugushi
- Dat Pham
Tutorials and documentation for our application are accessible on this wiki website. If you encounter any missing information, please create a new issue here, and our team is committed to offering a timely resolution. Your feedback is valued and helps us improve our service.