Skip to content

Callmeamann/Human-Computer-Interaction-via-Hand-Gesture

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Human-Computer Interaction (HCI) via Hand Gesture

Welcome to the Human-Computer Interaction project utilizing hand gestures! This project aims to provide intuitive interaction between users and computer systems using hand gestures.

Project Structure

1. Model Trainer Module

The model trainer module is responsible for training the machine learning model for hand gesture recognition. It includes:

  • Jupyter Notebook: Contains the code for training the machine learning model using collected hand gesture data.
  • app.py: A script for collecting datasets by capturing hand gesture coordinates.

2. Dataset

The dataset directory contains a collection of coordinates for hand gestures along with their corresponding labels. These datasets are used for training and testing the machine learning model.

3. Src (Source Code)

The src directory is the heart of our project and contains all the necessary modules and packages for the application to work. It includes:

  • Main Code: This module utilizes the trained machine learning model and provides a working window with three modes:
    • Tracker Mode: For hand tracking.
    • Gesture Mode: For predicting hand gestures.
    • Interaction Mode: For interacting with the system based on recognized gestures.

Getting Started

To get started with the project:

  1. Clone the repository to your local machine.
  2. Navigate to the directory containing the project files.
  3. Set up the environment and dependencies required for running the application.
  4. Train the machine learning model using the provided Jupyter Notebook.
  5. Run the application and start interacting with the system using hand gestures!

Creator

  • Aman Gusain

Acknowledgments

  • Ivan Goncharov: A special thanks to Ivan Goncharov for his invaluable guidance and insights shared through his YouTube channel. His expertise has been instrumental in shaping and improving this project.

  • kinivi/hand-gesture-recognition-mediapipe: We acknowledge and appreciate the work done by the contributors to the hand-gesture-recognition-mediapipe GitHub repository. This project served as a valuable learning resource, providing inspiration and insights for our own implementation.

Feel free to explore the project and contribute to its development! Thank you for your interest in HCI via Hand Gesture.

About

Developed Intuitive Interaction System between users and computer systems using hand gestures.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published