Inspiration
VIDEO : https://youtu.be/UZ5P2KzhZBg
What it does
Syne is a tensorflow-based sign language processing system that allows mute people to efficiently communicate with the outside world.
How we built it
This system gathers (x, y, z) coordinates of each of the 15 joints in our hands using a leap motion sensor, which we then map to 3D vectors relative to the position of our palm. These vectors were normalized in order to ensure that the size of the hand doesn’t affect the accuracy of the system. Our neural network takes in the 45 data points, and, using three dense hidden layers, categorizes the hand gesture as one of the 26 letters. This model was trained with thousands of readings for each letter, and while training, received a validation accuracy of over 99%.
Accomplishments that we're proud of
This system could be set up at any facilities, not only allowing people to communicate more easily in their day to day lives, but also opening up a plethora of employment and other opportunities. We believe in empowering those with disabilities to carry out day to day tasks independently and seamlessly. With Syne, we eliminate the need to rely on translators, and open up a world of possibilities.
Built With
- leap-motion
- machine-learning
- matrices
- python
- tensor-flow
- vectors
Log in or sign up for Devpost to join the conversation.