Inspiration

Modern pose-estimation models allow for fast real-time predictions of people's limb positions. We hope to make this technology more accessible to integration into existing control structures and games. Giving anyone the ability to replace a keyboard controlled game or action and replace it with gesture control.

What it does

Our Gesture Controller allows users to interact directly with computers using gesture recognition, instead of cumbersome keyboards. To facilitate ease of use, our program allows users to create new sets of gestures and bind them to key actions. This means that any task that currently requires a keyboard, can instead be replaced by easily defined gestures. This gives our program limitless potential in its use cases.

How we built it

We made use of the WrnchAI pose estimation algorithm, to provide us with the positions of joints at runtime. We then used these joint position and user input to define gestures. This allows use to run the program and use gesture control to interact with the computer.

Challenges we ran into

Hardware limitations: our laptops are only able to run the cpu version of WrnchAI and do so at a rate of about 2 fps. This prevented us from doing motion tracking gestures, and limited use to using still poses. Faster pose-estimation (such as that using a gpu) can increase the fps to a rate that would allow far more responsive control.

Accomplishments that I'm proud of

It Works!!! Our Gesture Controller allows for more human control of computers without the use of keyboards.

What I learned

We learned about the Wrnch API and also that python 2 is bad. Oh and also some GUI.

What's next for Gesture Controller

Stronger hardware would allow us to improve the technology to understand motion. Further improvements could be achieved by adding recognition for hands and face features.

Built With

Share this project:

Updates