Inspiration

Our inspiration was to improve the learning experience with technology through inuitive controls, efficient motions and minimal design. We also recognize the diversity of learning styles different individuals possess and wanted to push the boundaries of conventional learning.

What it does

We used the Leap Motion device to detect and translate hand gestures into practical digital actions. It serves as a highly customizable controller that can be integrated and personlized to the user's preferences.

How we built it

We processed the Leap Motion's three-dimensional vectorized data to define custom hand gestures by analyzing individual finger positions, directions, and movement. We then used a real-time Firebase back-end to allow the controller to be connected with other devices.

Challenges we ran into

Creating accurate, consistent, and distinct hand gestures was a lot more challenging than we anticipated. However, we managed to utilize the data measured to eventually create reliable gestures.

Accomplishments that we're proud of

We are proud of implementing a fully operational, unique and nontraditional medium of approaching technology education.

What's next for SynHaptic

We would like to improve the human-computer interaction through more gestures and more accurate sensing. We also envision SynHaptic's gesture-based learning being applied to the classroom space, with multiple users (students and teachers), as well as curriculum or concept-based features.

Share this project:

Updates