Skip to content

Brgraul/START-Hack

Repository files navigation

START-Hack 2019 ️

drawing

The problem 💔

To create a better world and increase understanding among each other, everyone have similar cognitive oportunities, or at least a common ground in this regards. 80% of human communication is non-verbal, and blind people are deprived of that valuable part. This is a barrier for emotional connections, and we are here to bring it down.

Our solution 📈

Our Pheel Smartglasses provide audio feedback on the expression and reactions of the people in front of you. They count with an integrated camera and send synchronous images to the Azure Cloud for Sentiment Analysis. This feedback is communicated back to the user in the form of speech, transferred through a discrete bone conduction interface.

The Implementation 👾

Along these 31 hours, we built a working Web Application that performs sentiment analysis based on webcam input and transforms the result in user actionable audio feedback.

  • Backend -> Node.js
  • Frontend -> React.js
  • Sentiment Analysis -> Azure Face API
  • Text to Speech -> Azure TTS API

Our Vision 🌎

Our bigger picture is to implement this framework into a pair of glasses, that discretly scans and feedbacks the enviroment to the user in need. Using cameras for the face detection and bone conduction for communicating the feedback - enabling emotions for everyone!

# ThinkBigStartSmall

About

Elevating emotional perception to enhance human interactions.

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors