Inspiration
As a team we were interested in creating something that could be used to make social interaction more intimate and meaningful. We began creating VibeWatch as an application that could provide entertainers and presenters real time information about the emotional state of their audience. As we got further along the development process, we realized how wholesome VibeWatch could be. We used our computer vision application to incentivize random acts of kindness to foster a kinder and healthier social environment.
What it does
VibeWatch uses computer vision to monitor certain key facial markers to label a face as having 1 of 8 emotions (happiness, sadness, anger, surprise, disgust, contempt, fear, confusion). VibeWatch is able to track these emotions in real time with a convenient UI using different resizing emoji's to emphasize the prevailing emotion of the audience. We also provide analytics on emotional state over time and analytics based on the demographics of the audience (ie. 30's female or a male in his 20's). VibeWatch also rewards its users for carrying out Random Acts of Kindness. Making someone smile on camera will result in a small charitable payment to the user, rewarding their behavior for helping put a smile on others' faces.
How we built it
VibeWatch uses Microsoft Azure's Face API, parses JSON responses for any number of faces in the field of view with many attributes like gender, age, emotion, etc. We used node.js to write our server script and backend, while using es6(js) and react for our front end. We also utilized ethereum to create our own Cryptocurrency called "GoodVibes". Our Smart Contract allows us to reward users for making others smile by transferring a small amount of our Cryptocurrency to the user.
Challenges we ran into
The Vision API was somewhat challenging to use in order to display live/real-time results. Setting up our cryptocurrency was also a new and fun experience.
Accomplishments that we're proud of
We are proud of how our app came together with such a wholesome and kind spirit. It is fun for people of all ages, has a legitimate use in terms of being able to alter content (video/audio) based on emotional body language, and has potential to make society a more helpful and friendlier place.
What we learned
We learned how to utilize the Azure Vision API and set up our own cryptocurrency.
What's next for VibeWatch
We would like to develop VibeWatch's ability to change media content based on emotion in a more robust and encompassing fashion. For example, VibeWatch can be used to change music volume/station when drivers seem tired or distracted/angry. It can be used to glean reactions over time of theater audiences, ensure customer service, etc.
Log in or sign up for Devpost to join the conversation.