Inspiration

Research has shown that mental disorders can be diagnosed/screened using facial expressions and the emotions they convey when someone is triggered. Here's one paper that says that - https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3743655/ So the idea was to come up with an app that not only entertains or tries to brighten you up based on your emotions but also lets you know when to get help.

What it does

The app takes a picture and uses the Microsoft Face and Emotion APIs to analyze emotions and then the YouTube API uses the emotion and the user's preference to play music/videos. The data obtained from the Microsoft APIs is then analyzed to see if negative emotions are intense enough to require help and then uses the New York State's Clinics data and populates a listview showing health centers that the user can use. The app also allows the user to talk to someone for emotional support.

The app can also be used by doctor's to analyze the patient's emotions while doing ERP (Exposure and Response Prevention therapy) and see if their emotions are getting affected when exposed to triggers.

How I built it

Built it using the Microsoft, YouTube and the NY State Health Centers API

Challenges I ran into

Getting the Microsoft API to work in conjunction with the YouTube API

Accomplishments that I'm proud of

Finishing the application

What I learned

It is important for us to keep track of our mental states now and then because a lot of people with mental disorders never get help.

What's next for EmoSnap

Creating a different UI for doctors.

Built With

Share this project:

Updates