Inspiration
As a team, we noticed a shared desire for a playlist that truly reflected our emotions, without the need for time-consuming curation. We wanted a way to instantly connect with music that felt personal and understood our moods. This urge to experience music that resonates with us, effortlessly and authentically, inspired us to create this platform.
What it does
Our platform uses AI to generate personalized playlists based on images users upload. By analyzing facial expressions and emotional cues from these images, it curates music that aligns with the user's current mood, creating an intuitive and engaging way to discover new songs.
How we built it
We used a combination of image recognition technology and AI-driven algorithms to interpret emotions from photos. Specifically, we used Python 3.6+, Spotify API, The Spotify Million, and Database, and the DeepFace Library
Challenges we ran into
Our first hackathon has been a challenging yet invaluable learning experience. One of our biggest hurdles was integrating various tools effectively, requiring extensive trial and error to develop the most efficient algorithm. As we tested our code, we encountered several bugs, such as fine-tuning the emotion reader for accuracy, resolving issues with the playlist cover appearing blue-scaled, and refining song selection to ensure diversity beyond just titles matching the detected emotion. Once these issues were addressed, we faced another major challenge—seamlessly connecting the backend and frontend. Despite this, connecting backend and frontend is still a challenge for us but we did our best on both aspects respectively!
Accomplishments that we're proud of
We successfully integrated AI into our project for the first time, overcoming numerous challenges along the way. Our team learned how to apply advanced technologies to create something meaningful and personalized, and we’re proud of how we navigated technical hurdles to bring our vision to life.
What we learned
We learned a lot about full-stack development and all the intricacies that's involved. We also learned how to iterate on an algorithm to improve accuracy. We also gained a deeper understanding of how to work with AI and image recognition technologies.
What's next for Facify
Creative and functional UI. More fun titles for playlists. More complex emotion recognition.
Log in or sign up for Devpost to join the conversation.