About the Project

What Inspired Us

The idea for DriveTunes came from a simple thought: music has the power to elevate moods, and our emotions often dictate the kind of music we want to listen to. As avid music lovers and tech enthusiasts, we wondered—what if your playlist could adapt to your mood without you having to touch your phone? Whether you're cruising down the highway or stuck in traffic, the right song can transform the experience. This inspiration fueled our vision to create a seamless, hands-free music app that adjusts to your emotions in real-time.

How We Built It

Building DriveTunes was an exciting journey that merged multiple technologies to create a cohesive system:

  • Spotify Integration: Using the Spotipy library, we connected the app to users' Spotify accounts to fetch playlists and play songs.
  • Emotion Detection: We utilized EmoNet, a powerful emotion recognition model, and combined it with OpenCV to process camera inputs and detect user moods based on facial expressions.
  • App Development: We built the app's interface using Flutter and Dart, ensuring a clean, user-friendly experience that seamlessly works across platforms.
  • Gesture and Voice Recognition: To add even more interactivity, we integrated hand gesture detection with the Gemini library and voice commands for intuitive control.
  • Challenges of APIs: Since APIs like Spotify's often change or get deprecated, we made sure to adapt and future-proof our implementation.

What We Learned

Throughout this project, we gained valuable experience and deepened our knowledge in several areas:

  • Spotipy: Managing Spotify APIs to fetch playlists, handle playback, and customize user experiences.
  • EmoNet: Understanding emotion recognition and integrating it effectively with a real-time input system.
  • Flutter and Dart: Building cross-platform apps with smooth interfaces and responsive designs.
  • OpenCV: Processing video streams for facial recognition and enhancing accuracy.
  • Gesture and Voice Recognition: Experimenting with libraries like Gemini and exploring how voice commands enhance hands-free operation.
  • Never Trust APIs: We learned the hard way that APIs can get deprecated or change without notice, requiring adaptability and backups.

Challenges We Faced

Like any ambitious project, DriveTunes came with its share of challenges:

  • Deprecated APIs: We encountered outdated APIs, especially with Spotify and some third-party emotion detection tools, which forced us to refactor code multiple times.
  • Android Studio and Package Management: Managing dependencies in Flutter and configuring Android Studio for smooth builds was more tedious than we anticipated.
  • Error Handling: From dealing with edge cases in emotion detection to handling Spotify API limits, we faced numerous bugs that required creative debugging and efficient handling.
  • Integration Overload: Combining emotion detection, voice recognition, and gesture control in a single app was complex and required careful resource management.

The Takeaway

DriveTunes was not just a project but a journey of innovation and perseverance. We learned to navigate technical challenges, adapt to unforeseen changes, and leverage the power of diverse technologies to bring our idea to life. In the end, it’s incredibly rewarding to see an idea turn into an app that can make car rides more enjoyable and personal.

We hope DriveTunes inspires others to explore how technology can make everyday experiences smarter and more delightful!

Built With

Share this project:

Updates