Inspiration

We kept asking ourselves: what advice would you give your younger self? That question stuck with us because everyone has an answer, but no one has a way to actually capture that wisdom as they live it. We wanted to build something that preserves your struggles and growth in real time, so when you finally figure something out, that knowledge doesn't just disappear. It becomes something you can pass on.

What it does

Echo is a voice journal app that turns your daily experiences into an archive that matters. You speak your thoughts into your phone, and Echo transcribes it, lets you add photos, and shows you patterns you might be missing: what's weighing on you, whether it's recurring, and the emotional themes running through your life. The app connects you with others navigating similar struggles, and if you're in distress, it surfaces real resources. After a year, you get a complete picture of your journey and the chance to mentor someone facing what you've already overcome.

How we built it

We focused on making the experience feel natural, not clinical. The voice-to-text system needed to be fast and accurate, so users could just talk without friction. We built pattern recognition that analyzes journal entries for recurring themes and emotional patterns. The matching algorithm compares themes across users to suggest meaningful connections. We integrated crisis resources that trigger based on frequency and intensity of entries. The year-end wrapped feature required building a data visualization system that could turn 365 entries into a coherent narrative.

Challenges we ran into

The biggest challenge was balancing AI insights with genuine human connection. We didn't want Echo to feel like it was diagnosing people or replacing therapy. We also struggled with the matching algorithm, making sure it connected people meaningfully without being intrusive or overwhelming. Privacy was critical since we're handling sensitive mental health data. And figuring out when to surface crisis resources without being alarmist took a lot of iteration.

Accomplishments that we're proud of

We built a system that actually listens. The pattern recognition works well enough that users genuinely see things about themselves they hadn't noticed. The year-end wrapped feature turned out more powerful than we expected. People saw their growth in a way that felt validating and real. And we're proud that Echo doesn't try to be a therapist. It's a tool that helps you understand yourself and connect with others who get it.

What we learned

Mental health technology has to earn trust every single day. People will only be vulnerable if the experience feels safe and human. We learned that AI works best when it amplifies human insight rather than replacing it. We also learned that people crave connection around their struggles more than we expected. The matching feature became one of the most requested aspects during testing.

What's next for Echo

We want to expand the matching algorithm to allow for group connections, not just one-on-one. We're exploring partnerships with mental health organizations to make sure our crisis resources are comprehensive. Long term, we want Echo to help people see not just their own patterns, but how their experiences fit into larger community narratives. And we're thinking about ways for mentors to stay connected with the people they help, building a real support network that grows over time.

Built With

  • and-modular-firebase-sdk.-implemented-real-time-chat
  • audio-recording
  • cloud-firestore
  • cloud-firestore-(real-time-database)
  • expo-router
  • expo.io
  • firebase
  • firebase-authentication
  • firebase-security-rules
  • firestore-security-rules
  • git
  • react-native
  • typescript
Share this project:

Updates