Inspiration
While scrolling on YouTube for ideas, I came across an illustration video from the Veterans Health Administration. It tells story about a man, named Frank. On the outside, he looks strong and functional, but inside, he's struggling. To cope, he starts drinking more often and smoking weed more, too. And long story short, he met a doctor who suggested him a treatment called Cognitive Behavioral System (CBT). Therefore, I knew that I wanted to build something that helps therapists and doctors extend care beyond the session, and supports people during the exact moments when relapse usually happens. And that’s where LAST.CALL began.
What it does
LAST.CALL is designed for urgent moments. It helps people who are dealing with addiction-related urges when: no one is monitoring them, they are outside therapy sessions, the urge hits late at night, or they are overwhelmed by stress and cravings.
Instead of a static habit tracker, LAST.CALL acts as a real-time distraction and grounding tool, guiding users to pause, breathe, and regain control before an urge turns into action. The app walks users through personalized structured steps inspired by CBT principles, delivered through calm, guided audio, helping them survive the moment, not solve their entire life.
How we built it
- The Senses: Using Gemini 2.5 Flash, the app analyzes 15 seconds of the user’s facial expressions and voice. It doesn't just "hear" words; it detects the micro-expressions of anxiety and the tremors of a craving.
- A Python (FastAPI) backend processes this emotional data. If the system detects high distress, it pivots the recovery roadmap instantly, selecting the most effective CBT grounding technique for that specific state.
- The Voice: To maintain a supportive atmosphere, I integrated ElevenLabs voice agents to deliver these intervention steps, ensuring the user feels heard and guided by a "human" presence.
- The frontend is a fast, responsive React + Vite app, with MongoDB securely storing emotional summaries so users can see their growth over time.
Challenges we ran into
- I initially tried building a fully autonomous, live ElevenLabs generation pipeline after Gemini produced the steps. However, this caused long loading times and instability. I had to rethink the architecture and switch to a more reliable approach.
- I also attempted to implement a RAG (Retrieval-Augmented Generation) pipeline using Vector Analysis with MongoDB to pull medical data. However, the extra overhead of embedding and vector searching made the app significantly slower.
- Balancing emotional sensitivity with technical automation was difficult, especially making sure the system felt supportive, not robotic.
Accomplishments that we're proud of
- Turning a deeply human and emotional problem into a functional, real-time system.
- Successfully integrating Multimodal AI (Vision + Voice) into a meaningful wellness use case
- Building a complete end-to-end product, from the first camera frame to the final guided breath and intervention
What we learned
Almost everything in this project was new to me. Beyond the technical side, I learned that designing for mental health requires restraint. Not everything needs to be optimized or automated. Sometimes, helping someone pause for a few seconds is already a win especially for a CBT case.
What's next for LAST.CALL
- Vector Analysis (RAG): I want to implement a RAG pipeline so the AI can pull from a massive library of clinical CBT protocols for even more precise advice.
- Therapist Integration: Building a dashboard where (with consent) doctors can see their patients' "High-Risk" patterns to provide better care during sessions.
- Wearable Support: Detecting cravings through heart rate spikes before the user even picks up their phone.
Built With
- elevenlabs
- fast-api
- gemini-api
- javascript
- mongodb
- ngrok
- python
- react
- vite
Log in or sign up for Devpost to join the conversation.