Inspiration

JustRun is inspired by creating a seamless user experience for those who use run. It inspires to remove the uncomfortable motions when using your phone while running, and improve on real-time feedback users get.

What it does

JustRun is a bidirectional voice-first coaching app that integrates with your digital fitness ecosystem. Simply begin with "Hey Coach!" followed by a question to get immediate, relevant feedback. It connects with Strava to import your routes/history and Whoop (via API) to monitor your heart rate and recovery state. It knows your route. If you ask "Where do I turn?", it checks your GPS coordinates against the GPX track and gives you specific directions.

The entire experience is designed to be hands-free. No tapping buttons; just run and talk.

Technology Used

** Backend (The Brain) **

  1. Whisper (Transcription): We use OpenAI's Whisper model for accurate speech-to-text transcription even in windy outdoor conditions, so the user can ask questions live during the run.
  2. Whoop API: The app was integrated with the WHOOP API (a popular fitness band) to extract sleep scores, recovery scores, heart rate zones, and more.
  3. FastAPI (Python): We used FastAPI for a high-performance asynchronous backend. We built a custom WebSocket architecture that relays audio and text between the user and our AI agents.
  4. OpenAI (Intelligence): We use OpenAI's models with Function Calling (Tool Use). The LLM has access to real-time tools like the user's past runs, location, guidance, and live heart rate data before answering.
  5. ElevenLabs (The Voice): After OpenAI produces a response, we stream it back to the user in a human-like voice using ElevenLabs' ultra-low latency text-to-speech API.

** Frontend (The Body) **

  1. React Native & Expo: Built for iOS and Android, focusing on a minimal, high-contrast UI that is visible in bright sunlight.
  2. Real-Time Telemetry: The app aggregates GPS location and accelerometer data, streaming it to the backend 10x per second.

** Integrations: ** Strava API: We implemented full OAuth2 flows to import GPX route data and analyze past run performance.

Challenges we ran into

Latency: Seconds matter when you're running past a turn. Optimizing the "Audio -> Whisper -> Open-AI -> ElevenLabs -> Audio" pipeline to be near-instantaneous was a major engineering hurdle, we cut down the latencies by optimising the responses given by open ai and learning to use lighter models for faster response times

Context Management: The AI needs to know a lot relative to the user: "Where am I?", "What is my average pace?", "How does this compare to last week?". Managing this state in real-time and feeding it into the LLM context window without overflowing it required careful design.

Map Matching: Determining exactly where a runner is on a route (and when to trigger a turn instruction) using noisy GPS data is surprisingly difficult.

Integration Headaches: With a team working across a complex Python backend and a React Native frontend, we faced significant "integration hell." reconciling state management logic between the server and the mobile app led to gnarly merge conflicts. It taught us the hard way that defining clear API schemas early is crucial.

What's next for JustRun

This is only the beginning. JustRun can be expanded in various ways:

  1. Integration of fitness devices other than WHOOP (e.g. Apple Watch, Garmin)
  2. Integration of sports other than running (e.g. Biking)
  3. Emergent Behaviours: Let the app learn the behaviour of the user, so that it can personalise the experience for them, suggesting tailored runs based on past patterns

Built With

Share this project:

Updates