Swing Sensei

Swing Sensei is a personal AI-powered badminton coach that makes training more accessible, interactive, and actionable. We wanted to build something that could help players improve even when they do not have access to a coach, a training partner, or formal lessons. Instead of generic advice, Swing Sensei gives users live, targeted feedback on their form and then follows up with a detailed AI-generated summary after each session.

Inspiration

Badminton is a fast and technical sport, but quality coaching is not always easy to access, and learning something without guidance can be difficult and frustrating. Newer players often do not know what to correct, and even experienced players can struggle to notice small mistakes in their own form without outside feedback. We were inspired by the idea of making coaching more accessible through computer vision and AI.

We wanted to create a tool that feels like a supportive training partner: one that watches your swing, gives you real-time guidance, and helps you improve over time. The goal was to bridge the gap between solo practice and personalized coaching.

What it does

Swing Sensei offers two main experiences through its interface:

  • A landing page that introduces the project and gives users access to training resources
  • A live training mode where users can log in, practice their swing, and receive AI-powered feedback

Users can create an account or log in with email, with authentication handled through Supabase. Once inside the training experience, they can start a live session and choose to focus feedback on specific form areas such as the arm or elbow.

During a session, the camera tracks the user's motion and analyzes their swing in real time. Swing Sensei then gives immediate voice feedback such as:

  • "Lift arm"
  • "Good swing"

After the session, the recorded movement data is sent to the Gemini API, which generates a more detailed feedback summary. This gives users both instant corrections in the moment and reflective coaching afterward.

How we built it

We built Swing Sensei by combining frontend, backend, machine learning, and AI summarization into one workflow.

Frontend

We used React to create the main user-facing interface, including the landing page, login flow, and navigation between resources and live training. Since our motion-analysis pipeline was already working well in Streamlit, we embedded that component into the React experience using an iframe so we could quickly connect a polished frontend with a functional computer vision backend.

Authentication and backend

We used Supabase for authentication and user account management. Users can sign up and log in with email, and their credentials are stored securely so they can return to the platform for future sessions.

Computer vision and motion tracking

For pose detection, we used MediaPipe to extract body landmarks from the live camera feed. From those landmarks, we computed geometric measurements such as joint angles and wrist motion velocity to evaluate swing quality.

These measurements let us reason about whether a swing is too low, too slow, or mechanically inconsistent.

Voice and AI feedback with Gemini API and ElevenLabs

We used the Gemini API to transform session output into a more natural, user-friendly coaching summary. We convert the session recording movement analysis into feedback that is easier for users to understand and apply in their next practice session. To make the coaching experience feel more immediate and interactive, we used ElevenLabs AI to deliver spoken feedback during training sessions. Instead of only showing visual prompts on screen, Swing Sensei can respond with audio cues such as "lift arm" or "good swing" in real time. This made the experience feel more like working with an actual coach and helped users stay focused on their movement without constantly looking back at the screen.

Challenges we ran into

One of our biggest challenges was connecting multiple layers of the project into a smooth end-to-end experience. We were working across:

  • a React frontend
  • a Streamlit-based live analysis system
  • authentication with Supabase
  • real-time pose tracking with MediaPipe
  • AI summarization with Gemini

Each part worked differently, so integration was a major challenge. In particular, embedding Streamlit into a React workflow required us to think carefully about how the final product should feel cohesive rather than like separate disconnected tools.

Another challenge was turning raw pose information into meaningful feedback. It is one thing to detect body landmarks, but it is much harder to decide what counts as a "good swing" versus a weak or misaligned one. We had to think about which measurements were most useful, such as elbow positioning, arm angle, and motion speed, and how to map those into feedback that would actually help a player improve.

We also ran into practical issues around authentication, environment variables, and getting our full stack to work consistently across local development and deployment.

Accomplishments that we're proud of

We are proud that Swing Sensei does more than just detect movement. It creates a full coaching loop:

  1. the user signs in,
  2. practices live,
  3. receives real-time voice feedback,
  4. and gets a post-session AI summary with more detailed insights.

We are also proud of how interdisciplinary the project became. It brought together UI design, authentication, security-minded development, computer vision, geometry, and generative AI into a single experience.

Most importantly, we built something that feels practical. Swing Sensei is not just a demo of pose detection but also a step toward making personalized sports feedback more accessible.

What we learned

This project taught us a lot about building across the full stack under hackathon time pressure. We learned how to:

  • connect React and Streamlit into one product flow
  • use Supabase for login and account persistence
  • work with MediaPipe pose landmarks for real-time motion analysis
  • translate technical measurements into user-friendly AI feedback with Gemini

We also learned that building a good AI experience is not just about accuracy. It is about communication. A user does not want a wall of numbers; they want clear, timely advice they can act on immediately.

What's next for Swing Sensei

In the future, we would love to expand Swing Sensei beyond basic swing feedback. Some next steps we are excited about include:

  • supporting more shot types such as clears, drops, and smashes
  • adding progress tracking over time for returning users
  • generating personalized drills based on repeated mistakes
  • improving the accuracy of feedback with more motion features and training data
  • making the platform more robust and polished for real-world use

Our long-term vision is to make Swing Sensei feel like a true AI training companion for badminton players at any level.

Built With

  • React
  • Streamlit
  • Supabase
  • MediaPipe
  • Gemini API
  • TypeScript / JavaScript
  • Computer vision
  • Pose estimation

Built With

Share this project:

Updates