Inspiration
There are many barriers to perfecting your jumpshot, including lack of coaching and real-time feedback. We wanted to create a platform for basketball players and fanatics to get fast, actionable feedback on their jumpshot to help improve their shooting mechanics.
What it does
ShotIQ allows users to upload clips of them shooting a basketball. After the clips are uploaded, ShotIQ runs a pose estimation model on the player in-frame, calculating the angles of their knees, ankles, and elbows during and after their jumpshot. ShotIQ compares this information against well-known and research-backed indicators of form quality, and uses natural language to communicate actionable feedback to the player.
How we built it
On the frontend, we used NextJS as our frontend framework and TailwindCSS for styling. This allowed us to create attractive UI components for users to upload videos and interact with. On the backend, we used FastAPI to build a REST API to receive video uploads and relay feedback to users. To handle pose estimation, we built OpenCV Python module. We used the OpenAI API to express quantitative feedback in natural language.
Challenges we ran into
Setting up our local development environments took longer than expected due to compatibility issues between varying Python versions we had installed and differing operating systems (macOS and Windows). We also learned a lot of new technologies for this project (NextJS, FastAPI, OpenCV) that most of us hadn't used before.
Accomplishments that we're proud of
We're proud that we were able to deliver a fully working product that users can meaningfully use in under 24 hours.
The project can take and store several videos from a user, and when prompted, process the video and provide the appropriate feedback to the player on how to improve their shooting form. We're proud of integrating so many moving parts, from the frontend to all the functions involved in the backend, and of learning new technology in the process.
What we learned
That the real hackathon was the friends we made along the way <3
And also that playing with new tech is exciting.
We also learned lots about the biomechanics of excellent shooting form in basketball -- particularly pertaining to the ideal angles of the body in preparatory and release phases of shooting, which occur when the knee is at its minimal angle and the elbow is at its maximal angle, respectively. This finding is based in scientific literature and served as the basis for determining user strengths and weaknesses in their shooting form. The relevant key points in the videos are found by a MediaPipe computer vision model's identification of the different body parts. These vectors' dot products served as a means of calculating the elbow, knee, and ankle angles in their different phases, which were then compared to the baseline expectations for excellent shooting form. Additionally, we learned how to provide the relevant context and interpretation of the player's movements to ChatGPT, which then outputs encouraging but appropriately concise feedback for the player.
What's next for ShotIQ
Ironing out the frontend and creating a more detailed criteria for what makes a "good" and "bad" jump shot. We also hope to add is offering more holistic recommendations on a set of one player's videos instead of just one video. Adding live audio feedback when a player is shooting would be a complicated but valuable new feature for us as well.
Built With
- fastapi
- javascript
- nextjs
- opencv
- python
- react
- typescript
Log in or sign up for Devpost to join the conversation.