Inspiration
Some of our team members love running and enjoy listening to music while doing so. However, when running, music can sometimes be distracting and it can be hard to pace your steps when listening to music. As such, we are looking to make it easier for people to listen to music when running, and so enjoy their runs more.
What it does
Our software has two aspects, one of which is a mobile app which calculates your cadence and will then recommend songs (and ultimately play the tracks) matching your cadence to the music. The second part will pull your data from recent workouts and analyze them (to produce insights which can be useful for future workouts).
How we built it
We used Kotlin (with Jetbrains IntelliJ) to build the mobile app. The web app was built with NextJS and connects to Firebase for user authentication and our database in Firestore. On our backend, we pull from the Terra API and upload users’ health data (from their wearables) onto Firestore, which we then request client side for their data. Additionally, we also experimented with the Spotify API, which we plan on integrating into the mobile app to then be able to change the music while on the mobile app.
Challenges we ran into
We encountered a few difficulties with the Terra API, especially with live tracking of data. Although we could not get the live streaming data API to work within the timeframe of the hackathon, we used the Terra API to provide users with a web platform to analyze their workout statistics after the session.
Accomplishments that we're proud of
We really enjoyed and did well working as a team, especially since we are mostly beginners with hackathons. We spent a while on the creation of the concept art/UI, including lots of research for the design layout. Working on tight deadlines with little sleep but was challenging yet very rewarding when we managed to make significant progress on this project.
What we learned
We learnt a lot about utilizing various tools we had not used before, such as using Kotlin for Android development for the first time, doing our own research into how to make use of obscure APIs with unclear documentation, the intricacies of Gradle, the difficulties of implementing front end from a concept to a practicality, and working as a team with strict time constraints, all of which are useful transferable skills for our future careers.
What's next for FootBeat
We plan to further integrate the designed front-end pages into code, creating a better user experience. We could incorporate voice control, which would make it easier to use the app during a run if they forgot to set it at the start. We will allow the user to further customize their listening experience by picking genres they want to listen to during their run. We hope to advance our insight analysis on the web app, using a wider range of available data such as sleep data.



Log in or sign up for Devpost to join the conversation.