Inspiration
Every term, students face the stressful task of enrolling in high-demand courses. Seats open and close within seconds, and students are often forced to constantly refresh course listings to secure a spot. We wanted to solve this inefficiency and reduce the anxiety around course enrollment, making the process smarter and more seamless.
What it does
SeatSniper is a web application that allows students to express interest in courses they want to enroll in. When a seat becomes available, an instant notification is sent via email, enabling students to quickly secure their spot.
Beyond enrollment monitoring, SeatSniper provides a comprehensive course browsing experience. Students can view courses for the next term, access summarized student reviews pulled from Reddit, and interact with an AI bot to get recommendations, prerequisites, and course insights. All of this is done within a single interface for an easy and informed course selection experience.
How we built it
We began by scraping UWaterloo's course listings for several departments to collect enrollment data. This data was exported to CSV and stored in DynamoDB for fast querying. The backend is built using FastAPI, which serves the data and manages notifications.
The frontend is built with React and TypeScript, offering an intuitive interface for browsing courses and tracking interest. We integrated Cohere to generate summarized course reviews from Reddit posts. To reduce load and maintain performance, scraped Reddit data is cached in MongoDB for seven days, keeping the reviews fresh without overloading the scraper. Additionally, we implemented an AI chatbot using a GPT wrapper, which leverages our database to answer questions, and, if the database is insufficient, searches the web to provide answers.
Challenges we ran into
- Integrating multiple services (DynamoDB, MongoDB, FastAPI, and frontend) into a seamless system.
- Handling the AI chatbot giving inaccurate and made-up responses when database info was insufficient.
- Parsing complex and nested HTML structures from the course listing pages.
- Ensuring real-time notifications were reliably delivered via email.
- Summarizing Reddit reviews meaningfully despite inconsistent data formats.
Accomplishments that we're proud of
- Successfully integrating all components to work together.
- Learning and implementing DynamoDB effectively.
What we learned
- How to scrape complex, dynamic web data and structure it for use in an application.
- Integrating multiple databases and APIs to support real-time notifications.
- Building a modern frontend with React/TypeScript that interacts smoothly with backend services.
What's next for SeatSniper
In no particular order, here are some possible features that may be added in future.
- Expand the scraper to cover all departments and courses.
- Automate scraping to update the database continuously.
- Add support for personalized notifications, such as:
- Alerts for preferred time slots or sections.
- Notifications when course availability drops below a threshold.
- Incorporate analytics for students to track trends in course demand and availability.
- Explore mobile app integration or push notifications for faster alerts.

Log in or sign up for Devpost to join the conversation.