The Inspiration
We’ve all been there: the alarm goes off, you hit snooze in a half-conscious daze, and suddenly it’s 45 minutes later, and you’re late for class. We wanted to build something that doesn't just make noise, but actually holds you accountable. We figured if your alarm clock can see you, it can make sure you’re actually up before it stops.
What it does
Roost is a smart alarm system that integrates a web-based dashboard with physical hardware:
Smart Scheduling: Users can set, toggle, and manage multiple alarms through a web interface powered by the web_ui Brick.
Visual Accountability: Using a camera and the YOLOv8 pose-detection model (yolo26n-pose.onnx), the system monitors the room. If it doesn't detect that you've physically left your bed within a certain window, the alarm won't stay quiet.
Hardware Feedback: The Arduino Uno Q displays the current time on a TM1637 7-segment display and controls a physical servo motor designed to "fire" a rubber band mechanism if you stay in bed too long.
How we built it
The project is split into three main layers:
Python Backend: Python handles the logic, time tracking, and vision processing. It runs a WebSocket server to stay in sync with the frontend and communicates with the Arduino using the Router Bridge.
Computer Vision/YOLO: We integrated the YOLO ultralytics and OpenCV library to run pose detection. This allows the app to check the frame for a person and determine their state.
Arduino Hardware: An Arduino Uno Q manages the peripherals, driving the clock display and a servo motor for the physical "alarm" action.
Challenges we ran into
Synchronization was the biggest hurdle. Keeping the web UI, the Python vision script, and the Arduino hardware all talking to each other in real-time required a lot of tuning with Socket.IO and the Bridge library. We also had to account for the time zone offset (PST) to ensure the board time matched the user's actual time.
Accomplishments that we're proud of
We managed to get a full-stack IoT application running on the Uno Q that feels like a finished product. The computer vision aspect is snappy, and the physical hardware response—the servo firing—adds a layer of "consequence" that standard phone alarms just don't have. We were able to get a fully finished product, start-to-finish, especially for being mostly new to hardware.
What we learned
We learned a ton about deploying machine learning models (ONNX/YOLO) in a resource-constrained environment and about bridging the gap between high-level Python logic and low-level C++ hardware control. Additionally, we learned how to navigate through Linux and problem-solve through many of Linux's hurdles. We learned commands, workarounds, and software compatibilities.
What's next for Roost
We’d love to add an "Aggressive Mode" where the alarm volume increases, or the servo fires more frequently, the longer the camera detects you're still in bed. We also want to implement sleep-tracking analytics so users can see their sleep patterns over time. Adding more mechanisms and ways for the user to customize their alarm down to the physical level would be an interesting and fresh way to bring to Roost.
Built With
- 7-segment-display
- arduino
- arduino-router-bridge.-web:-html5
- c++
- c++-(arduino-sketch)
- css
- css3
- fastapi
- html
- javascript
- javascript-(frontend).-ai/vision:-yolov8-(ultralytics)
- jinja2.-key-libraries:-pytorch/torchvision
- numpy
- onnx
- onnx-runtime
- opencv
- opencv.-communication:-socket.io-(websockets)
- python
- pytorch
- scipy
- servo-motor.-languages:-python-(backend)
- tm1637-7-segment-display
- yolov8
Log in or sign up for Devpost to join the conversation.