Inspiration

Parkinson’s is the second most common neurodegenerative disorder, affecting more than 110,000 people in Canada alone. It often steals control over one’s own movement: tremors, stiffness, and difficulty writing can make daily tasks overwhelming. And while therapy helps, access is uneven, and progress requires constant, guided practice.

The need goes beyond Parkinson’s. Stroke survivors, people with acquired brain injuries, and individuals with motor-development disorders all face similar challenges with fine-motor precision.

We want to make therapy more accessible and effective, empowering people to regain control, rebuild confidence, and improve their daily lives.

What it does

SteadyScript makes fine-motor therapy more accessible, using real-time biofeedback to help users build steadier, more controlled movements. Our system pairs a biofeedback pen with a simple web app to guide users through short, structured exercises.

Each session starts with a tremor baseline test: users hold the pen steady upright, and the pen lights up whenever the user is in a stable or variable state, turning off when the user is in an unstable state. SteadyScript prompts users to adjust their grip and discover what feels most stable. Next, a mobility test asks users to move their pen back and forth in a line to the beat of a metronome, while the pen and app provide instant visual and on-screen alerts when movement becomes less steady.

At the end, users receive a clear summary of their performance and can track their progress over time. With repeated practice and continuous feedback, SteadyScript helps retrain fine-motor control.

How we built it

SteadyScript was built as a full-stack, real-time training system combining computer vision, hardware feedback, and a responsive web interface.

On the computer vision side, we used OpenCV to process a live webcam feed at ~30 FPS. A colored marker on the pen tip is tracked using HSV colour segmentation, contour detection, and centroid extraction. To quantify tremor, we implemented a jitter detection algorithm that maintains a rolling window of recent positions, computes a smoothed average, and measures deviation from that average to isolate high-frequency micro-movement.

For our FOLLOW mode, where users perform drawing exercises, we designed a custom lateral jitter algorithm. Instead of penalizing intentional movement, the system computes the dominant direction of motion and measures only the perpendicular wobble, allowing us to accurately assess tremor during continuous drawing tasks without false positives.

The backend is built with FastAPI, using WebSockets to stream base64-encoded video frames and real-time metrics to the frontend with low latency. Session data, including jitter percentiles and stability scores, is stored locally in JSON for progress tracking.

Our frontend is a React + TypeScript application built with Vite, Tailwind CSS, and Framer Motion, providing a polished and accessible UI with live video, stability gauges, adjustable metronome controls, and historical progress charts rendered using Recharts.

To enable eyes-free training, we integrated physical hardware feedback using an Arduino Uno. The backend communicates with the Arduino via PySerial (serial communication); sending a byte to the Arduino based on stability levels, toggling LEDs that indicate stability states in real time (green for stable, yellow for warning, off for unstable), allowing users to focus on the task rather than the screen.

Challenges we ran into

One of our biggest challenges was accurately measuring tremor during intentional movement. Early versions of our system falsely penalized users simply for moving the pen, making the feedback feel more like a game than a meaningful training tool. We addressed this by designing the lateral jitter algorithm, which separates intended motion from involuntary wobble by projecting movement onto a perpendicular axis. Another major challenge was achieving stable real-time performance across vision processing, WebSocket streaming, and hardware feedback. Balancing frame rate, latency, and responsiveness required careful optimization, especially when encoding video frames and synchronizing metrics with the UI and Arduino signals. Hardware integration also presented difficulties. Mapping continuous stability metrics into discrete, intuitive LED states took several iterations to avoid flickering or confusing feedback, and we had to tune thresholds carefully so that feedback felt supportive rather than discouraging.

Accomplishments that we're proud of

We’re proud to have built a fully functional, end-to-end system that integrates computer vision, a real-time web application, and physical hardware feedback into a cohesive training experience. We’re especially proud of the lateral jitter detection approach, which allows SteadyScript to evaluate movement quality in a way that feels both fair and meaningful during drawing tasks. This made the tool feel like a true motor training aid rather than a simple stability test. The physical Arduino LED feedback is another highlight. By allowing users to train without constantly looking at a screen, we preserved focus and reduced cognitive load, which is critical for accessibility-focused design. Finally, delivering a polished frontend with smooth animations, live charts, and session history within a hackathon timeframe was a significant achievement for our team.

What we learned

Through this project, we learned that measuring human motor control is fundamentally different from measuring static accuracy. Separating intentional movement from involuntary instability required careful signal processing and thoughtful task design. We gained hands-on experience with real-time computer vision pipelines, including noise reduction, rolling window analysis, and percentile-based scoring to produce stable and interpretable metrics. We also learned the importance of hardware as a first-class design element in accessibility projects. Physical feedback can dramatically change how usable and empowering a system feels. On the systems side, we deepened our understanding of WebSocket-based streaming, backend–frontend synchronization, and reliable serial communication with microcontrollers. Most importantly, we learned how critical scoping and framing are when building assistive technologies—focusing on training, feedback, and empowerment rather than diagnosis or cure.

What's next for SteadyScript

  • Custom tracing modes with feedback
  • Mobile connectivity for greater accessibility
  • Make the smart biofeedback pen fully wireless with Bluetooth connectivity.

Built With

Share this project:

Updates