Inspiration

We were inspired by the question: How can we create new modes of interaction? Many people with limb loss or paralysis struggle with everyday digital interaction, something most of us take for granted. Our goal was to make technology not just accessible, but inclusive and fun! The Human Augmentation track appealed most to us and challenged us to go beyond assistance to create an interface that expands what the human body can do.

What it does

Blinklet uses the Muse 2 EEG headband to detect blinks and eye activity, translating them into real-time computer keyboard strokes. With Blinklet, users can scroll through Instagram and play a variety of games entirely hands-free. It redefines accessibility by letting individuals interact with digital platforms using only natural facial and neural signals. There are currently 4 user inputs you can use to bind keys to (single blink, double blink, look left, look right)

How we built it

The Muse backend, it utilizes pylsl to connect with the Muse, then scans the incoming signals for user actions by combining information from all 4 sensors. It also starts a FastAPI server so the front end can send keybindings for emitting keystrokes. When it detects a user input it uses pyautogui to emit the proper keybinding. We created a web-based game hub built with React + Vite, featuring a navbar, game directory, user authentication, and embedded classic games using integrations. Each game (Snake, Flappy Bird, Dino, Breakout, Pong, etc.) is hosted locally in the public/ directory and displayed through reusable components like GameTemplate. We had a Responsive Layout &amp; Styling Fixes, User Authentication Integration Point, and EEG Integration.</p> <h2 id="challenges-we-ran-into">Challenges we ran into</h2> <p>Finetuning the EEG for artifact detection took a while to get the write sensor combinations and cutoffs for an accurate user signal reading. While embedding games onto our website, we also faced some issues creating a structured template for the games. This is because of dimension problems, templating, and coordinating multiple games on one site. Additionally, we ran into several database errors and misconfigurations when setting up our user authentication. It’s our first time using supabase, so it took some time to understand supabase permissions and policies.</p> <h2 id="accomplishments-that-were-proud-of">Accomplishments that we&#39;re proud of</h2> <p>Achieved real-time blink to action control with very minimal delay It&#39;s fun and now we can watch reels with out needing our hands to scroll Created a functional hands-free gaming and social media interface. Demonstrated that EEG and eye blinks can form a viable, accessible human-computer interface Integrated hardware, neuroscience, and software successfully for our project.</p> <h2 id="what-we-learned">What we learned</h2> <p>Learned how to setup a cmd line interface in python, connect the muse to keybinding and filter for user input. Also how to start a fastapi server on a separate thread from the muse logic and make sure the threads are in sync. On the web app side, we learned how to properly configure a database with user authentication, and we learned how to embed external games onto our own website. In a more general sense, we learned how important and practical it is to create basic fun experiences to those that may not have been privileged enough to enjoy them regularly.</p> <h2 id="whats-next-for-blinklet">What&#39;s next for BlinkLet</h2> <p>Adding more ways to interact with the keybindings, for example look up, look down, left blink, right blink, triple click?, and of course improving accuracy. Additionally, we hope to find people who would genuinely benefit from our app to try it out, and watch how impactful it is for them. </p>

Built With

Share this project:

Updates