About the Project

Inspiration

Technology is supposed to make communication easier, but face-to-face connection harder than ever. Around 80% of Gen Zs report feeling lonely, and some are even turning to AI companions. We wanted to explore whether we could help people act on their real-world impulses.

What We Built

We developed a brain computer interface (BCI) pipeline that uses your brain activity to control a robotic arm and generates a custom message for that someone you might be interested in. Using Emotiv BCIs, Brain Crush reads raw EEG signals including alpha, low beta, high beta, and theta frequency bands. We then created our own interpretation layer that filters and processes these signal bands to convert them into robotic input commands we could send to our LeRobot. At the same time, we used Perplexity to create a prompt we could feed into HeyGen. We were able to train an AI avatar based on Jeremy for our demo that would deliver a message to the person we wanted to use Brain Crush on. Finally, Brain Crush is deployed on Vercel. Conceptually, the pipeline is:

$$ \text{Brain Waves} \rightarrow \text{Signal Processing} \rightarrow \text{AI Interpretation} \rightarrow \text{Discrete Intent} \rightarrow \text{Robot Action} \rightarrow \text{AI Video Generation} $$

Because neural signals are noisy and inconsistent, we converted a continuous signal into a small set of discrete outputs. Basically, this meant mapping a high-dimensional time series into a few categories (e.g., calm, focused, excited, or neutral):

$$ f: \mathbb{R}^n \rightarrow {A_1, A_2, A_3, A_4} $$

Each emotion activates a different robotic behaviour.

How We Built It

Our system consists of five main layers:

1) Brain Signal Collecting We used Emotiv Epoc X and Impulse BCIs to get real-time waveform data (EEGs).

2) EEG Processing We used data we collected from medical research to filter and isolate brain waves that are the most indicative of romantic interest. In our case, this would be the Beta High waves. We set our threshold to be around 300 microvolts squared for interest, and under 5 microvolts squared for a sustained 5 seconds for non-interest.

3) Intent Mapping To ensure reliability, we constrained the output to a small number of discrete actions. Instead of attempting control, we implemented a classification system that maps continuous inputs into a set of commands we programmed to the LeRobot.

4) Robotic Execution The robotic arm performs a physical task corresponding to the detected intent. For example:

  • Extend the arm to hand candy over to your romantic interest
  • Shake its arm no in the case that you didn’t feel too interested

4) HeyGen We used Perplexity to interpret the rest of the data available to us from the BCI to create a prompt for HeyGen. The prompt is made up of:

  • Combination of the Alpha, Beta, and Gamma waves
  • Rates of change (derivatives) of the fluctuations in brain activity
  • The video that is uploaded of the user talking into the camera

Then, HeyGen creates a message with an AI avatar trained to look and sound exactly like you. Basically, if you’re too shy to talk to someone at the cafe, our HeyGen avatar can do it for you.

Data and Simulation

We used data from a variety of sources. The main input was obviously real-time EEG data from our BCIs. To interpret our EEG data, we used Bright Data’s site scrapers to figure out which specific band of wavelength was most effective in determining romantic interest. Edge cases were also accounted for by using direct data injection in Node-Red, making sure any inconsistencies with our BCI were smoothed out.

Challenges We Faced

1) EEG Outputs Dry electrodes produce noisy signals, so saline solution is used to improve conductivity for the BCI nodes. Since we didn’t have any saline solution, we had to grab salt shakers from a dining hall to create our own.

2) Output Translation We had to isolate individual brain waves and EEG data points and convert it to a set of outputs. We had to research and design our own logic system that linked specific bands of brain wavelength to romantic interest and excitement.

3) Real-Time Integration We had to optimize our system to reduce latency as much as possible, since we’re essentially controlling an entire AI processing, computer vision, and robotics system with our thoughts (mind control).

What We Learned

This was our first time working with LeRobot robotic arms, and Sarah, Jeremy, and Claire’s first experience using EEG technology. We had to quickly learn how to configure the LeRobot environment and interpret what the raw EEG outputs meant in practice.

Something cool that all of us also learned was how to pet a Llama (shoutout to Munay!)

Built with:

  • Emotiv EEG headsets (insight and epocx)
  • LeRobot Robotic Arms
  • Node-RED
  • Next.js
  • Perplexity
  • HeyGen
  • BrightData
  • Vercel

Built With

Share this project:

Updates