By Team Syncer
📋 View project on Devpost → — Story, demo video, sizzle reel, and full hackathon submission.
Videos: Demo · Sizzle reel
🏆 Meta track winner @ MIT Reality Hack 2025 — OpenGalea was awarded 5 Quest 3 headsets by Meta, recognizing our brain-controlled, colocated multiplayer mixed reality system and supporting the next phase of development.
Contributors: Alif Jakir, Tsing Liu, Soo Bin Park, Yechan Ian Seo, Syed Hussain Ather
Special thanks to Vankley Yi for hardware engineering support.
"We stuck together the Ultracortex with a Quest 3, but made space by giving the prefrontal cortex a lobotomy, and then we added a bunch of electrodes, and made a local multiplayer shared experience."
Table of Contents
- Introduction
- Project Genesis: MIT Reality Hack
- Features
- Use Cases
- Bill of Materials (BOM)
- Hardware Setup
- Software Setup
- Machine Learning Model
- Running OpenGalea
- Colocation Implementation
- Troubleshooting
- Contributing
- License
- Acknowledgements
- Contact
- Inspirations
- Repositories & Links
OpenGalea is an open-source project that merges neuroscience and mixed reality into immersive, brain-controlled experiences. An 8-channel EEG system (OpenBCI Cyton) paired with the Meta Quest 3 lets users shape virtual environments with their brainwaves—no controllers required. We built it to democratize neurotechnology: at ~15× lower cost than commercial alternatives (OpenGalea ~$1,900 vs. Galea ~$30,000), it opens the door for researchers, developers, and enthusiasts to collect meaningful datasets and build entirely brain-driven, closed-loop visual and auditory experiences. The result is a platform where thought and presence share the same space.
OpenGalea was conceived and built during the hardware track at MIT Reality Hack, a premier hackathon that brings together innovators in mixed reality, AI, hardware, and game development. The intensity of the hackathon—and the chance to iterate alongside other teams—pushed the project from concept to a working, multi-user system in days. At MIT Reality Hack 2025, OpenGalea was named Meta track winner, earning 5 Quest 3 headsets from Meta to scale our colocated multiplayer experiences and bring the vision to more users.
- Colocated Mixed Reality: Shared virtual environments where multiple users share the same physical space—see each other, see the same virtual objects, and collaborate in real time.
- Brain-Computer Interface: An 8-channel EEG system streams and analyzes brainwaves in real time, so attention, relaxation, and other states can drive the experience.
- Custom Machine Learning Model: A trained classifier turns raw EEG into reliable mental-state labels (e.g., Attention, Relaxation), enabling responsive, brain-driven interactions.
- UDP Communication: Low-latency links between EEG hardware, laptops, and the Quest 3 keep brain data and the MR experience in sync.
- Unity + Meta XR SDK: Full mixed reality experiences built in Unity, with colocation and shared spatial anchors so everyone shares one coherent world.
- Open-Source and Affordable: Hardware, software, and docs are open—so labs, indie devs, and makers can build without the six-figure price tag of commercial neuro-MR systems.
OpenGalea’s combination of BCI and colocated MR opens a wide range of applications:
- Gaming: Brain-controlled mechanics—focus to aim, relax to trigger—for deeper immersion and new forms of interdependence between players.
- Therapeutic Applications: Neuroadaptive environments that respond to attention and relaxation for meditation, stress reduction, and mental wellness.
- Collaborative Training: Shared virtual simulations where teams train together in the same space, with feedback tied to mental state and cooperation.
- Accessibility: Assistive interfaces that reduce reliance on physical input, enabling interaction through intention and focus.
- Non-Verbal Communication: Shared experiences where emotional or cognitive state (e.g., calm vs. focused) becomes part of how people coordinate—without saying a word.
Keeping the system affordable was a core goal: we want labs, educators, and indie creators to run experiments and build experiences without six-figure hardware. A detailed BOM, including component sources and costs, is available in BOM.md.
Cost Comparison:
- OpenGalea: Approximately $1,900
- Commercial Equivalent (e.g., Galea): Approximately $30,000
OpenGalea is approximately 15.8× more cost-effective than comparable commercial systems—so neuroadaptive MR can move out of high-budget labs and into the wild.
- Headset Components: The front and back components of the OpenGalea headset are designed for 3D printing. STL files are available in the
OpenGalea/3d-modelsdirectory of the Hardware Repository. - Recommended Materials: PLA or PETG
- Print Settings:
- Layer Height: 0.2mm
- Infill: 20-30%
- Supports: As needed
- Refer to your filament and printer documentation for specific temperature settings.
A detailed, step-by-step assembly guide with diagrams and photos is available in the Hardware Repository's HARDWARE.md.
Key Assembly Steps:
- Prepare all components (3D printed parts, OpenBCI Cyton board, electrodes, wiring, Velcro straps).
- Assemble the Ultracortex frame (refer to OpenBCI Ultracortex Mark IV documentation if needed).
- Integrate the Cyton board onto the 3D printed back component.
- Attach electrodes and route wiring.
- Mount the front component to the Ultracortex frame.
- Add weights and Velcro straps for balance and fit.
- Connect the assembled system to the Quest 3 headset.
- Operating System: Windows 10 or 11 (for OpenBCI GUI and model training)
- Unity: Version 2022.3 or later
- Meta XR SDK: Download and import into your Unity project
- OpenBCI GUI: Download from the OpenBCI website
- Python: Version 3.9 or later (for machine learning components)
- Python Libraries:
numpyscipyscikit-learnjoblibpylsl(for Lab Streaming Layer)websockets- Install using pip:
pip install numpy scipy scikit-learn joblib pylsl websockets
- Visual Studio or Rider (for C# development in Unity)
- Blender (optional, for 3D model editing)
-
Clone the Repositories:
- Software Repository (Unity, ML, BCI):
git clone https://github.com/sbpark422/Syncer.git cd Syncer - Hardware Repository (Design Files + Hardware):
git clone https://github.com/Caerii/OpenGalea.git cd OpenGalea
- Software Repository (Unity, ML, BCI):
-
Set up the Unity Project:
- Open Unity Hub and create a new project using Unity 2022.3 or later.
- Import the Meta XR SDK.
- Copy the contents of the
Assetsdirectory from theSyncerrepository into your Unity project'sAssetsfolder.
-
Install Python Dependencies:
cd Syncer/ML # Navigate to the ML directory within the Syncer repo pip install -r requirements.txt
(Create a
requirements.txtfile listing all Python dependencies within theSyncer/MLdirectory) -
Install OpenBCI GUI
- Download and install the OpenBCI GUI according to your operating system.
The ML pipeline turns raw EEG into actionable states (e.g., Attention vs. Relaxation) so the MR experience can respond in real time. Below is how we collect data, train the model, and run inference live.
- EEG data was collected using the OpenBCI Cyton board and the provided Python script (
Syncer/ML/model_dev_Attention.py). - Data was collected for two states: "Attention" and "Relaxation."
- The dataset comprises 112,900 data points (498 seconds).
- Electrode Placement:
- Fp1: Left frontal lobe
- Fp2: Right frontal lobe
- C3: Left central region
- C4: Right central region
- T5: Left temporal lobe
- T6: Right temporal lobe
- O1: Left occipital lobe
- O2: Right occipital lobe
- A modified Random Forest algorithm was used for classification.
- Features: Raw EEG data, alpha wave band power, and beta wave band power.
- The model was trained using the collected EEG data and saved as
Syncer/ML/random_forest_model_Attention_AlphaBeta.joblib. - Training scripts and details are available in the
Syncer/MLdirectory.
- The
Syncer/ML/main_Attention.pyscript loads the trained model and performs real-time inferencing on incoming EEG data from the Cyton board via LSL. - Inferencing is performed on 1-second time windows, with results updated every second.
Once hardware is assembled and software is installed, you can run OpenGalea in single-player or multiplayer mode.
- Hardware Setup: Assemble the OpenGalea headset and connect it to your Quest 3. Connect the OpenBCI Cyton board to your laptop.
- Software Setup:
- Launch the OpenBCI GUI and start streaming EEG data.
- Run the
main_Attention.pyscript from theSyncer/MLdirectory to start the machine learning model and begin real-time inferencing. - Open the OpenGalea Unity project and build/deploy it to your Quest 3.
- Start the Experience: Launch the OpenGalea app on your Quest 3.
In our example app, multiple users collaborate using their BCI alpha channels: each user's alpha values are visualized as pillars, and users help a running character jump over obstacles by entering a rest state (high alpha) when the character needs to jump.
- Hardware Setup: Each participant needs a fully assembled OpenGalea headset, a Quest 3, and a laptop.
- Software Setup:
- Ensure all devices are on the same Wi-Fi network.
- Launch the OpenBCI GUI on each laptop and start streaming EEG data.
- Run the
main_Attention.pyscript on each laptop. - Open the OpenGalea Unity project.
- Configure the
NetworkManagerin Unity to designate one device as the host and the others as clients. - Build/deploy the app to all Quest 3 devices.
- Start the Experience: Launch the OpenGalea app on all Quest 3 devices. The host should initiate the shared experience.
Colocation is what makes OpenGalea feel shared: everyone inhabits the same physical room and the same virtual layer, so cooperation is natural and spatial. We use Meta’s Colocation and Shared Spatial Anchors APIs to make that possible.
- Colocation Discovery: Devices discover each other over Bluetooth within ~30 feet. The Colocation API lets each headset advertise its session and find nearby peers so users can join without manual IPs or codes.
- Shared Spatial Anchors: Once connected, shared anchors lock virtual objects to the same spots in the room for every user—so a virtual pillar or character stays where you expect it, for everyone.
- Synchronization: We keep device state and object updates in sync so that all users see and interact with the same virtual world at the same time, with minimal lag or drift.
See TROUBLESHOOTING.md for common issues and fixes (hardware, OpenBCI, Python/ML, Unity/Quest, and multiplayer). If you run into something not covered there, open an issue or reach out via Contact.
OpenGalea is meant to grow with the community—whether you’re into hardware, ML, Unity, or UX, there’s room to help. We welcome contributions of all kinds. Please see our Contribution Guidelines for how to get involved.
This project is licensed under the MIT License.
- MIT Reality Hack
- OpenBCI
- Meta
- Vankley Yi — hardware engineering support
For questions or inquiries, please contact:
- [[email protected] (Soo Bin); [email protected] (Tsing); [email protected] (Yechan); [email protected] (Hussain); [email protected] (Alif)]
We drew on sci-fi and pop culture for the feel of brain-linked collaboration:
- Pacific Rim — Pilots neurally linked to share control and intention.
- Cerebro (X-Men) — A device that reads and connects minds across space.
- Neon Genesis Evangelion — Synchronization between pilot and machine, and between pilots, as the key to cooperation.
- Software (Unity, ML, BCI): Syncer
- Hardware (Design Files): OpenGalea
- Devpost: OpenGalea project page
- Demo video: YouTube
- Sizzle reel: YouTube
