OpenGalea: Open-Source Neuroadaptive Mixed Reality with the Meta Quest 3
Team Syncer
Team Members: Alif Jakir, Tsing Liu, Soo Bin Park, Yechan Ian Seo, Syed Hussain Ather
Special thanks to Vankley Yi for hardware engineering support
Inspiration
Human interaction with digital information has evolved significantly, driven by advancements in technology like graphical user interfaces, immersive virtual reality, and mixed-reality headsets. Despite these strides, open-source accessibility in neural interfaces remains limited due to the complexity and cost of the associated technology. Inspired by the need to democratize neuroscience and enable affordable, brain-controlled experiences, OpenGalea was created. It bridges the gap between neuroscience and mixed reality, empowering users with immersive, brainwave-driven interactions. We drew further inspiration from science fiction concepts like the neural link in Pacific Rim, Cerebro from X-Men, and the synchronization in Neon Genesis Evangelion, envisioning a future where users can seamlessly interact with virtual worlds using their thoughts.
What it does
OpenGalea combines EEG technology with mixed reality to enable real-time, brain-controlled interactions. Users wear an OpenBCI-based Ultracortex headset, which captures their brainwave activity. This data is processed through custom machine learning models to classify mental states (e.g., Attention or Relaxation). These classifications are then visualized in real-time in a shared mixed-reality environment, supporting both individual and cooperative interactions. For example:
- Brain-Controlled Interactions: Users can control virtual objects through their brainwaves.
- Co-located Collaboration: Co-located users can engage in synchronized interactions, enhancing collaboration.
- Multiplayer MR Experience: The Unity app supports a multiplayer MR experience where multiple users can collaborate using their BCI alpha channels to accomplish common tasks. In our example app, multiple users collaborate to help a running character jump over obstacles. Each user's alpha values are visualized as pillars. When the character needs to jump, users try to enter a rest mode, generating high alpha values that cause the character to jump.
How we built it
The development of OpenGalea involved a multi-faceted approach, encompassing hardware design, software development, machine learning, and integration with the Meta Quest 3 platform.
1. Hardware:
- OpenBCI Cyton Board: Utilized an OpenBCI Cyton board with an 8-channel EEG setup.
- Modified Ultracortex IV: Modified the Ultracortex IV design for affordability and ease of assembly while integrating it with the Quest 3 headset. The headset incorporates adjustable Velcro straps and strategically placed weights for optimal comfort and balance.
2. Software:
Unity Development: We used Unity 2022.3 and the Meta XR SDK to create the mixed reality experiences. We implemented Colocation and Shared Spatial Anchors for co-located interactions, and developed custom C# scripts to handle EEG data processing, object manipulation, and communication.
- OpenBCI and Meta Quest Integration: Each user wears OpenBCI electrodes and a Meta Quest headset, and each has their own laptop. The OpenBCI GUI runs on each user's laptop, networking with their Meta Quest device via UDP communication.
- Alpha Value Processing: We have two versions in OpenBCI: one that directly reads alpha values using their prefabs (ranging from 0 to 1) coming from each user, and another that reads data from our trained ML server.
- Meta SDK Features: We utilized co-location and multiplayer features from the Meta SDK, allowing different users to share one space and collaborate.
OpenBCI GUI and UDP Streaming: We utilized the OpenBCI GUI to acquire raw EEG data and stream it to Unity via UDP.
Machine Learning:
- Data Collection and Preprocessing: We developed a Python script (
model_dev_Attention.py) to collect and process raw 8-channel EEG data using LSL communication with the Cyton board. The 8 channels correspond to the following brain regions:- Fp1: Left frontal lobe
- Fp2: Right frontal lobe
- C3: Left central region
- C4: Right central region
- T5: Left temporal lobe
- T6: Right temporal lobe
- O1: Left occipital lobe
- O2: Right occipital lobe
- We collected data for two states, "Attention" and "Relaxation," amassing a dataset of 112,900 data points (498 seconds total).
- Model Development: We designed a custom machine learning model based on a modified Random Forest algorithm. The model incorporates raw EEG data, alpha wave band power, and beta wave band power as features.
- Training and Saving: We trained a 2-class classification model (Attention vs. Relaxation) and saved the trained model parameters in the file
random_forest_model_Attention_AlphaBeta.joblib. - Real-time Inferencing: The saved model is loaded in
main_Attention.py, which receives real-time 8-channel EEG data from the Cyton board via LSL. The script performs real-time inferencing on 1-second time windows, updating the results every second. - UDP Output: The model's output (predicted state in JSON format) is sent to Unity via a UDP socket, enabling real-time interaction within the virtual environment.
- Data Collection and Preprocessing: We developed a Python script (
3. Real-Time System:
- LSL Communication: Integrated LSL communication to stream EEG data in real-time.
- Real-time Inferencing Pipeline: Built a real-time inferencing pipeline that updates every second.
- UDP Transmission: Transmitted classified states to Unity via UDP sockets for visualization.
4. Mixed Reality:
- Co-located Environment: Created a co-located mixed-reality environment using Unity.
- Shared Object Interactions: Implemented shared object interactions based on synchronized brainwave activity.
Challenges we ran into
- Hardware Prototyping: Modifying the Ultracortex headset design to reduce cost while maintaining functionality and integrating it with the Quest 3 required significant effort and iterations.
- Data Synchronization: Ensuring low-latency, synchronized data streaming between EEG devices, Unity applications, and multiple headsets was technically challenging.
- Model Optimization: Fine-tuning the machine learning model to handle noisy real-world EEG data required extensive experimentation.
- Collaboration in Mixed Reality: Designing intuitive, shared interactions in a co-located mixed-reality setting demanded creative problem-solving.
- Seamless Communication: Establishing reliable communication between OpenBCI and Meta Quest was a complex task, requiring careful synchronization and data handling.
Accomplishments that we're proud of
- Successfully integrated an affordable, open-source EEG device with a high-performance mixed-reality headset: Creating a functional, open-source neuroadaptive MR system.
- Developed a real-time brainwave-driven interaction system capable of multi-user collaboration: Enabling brain-controlled interactions in shared virtual environments.
- Demonstrated the potential of co-located mixed reality for enhancing cooperation through neuroadaptive technology: Showcasing a novel approach to collaborative MR experiences.
- Reduced the cost of similar systems by more than 15x: Making advanced neurotechnology more accessible (OpenGalea costs ~$1900, compared to Galea's ~$30,000 price tag).
- Seamless communication between Open BCI and Meta Quest: Successfully integrating two distinct hardware and software platforms.
- Implementing a custom machine learning model: Our trained ML model enhances the accuracy and adaptability of brainwave interpretation.
- Building a strong foundation for future development: OpenGalea's open-source nature allows for community contributions and further innovation.
- Winning an award at MIT Reality Hack: Our project received recognition at MIT Reality Hack, validating our efforts.
What we learned
- Hardware-Software Balance: The importance of balancing hardware affordability with technical performance.
- Biosignal Data Management: Techniques for managing and processing noisy biosignal data in real-time.
- Immersive Brain-Controlled Experiences: The potential of mixed reality to create deeply immersive, brain-controlled experiences.
- Cooperative MR Design: How to design cooperative experiences leveraging co-located mixed-reality environments.
- EEG Signal Processing: The intricacies of EEG signal processing and the challenges of obtaining high-quality data in a mobile setting.
- Meta Quest 3 Development: The complexities of developing for the Meta Quest 3 platform, including the implementation of Colocation and Shared Spatial Anchors.
- Iterative Design: The importance of iterative design and prototyping in hardware development.
- Open-Source Collaboration: The power of open-source collaboration and the potential for community-driven innovation.
- Machine Learning for Real-time Applications: The challenges and rewards of training and deploying machine learning models for real-time applications.
What's next for OpenGalea
- Expanded Applications:
- Explore use cases in gaming, therapy, and non-verbal communication.
- Develop brain-controlled tools for high-stress environments.
- Scalability:
- Extend compatibility to additional EEG hardware and mixed-reality platforms.
- Optimize the system for higher-resolution data streams.
- Open Source:
- Release hardware designs, software, and documentation to foster community-driven improvements.
- Encourage collaboration for new features and use cases.
- Enhanced User Experience:
- Improve the onboarding process for both single and multi-user modes.
- Refine the visual and tactile elements of the mixed-reality environment.
- Refine the machine learning model: Improve the accuracy and robustness of the model by expanding the training dataset and exploring more advanced architectures.
- Conduct user studies: Gather feedback from users to evaluate the system's effectiveness and identify areas for improvement.
Built With
- Unity
- Meta XR SDK
- OpenBCI Cyton
- C#
- UDP
- Python (for machine learning)
- Blender (for 3D modeling)
- LSL (Lab Streaming Layer)
Links
- GitHub Unity Repository: [https://github.com/sbpark422/Syncer]
- GitHub Hardware + Design Repository: [https://github.com/Caerii/OpenGalea]
- Demo Video: [https://youtu.be/qLijOJMBI6s]
- Sizzle Reel: [https://youtu.be/i-78e9bFQKs]
- Bill of Materials: [Link to your BOM]
Built With
- colocation
- lsl
- machine-learning
- meta
- multiplayer
- openbci
- openbcigui
- udp

Log in or sign up for Devpost to join the conversation.