Lab Streaming Layer (LSL) integraton for Muse 2014 (MU-01) EEG headbands from Interaxon Inc.
This project is not compatible with the Muse 2016 or any future Muse models, since they use completely different streaming protocols, and operate via BLE (Bluetooth Low Energy). Whereas the MU-01 uses Bluetooth RFCOMM, a (legacy) Bluetooth serial communication mode.
This project requires a first-gen Muse headband, model number MU-01, also known as the Muse 2014. Not to be confused with the MU-02 or Muse 2016. The key differences are that the 2016 has a white power icon on its power button (2014 power button is blank) and the 2014 has two micro USB ports, one on each side (2016 only has one micro USB port).
I have only tested this script with Linux so far, but it may be compatible with Windows/macOS, as long as you have a way of communicating with Bluetooth RFCOMM devices. On Arch Linux for instance, I needed to install the bluez-deprecated-tools package, which comes with tools like hcitool and rfcomm. This page was quite helpful in setting things up: https://wiki.archlinux.org/title/Bluetooth.
Once you have a means of RFCOMM serial communication, you will need to know the MAC address of your Muse headband. There are automated ways of figuring this out, but they seem to be platform-specific, so I didn't include them in the interest of wider compatibility. Generally though it should be pretty easy to find the MAC address of your headband, just turn it on and ensure it's in pairing mode to start. Pairing mode is when the lights go back and forth, and your headband should go into this mode after being turned on. If it's doing something else, you can reset it by holding the power button until it turns on, then keeping it held until the lights blink a few times and then turn off. You should then be able to put it into pairing mode.
Anyway once it's in pairing mode, you should be able to find the device showing up as a regular Bluetooth device, with a name like "Muse-A1234" for instance. The last four characters in the name should be the last four characters in the serial number that's printed on your headband.
There are many ways to find the MAC address once you have it appearing as a Bluetooth device, but on Linux bluetoothctl is a good command-line tool you can use. However you do it, once you have the MAC address, write it down somewhere and you should be able to continue.
You will need Python installed, ideally version 3.14 or above. You will also need the uv package manager for installing the Python dependencies. In order to not mess up your system packages, you should create a virtual environment first:
uv venv will create the virtual environment, then every time you want to interact with this project using the terminal, you should run source .venv/bin/activate in the project directory first. This way you can install whatever Python packages you need without affecting the rest of your system.
So once the virtual environment has been created and activated, you can run uv sync to download the required Python packages. If all goes well, you should be good to continue.
This project consists of two main parts:
- A streaming script (
main.py) - Scripts to view the stream as line graphs (
accel_viewer.pyandeeg_viewer.py)
As mentioned above, you will need the MAC address of your Muse headband in order to run the streaming script. So long as the headband is on and ready to connect (you may need to pair it as a Bluetooth device as well), you can run the script like this:
python main.py --address "00:11:22:33:44:55:66"
Replacing "00:11:22:33:44:55:66" with your device's MAC address, of course.
It should then start the LSL stream, which you can pick up in many ways. There are many good LSL-compatible viewers available, here's a list of some popular ones: https://labstreaminglayer.readthedocs.io/info/viewers.html.
You may also run the included visualizers, one for EEG and the other for accelerometer data:
python eeg_viewer.py
python accel_viewer.py
Just make sure the streamer script is running and has fully started before attempting to run a viewer script. It may take a few seconds for the handshake to take place and the stream to become available.
While significant open-source infrastructure exists for the Muse 2016 (see especially https://github.com/alexandrebarachant/muse-lsl), almost every project I found for working with the 2014 headband requires the use the the official muse-io streaming application from the Muse SDK. Said SDK is no longer available to the general public, and was never open-source, so while downloads are still available from certain sites archived via the Wayback Machine, it's not ideal. The Mind Monitor app also exists, which has excellent support for the 2014 headband, albeit it is not free. This project does not seek to replace Mind Monitor or any other solution, pair or free, but merely exists as an alternative open-source solution for keeping these headbands alive and useful.
Funny enough as I was writing this, I also found https://github.com/DavidVivancos/MuseDotNet and https://web.archive.org/web/20171223012514/https://musesharp.codeplex.com/, other open-source partial reimplementations of the Muse 2014's streaming protocol. So credit where credit's due, I obviously wasn't the first to have this idea.
This project was developed through referencing official documentation on Interaxon's website, although the specific sources were only available through the Wayback Machine. Other than what was available through the (formerly publicly accessible) documentation on the Muse's streaming protocol, this is a cleanroom implementation based on trial and error, and comparing the results against muse-io. It's not feature-compatible with muse-io, and is missing a lot of the more advanced features of the former, but it does allow for the decoding of EEG, accelerometer, and battery status packets, which was enough for my particular needs.
I should note, this streaming script uses a scaling algorithm to convert the raw EEG data coming from the headband into microvolts. It is based on an algorithm I found in the official docs, that uses the Muse's self-reported analog front-end gain as part of a conversion factor from raw to microvolts. I'm not 100% sure the algorithm I came up with is correct, but it does give useful-looking data that's within the general range that I got when testing with muse-io. Someone that knows more about this stuff may be able to correct or improve it, but for now it's better than nothing.
Some things that kind of bug me at the moment, but probably not enough to revisit this project (at least for a while) are as follows. First of all, the Muse has different "presets" that determine which types of data it sends over. Some presets have compressed 10-bit EEG samples, but I couldn't figure out how to decode them, so I went with a "research" preset that sends over 16-bit samples. Not sure what the specific advantages/disadvantages are, but it was the only way I could make it work without having to figure out how parse the 10-bit samples. However, the specific preset I'm using doesn't support the "DRL/REF" packet type, which I believe can be used to determine whether the headband has been correctly installed across someone's forehead. I also think it would be nice to support additional EEG channels, since apparently the 2014 headband lets you plug in two additional EEG sensors, one on each USB port, if you make up some custom wiring. See https://hackaday.io/project/162169-muse-eeg-headset-making-extra-electrode for more info on that. But it's interesting because the 2016 headband only has one AUX channel, whereas the 2014 has two. That's probably the only advantage of the 2014 I can think of, but I could see how it could come in handy with a specific project or experiment. I'm also torn on whether the battery status packets should be streamed over LSL or announced in some other way. For now they're just being periodically printed to the console, which has been enough for my needs.
Overall, I'm just glad to have scratched this mental itch. Personally I don't have a ton of uses planned for this headband, and I know it can still work with the official SDK if I needed it for something, but this was an experiment around a decade in the making for me, since being given this headband in 2016 and not really knowing what to do with it. I've documented my journey in a blog post here if you're interested: https://overscore.media/posts/reading-my-mind-rediscovering-the-muse-2014-eeg-headband.
If you have any questions, suggestions for improvement, or if you've encountered a specific issue, please feel free to reach out by creating an issue or pull request. If you've somehow found this useful, also love to hear what you used it for. At any rate, maybe this has inspired you to work on something, or to teach yourself something new. Even if you think all the fun or interesting stuff has already been done, or that old gizmo of yours will never be useful despite your best efforts, or whether you have a plan but you feel like you're just not making meaningful progress, let this be an encouragement that there very well may still be hope.
Sincerely,
Matthew Piercey