Reactive Brain Waves

How to use RxJS, Angular, and Web Bluetooth, along with an EEG Headset, to Do More With Your Brain

Uri Shaked
10 min readOct 12, 2017

A few months ago, I stumbled upon a Bluetooth Smart EEG Headset. I immediately recognized its potential for some super interesting stuff: using Web Bluetooth, I could communicate directly from my brain to web pages!

EEG, or electroencephalography, is essentially just a way to monitor electrical activity in the brain. It usually involves placing a few electrodes on your scalp, which then collect information about the firing of your neurons, which is then recorded on a graph. Sounds like some pretty good data to play with to me! While EEG is most often used for medical purposes, there are a few novel use cases that have popped up here and there.

One of those novel use cases is Muse, a consumer product that is supposed to help teach you how meditate tune of $250, but it’s also a solid consumer EEG device that comes with Bluetooth. And though it’s supposed to teach you how to calm your mind, my mind will only calm after I figure out how to consume its data from my web page!

(If your mind will also not stay calm, feel free to skip to the code tutorial below ;-)

The headset comes with an Android or iOS app, and there is even a library so you can build your own app and get the raw data — but this only works native apps, and the source code is not open (therefore my dream of controlling web pages from my mind seemed, at first, out of reach).

When I went on ng-cruise, I met Alex Castillo, who was giving a talk showing how he’d connected an open-hardware EEG headset that he had called OpenBCI to Angular and visualized the signals. Though impressive in itself, he had to use a complicated setup with node.js and a web socket server to relay the data, which was still far from the vision I had. Later on though, we had a hack night on the cruise where everyone tried to do cool stuff with the various hardware devices, including the EEG device, so naturally I had to give it a go.

I tried to reverse engineer the Muse Bluetooth protocol similar to what I had done with the Magic Blue bulb. About an hour into it, I figured out somebody may already have done this, so I googled one of the characteristic numbers I discovered and found this great article, which in turn, pointed out to this python library created by Alexandre Barachant, and all of a sudden, I had everything I needed: that’s how muse-js was born.

So now I can connect to my Muse headset from the web and receive the EEG data (also battery level, accelerometer/gyro, etc.). Hooray!

So what will I build with it?

The Hardware

Before we dive into code, let’s first get to know the Muse headset a little better. Basically, it is a lightweight, rechargeable headband. It has 4 EEG electrodes: two on the forehead, slightly above the eyes, and two touching the ears. Additionally, it also has a gyroscope and accelerometer, so you can calculate the head orientation. I was also really happy when I figured out they have another EEG sensor that you can connect to your own electrode (though the Micro USB port), which I plan to try soon.

Note that there are two versions of the headset — 2014 and 2016. You definitely want the 2016 one, which uses Bluetooth Low Energy. The 2014 speaks Classic Bluetooth and therefore can’t be used with Web Bluetooth.

Muse 2016: AF7, AF8 are the twoforehead electrodes, TP9 and TP10 are the ear electrodes

Reactive Streams with RxJS

When I built the library, I had to decide how to expose the incoming EEG data. With Web Bluetooth, an event is fired whenever a new data packet is received. Each data packet contains 12 samples from a single electrode. I could let the user register a JavaScript function to invoke whenever new data is received, but I decided to go with the RxJS library (Reactive Extensions Library for JavaScript), which includes methods for transforming, composing, and querying streams of data.

The advantage of RxJS is that it offers a set of functions that allow you to manipulate and process the raw data bytes received from the Muse headset in order to convert them to something more useful (as we will be doing shortly).

Visualize

The first thing that comes to mind to do with our new muse-js is to visualize the data. During the hack night, Alex and I started working on angular-muse — an Angular application that visualizes the EEG data as well as the head orientation.

My initial prototype for visualizing the Muse Data

In fact, if you have a Muse device and a Web Bluetooth supporting browser, you can actually open the Demo page and try it yourself!

Visualizing my brain activity with Muse, Angular and Smoothie Charts

The app is a simple way to demonstrate that the data is streaming, but to be honest, seeing graphs of data can be appealing, but you would probably lose interest rather quickly if that was all you could do with the data (though I wonder how that would be reflected in the graphs…).

In the Blink of an Eye

One of the many things EEG does is to measure the electronic potentials (voltage) across different locations on your scalp. The measured signals are a side effect of brain activity, and can be used to detect general state of mind (such as level of concentration, detection of unexpected stimuli, etc.).

Apart from brain activity, eye movements can also be detected, using a technique called Electrooculography (luckily, my girlfriend is an optometrist and was able to educate me on the subject). The muse device has two electrodes positioned on the forehead (called AF7 and AF8 in the standard 10–20 positioning system), which are close to the eyes, so we can easily monitor eye movement activity.

Our eye: Cornea is positively charged in front, Retina is negatively charged in the back

We will use the signal from these electrodes for our “Hello World” EEG program — detecting blinks by monitoring eye activity.

Let’s get started!

Our strategy will be as follows: We will take the stream of incoming EEG samples from the device (muse-js provides it as an RxJS observable, as mentioned above), then filter just the electrode that we need — the AF7 electrode, which is above the left eye, and then we will look for spikes in the signal, i.e., samples with absolute value above 500mV, which means there was a big potential change. Since the electrode is next to the eye, we expect the movement of the eyeballs to generate a significant potential difference.

While this may not be the most accurate method to detect a blink, it worked pretty good for me, and the code is simple and easy to follow (like all good “Hello World” examples should be ;-).

But before we do anything else, let’s first install muse-js into our project…

npm install --save muse-js

…and then import it into our code. In this case, it will be an Angular application — just an empty project created with the Angular CLI, but you can also follow along with React/VueJS if you prefer, as there will be very little framework-specific code.

Next, we import muse-js into our main app component:

import { MuseClient, channelNames } from `muse-js`;

The MuseClient class interacts with the headset, while channelNames simply provides a human-readable mapping to the EEG channels.

In our component, we will create a new instance of the MuseClient :

this.muse = new MuseClient();

Now we get to a semi-tricky part: the logic for connecting the headset.

Web Bluetooth requires some user interaction before we can initiate a Bluetooth connection, so we need to add some button, and only when the user clicks on it, will we actually connect with the headset. We will implement the connection logic inside a method called onConnectButtonClick :

async onConnectButtonClick() {
await this.muse.connect();
this.muse.start();
// TODO: subscribe to EEG DATA
}

The connect() method of the MuseClient class initiates the connection with the headset, and then the start() method commands the headset to start sampling the EEG data and sending it down the wire.

Pairing with the Muse headset using Web Bluetooth

The next thing we need to do is to subscribe to the EEG data available on muse.eegReadings observable (where we’d put the TODO comment above):

  const leftEyeChannel = channelNames.indexOf('AF7');

this.leftBlinks = this.muse.eegReadings
.filter(r => r.electrode === leftEyeChannel)

The above code takes the EEG readings received from the device and filters only the AF7 electrode, which is located above the left eye. Each packet contains 12 samples, so each item in the observable stream is an object with the following structure:

electrode will contain the numeric index of the electrode (use the channelNames array to map it to a more friendly name), timestamp contains the timestamp this sample was taken, relative to the start of the recording, and samples is an array of 12 floating point numbers, where each contains one EEG measurements in unit of mV (micro volts).

For the next step, we want to get just the maximum value out of each packet (i.e., the measurement with the maximal output value). We will use the RxJS map operator on the above stream obtain it:

  this.leftBlinks = this.muse.eegReadings
.filter(r => r.electrode === leftEyeChannel)
.map(r => Math.max(...r.samples.map(n => Math.abs(n))))

So now that we have a stream of simple numbers, we can filter it and only allow values greater than 500, which are probably the blinks we are looking for:

  this.leftBlinks = this.muse.eegReadings
.filter(r => r.electrode === leftEyeChannel)
.map(r => Math.max(...r.samples.map(n => Math.abs(n))))
.filter(max => max > 500)

At this stage, we have a simple RxJS pipeline for blink detection, but we still need to subscribe to it in order to actually start receiving the data. We will start with a simple console.log:

  this.leftBlinks.subscribe(value => {
console.log('Blink!', value);
});

If you run this code, you will probably see a lot of “Blink!” activity until you put your headset on, as there will be a lot of static noise. Once you do have your headset on though, you should start seeing the “Blink!” messages only when you blink or touch the left eye:

Wow, it actually works!

You will probably see several “Blink!” messages whenever you blink. The reason for this is that blinking your eyes generates several changes in the electronic potential. To avoid having more “Blinks!” than necessary, we need to apply a debouncing filter, similar to how this is done with mechanical buttons in Arduino.

So let’s add this final touch: instead of logging to console, we actually want to emit the value 1 whenever there is a blink, then wait half a second after the last potential change, and emit the value 0. This will filter out the multiple “Blink” events that we see:

So what does this switchMap sorcery do? Basically, whenever a new item arrives, switchMap will discard the previous stream and call the given function to produce a new stream. This new stream comprises two items: the value 1, which we emit immediately with Observable.of, and then the value 0, which is emitted after 500 milliseconds, unless, of course, a new item has arrived from the filter line, which will restart the switchMap and discard the pending 0 value.

Now we can use this leftBlinks observable to visualize our blinks! We can bind to it in our Angular templates using the async pipe:

<span [hidden]="leftBlinks|async">👁</span>

The above code will hide the eye symbol whenever you blink, or we can toggle a CSS class instead and then color or animate the eye symbol upon blink:

<span [class.blink]=”leftBlinks|async”>👁</span>

In either case, I recommend blinking only one eye at a time if you can, to make sure you can see whether or not your code is working 😜!

If we’re are building a React app, we can simply subscribe to this observable and update our component state whenever you blink:

  this.leftBlinks.subscribe(value => {
this.setState({blinking: value});
});

So now we’ve made it! The “Hello World” of EEG is done!

You can find the code for the full project here:

Summary

A few years ago, EEG was something expensive, with cumbersome equipment available only for hospitals and research facilities. Today, web developers like you and me can easily connect and analyze EEG data with the same tools we use day-to-day to build web sites — our web browser, RxJS and Angular.

Even if EEG isn’t your cup of tea, you can clearly see how the new push for all kinds of “smart” consumer products has created a bunch of really neat opportunities for developers. We certainly live in exciting times!

p.s. — a big thanks Ben Lesh for helping with the RxJS code in these examples.

--

--

Uri Shaked
Uri Shaked

Written by Uri Shaked

Google Developer Expert for Web Technologies, Maker and Public Speaker