Using EEG data for a poster design

Having our sketch running in Processing we could easily export the data we were visualizing and use it as an input for other applications (such as visualizations, physical computing etc.)

As a way of making our work public within the lab we created some posters using the Processing output in Illustrator. In the process we stumbled upon an interesting Moiré effect that gave the sensors graphs a watercolor-like effect.


IMG_1202 IMG_1205 IMG_1207



What if a book could read you?

Neurofiction is a new kind of literary experience, created with the support of New Media Scotland. Neurofiction combines:

Together, these enable stories that change themselves in response to the reader’s brain activity.

Visualizing the output

Once we had the library working with Processing, we needed to visualize four variables for each sensor:

  • The position on the scalp
  • The quality of the signal

(functions alraedy present in the EPOC control panel)

  • The strength of the signal
  • The change of strength over time

Working with Giulia Marzin we visualized each sensor with a polar graph that shows the strength of the signal using lines and a color indicator for its quality.


Meet the neural helmet

After meeting the teachers of the group we started using Eugenio’s Emotiv EPOC headset and MindYourOSC software to connect the helmet to Processing.

At the time of writing Emotiv offers two types of helmet the EPOC (which gives you data regarding thoughts, feelings and expressions) and the EEG (which gives you all the EPOC data plus the raw signal sensors).

Eugenio illustrating the Emotive EPOC

I had the opportunity to test the helmet first and I was quite impressed when I first calibrated the device: the software lets you interact with a virtual cube using your brain activity. After a few minutes of training I was able to do a combination of three actions (pushing, pulling and lifting the cube).

First calibration test

As quick test (and to show a little magic to our audience) we came up with a  Processing sketch that lets the user trash pieces of paper with data coming from the sensors.

In our first experiment only one signal from the headset was used: we noticed that muscular movements were the strongest so we used the eyebrow raise value as a main input.

As you can imagine the results were quite entertaining 🙂

Raise eyebrows to change the wind!

The code to retrieve the signal data from MindYourOSC over the OSC protocol, is quite straight-forward:


OscP5 oscP5;

void setup() {
size(800, 600, OPENGL);
//start oscP5, listening for incoming messages on port 7400
//make sure this matches the port in Mind Your OSCs
oscP5 = new OscP5(this, 7400);
void oscEvent(OscMessage theOscMessage) {
// check if theOscMessage has an address pattern we are looking for
if (theOscMessage.checkAddrPattern("/EXP/EYEBROW") == true) {
// parse theOscMessage and extract the values from the OSC message arguments
wind = theOscMessage.get(0).floatValue();



Hallucinating with table tennis balls

The Ganzfeld procedure exposes the participant to ‘unstructured’ sensations usually by placing half ping-pong balls over the eyes so they can only see diffuse white light and by playing white noise through headphones.

It is probably best known for its uses in parapsychology experiments, but it is also used to induce hallucinations and sensory distortions which are much more likely to occur in the absence of clearly defined sensory experiences.