At this link you can find the Prezi of the speech given by Leonardo Romei at XY Lab
Having our sketch running in Processing we could easily export the data we were visualizing and use it as an input for other applications (such as visualizations, physical computing etc.)
As a way of making our work public within the lab we created some posters using the Processing output in Illustrator. In the process we stumbled upon an interesting Moiré effect that gave the sensors graphs a watercolor-like effect.
What if a book could read you?
Neurofiction is a new kind of literary experience, created with the support of New Media Scotland. Neurofiction combines:
- off-the-shelf hardware
- machine learning
- and customised prose.
Together, these enable stories that change themselves in response to the reader’s brain activity.
Once we had the library working with Processing, we needed to visualize four variables for each sensor:
- The position on the scalp
- The quality of the signal
(functions alraedy present in the EPOC control panel)
- The strength of the signal
- The change of strength over time
Working with Giulia Marzin we visualized each sensor with a polar graph that shows the strength of the signal using lines and a color indicator for its quality.
The next step was to get the raw signals from the headset. We couldn’t find a Processing library so we used a java library by Samuel Halliday who ported the open source Emokit library originally written in C developed by Cody Brocious and Kyle Machulis.
From this library we could get the strength and the quality of each electrode’s signal.