Processing + EPOC via OSC

Related articles: AffectCircles

How would you like to create interactive art that responds to your thoughts, moods, and facial expressions? Thanks to Mind Your OSCs and oscP5, interpreting the Emotiv EPOC‘s data within a Processing sketch (and by extension, Arduino) could not be easier, even with the consumer (i.e. most affordable) version of the EPOC. This effectively allows anyone to develop a great variety of (open-source, if desired) EPOC applications including physical computing, even if they have only the consumer headset [1].

Here is how it works. The EPOC headset and software read your neuroelectrical signals and interpret them as a set of predefined outputs that reflect your facial expressions, mood (excited, bored, meditative, etc.), and conscious intentions (see the EPOC docu for more info). Mind Your OSCs formats that output as a series of OSC messages, and sends them to a network port. Using oscP5, Processing can listen to that port and read any OSC messages that can then be parsed and their values assigned to variables that determine various interactions.

To begin, you need an Emotiv EPOC and the Mind Your OSCs application which you can download for free from the Emotiv store. If you want to experiment before committing to purchase the EPOC, you can download Emotiv’s SDK Lite for free, which includes an EPOC emulator and scripting tool called EmoComposer (also useful for testing interactions with Processing sketches without having the headset on). You also need to download and install Processing and the oscP5 library.

Connecting Processing to Mind Your OSCs

In the right-hand side of the Mind Your OSCs window, you can see the IP address and port number for data going out of Mind Your OSCs (connection info for data coming into Mind Your OSCs from the EPOC device or an emulator, etc., is displayed on the left-hand side of the window):

OSC Connection

In Processing’s setup() method, you set up a connection to the same port showing in Mind Your OSCs:

void setup() {
  //start oscP5, listening for incoming messages on port 7400
  //make sure this matches the port in Mind Your OSCs
  oscP5 = new OscP5(this, 7400);

Interpreting EPOC Events

Each OSC message sent from Mind Your OSCs has three parts:

  1. an Address Pattern in the form of /COG/PUSH (meaning Cognitiv suite → Push action, e.g.) [2]
  2. a Type Tag string that identifies the data type of the Argument (always ‘f’ for Mind Your OSCs, for floating point)
  3. one or more Arguments; for Mind Your OSC’s, always one argument, a floating point number between 0 and 1 that represents the value returned by the EPOC function identified by the Address Pattern

You can use the oscEvent() method to parse selected messages from Mind Your OSCs. Here is an example to select the messages corresponding to the Cognitiv suite’s (move) Left and Right actions, and assign their values to the variables left and right, respectively:

void oscEvent(OscMessage theOscMessage) {
  // check if theOscMessage has an address pattern we are looking for
  if(theOscMessage.checkAddrPattern("/COG/LEFT") == true) {
    // parse theOscMessage and extract the values from the OSC message arguments
    cogLeft = theOscMessage.get(0).floatValue();
  } else if (theOscMessage.checkAddrPattern("/COG/RIGHT") == true) {
    cogRight = theOscMessage.get(0).floatValue();

Remember that even though the OSC messages from Mind Your OSCs have only one Argument each, OSC messages in general can have more than one, so they need to be treated like arrays. The statement left = theOscMessage.get(0).floatValue(); means “read the value at index 0 (i.e. the first and only value associated with a message sent from Mind Your OSCs) as a floating-point number, and assign it to the variable left.”

You can use oscP5’s plug() service to automatically forward select messages to your methods without having to parse them with oscEvent() (example).

Example Sketch

Here is an example Processing sketch that moves a circle left or right when the operator wearing the EPOC thinks, “move left,” or, “move right,” respectively (the Left and Right Cognitiv actions must be trained using the Emotiv software, prior to running the sketch).

ProcessingEpocOsc1 Screen Shot

 * ProcessingEpocOsc1
 * by Joshua Madara,
 * demonstrates Processing + Emotiv EPOC via OSC
 * uses EPOC's Cognitiv Left and Right to move a circle
 * left or right
import oscP5.*;
import netP5.*;

public float cogLeft = 0;
public float cogRight = 0;
int circleX = 240;

OscP5 oscP5;

void setup() {
  size(480, 360);
  //start oscP5, listening for incoming messages on port 7400
  //make sure this matches the port in Mind Your OSCs
  oscP5 = new OscP5(this, 7400);

void draw() {
  // draw graph ticks
  int i;
  for (i = 1; i <= 11; i++) {
    stroke(map(i, 1, 11, 0, 255));
    float tickX = map(i, 1, 11, 60, 420);
    line(tickX, 250, tickX, 269);
    line(tickX, 310, tickX, 329);
  // draw bar graph
  drawBarGraph(cogLeft, 270);
  drawBarGraph(cogRight, 290);
  // determine whether to move circle left or right
  if((cogLeft >= 0.5) && (circleX >= 0)) {
    circleX -= 5;
  } else if ((cogRight >= 0.5) && (circleX <= 480)) {
    circleX += 5;
  // draw circle
  fill(color(25, 249, 255));
  ellipse(circleX, 150, 90, 90);

void drawBarGraph(float cogVal, int barY) {
  if(cogVal >= 0.5) {
    fill(color(22, 255, 113));
  } else {
    fill(color(255, 0, 0));
  float len = map(cogVal, 0.0, 1.0, 0, 360);
  rect(61, barY, len, 20);

void oscEvent(OscMessage theOscMessage) {
  // check if theOscMessage has an address pattern we are looking for
  if(theOscMessage.checkAddrPattern("/COG/LEFT") == true) {
    // parse theOscMessage and extract the values from the OSC message arguments
    cogLeft = theOscMessage.get(0).floatValue();
  } else if (theOscMessage.checkAddrPattern("/COG/RIGHT") == true) {
    cogRight = theOscMessage.get(0).floatValue();

Spectacle of the Mind

Here is video footage from Spectacle of the Mind, a Global Mind Project performance featuring Stelarc, Domenico De Clario, and Jill Orr, all using EPOCs in multimedia performance art (albeit not with Processing or Arduino, but they are capable of similar things).

Notes & References

  1. You can do the same thing using GlovePIE, but it requires a developer edition of the EPOC, or better.

2010.09.15 Update

I have been accepted to teach a six-week course in beginning interactive multimedia ritual design, at Arcanorium College, later this year or early next. Will post details as they develop.

I am working on a series of multimedia sketches titled Sleight-of-Mind Machines, that will demonstrate simple human-computer interactions for magic.

The Emotiv EPOC has been hacked (thanks to Larry for the heads-up). At least one or two of the Sleight-of-Mind Machines will feature the EPOC. I also have some plans to interface it with Processing via OSC packets.

Update 2010.02.21

The EPOC and Scratch demos @ Jigsaw Renaissance went pretty well. We may be hosting some Scratch classes in the future, and I am still working on tutorials for Scratch as an introduction to multimedia ritual design. In response to members’ interests, we will be hosting a regular EPOC meetup at Jigsaw. I have been having some problems with my EPOC, including broken electrodes and my wireless connection frequently drops. Hoping to get through to Emotiv‘s support soon. Also still unable to get Neurovault working.

Boe-Bot (more images) I won a Boe-Bot with Gazbot accessory from Parallax, two weeks ago (contest details). A few days later, Michael Parks gave a Propeller demo @ Jigsaw, and we talked about him giving some Propeller classes there. He turned me onto 12Blocks, which is like Scratch for Propeller. All of these experiences are converging on me learning the Propeller platform in the near future. The more I consider it, the more I dig its multi-cog architecture, and the more excited I become for possibilities of creating robots having multiple personalities sharing and competing for the same resources, as wells as multiprocessor robots for rituals.

Omen Antiquitatum (more images) I recently acquired some Elder Sign (Omen Antiquitatum) props from HPLHS, and the old Scary Laboratory set from the LEGO Studios collection, for a common ritual I expect to disclose more about in days to come. An unexpected score from the LEGO set was the Studio effects software which includes a cool interface for manipulating audio, and a bank of horror movie sound effects — will post pics of that, later.

Coming soon: using scents in multimedia rituals.

Question: How would you begin to design an artificial intelligence that could alter a REG‘s output with the “power” of intention?

Ideas for Experimenting with EPOC + Meta-Magick

I have been thinking about ways to combine the Emotiv EPOC with Meta-Magick techniques, for experimentation (vis-à-vis ritual per se, where resides most of my interest in combining them).

1. State entity invocation. Have Person A (wearing the EPOC) invoke State Entity J while Person B records the EPOC’s activity, mapping it to one of the Cognitiv abilities. Repeat (banish, re-invoke) a few times to improve the EPOC’s response. Later, have Person A invoke State Entity J again, while Person B observes the EPOC’s response to see if it is consistent with previous responses. Alternatively, the response could be mapped to some feedback signal accessible to Person A (e.g. changing the color of a digital artifact).

2. State entity evocation. Have Person A evoke State Entity K into Space X. Have Person B (wearing the EPOC) enter into Space X while Person C records the EPOC’s activity and maps it to one of the Cognitiv abilities. Repeat to see if a consistent response can be found. Next, have Person A evoke State Entity L into Space X, then Person B enters into Space X while Person C records the EPOC’s activity and maps it to a different Cognitiv ability than for State Entity K, and observes any in/consistencies in the response — does the response mapped to State Entity K trigger? Repeat with Person A randomly alternating the entity evoked into Space X, to see what response is triggered in Person B.

3. Mirror neurons and telepathy. Have Person A relax while Person B (wearing the EPOC) observes Person A, and Person C records the EPOC’s activity. Are there any Affectiv signs that Person B relaxes in correspondence to Person A? This could be done with Person A also wearing an EPOC, to compare data from both devices. Another way would be with the Cogntiv abilities, e.g. Person A and Person B each record responses to Push and Pull, then A randomly alternates between Push and Pull, while B observes A, and Person C observes B’s responses to A — does B “push back” when A “pushes”?

Note that in (2) and (3), Person C could be replaced by an automatic recording mechanism, to eliminate her intention’s affect on the experiment. Of course, these all would need additional details such as controls, in order to qualify as bona fide scientific experiments.