No sooner had I read the piece on Ensomniac Studios’ Conductar: Moogfest app, than I stumbled onto an event posting by Columbia University’s CLOUD Lab for a Research Workshop called “How Does the Brain Respond to the City?”. The goal of the workshop, which was sponsored by the Van Alen Institute, was to gather input for a data visualization depicting the composite “mind-state” of the participants as they walked pre-defined paths through the DUMBO section of Brooklyn. This looked like an incredibly cool event, and though I was initially wait-listed, I received an email informing me that slots had opened up. On it.
If you’re wondering why we’re interested at all in a study of this nature, I’ll fill you in. There are two main reasons: First, as a company specializing in Generative / New Media Art, we are always looking for new data sources and technologies that facilitate user/audience interaction. Much of what creative coders do, in fact, is to find interesting ways of taking huge data sets and map them to some sort of emitter — be it visual, aural, or tactile — which is then output for users to experience and interact with. So brainwave data — and the means for collecting and using it — is very intriguing.
Secondly, our Augmented Reality music-mixing app, U-GRUVE AR, was designed to use GPS as the primary input for controlling the audio mix. Adding bio-feedback, as the Conductar team had done, would be incredible.
So that was our business motivation, in a large nutshell.
We were briefed and prepped for the data-gathering exercise by Mark Collins, Co-director of the Cloud Lab, who gave us a run-down of the project and technology we’d be using. Using the NeuroSky Mindwave Mobile headset, and an app paired to the headset via an iPod Touch, we’d be recording a standard EEG data set, in addition to a couple of ‘pre-analyzed’ measurement gauging “attentiveness” and “meditativeness”, as well as eye-blink strength. The app would also record GPS and gyro-orientation values, of course, to determine course and orientation.
Mr. Collins went on to explain that they were hoping to end up with a visualization that, if all went well, would resemble a weather map of sorts, showing shifts in mental states from one area to the next, over the course of the day (I was in the 11:30 group, but the exercise spanned from 10AM to 6PM). The upper right image in the above composite is an example of what the output might look like.
Mr. Collins also emphasized that the DUMBO neighborhood had been intentionally selected because of its diverse mix of industrial, commercial, residential, and scenic stimuli (hit google images with “dumbo brooklyn”) to allow for clearly discernable patterns.We were then fitted with the headsets, issued an iPod Touch, and off we went with our guides (thanks to Bobby, pictured above, who protected us from cars, thugs, and strollers) on one of two paths – either a loop or an out-and-back – walking at “museum pace”, e.g. very slow, to ensure that our minds fully processed the things we chose to focus on. Interestingly, the quick peaks at the iPod app showed, more often than not, that my “Meditative” readings were higher than my “Attentive” ones. I certainly didn’t feel meditative, which led me to wonder if the nomenclature, rather than the state of mind, were really at issue.
About twenty minutes later, we were done, and were told that we could send our collected data via email, right from the app, in the form of an excel-compatible csv file. How cool! A whole new dataset to play with!
We wrapped up with some post-tour note-swapping, as some riffing on the multifarious ways in which this technology could be used. Learning also that the mobile version of the headset is available for around $130 makes the possibility of integration with our apps even more plausible. Wait . . . let me say that again — you can buy a slick, compact little headset that records your brainwaves as usable data for just a little over a hundred bucks. Recalling where things were when I first saw the BioMuse demonstrated at CyberArts 1990, it’s astounding to see how far things have come. Truly.
The CLOUD Lab’s data visualization will be presented on 13 May, 2014. The Van Alen site has more information about the event and how to get tickets.