Holidays and arcades are one of my traditions. Come every holiday I hole up in the nearest arcade and play games until my fingers go numb, usually from the re-coil of the light-gun games. Sadly, in my experience, arcade culture in the UK has diminished significantly as the novelty and variety of yesteryear is simply not there any more. Most arcades tend to host a mixture of dated racing and light-gun games (I’m looking at you Time Crisis), which, while were fun at the time have lost their charm. During my recent holiday, much to my surprise, I came across a brand new arcade game which really piqued my interest: Dark Escape 4D by Namco.
And why did this game catch my attention so, well because it was a biofeedback game, a biofeedback game at the ARCADE!
Building a rudimentary galvanic skin response sensor
Recently I’ve been developing mechanics for a range of biofeedback projects, one of which was featured, over the summer, in an art exhibit at FACT Liverpool. These projects have been developed with the general public in mind, and so I’ve been working with consumer electronics rather than the research grade devices I normally use.
Imagine you are waiting to be interviewed for a job that you really want. You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room. The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands. He places the set on your head and says “Your interview starts now.”
This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul. And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.
The rationale for the exercise is quite clear. The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way. Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).
I’ve seen at least one other blog post that expressed some reservations about the process.
Let’s take a deep breath because I have a whole shopping list of issues with this exercise.
There has been a lot of tweets and blogs devoted to an article written recently by Don Norman for the MIT Technology Review on wearable computing. The original article is here, but in summary, Norman points to an underlying paradox surrounding Google Glass etc. In the first instance, these technological artifacts are designed to enhance human abilities (allowing us to email on the move, navigate etc.), however, because of inherent limitations on the human information processing system, they have significant potential to degrade aspects of human performance. Think about browsing Amazon on your glasses whilst crossing a busy street and you get the idea.
The paragraph in Norman’s article that caught my attention and is most relevant to this blog is this one.
“Eventually we will be able to eavesdrop on both our own internal states and those of others. Tiny sensors and clever software will infer their emotional and mental states and our own. Worse, the inferences will often be wrong: a person’s pulse rate just went up, or their skin conductance just changed; there are many factors that could cause such things to happen, but technologists are apt to focus upon a simple, single interpretation.”
I’ve written a couple of posts about the Emotiv EPOC over the years of doing the blog, from user interface issues in this post and the uncertainties surrounding the device for customers and researchers here.
The good news is that research is starting to emerge where the EPOC has been systematically compared to other devices and perhaps some uncertainties can be resolved. The first study comes from the journal Ergonomics from Ekandem et al and was published in 2012. You can read an abstract here (apologies to those without a university account who can’t get behind the paywall). These authors performed an ergonomic evaluation of both the EPOC and the NeuroSky MindWave. Data was obtained from 11 participants, each of whom wore either a Neurosky or an EPOC for 15min on different days. They concluded that there was no clear ‘winner’ from the comparison. The EPOC has 14 sites compared to the single site used by the MindWave hence it took longer to set up and required more cleaning afterwards (and more consumables). No big surprises there. It follows that signal acquisition was easier with the MindWave but the authors report that once the EPOC was connected and calibrated, signal quality was more consistent than the MindWave despite sensor placement for the former being obstructed by hair.
If there are two truisms in the area of physiological computing, they are: (1) people will always produce physiological data and (2) these data are continuously available. The passive nature of physiological monitoring and the relatively high fidelity of data that can be obtained is one reason why we’re seeing physiology and psychophysiology as candidates for Big Data collection and analysis (see my last post on the same theme). It is easy to see the appeal of physiological data in this context, to borrow a quote from Jaron Lanier’s new book “information is people in disguise” and we all have the possibility of gaining insight from the data we generate as we move through the world.
If I collect physiological data about myself, as Kiel did during the bodyblogger project, it is clear that I own that data. After all, the original ECG was generated by me and I went to the trouble of populating a database for personal use, so I don’t just own the data, I own a particular representation of the data. But if I granted a large company or government access to my data stream, who would own the data?
Following on from yesterday’s post, I quickly checked up on Innergy, Ubisoft’s entry into the biofeedback market. Announced a year after the Vitality in 2010, the game seems to of quietly disappeared off Ubisoft’s website. The most recent news report in 2012 indicated work was still on-going on the project, but no release schedule had been announced. Given the lack of PR noise this late in the year and the missing listings on Ubisoft’s webpage I don’t expect we’ll see a 2013 release or one in the future. This would be rather disappointing, although the revealed gameplay was very traditional for a a biofeedback regime, the production values where first rate e.g. art work by Rolito (see Patapon on the Sony PSP) of and which is sorely lacking in many biofeedback programs. Continue reading →
It looks like Nintendo have put the Vitality sensor on an indefinite hold. In answer to a question at a recent shareholder meeting, Nintendo explained that while player physiology opened interesting avenues for play the mechanics they tried didn’t work for everybody, that being 10% of the players they tested. As I posted back in 2011, when Nintendo first raised this issue, the bar Nintendo had set for the percentage of players who could successfully control their physiology was simply too high at 99%.
I attended a short conference event organised by the CEEDs project earlier this month entitled “Making Sense of Big Data.” CEEDS is an EU-funded project under the Future and Emerging Technology (FET) Initiative. The project is concerned with the development of novel technologies to support human experience. The event took place at the Google Campus in London and included a range of speakers talking about the use of data to capture human experience and behaviour. You can find a link about the event here that contains full details and films of all the talks including a panel discussion. My own talk was a general introduction to physiological computing and a statement of our latest project work.
It was a thought-provoking day because it was an opportunity to view the area of physiological computing from a different perspective. The main theme being that we are entering the age of ‘big data’ in the sense that passive monitoring of people using mobile technology grants access to a wide array of data concerning human behaviour. Of course this is hugely relevant to physiological monitoring systems, which tend towards high-resolution data capture and may represent the richest vein of big data to index the human experience.