I’ve written a couple of posts about the Emotiv EPOC over the years of doing the blog, from user interface issues in this post and the uncertainties surrounding the device for customers and researchers here.
The good news is that research is starting to emerge where the EPOC has been systematically compared to other devices and perhaps some uncertainties can be resolved. The first study comes from the journal Ergonomics from Ekandem et al and was published in 2012. You can read an abstract here (apologies to those without a university account who can’t get behind the paywall). These authors performed an ergonomic evaluation of both the EPOC and the NeuroSky MindWave. Data was obtained from 11 participants, each of whom wore either a Neurosky or an EPOC for 15min on different days. They concluded that there was no clear ‘winner’ from the comparison. The EPOC has 14 sites compared to the single site used by the MindWave hence it took longer to set up and required more cleaning afterwards (and more consumables). No big surprises there. It follows that signal acquisition was easier with the MindWave but the authors report that once the EPOC was connected and calibrated, signal quality was more consistent than the MindWave despite sensor placement for the former being obstructed by hair.
First of all, apologies for our blog “sabbatical” – the important thing is that we are now back with news of our latest research collaboration involving FACT (Foundation for Art and Creative Technology) and international artists’ collective Manifest.AR.
To quickly recap, our colleagues at FACT were keen to create a new commission tapping into the use of augmented reality technology and incorporating elements of our own work on physiological computing. Our last post (almost a year ago now to our shame) described the time we spent with Manfest.AR last summer and our show-and-tell event at FACT. Fast-forward to the present and the Manifest.AR piece called Invisible ARtaffects opened last Thursday as part of the Turning FACT Inside Out show.
With regards to the development of physiological computing systems, whether they are BCI applications or fall into the category of affective computing, there seems (to me) to be two distinct types of research community at work. The first (and oldest) community are university-based academics, like myself, doing basic research on measures, methods and prototypes with the primary aim of publishing our work in various conferences and journals. For the most part, we are a mixture of psychologists, computer scientists and engineers, many of whom have an interest in human-computer interaction. The second community formed around the availability of commercial EEG peripherals, such as the Emotiv and Neurosky. Some members of this community are academics and others are developers, I suspect many are dedicated gamers. They are looking to build applications and hacks to embellish interactive experience with a strong emphasis on commercialisation.
There are many differences between the two groups. My own academic group is ‘old-school’ in many ways, motivated by research issues and defined by the usual hierarchies associated with specialisation and rank. The newer group is more inclusive (the tag-line on the NeuroSky site is “Brain Sensors for Everyone”); they basically want to build stuff and preferably sell it.
The biocybernetic loop is the underlying mechanic behind physiological interactive systems. It describes how physiological information is to be collected from a user, analysed and subsequently translated into a response at the system interface. The most common manifestation of the biocybernetic loop can be seen in traditional biofeedback therapies, whereby the physiological signal is represented as a reflective numeric or graphic (i.e. representation changes in real-time to the signal).
In the 90’s a team at NASA published a paper that introduced a new take on the traditional biocybernetic loop format, that of biocybernetic adaptation, whereby physiological information is used to adapt the system the user is interacting with and not merely reflect it. In this instance the team had implemented a pilot simulator that used measures of EEG to control the auto-pilot status with the intent to regulate pilot attentiveness.
Concept art for biocybernetic adaptive plane
Dr. Alan Pope was the lead author on this paper, and has worked extensively in the field of biocybernetic systems for several decades; outside the academic community he’s probably best known for his work on biofeedback gaming therapies. To our good fortune we met Alan at a workshop we ran last year at CHI (a video of his talk can be found here) and he kindly allowed us the opportunity to delve further into his work with an interview.
So follow us across the threshold if you will and prepare to learn more about the origins of the biocybernetic loop and its use at NASA along with its future in research and industry.
We would like to thank all participants for taking part in the ARtSENSE Visual Aesthetic Interest Survey which is now closed. We have completed the prize draw and winners have been contacted by email.
PhD student at Liverpool John Moores University, UK
Way back in 2008, I was due to go to Florence to present at a workshop on affective BCI as part of CHI. In the event, I was ill that morning and missed the trip and the workshop. As I’d prepared the presentation, I made a podcast for sharing with the workshop attendees. I dug it out of the vaults for this post because gaming and physiological computing is such an interesting topic.
The work is dated now, but basically I’m drawing a distinction between my understanding of BCI and biocybernetic adaptation. The former is an alternative means of input control within the HCI, the latter can be used to adapt the nature of the HCI. I also argue that BCI is ideally suited certain types of game mechanics because it will not work 100% of the time. I used the TV series “Heroes” to illustrate these kinds of mechanics, which I regret in hindsight, because I totally lost all enthusiasm for that show after series 1.
The original CHI paper for this presentation is available here.
Last month I gave a presentation at the Annual Meeting of the Human Factors and Ergonomics Society held at Leeds University in the UK. I stood on the podium and presented the work, but really the people who deserve most of the credit are Marjolein van der Zwaag (from Philips Research Laboratories) and my own PhD student at LJMU Elena Spiridon.
You can watch a podcast of the talk above. This work was originally conducted as part of the REFLECT project at the end of 2010. This work was inspired by earlier research on affective computing where the system makes an adaptation to alleviate a negative mood state. The rationale here is that any such adaptation will have beneficial effects – in terms of reducing duration/intensity of negative mood, and in doing so, will mitigate any undesirable effects on behaviour or the health of the person.
Our study was concerned with the level of anger a person might experience on the road. We know that anger causes ‘load’ on the cardiovascular system as well as undesirable behaviours associated with aggressive driver. In our study, we subjected participants to a simulated driving task that was designed to make them angry – this is a protocol that we have developed at LJMU. Marjolein was interested in the effects of different types of music on the cardiovascular system while the person is experiencing a negative mood state; for our study, she created four categories of music that varied in terms of high/low activation and positive/negative valence.
The study does not represent an investigation into a physiological computing system per se, but is rather a validation study to explore whether an adaptation, such as selecting a certain type of music when a person is angry, can have beneficial effects. We’re working on a journal paper version at the moment.
Some months ago, I wrote this post about the REFLECT project that we participated in for the last three years. In short, the REFLECT project was concerned with research and development of three different kinds of biocybernetic loops: (1) detection of emotion, (2) diagnosis of mental workload, and (3) assessment of physical comfort. Psychophysiological measures were used to assess (1) and (2) whilst physical movement (fidgeting) in a seated position was used for the latter. And this was integrated into the ‘cockpit’ of a Ferrari.
The idea behind the emotional loop was to have the music change in response to emotion (to alleviate negative mood states). The cognitive loop would block incoming calls if the driver was in a state of high mental workload and air-filled bladders in the seat would adjust to promote physical comfort. You can read all about the project here. Above you’ll find a promotional video that I’ve only just discovered – the reason for my delayed response in posting this is probably vanity, the filming was over before I got to the Ferrari site in Maranello. The upside of my absence is that you can watch the much more articulate and handsome Dick de Waard explain about the cognitive loop in the film, which was our main involvement in the project.
The survey is now CLOSED. Thank you to all our participants.
We invite you to take part in the ARtSENSE Visual Aesthetic Interest Survey. The survey asks you to give subjective ratings, i.e. your thoughts and feelings, towards artworks on a number of scales. The survey is part of the ARtSENSE project which investigates augmented reality supported adaptive and personalized experience in a museum based on processing real-time sensor events.
ARtSENSE tackles a very important problem in the modern usage of ICT in cultural heritage domain. It aims to bridge the gap between the digital world with the physical in a highly flexible way in order to enable a novel and adaptive cultural experience.
You can complete the study online and it shouldn’t take you more than 20 minutes. You will be given feedback about the picture you have rated as most interesting and how it compares to that others have rated most interesting. You can also enter a prize draw for some Amazon vouchers!
To take part in this study (you have to be at least 18) and for further details go to:
PhD student at Liverpool John Moores University, UK
Admin: Please welcome to the site our new Physiological Computing bloggger, Dr. Lennart Nacke.
Hi, I am Lennart Nacke and will merge my affectivegaming.info blogging efforts from now on into the Physiological Computing blog (sometimes you can also catch my blogging at Gamasutra and on my own homepage). And I have been promising Kiel and Steve to post here for almost a year now (we have organized a workshop together in the meantime), so I was overdue with this post.
In the above video, you can see my talk about the current directions in physiological game interaction and psychophysiological game evaluation. I have been deeply interested in those topics for at least the past five years, spanning my PhD and postdoc time, several presentations for research institutions and game companies, a growing list of publications, and other articles. In the meantime, physiological sensors have become much cheaper and today we are seeing companies like Neurosky and Emotiv with low-cost physiological sensor products reaching a large amount of customers. My colleague Mike Ambinder at Valve is now even looking into applications of biofeedback input for commercial game titles (PDF) some of this was demonstrated at GDC 2011). So, this is definitely an exciting field to work in. For the rest of this article (which reproduces parts of my workshop paper), I will recapture my CHI workshop talk and discuss some of the applications for game interaction and game evaluation from a Physiological Computing side.