The biocybernetic loop is the underlying mechanic behind physiological interactive systems. It describes how physiological information is to be collected from a user, analysed and subsequently translated into a response at the system interface. The most common manifestation of the biocybernetic loop can be seen in traditional biofeedback therapies, whereby the physiological signal is represented as a reflective numeric or graphic (i.e. representation changes in real-time to the signal).
In the 90’s a team at NASA published a paper that introduced a new take on the traditional biocybernetic loop format, that of biocybernetic adaptation, whereby physiological information is used to adapt the system the user is interacting with and not merely reflect it. In this instance the team had implemented a pilot simulator that used measures of EEG to control the auto-pilot status with the intent to regulate pilot attentiveness.
Concept art for biocybernetic adaptive plane
Dr. Alan Pope was the lead author on this paper, and has worked extensively in the field of biocybernetic systems for several decades; outside the academic community he’s probably best known for his work on biofeedback gaming therapies. To our good fortune we met Alan at a workshop we ran last year at CHI (a video of his talk can be found here) and he kindly allowed us the opportunity to delve further into his work with an interview.
So follow us across the threshold if you will and prepare to learn more about the origins of the biocybernetic loop and its use at NASA along with its future in research and industry.
In our final workshop video Alan Pope presents “Movemental”: Integrating Movement and the Mental Game (PDF). For the uninitiated Alan Pope co-authored a paper back in the early 90’s which introduced the concept of bio-cybernetic adaptation which has become a key work for us in the field of Physiological Computing. It was with much excitement that we received a paper submission from Alan and it was great to have him talk shop at the event.
Alan’s latest work with his colleague Chad Stephens described several new methods of adapting controller interfaces using physiology, in this case a Wii game controller. I was going to release the original footage I recorded during the workshop, however the camera failed to pick up any of the game demo’s that were shown. As one of my particular research fancies are biofeedback based game mechanics (e.g. lie-detection, sword fighting) I’ve remade the video with Alan’s permission using his power point presentation and so the demo’s can be enjoyed in all their glory.
(Pope, A., Stephens, C.) “Movemental”: Integrating Movement and the Mental Game (PDF)
A videogame or simulation may be physiologicallymodulated to enhance engagement by challenging the user to achieve a target physiological state. A method and several implementations for accomplishing this are described.
So that’s the end of our workshop video series. I hope you’ve all enjoyed them, for now I’m going to hibernate for a month to recover from the editing process.
This week see’s the release of the talks presented during the Sharing the Physiological Experience session. To view these talks and more please click here. For guidance about the session 4 talks please consult the abstracts listed below.
This release marks the end of the CHI 2011 Brain and Body Designing for Meaningful Interaction workshop videos. I’d like to thank our presentators for allowing us to share their talks on the Internet and for choosing our workshop to present their research. Without you the workshop could not of been the success it was. Hopefully these videos will go some small way to bringing your excellent research to a wider audience, and if not they can always be used to explain what exactly you do to family and friends.
This week see’s the release of the talks presented during the Evaluating the User Experience session. To view these talks and more please click here. For guidance about the session 3 talks please consult the abstracts listed below.
I came across an article in a Sunday newspaper a couple of weeks ago about an artist called xxxy who has created an installation using a BCI of sorts. I’m piecing this together from what I read in the paper and what I could see on his site, but the general idea is this: person wears a portable EEG rig (I don’t recognise the model) and is placed in a harness with wires reaching up and up and up into the ceiling. The person closes their eyes and relaxes – presumably as they enter a state of alpha augmentation, they begin to levitate courtesy of the wires. The more that they relax or the longer they sustain that state, the higher they go. It’s hard to tell from the video, but the person seems to be suspended around 25-30 feet in the air.
From the point of view of an outsider, the utility and value of computer technology that provides emotional feedback to the human operator is questionable. The basic argument normally goes like this: even if the technology works, do I really need a machine to tell me that I’m happy or angry or calm or anxious or excited? First of all, the feedback provided by this machine would be redundant, I already have a mind/body that keeps me fully appraised of my emotional status – thank you. Secondly, if I’m angry or frustrated, do you really think I would helped in any way by a machine that drew my attention to these negative emotions, actually that would be particularly annoying. Finally, sometimes I’m not quite sure how I’m feeling or how I feel about something; feedback from a machine that says you’re happy or angry would just muddy the waters and add further confusion.
In last week’s excellent Bad Science article from The Guardian, Ben Goldacre puts his finger on a topic that I think is particularly relevant for physiological computing systems. He quotes press reports about MRI research into “hypoactive sexual desire response” – no, I hadn’t heard of it either, it’s a condition where the person has low libido. In this study women with the condition and ‘normals’ viewed erotic imagery in the scanner. A full article on the study from the Mail can be found here but what caught the attention of Bad Science is this interesting quote from one of the researchers involved: “Being able to identify physiological changes, to me provides significant evidence that it’s a true disorder as opposed to a societal construct.”
I recently read a paper by Rosalind Picard entitled “emotion research for the people, by the people.” In this article, Prof. Picard has some fun contrasting engineering and psychological perspectives on the measurement of emotion. Perhaps I’m being defensive but she seemed to have more fun poking fun at the psychologists than the engineers, but the central impasse that she identified goes something like this: engineers develop sensor apparatus that can deliver a whole range of objective data whilst psychologists have decades of experience with theoretical concepts related to emotion, so why haven’t people really benefited from their union through the field of affective computing. Prof. Picard correctly identifies a reluctance on the part of the psychologists to define concepts with sufficient precision to aid the work of the engineers. What I felt was glossed over in the paper was the other side of the problem, namely the willingness of engineers to attach emotional labels to almost any piece of psychophysiological data, usually in the context of badly-designed experiments (apologies to any engineers reading this, but I wanted to add a little balance to the debate).
Well that was a disappointment. In the end Nintendo decided against demonstrating the Wii Vitality at this year’s E3. A representative of the company stated that the Vitality was a no-show because Nintendo did not believe the event was a suitable environment for the product. Disappointing but given their press event was jam packed with a number of AAA games and a new portable it was understandable. However with the Vitality aimed for a late 2010 release it doesn’t give Nintendo much time to create a buzz around a product that frankly has none. In actuality I was surprised that Nintendo didn’t use their recent endorsement deal with the American Heart Association to hype Vitality pre E3. While the product is currently being marketed towards mental health (i.e. stress management), rather than physical health which the AHA endorsement concerns, Nintendo could of easily used the event as part of a broader health platform and so make better use of the publicity the endorsement deal attracted.
I came across this article about the Heart Chamber Orchestra on the Wired site last week. The Orchestra are a group of musicians who wear ECG monitors whilst they play – the signals from the ECG feed directly into laptops and adapts the musical scores played directly and in real-time. They also have some nice graphics generated by the ECG running in the background when they play (see clip below). What I think is really interesting about this project is the reflexive loop set up between the ECG, the musician’s response and the adaptation of the musical score. It really goes beyond standard biofeedback – a live feed from the ECG mutates the musical score, the player responds to technical/emotional qualities of that score, which has a second-order effect on the ECG and so on. In the Wired article, they refer to the possibility of the audience being equipped with ECG monitors to provide another input to the loop – which is truly a mind-boggling possibility in terms of a fully-functioning biocybernetic loop.
The thing I find slightly frustrating about the article and the information contained in the project website is the lack of information about how the ECG influences the musical score. In a straightforward way, an ECG will yield a beat-to-beat interval, which of course could generate a metronomic beat if averaged over the group. Alternatively each individual ECG could generate its own beat, which could be superimposed over one another. But there are dozens of ways in which ECG information could be used to adapt a musical score in a real-time. According to the project information, there is also a composer involved doing some live manipulations of the score, but it’s hard to figure out how much of the real-time transformation is coming from him or her and how much is directly from the ECG signal.
I should also say that the Orchestra are currently competing for the FILE PRIX LUX prize and you can vote for them here
Before you do, you might want to see the orchestra in action in the clip below.
Heart chamber orchestra on vimeo