Stelarc, the famous performance artist known for his work on manipulating the human body and cybernetics, visited the Foundation for Art and Creative Technology (FACT), based in Liverpool, earlier this month to talk about his recent projects. To our good fortune we managed to land an interview with Stelarc in order to discuss the intersection between his work and the field of Physiological Computing.
Category Archives: News
CHI2011 Workshop – Social Media Links
The workshop’s social media links are now online.
Announcing the CHI 2011 Workshop – Brain and Body Interfaces: Designing for Meaningful Interaction
I’m proud to announce the launch of the official webpage for Brain and Body Interfaces: Designing for Meaningful Interaction a workshop running at CHI 2011 May 7-11th 2011. You can subscribe to the workshop RSS feed here, where we will be posting all the latest workshop updates (social networking feeds will be following shortly).
I’ve created a subdomain to host the webpage to make it easier to remember http://brainandbody.physiologicalcomputing.net.
Revised Physiological Computing FAQ
This is a short post to inform regular readers that I’ve made some changes to the FAQ document for the site (link to the left). Normally people alter the FAQ because the types of popular questions have changed. In our case, it is my answers to those questions that have changed in the time since I wrote my original responses – hence the need to revise the FAQ.
The original document firmly identified physiological computing with affective computing/biocybernetic adaptation. There was even a question making a firm division between BCI technology and physiological computing. In the revised FAQ, I’ve dumped this distinction and attempted to view BCI as part of a broad continuum of computing devices that rely on real-time physiological data for input. This change has not been made to arrogantly subsume BCI within the physiological computing spectrum, but to reconcile perspectives from different research communities working on common measures and technologies across different application domains. In my opinion, the distinction between research topics and application domains (including my own) are largely artificial and the advancement of this technology is best served by keeping an open mind about mash-ups and hybrid systems.
I’ve also expanded the list of indicative references to include contributions from BCI, telemedicine and adaptive automation in order to highlight the breadth of applications that are united by physiological data input.
The FAQ is written to support the naive reader, who may have stumbled across our site, but as ever, I welcome any comments or additional questions from domain experts.
Quantified Self London Video
I recently did a talk at the inaugural Quantified Self London meetup about my experiences as The Body Blogger for which we now have a video.
For more information about the event you can read the review by Adriana Lukas as well as a writeup of my experience in running the system as I presented. As you can imagine my heart was pumping pretty fast.
Valve experimenting with physiological input for games
This recent interview with Gabe Newell of Valve caught our interest because it’s so rare that a game developer talks publicly about the potential of physiological computing to enhance the experience of gamers. The idea of using live physiological data feeds in order to adapt computer games and enhance game play was first floated by Kiel in these papers way back in 2003 and 2005. Like Kiel, in my writings on this topic (Fairclough, 2007; 2008 – see publications here), I focused exclusively on two problems: (1) how to represent the state of the player, and (2) what could the software do with this representation of the player state. In other words, how can live physiological monitoring of the player state inform real-time software adaptation? For example, to make the game harder or to increase the music or to offer help (a set of strategies that Kiel summarised in three categories, challenge me/assist me/emote me)- but to make these adjustments in real time in order to enhance game play.
Mobile Monitors and Apps for Physiological Computing
I always harbored two assumptions about the development of physiological computing systems that have only become apparent (to me at least) as technological innovation seems to contradict them. First of all, I thought nascent forms of physiological computing systems would be developed for desktop system where the user stays in a stationary and more-or-less sedentary position, thus minimising the probability of movement artifacts. Also, I assumed that physiological computing devices would only ever be achieved as coordinated holistic systems. In other words, specific sensors linked to a dedicated controller that provides input to adaptive software, all designed as a seamless chain of information flow.
The Body Blogger
A new page has been added to the navigation bar. “The Body Blogger” concerns our work in live blogging a user’s physiological state to the web and what impact it may have on the user as well as their audience. The body blog can be found at http://twitter.com/bodyblogger.
I just watched a TEDMED talk about the iBrain device via this link on the excellent Medgadget resource. The iBrain is a single-channel EEG recording collected via ‘dry’ electrodes where the data is stored in a conventional handheld device such as a cellphone. In my opinion, the clever part of this technology is the application of mathematics to wring detailed information out of a limited data set – it’s a very efficient strategy.
The hardware looks to be fairly standard – a wireless EEG link to a mobile device. But its simplicity provides an indication of where this kind of physiological computing application could be going in the future – mobile monitoring for early detection of medical problems piggy-backing onto conventional technology. If physiological computing applications become widespread, this kind of proactive medical monitoring could become standard. And the main barrier to that is non-intrusive, non-medicalised sensor development.
In the meantime, Neurovigil, the company behind the product, recently announced a partnership with Swiss pharmaceutical giants Roche who want to apply this technology to clinical drug trials. I guess the methodology focuses the drug companies to consider covert changes in physiology as a sensitive marker of drug efficacy or side-effects.
I like the simplicity of the iBrain (1 channel of EEG) but speaker make some big claims for their analysis, the implicit ones deal with the potential of EEG to identify neuropathologies. That may be possible but I’m sceptical about whether 1 channel is sufficient. The company have obviously applied their pared-down analysis to sleep stages with some success but I was left wondering what added value the device provided compared to less-intrusive movement sensors used to analyse sleep behaviour, e.g. the Actiwatch
Heart Chamber Orchestra
I came across this article about the Heart Chamber Orchestra on the Wired site last week. The Orchestra are a group of musicians who wear ECG monitors whilst they play – the signals from the ECG feed directly into laptops and adapts the musical scores played directly and in real-time. They also have some nice graphics generated by the ECG running in the background when they play (see clip below). What I think is really interesting about this project is the reflexive loop set up between the ECG, the musician’s response and the adaptation of the musical score. It really goes beyond standard biofeedback – a live feed from the ECG mutates the musical score, the player responds to technical/emotional qualities of that score, which has a second-order effect on the ECG and so on. In the Wired article, they refer to the possibility of the audience being equipped with ECG monitors to provide another input to the loop – which is truly a mind-boggling possibility in terms of a fully-functioning biocybernetic loop.
The thing I find slightly frustrating about the article and the information contained in the project website is the lack of information about how the ECG influences the musical score. In a straightforward way, an ECG will yield a beat-to-beat interval, which of course could generate a metronomic beat if averaged over the group. Alternatively each individual ECG could generate its own beat, which could be superimposed over one another. But there are dozens of ways in which ECG information could be used to adapt a musical score in a real-time. According to the project information, there is also a composer involved doing some live manipulations of the score, but it’s hard to figure out how much of the real-time transformation is coming from him or her and how much is directly from the ECG signal.
I should also say that the Orchestra are currently competing for the FILE PRIX LUX prize and you can vote for them here
Before you do, you might want to see the orchestra in action in the clip below.