It is with a heavy heart that we are announcing the closure of the PhysiologicalComputing.net website. As you’ve probably noticed we’ve not posted very much over the past year. We started this website in 2009, around the time I moved to Liverpool and began working for Steve at Liverpool John Moores University. We created the website so we could have a place to discuss our ideas about Physiological Computing in a more informal setting and share our interest with fellow researchers and the general public at large. Being in the same department was a great motivator for us to publish content regularly, especially in the early years. Sadly my time at John Moores came to an end and we are now working on different projects which has made it difficult to maintain the needed momentum to publish regularly and as a result we’ve decided to call it a day and close the website.
Over the years we’ve covered a lot of interesting topics including the problems inherent in commercial EEG sensors [1, 2, 3], the use of biofeedback in the videogame industry [1, 2, 3. 4], the creation and development of Body Blogging [1, 2] and issues in designing physiological computing systems [1, 2, 3], to name but a few. We’ve also taken sensors out in the field and done impromptu experiments [1, 2], done a couple of interviews [1, 2] including one with Dr Alan Pope, as well run a successful CHI workshop through the website and even produced a book which we released last year!
We are immensely proud of what we have achieved here on PhysiologicalComputing.net and are remiss to let it go but we both feel its time to move on to other projects. Fortunately the website will remain online and we hope you will check out our archives and enjoy reading our musings from over the years.
You can still find us musing to ourselves on our respective websites:
As well as find us on Twitter:
We wish to thank our readers and everyone else who has contributed to the website, including Jen, Ute, Alex and Lenart.
We hope to see you again soon, signing off,
– Kiel & Steve
It was way back in 2011 during our CHI workshop that we first discussed the possibility of putting together an edited collection for Springer on the topic of physiological computing. It was clear to me at that time that many people associated physiological computing with implicit monitoring as opposed the active control that characterised BCI. When we had the opportunity to put together a collection, one idea was to extend the scope of physiological computing to include all technologies where signals from the brain and the body were used as a form of input. Some may interpret this relabelling of physiological computing as an all-inclusive strategy as a provocative move. But we did not take this option as a conceptual ‘land-grab’ but rather an attempt to be as inclusive as possible and to bring together what I still perceive to be a rather disparate and fractured research community. After all, we are all using psychophysiology in one form or another and share a common interest in sensor design, interaction mechanics and real-time measurement.
The resulting book is finally close to publication (tentative date: 4th April 2014) and you can follow this link to get the full details. We’re pleased to have a wide range of contributions on an array of technologies, from eye input to digital memories via mental workload monitoring, implicit interaction, robotics, biofeedback and cultural heritage. Thanks to all our contributors and the staff at Springer who helped us along the way.
Nintendo recently announced their going to focus on health applications using non-wearable devices in the near future. Sadly this is all they were willing to say at this point in time and so it’s pretty much anyone’s guess what their actually working on. While Nintendo has developed successful exergames with the likes of the Wii-Fit their entry into more physiologically driven gaming for health applications never really got off the drawing board. As such I’d hazard a guess that their working on a camera based heart monitor similar to the one supported by the new Xbox Kinect for use in the Wii-Fit U.
Special Issue Editors
- Hugo Gamboa (Universidade Nova de Lisboa, Portugal)
- Hugo Plácido da Silva (IT – Institute of Telecommunications, Portugal)
- Kiel Gilleade (Liverpool John Moores University, United Kingdom)
- Sergi Bermúdez i Badia (Universidade da Madeira, Portugal)
- Stephen Fairclough (Liverpool John Moores University, United Kingdom)
Deadline for Submissions: 30 June 2014
Physiological data provides a wealth of information about the behavioural state of the user. These data can provide important contextual information by allowing the system to draw inferences with respect to the affective, cognitive and physical state of a person. In a computerised system this information can be used as an input control to drive system adaptation. For example, a videogame can use psychophysiological inferences of the player’s level of mental workload during play to adjust game difficulty in real-time.
It appears Ubisoft’s entry into biofeedback training isn’t quite over as I earlier suspected. The product has been rebranded as Ozen and is now being marketed more appropriately to the well-being community rather than the gamer community. Its scheduled for a 2014 release; hopefully we’ll get a chance to play with it soon.
Imagine you are waiting to be interviewed for a job that you really want. You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room. The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands. He places the set on your head and says “Your interview starts now.”
This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul. And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.
The rationale for the exercise is quite clear. The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way. Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).
I’ve seen at least one other blog post that expressed some reservations about the process.
Let’s take a deep breath because I have a whole shopping list of issues with this exercise.
A video about the exhibit myself and Steve have been collaborating on with Manifest.AR is now online. The exhibit is currently being shown at FACT Liverpool until the 15th September.
The Quantified Self Europe presentation videos are now online. Enjoy!
In our final workshop video Alan Pope presents “Movemental”: Integrating Movement and the Mental Game (PDF). For the uninitiated Alan Pope co-authored a paper back in the early 90’s which introduced the concept of bio-cybernetic adaptation which has become a key work for us in the field of Physiological Computing. It was with much excitement that we received a paper submission from Alan and it was great to have him talk shop at the event.
Alan’s latest work with his colleague Chad Stephens described several new methods of adapting controller interfaces using physiology, in this case a Wii game controller. I was going to release the original footage I recorded during the workshop, however the camera failed to pick up any of the game demo’s that were shown. As one of my particular research fancies are biofeedback based game mechanics (e.g. lie-detection, sword fighting) I’ve remade the video with Alan’s permission using his power point presentation and so the demo’s can be enjoyed in all their glory.
(Pope, A., Stephens, C.) “Movemental”: Integrating Movement and the Mental Game (PDF)
A videogame or simulation may be physiologicallymodulated to enhance engagement by challenging the user to achieve a target physiological state. A method and several implementations for accomplishing this are described.
So that’s the end of our workshop video series. I hope you’ve all enjoyed them, for now I’m going to hibernate for a month to recover from the editing process.
This week see’s the release of the talks presented during the Sharing the Physiological Experience session. To view these talks and more please click here. For guidance about the session 4 talks please consult the abstracts listed below.
This release marks the end of the CHI 2011 Brain and Body Designing for Meaningful Interaction workshop videos. I’d like to thank our presentators for allowing us to share their talks on the Internet and for choosing our workshop to present their research. Without you the workshop could not of been the success it was. Hopefully these videos will go some small way to bringing your excellent research to a wider audience, and if not they can always be used to explain what exactly you do to family and friends.