Admin: Please welcome to the site our new Physiological Computing bloggger, Dr. Lennart Nacke.
Hi, I am Lennart Nacke and will merge my affectivegaming.info blogging efforts from now on into the Physiological Computing blog (sometimes you can also catch my blogging at Gamasutra and on my own homepage). And I have been promising Kiel and Steve to post here for almost a year now (we have organized a workshop together in the meantime), so I was overdue with this post.
In the above video, you can see my talk about the current directions in physiological game interaction and psychophysiological game evaluation. I have been deeply interested in those topics for at least the past five years, spanning my PhD and postdoc time, several presentations for research institutions and game companies, a growing list of publications, and other articles. In the meantime, physiological sensors have become much cheaper and today we are seeing companies like Neurosky and Emotiv with low-cost physiological sensor products reaching a large amount of customers. My colleague Mike Ambinder at Valve is now even looking into applications of biofeedback input for commercial game titles (PDF) some of this was demonstrated at GDC 2011). So, this is definitely an exciting field to work in. For the rest of this article (which reproduces parts of my workshop paper), I will recapture my CHI workshop talk and discuss some of the applications for game interaction and game evaluation from a Physiological Computing side.
This recent interview with Gabe Newell of Valve caught our interest because it’s so rare that a game developer talks publicly about the potential of physiological computing to enhance the experience of gamers. The idea of using live physiological data feeds in order to adapt computer games and enhance game play was first floated by Kiel in these papers way back in 2003 and 2005. Like Kiel, in my writings on this topic (Fairclough, 2007; 2008 – see publications here), I focused exclusively on two problems: (1) how to represent the state of the player, and (2) what could the software do with this representation of the player state. In other words, how can live physiological monitoring of the player state inform real-time software adaptation? For example, to make the game harder or to increase the music or to offer help (a set of strategies that Kiel summarised in three categories, challenge me/assist me/emote me)- but to make these adjustments in real time in order to enhance game play.
At this year’s E3 Microsoft, Nintendo and Sony all presented their own vision of how the player will interact with games in the future. Microsoft introduced Project Natal, a full-body hands-free game controller, which had previously been hinted upon early last month. You can check the concept video here. Sony demonstrated a wand like motion controller which works in conjunction with the Playstation Eye. And Nintendo revealed the Wii Vitality Sensor, a biosensor add-on for the Wii controller.
Sadly Nintendo didn’t reveal any specific details (or games for that matter) on how they intend to use the sensor. However from what little they did provide its likely Nintendo are going to start with stress management games similar in nature to Healing Rhythm’s Journey to Wild Divine series. Given the relax-to-win game format is very common in biofeedback based stress management, I’m suprised a game demo was not forthcoming. Oh well, E3 isn’t over as of yet, so they might reveal some more information.
Next we’ll have a look at the type of experiences the Wii Vitality can be expected to provide.