Recent posts on the blog have concerned the topic of psychophysiology (or biometrics) and the evaluation of player experience. Based on those posts and the comments that followed, I decided to do a thought experiment.
Imagine that I work for a big software house who want to sell as many games as possible and ensure that their product (which costs on average $3-5 million to develop per platform) is as good as it possibly can be – and one of the suits from upstairs calls and asks me “how should we be using biometrics as part of our user experience evaluation? The equipment is expensive, its labour-intensive to analyse and nobody seems to understand what the data means.” (This sentiment is not exaggerated, I once presented a set of fairly ambiguous psychophysiological data to a fellow researcher who nodded purposefully and said “So the physiology stuff is voodoo.”)
Here’s a list of 10 things I would push for by way of a response.
Admin: Please welcome to the site our new Physiological Computing bloggger, Dr. Lennart Nacke.
Hi, I am Lennart Nacke and will merge my affectivegaming.info blogging efforts from now on into the Physiological Computing blog (sometimes you can also catch my blogging at Gamasutra and on my own homepage). And I have been promising Kiel and Steve to post here for almost a year now (we have organized a workshop together in the meantime), so I was overdue with this post.
In the above video, you can see my talk about the current directions in physiological game interaction and psychophysiological game evaluation. I have been deeply interested in those topics for at least the past five years, spanning my PhD and postdoc time, several presentations for research institutions and game companies, a growing list of publications, and other articles. In the meantime, physiological sensors have become much cheaper and today we are seeing companies like Neurosky and Emotiv with low-cost physiological sensor products reaching a large amount of customers. My colleague Mike Ambinder at Valve is now even looking into applications of biofeedback input for commercial game titles (PDF) some of this was demonstrated at GDC 2011). So, this is definitely an exciting field to work in. For the rest of this article (which reproduces parts of my workshop paper), I will recapture my CHI workshop talk and discuss some of the applications for game interaction and game evaluation from a Physiological Computing side.
In April there was a rumour going around that the next Red Steel (the third in the series) might support the Wii Vitality. The gameplay in Red Steel is a mixture of first person shooting and first person sword fighting. In the last Red Steel the combat system felt very similar to that of two-player fighting games like Street Fighter as apart from the basic sword fighting techniques you can perform with the Wii controller (e.g. blocking and striking) you could also pull off a range of special moves with different combinations of gestures and key presses. I’m a big fan of the Red Steel franchise and I believe it would be an interesting series to explore biofeedback based gameplay mechanics as both the mythos and the physical skillset being simulated in Red Steel lends itself well to intrinsically interesting physiological manipulations (e.g. as your playing a swordsman, “zen” powers aren’t too much of stretch for your suspension of disbelief). Below I’ve made a couple of suggestions as to what biofeedback based gameplay mechanics you might find in the next Red Steel game if it uses the Wii Vitality: –