BioS-Play 2010 Workshop Experience Report – Part 1 of 2

Admin: Workshop papers can be found here.

Last month I attended the BioS-Play workshop at the Fun and Games 2010 conference over in Leuven, Belgium. I was presenting Physiology as XP – Bodyblogging to Victory, a position paper I co-wrote with Steve in which we extended the body blogging concept to computer games. In part 1 of this 2 part series of posts on BioS-Play I’ll be re-counting my experiences at the conference, as well as providing my thoughts on the likely research direction physiological games will take in the future.

EDIT

The post is rather large so I’ve made a few quick links to provide readers a taster of what’s contained within.

  • EmRoll:  A 2 player co-operative children’s game which uses a mixture of gestures and biological interactions to control Gamboo, a 2 headed monster. What the Xbox 360 Kinetic might offer in the future.
  • Study investigating the effect of sharing physiological information in collocated and networked environments on measures of presence and emotion. Following on from Steve’s Valve post, what measurable benefits might shared physiology actually bring to multiplayer games like Left for Dead.
  • Workshop discussion, covers such issues as: how do we design meaningful physiological interactions and how do we evaluate the efficacy of the user experience of a physioloigcal interface?

The Workshop Theme

BioS-Play was aimed at exploring the use of biological signals (e.g. brain waves) in both a multiplayer and social gaming environment. For full details see the workshop proposal. Over the past decade there has been an up turn in using this class of physiological input in computer games, however the majority of such systems are designed for single player experiences. This is not really surprising, although such signals have been utilised by games since the 70’s,  bio-adaptive interaction was only used in a limited therapeutic capacity. It was not until the late 90’s, a period that saw the emergence of Affective Computing,  that we saw player physiology being used in more interesting ways (e.g. see MIT Media Lab Europe projects on affective feedback).

Mulitplayer and social environments provide a new and exciting arena to explore biofeedback interaction. For example, Media Lab Europe designed a 2-player relaxation game called Relax-to-Win where players collocated in the same physical space, competed against each in a head-to-head race. Players controlled their speed via their relaxation level as measured by skin conductance, and so the more calm they were the faster they moved. This game mechanic is what you would typically find in a biofeedback therapy program, in that the game only acts as a medium to reflect the user’s physiological state and provides no intrinsic incentive for the player to interact. Essentially gameplay boils down to “move cursor here”. What makes Relax-to-Win interesting is in the addition of the second player who acts as an antagonistic stimulus to the other players ability to control their physiological state. In competitive environments we naturally become more aroused as the body primes for action, however in a game where the player can only win by suppressing this reaction, the game challenge feed offs the interplay between each player’s psychophysiology (i.e. the game mechanic,  is embodied in the player’s which the system displays). You can find more details about this interplay in a paper I wrote in 2005: Affective Videogames and Modes of Affective Gaming.

I’ve seen several variations of the relax-to-win game mechanic in multiplayer and social environments (e.g. Balloon Trip which I saw at ACE 2004) as well as other types of bio-adaptive experiences, word is Valve are also now experimenting with shared player physiology in their games (e.g. imagine in a similar vein to Aliens, if you could see your allies zombie induced panic in Left for Dead). At the moment we’re only exploring the tip of the proverbial ice berg in what may be possible with bio-adaptive interaction and at BioS-Play the drills had already been laid out ready to sample those untapped ice cores.

Position Papers

Ten papers were presented at BioS-Play which I’ve organised into the following topics: –

  • Sensor Technology and Frameworks
  • Collocated Multiplayer
  • Networked Multiplayer
  • Shared Physiological Experiences
  • Evaluating Interactions

Below I’ve listed each paper and its abstract as well as a few thoughts I jotted down during the workshop.

Sensor Technology and Frameworks

(Jercic, P., Cederholm, H.) The Future of Brain-Computer Interface for Games and Interaction Design

In this paper we discuss the potential application areas and uses of modern Brain-Computer Interface technology such as the EPOC. We divide the discussing into two subgroups namely, Game design and Interaction design and hence discuss the future of these research areas in regards to such technology.

(Geurts, L., Van Hulle, M.) Spellbinder: Interactive Communication through a Brain Computer Interface

The purpose of the Spellbinder project is the development and the testing of a Brain Computer Interface, consisting of a diadem with electrodes to measure electroencephalogram signals, and several interactive computer programs, that allow the user to communicate with others through brainwaves, without using speech, facial expressions or other muscle activities. Spellbinder’s primary user group consist of patients suffering from disorders such as Cerebral Palsy, stroke, ALS, MS, Parkinson,… – and that have no other means to communicate with others. Through the BCI they can convey messages, tell a story, express their feelings or play a game with other people.

(Kosunen, I., Kuikkaniemi, K., et al.) Listen to Yourself and Others – Multiuser Mobile Biosignal Sonification Platform EMOListen

EMOListen is a multiuser mobile biosignal sonification platform. Biosignal adaptation has gained a lot of recognition in recent years both in academia and within industry. However, experimentations rarely involve multiuser systems, and they do not consider audio as the adaptation target. Biosignal sonification is a relevant domain, because audio can be used in adaptation with high temporal accuracy, can be associated to several biosignals, and can be used as a secondary stimulus, enabling long term and out-of-laboratory experimentations. Multiuser mobile sonification system can be used for creating sense of telepresence and mediated body language.

Discusses the authors wearable sensor framework which allows them to deploy multi-user physiological interactive applications  very quickly using Bluetooth enabled sensors connected via mobile phone to the Cloud for application management. I was quite impressed by the frameworks design as Cloud management will allow them to test their ideas in a whole slew of different environments (e.g. you could invite players from all over the world as long as they have a 3G phone and a Bluetooth enabled heart monitor) . Ideally, I would of liked to used a Bluetooth enabled device for the body blogger project as its easier to interface with a mobile phone which provides an all round effective means of real-time data collection, analysis and interaction for the individual user.

Collocated Multiplayer

(Yamabe, T., Kosunen, I., et al.) Biofeedback Training with EmoPoker: Controlling Emotional Arousal for Better Poker Play

A game of poker is a typical example of a situation involving inperfect information: players have to make decisions under uncertainty. This uncertainty can evoke emotional arousal and lead the player to make irrational decisions. In this paper, we introduce the EmoPoker system, which aims at making the player aware of the arousal level by providing biofeedback. With the EmoPoker system, we expect that a poker player becomes able to control their own arousal, consequently improving their gaming performance. EmoPoker presents itself as an augmented reality application, and its design is based on the traditional game concept. In this paper we also introduce other possible use cases of biofeedback training.

(Zangouie, F., Gashti, M., et al.) Designing for a Rich Emotional Journey through a Game a Riddles Called EmRoll

During last few years we have designed several systems to involve users emotionally through physical movements. In our recent work, we wanted to take the prior works one step further and make use of a combination of movement capture-sensors and bio-sensors, actively involving users’ emotional experiences. We have designed a game named EmRoll stands for Emotional Rollercoaster that poses riddles to pairs of players. The riddles can only be solved if the players are, or at least pretend to be, moving according to different emotional states (happy dancing, relaxed breathing and scared reactions). We pick up on movement, breathing and sweat from the two players.

This paper describes a 2 player co-operative children’s game which uses a mixture of gestures and biological interactions to control Gamboo, a 2 headed monster. In EmRoll each player takes control over the physical actions of one side of Gamboo, players have to work together as a team in order to solve 3 riddles. The first riddle requires synchronised physical movements to simulate Gamboo walking. The second riddle requires players to shake off a spider which climbs onto Gamboo via their physical actions and an increase in their level of arousal (via skin conductance) in order to prevent another spider from approaching. The later part of this game interaction was interesting as the authors suggested players should experience fear as part of the gameplay experience which would thereby induce the necessary increase in arousal.

I’m not sure what purpose the bio-adaptation served in this instance as its not used to influence later play (e.g. control later scares). If the children are scared by the spider and successfully shake it off they win. If the children are not scared then they have to manipulate their level of arousal in order to prevent another spider from showing up. Given the riddle  is designed to scare the players, if the children are scared as intended then the bio-adaptation is redundant whereas if the players are not scared then the bio-adaptation loses its context. As a position paper, I might be missing some of the finer details of the game riddle, but in this situation I would of instead choosen to reflect the players’ level of arousal in Gamboo (e.g. Gamboo’S face becomes more frightened the greater the increase in arousal) thereby creating a more surreal experience than relying on an implicit physiological response to drive an explicit game mechanic. And finally the third riddle requires synchronised breathing to allow Gamboo to tread water.

Watching EmRoll in action, you could easily see this game at home on the Xbox 360 Kinetic.

Networked Multiplayer

(G¨urk¨ok, H., Plass-Oude Bos, D. et al.) Towards Mutliplayer BCI Games

Traditional brain-computer interface (BCI) research has forked to consider not only disabled patients but able-bodied users as well. Among various biosignals, electroencephalogram (EEG) is a cheap, portable and popular means of accessing brain activity. As a result, EEG-based BCIs are gradually being used in everyday applications, including games. Just as any computer product, human-computer interaction (HCI) aspects of BCI games deserve attention. In this paper, we describe our previous work on multiplayer and multimodal BCI games based on EEG along with possible future research directions.

Further game mods and evaluation work by the team that made alpha-WoW, a BCI mod for World of Warcraft.

Shared Physiological Experiences

(Chanel, G., Perlli, S., et al) Social Interaction using Mobile Devices and Biofeedback: Effects on Presence, Attraction and Emotions

In this article we describe the preliminary results of our experiment conducted with the PRESEMO. PRESEMO is a system that makes it possible for an audience to interact during a presentation. The system uses the Nokia N900 devices with heart rate monitors to create an interactive presentation and displays the heart rate of four participants in real time. Over all we measured 70 participants. During the presentation the participants were able to communicate with each other via the mobile devices and answer questions that were presented interactively. In the experiment, short movie clips were presented to the participants during which their interbeat interval, electrodermal activity and blood volume pulse were measured. Some feedbacks (chat conversation, answers to the questions and biofeedback) were visible to the participants during the presentation. We wanted to know whether it had an impact on emotions, inter-personal attraction, physical presence and social presence. We found that feedback had a positive effect on the dependent variables.

In this paper the effect of sharing physiological information in collocated and networked environments on measures of presence and emotion was evaluated. A group of 4 users were shown a series of non-verbal video clips and were asked to comment about their experience via an IM system. 2 users were collocated together, though they were only allowed to communicate via IM, whereas the other 2 users were isolated from the group and each other.

The authors compared 4 different communication models for their effect on measures of presence and emotion: –

  • No feedback —  users can comment via the IM client but their texts would not be shown to others in the group.
  • Text feedback —  user comments are shown to everyone in the group.
  • Biofeedback — same as no feedback, each user’s heartbeat rate is shown to the rest of the group, as well as the correlation between each users heartbeat rate.
  • Text and biofeedback — users can see each others comments and heartbeat rate.

In this experiment, the biofeedback condition only had a significant effect on co-presence (awareness of others). Textual feedback had the most profound effect on both measures of presence and emotion. Its effect on co-presence was also greater than the biofeeback condition. Text and biofeedback in combination had an effect similar to that of textual feedback on its own (i.e. no advanatage in using biofeedback if textual feedback is provided).

Thinking back to Valve, these results suggest shared physiological states won’t add much to the player’s gaming experience  in comparison to the existing communication framework. However you have to bare two things in mind. First, this setup is for a passive entertainment medium, and user interaction is reminscent of a chat engine. In a computer game (at least the ones Valve develop), players don’t communicate at the same frequency they would if they’re conversing on an IM. Communication is very sparse as the player has to focus on the cognitive task they’ve been assigned, subsquently biofeedback may offer an alternate means of communicating important information they would normally have to type out (e.g. if a player is having difficulty in Left for Dead, it might show up as a heightened level of arousal, and instead of the player typing out a rescue call, their panic is communicated to the rest of the team via their physiology without them having to task switch, thereby allowing the panicing player to focus on staying alive). Though the effectiveness of biofeedback might lose out to voice chat which is frequently used in games in lieu of text (NB. when it works) as its easy to multi-task and provide status updates to team members. But on the other hand, biofeedback provides a continious data stream which voice doesn’t.

Secondly, the presenation format of the player’s heartbeat rate and the correlation between them may not of provided sufficent meaning for the user to have an affect on the recorded measures. As I’ve talked about in a prior post, its difficult to translate biological data into something you can derive meaning from. For example visualising a months worth of body blogger data as a graph provided little meaning to me about my day. However visualised as a heat map I was quickly able to assess my daily routines (e.g. sleep, exericse, pub crawls). Subsquently, the authors choice in visualising the groups physiology may of negatively affected their experience, and as noted by the authors research will be required to assess different presentation formats.

(Kuikkaniemi, K., Janssen, J.) Calling Safely Through Haptic Biosignal Transfer

Talking on mobile phones while driving, be it hands-free or not, reduces driving safety dramatically. We developed a biosignal transfer system to create an empathic link between driver and caller. Our hypothesis is that this helps the caller to adapt to the driver and make the driving safer. We used electrodermal activity for measuring driver’s arousal and a haptic vest for transferring biosignal information to the caller. Through a guessing game, we simulated constant and demanding discussion situations. A first pilot study shows, that GSR biosignal transfer can be quite easily manipulated to convey information about changes in driving challenge to caller, but the setup is also volatile to other stimulae
such as enjoyment elicited by funny conversation.

(Gilleade, K., Fairclough, S.) Physiology as XP – BodyBlogging to Victory

Quantifying how we change over time is a powerful tool, for it allows us to better understand the impact of events and our behavior on our psychological and physiological wellbeing. With understanding we can attempt to manipulate our behaviors so to improve coping strategies and outcomes (e.g. avoid undesirable mental states). As wearable sensors become more ubiquitous we may automate the collection of physiological data on a long term basis. As we collect more information about how our bodies react in different situations we can learn new things about ourselves that were previously concealed. While we may choose to keep this information private, web technologies allow us to share our data with others introducing new means to shape our behavior. We call this process body blogging – the act of logging physiological changes over a period of time or during specific events using web technology.

We’ve recently been exploring applications of body blogging in public spaces [1]. As physiological signals have become more prevalent in computer games (e.g. Wii Vitality, Ubisoft Innergy) and as newer games often feature online connectivity there is potential for games to be used as a means to modify behavior through body blogging. In this paper we introduce the concept of body blogging through our own experiences with the technique and discuss its potential uses. We also discuss how games might be used in conjunction with body blogging to modify behavior and the challenges involved in deploying such systems.

Full citation,

Gilleade, K., Fairclough, S. H., Physiology as XP – Bodyblogging to Victory. BioS-Play Workshop at Fun and Games 2010. Leuven, Belgium.

Evaluating Interactions

(Nacke, L., Mandryk, R.) Designing Affective Games with Physiological Input

With the advent of new game controllers, traditional input mechanisms for games have changed to include gestural interfaces and camera recognition techniques, which are being further explored with the likes of Sony’s PlayStation Move and Microsoft’s Kinect. Soon these techniques will include affective input to control game interaction and mechanics. Thus, it is important to explore which game designs work best with which affective input technologies, giving special regard to direct and indirect methods. In this paper, we discuss some affective measurement techniques and development ideas for using these as control mechanisms for affective game design using psychophysiological input.

Describes the implementation of a range of different physiological game mods in a platformer called Death Trigger. For the win is the “Medusa’s Gaze” mod, via a gaze tracker player’s can freeze enemies in place by staring at them for a set period of time. It was interesting to see the authors attempt to categorise physiological inputs according to their level of player control, either direct (e.g. gaze) or indirect (e.g. brainwaves). I’ve always found classifying physiological signals a pain (e.g is it physical action, electro-physiological, biological, etc), and sometimes the language we use becomes hazy as we all have a different interpretation of the terminology. Subsquently taking an application view to a physiological measure, as the authors have done here, is somewhat refreshing, and from a designers perspective easier to communicate to others. Though, again the nature of physiology raises problems with this classification as well (as it tends to do to them all). For example a temperature sensor is listed as a direct input device as the player can vary temperature by applying pressure or breathing, however this assumes the sensor is used in an explicit biofeedback interaction and not an implicit one. A  fully realised categorisation system for physiological input is probably going to involve a number of factors including the interaction type, signal source, sensing apparatus and so on.

Workshop Discussion

Being a small research field, a significant number of the established researchers in the area turned up, making the workshop a fantastic forum for discussing physiologically interactive computer games. I had several memorable discussions with Lennart Nacke and Kai Kuikkaniemi debating the finer points of affective play who I’ve been eager to meet since reading their earlier work. During the workshop a number of discussion threads came about which I found of particular interest, these are as follows: –

  • How do we design meaningful user relationships between a physiological change and the associated effect in the virtual space?

In a physiological interactive system, changes in a physiological input are made to manipulate the virtual space (e.g. Tetris 64 varied game difficulty according to the player’s heartbeat rate).  However certain relationships between the physical and virtual side of an interaction should theoretically be more usable if they are meaningful for the user (i.e. a natural interaction). For example, in the BCI game modification alpha-WoW, the player’s alpha activity is mapped onto the shape-shifting ability of their World of Warcraft character. During high levels of alpha activity (relaxed), the player takes the form of a serene and mystical night elf. However during low level alpha activity (agitated), the player takes the form of a violent bear. As can be seen the player’s psychophysiological states, are mapped onto similar virtual game states thereby making the relationship meaningful. Previously I talked about the concept of a meaningful interface for data visualisation (NB. not being an InfoVis guy, I suspect there is an already established term for this), however I do believe the same principle applies to physiological interaction and in our paper I brought along this concept as from my persecptive it formalised what we’re all trying to do.

Given there are no established rules as to what type of relationship provides the optimum level of experience (e.g.  usability) outside a traditional biocynetric interface (e.g. emulating an existing biological system with a computerised copy, such as a missing limb), there is a lot of investigative work going on to see what works and what doesn’t (e.g.  see Designing Affective Games with Physiological Input), and so it’ll be interesting to see what interactions are found to be meaningful in the future.

  • What interaction is a physiological input most suited for?

In a BCI the P300 signal (event related potential) and the steady state visually evoked potential (SSVEP) can be used as a selection input. In a controlled gaming environment either response should be sufficent for a selection task as any signal noise is kept to a minimum, however once we bring the signal into a more realistic setting their application becomes more restricted. For example, SSVEP relies on the user looking at a flashing stimulus which in turn generates the required response in the brain, subsquently in order to make a selection the user has to wait for their chosen item to flash. While the signal to noise ratio is better for classifying the signal it relies on a designer to be creative with the screen space (e.g. to prevent the selection of a flashing object which is nearby to the desired one).  A better place for this signal might be as part of a pervasive game over a large physical environment where there is little chance of selection error, for example, imagine a BCI geo-caching game where instead of collecting physical items you collect SSVEP’s via a flashing LED. As BCI devices become more mainstream, were going to see a lot more people exploring how the properties of a signal, including its limitations, can be exploited in different gaming environments and which work best given the circumstances. For further reading I suggest consulting an early paper by the authors of the Towards BCI Mulitplayer, Turning shortcomings into challenges: Brain-computer interfaces for games. This issue is also applicable to other forms of physiological input.

  • What are the limits of an interaction?

In explicit biofeedback, changes in the biological signal are mirrored by the system. This allows the user to develop an understanding of how their actions change their unseen biological processes. For example, Mattel’s MindFlex uses frontal theta, a measure of mental effort, to control the height of a ball. As the player modifies their throught processes in an effort to move the ball they develop a rapport between their actions and its reflection in the game. However if the lag between the physical and virtual action is too great then the user will not develop an understanding of which thought processes generated the desired behaviour and subsquently control becomes more a matter of luck than skill. As such this raises the question, at what point does a physiological interaction break down and become unusable?

  • How do we evaluate a physiological interface?

This is becoming more problematic as time goes by but how do we actually evaluate whether or not a physiological adaptation is  adding to the player’s gaming experience? In related research papers when you see the implementation of a biofeedback game it tends to look pretty cool which players respond too. However their is little to no assessment as to whether the input is actually adding something to the gaming experience beyond a short-term novelty (NB. I’ve been guilty of this myself). For physiological adaptation to become a mainstream gaming mechanic it needs to demonstrate it can provide a unique gaming experience, one that you can’t emulate by say for example profiling a player before play and then designing a bespoke level around them (e.g. Silent Hill: Shattered Memories).

To Be Continued

Overall it was pretty fun in in Leuven, given I got talk shop for three days straight, and this time I didn’t lose my plane ticket before I even left the house. It was rather amusing during the workshop finding out I wasn’t the only person monitoring themself, perhaps next time we all meet I’ll bring the FNIR to add that extra level of spice. In my next BioS-Play post I’ll be talking about a few of the interesting things I saw at the conference, how well my first public talk went with me wearing the body blogging system (Hint: not as well as my Quantifed Self talk later on in the month) and a few of the online research community ideas me and my fellow BioS-Play peeps have come up with and how we’ll be implementing them. If you have any thoughts or comments on my BioS-Play experience, I’ll be happy to hear them in the comments section below.

3 thoughts on “BioS-Play 2010 Workshop Experience Report – Part 1 of 2

  1. Pingback: Affective Gaming | BioS-Play 2010 Workshop Experience Report on Physiological Computing

  2. Guillaume

    Thanks Kiel for this post, I am eager to read the second part about your own body blogging system. Is it coming soon ?

    It was good to read about your comments on the “Shared Physiological Experiences” part. I fully agree with both points, especially the first which leads to the question “what type of physiological interaction for which interaction”. I would like to stress that in the developed protocol the idea was to display the physiological information in a raw format without giving any interpretation of the signals to the participants. In most systems, a signal is interpreted as a level of stress or an emotional state. I believe that letting the participants construct their own interpretation of the signals is interesting because:
    – most of the detection systems are still making a lot of errors and we can thus question their efficacy,
    – the user can always disagree with the proposed interpretation (independently of the true or false errors) which leads to frustration of the user,
    – this is closer to what we are doing everyday: interpreting emotional signs of persons.

    We were thus interested to know if just displaying the information (in our case the heart beats + synch measure) could improve the interpersonal communication. However, in this situation, it might be better to use other modalities to represent the signals. For instance, having the heart beats presented as sounds is certainly better than displayed on a screen since most people have heard theirs or others hearts before.

    Thanks again for this post,

    Guillaume

    1. Kiel Gilleade Post author

      Thanks Guillaume,

      The second BioS-Play post will be coming hopefully next week. I’m getting myself EA Sports Active 2.0, and might post on that first. Also been busy with organising a follow-up workshop to BioS-Play at CHI which I hope your able to make, see http://brainandbody.physiologicalcomputing.net. Going to have lots of fun debating the finer points of what makes for a meaningful interface for the user in any given situation.

      In the meantime I’ve just written a technical post on the The Body Blogger which you can find here.

      http://www.physiologicalcomputing.net/wordpress/?p=723

      Kind regards,

      – Kiel

Comments are closed.