Tag Archives: body blogs

Physiological Computing: increased self-awareness or the fast track to a divided ego?

In last week’s excellent Bad Science article from The Guardian, Ben Goldacre puts his finger on a topic that I think is particularly relevant for physiological computing systems.  He quotes press reports about MRI research into “hypoactive sexual desire response” – no, I hadn’t heard of it either, it’s a condition where the person has low libido.  In this study women with the condition and ‘normals’ viewed erotic imagery in the scanner.  A full article on the study from the Mail can be found here but what caught the attention of Bad Science is this interesting quote from one of the researchers involved: “Being able to identify physiological changes, to me provides significant evidence that it’s a true disorder as opposed to a societal construct.”

Continue reading

Mobile Monitors and Apps for Physiological Computing

I always harbored two assumptions about the development of physiological computing systems that have only become apparent (to me at least) as technological innovation seems to contradict them.  First of all, I thought nascent forms of physiological computing systems would be developed for desktop system where the user stays in a stationary and more-or-less sedentary position, thus minimising the probability of movement artifacts.  Also, I assumed that physiological computing devices would only ever be achieved as coordinated holistic systems.  In other words, specific sensors linked to a dedicated controller that provides input to adaptive software, all designed as a seamless chain of information flow.

Continue reading

Making Data Meaningful

The problem with collecting in any amount of data is figuring out how to present that data in a manner that is meaningful for its intended audience. For example if you want to assess the physical effort you exert during a run a plot of physiological activity (e.g. heartbeat rate) against time will provide you with a relatively simple visual representation of how your body adapts to physical stress.

Continue reading

The Moody Web

We’re sporting a new look this month here at Physiological Computing, several in fact, as we’ve turned the web interface into an online mood ring. Using the online heartbeat rate of our body blogger (read more here on the BodyBlogger) the colour scheme of the site is set according to the users current physiological state.

Currently 4 colour schemes are supported: –

Offline

Relaxed

Normal

Elevated

Burning

Each scheme  is mapped onto a different physiological range which are as follows: –

  • Relaxed: less than 60 beats per minute (bpm)
  • Normal: 60 to 80 bpm
  • Elevated: 80 to 100 bpm
  • Burning: More than 100 bpm

These ranges and their implied state have been configured for our current body blogger who transitions through them on a daily basis (e.g. burning – running).

I’ll be updating more about the Moody Web later on this week, for the time being enjoy our take on adding a touch of the personal to the web.

Better living through affective computing

I recently read a paper by Rosalind Picard entitled “emotion research for the people, by the people.”  In this article, Prof. Picard has some fun contrasting engineering and psychological perspectives on the measurement of emotion.  Perhaps I’m being defensive but she seemed to have more fun poking fun at the psychologists than the engineers, but the central impasse that she identified goes something like this: engineers develop sensor apparatus that can deliver a whole range of objective data whilst psychologists have decades of experience with theoretical concepts related to emotion, so why haven’t people really benefited from their union through the field of affective computing.  Prof. Picard correctly identifies a reluctance on the part of the psychologists to define concepts with sufficient precision to aid the work of the engineers.  What I felt was glossed over in the paper was the other side of the problem, namely the willingness of engineers to attach emotional labels to almost any piece of psychophysiological data, usually in the context of badly-designed experiments (apologies to any engineers reading this, but I wanted to add a little balance to the debate).
Continue reading

iBrain

I just watched a TEDMED talk about the iBrain device via this link on the excellent Medgadget resource.  The iBrain is a single-channel EEG recording collected via ‘dry’ electrodes where the data is stored in a conventional handheld device such as a cellphone.  In my opinion, the clever part of this technology is the application of mathematics to wring detailed information out of  a limited data set – it’s a very efficient strategy.

The hardware looks to be fairly standard – a wireless EEG link to a mobile device.  But its simplicity provides an indication of where this kind of physiological computing application could be going in the future – mobile monitoring for early detection of medical problems piggy-backing onto conventional technology.  If physiological computing applications become widespread, this kind of proactive medical monitoring could become standard.  And the main barrier to that is non-intrusive, non-medicalised sensor development.

In the meantime, Neurovigil, the company behind the product, recently announced a partnership with Swiss pharmaceutical giants Roche who want to apply this technology to clinical drug trials.  I guess the methodology focuses the drug companies to consider covert changes in physiology as a sensitive marker of drug efficacy or side-effects.

I like the simplicity of the iBrain (1 channel of EEG) but speaker make some big claims for their analysis, the implicit ones deal with the potential of EEG to identify neuropathologies.  That may be possible but I’m sceptical about whether 1 channel is sufficient.  The company have obviously applied their pared-down analysis to sleep stages with some success but I was left wondering what added value the device provided compared to less-intrusive movement sensors used to analyse sleep behaviour, e.g. the Actiwatch

Pre-E3: Thoughts on the Wii Vitality

With only hours left until Nintendo’s E3 press event I’ve once again been pondering what we’ll see from the Wii Vitality. At last year’s E3 the device’s announcement didn’t exactly wow the audience. It’s not surprising as Nintendo didn’t provide a demonstration of the device which might have bought gamers into the concept.  Nintendo have remained tight lipped ever since revealing absolutely nothing about what we might expect from the device.  Over the past year it has been suggested the Vitality will be used to monitor the scare factor in games, alter game difficulty, be used as an input for relaxation and exercise games as well as in lie-detection of which I discussed one particular method of implementation here.

Implementing these types of games are indeed possible using physiological measures and you can see versions of these games in the biofeedback and academic game communities for example: –

  1. The Journey to Wild Divine: series of relaxation mini-games controlled using heartbeat rate and skin conductance.
  2. A Fitness Game Reflecting Heart Rate:  boxing game which adapts the gameplay in order to move you towards a target heartbeat rate. Enemy characters require a different physical movement to destroy them, depending on the player’s current heartbeat rate and the goal an appropriate enemy will be selected.
  3. Fairies: target acquisition game which alters the player’s perception of the games difficulty according to the player’s level of arousal as denoted by heartbeat rate.

There are many ways you can use physiological data in gameplay, it’s all a question of how you make the input meaningful for the player (e.g. if the player’s relaxation level is used to switch between character states, those states have to be somewhat representative of the change, so if the player is controlling an avatar with a pyrotechnic ability then a shift from a relaxed state to an agitated state could be used to trigger their fire ability and vice versa, this would be a meaningful relationship).

The problem I have with the Vitality is in their choice of sensor:  a finger based pulse oximeter.  A pulse oximeter uses infra-red to track the changes in the volume of blood in the extremities and from this derive heartbeat rate. If you want to support a wide selection of different play styles (e.g. relaxation, exercise, affective) a finger based sensor would not have been my first choice considering the following issues: –

  1. A finger based sensor limits the player’s freedom of movement. Any physical activity will move the sensor from its recording position and may even possibly become disconnected, both events of which will create data errors. And depending on how responsive the game is to the player’s physiology it can easily lead to erroneous game behaviour.  This will limit how the data can be used in a given context (e.g. a game responsive to emotional physiological responses is not suited to a game involving gestures, this example is more prelevant for the Wii as the system sells itself on physical interaction as the standard input method).  Also the physicality of the sensor attachment to the player’s finger restricts player movement so physical actions may become uncomfortable (e.g. imagine playing Red Steel with a cable attached to your finger). This is not to say games involving physical actions will be taboo using the Vitality (e.g. a calorie counter in an exercise game), it just makes it harder.
  2. With a finger based sensor use of the second Nunchuk is liable to become awkward possibly eliminating it as an input device.

Ideally Nintendo should have gone with either an earlobe based pulse oximeter thereby freeing up the hands (though physical actions still have to be limited as that sensor is not the most secure under intense movement) or ideally a chest-strap*. A chest-strap sensor provides the most secure method of measuring a player’s heartbeat rate as the centre of the body is pretty stable under movement, this is especially true from my perspective given I’ve been wearing one for the last several months collecting data.

At this point these issues are pretty much moot (more like irritations in my noggin I can’t dispell) as I suspect Nintendo will launch the system with a series of relaxation games which the Vitality is clearly geared for**. Or perhaps a lie-detection game as I’ve talked about before.

* The problem inherent in using a chest-strap is in how the player may perceive it invading their personal space.  The chest-strap is an up-and-close personal wearable device, and I imagine given the new wireless heart monitor EA Sports Active 2.0 is using (an armband based heart monitor), there development staff thought so too. The finger attachment does not invade the player’s space so there is no unease in wearing the device.  Having just seen EA’s E3 press conference, the Vitality is already looking obsolete.

**The sensor used by The Journey to Wild Divine has been used in a multitude of different game genres, my favourite being the Half-Life 2 mod Please Don’t Feed the BioZombies. However the sensor is used in conjunction with a mouse and keyboard and this setup doesn’t suffer from player movement to the same degree the Vitality will given the nature of the input device doesn’t require much. Also unlike Vitality a mouse and keyboard is placed on a flat providing support for the hand the sensor is on.

life logging + body blogging

This article in New Scientist prompts a short follow-up to my posts on body-blogging. The article describes a camera worn around the neck that takes a photograph every 30sec. The potential for this device to help people suffering from dementia and related problems is huge. At perhaps a more trivial level, the camera would be a useful addition to wearable physiological sensors (see previous posts on quantifying the self). If physiological data could be captured and averaged over 30 sec intervals, these data could be paired with a still image and presented as a visual timeline. This would save the body blogger from having to manually tag everything; the image also provides a nice visual recall prompt for memory and the person can speculate on how their location/activity/interactions caused changes in the body. Of course it would work as a great tool for research also – particularly for stress research in the field.

quantifying the self (again)

I just watched this cool presentation about blogging self-report data on mood/lifestyle and looking at the relationship with health. My interest in this topic is tied up in the concept of body-blogging (i.e. recording physiological data using ambulatory systems) – see earlier post. What’s nice about the idea of body-blogging is that it’s implicit and doesn’t require you to do anything extra, such as completing mood ratings or other self-reports. The fairly major downside to this approach comes in two varieties: (1) the technology to do it easily is still fairly expensive and associated software is cumbersome to use (not that it’s bad software, it’s just designed for medical or research purposes), and (2) continuous physiology generates a huge amount of data.

For the individual, this concept of self-tracking and self-quantifying is linked to increased self-awareness (to learn how your body is influenced by everyday events), and with self-awareness comes new strategies for self-regulation to minimise negative or harmful changes. My feeling is that there are certain times in your life (e.g. following a serious illness or medical procedure) when we have a strong motivation to quantify and monitor our physiological patterns. However, I see a risk of that strategy tipping a person over into hypochondria if they feel particularly vulnerable.

At the level of the group, it’s fascinating to see the seeds of a crowdsourcing idea in the above presentation. Therefore, people self-log over a period and share this information anonymously on the web. This activity creates a database that everyone can access and analyse, participants and researchers alike. I wonder if people would be as comfortable sharing heart rate or blood pressure data – provided it was submitted anonymously, I don’t see why not.

There’s enormous potential here for wearable physiological sensors to be combined with self-reported logging and both data sets to be combined online. Obviously there is a fidelity mismatch here; physiological data can be recorded in milliseconds whilst self-report data is recorded in hours. But some clever software could be constructed in order to aggregate the physiology and put both data-sets on the same time frame. The benefit of doing this for both researcher and participant is to explore the connections between (previously) unseen patterns of physiological responses and the experience of the individual/group/population.

For anyone who’s interested, here’s a link to another blog site containing a report from an event that focused on self-tracking technologies.