Tag Archives: physiological computing

Forum for the Community for Passive BCI

A quick post to alert people to the first forum for the Community for Passive BCI Research that take place from the 16th to the 18th of July at the Hanse Institute for Advanced Study in Delmenhorst, near Bremen, Germany.  This event is being organised by Thorsten Zander from the Berlin Institute of Technology.

The main aim of the forum in his own words “is to connect researchers in this young field and to give them a platform to share their motivations and intentions. Therefore, the focus will not be primarily set on the presentation of new scientific results, but on the discussion of current and future directions and the possibilities to shape the community.”

Continue reading

Book Announcement – Advances in Physiological Computing

It was way back in 2011 during our CHI workshop that we first discussed the possibility of putting together an edited collection for Springer on the topic of physiological computing.  It was clear to me at that time that many people associated physiological computing with implicit monitoring as opposed the active control that characterised BCI.  When we had the opportunity to put together a collection, one idea was to extend the scope of physiological computing to include all technologies where signals from the brain and the body were used as a form of input.  Some may interpret this relabelling of physiological computing as an all-inclusive strategy as a provocative move.  But we did not take this option as a conceptual ‘land-grab’ but rather an attempt to be as inclusive as possible and to bring together what I still perceive to be a rather disparate and fractured research community.  After all, we are all using psychophysiology in one form or another and share a common interest in sensor design, interaction mechanics and real-time measurement.

The resulting book is finally close to publication (tentative date: 4th April 2014) and you can follow this link to get the full details.  We’re pleased to have a wide range of contributions on an array of technologies, from eye input to digital memories via mental workload monitoring, implicit interaction, robotics, biofeedback and cultural heritage.  Thanks to all our contributors and the staff at Springer who helped us along the way.

 

Reflections on first International Conference on Physiological Computing Systems

international-conference-on-physiological-computing-systems-phycs-logo

Last week I attended the first international conference on physiological computing held in Lisbon.  Before commenting on the conference, it should be noted that I was one of the program co-chairs, so I am not completely objective – but as this was something of a watershed event for research in this area, I didn’t want to let the conference pass without comment on the blog.

The conference lasted for two-and-a-half days and included four keynote speakers.  It was a relatively small meeting with respect to the number of delegates – but that is to be expected from a fledgling conference in an area that is somewhat niche with respect to methodology but very broad in terms of potential applications.

Continue reading

What kind of Meaningful Interaction would you like to have? Pt 1

A couple of years ago we organised this CHI workshop on meaningful interaction in physiological computing.  As much as I felt this was an important area for investigation, I also found the topic very hard to get a handle on.  I recently revisited this problem in working on a co-authored book chapter with Kiel on our forthcoming collection for Springer entitled ‘Advances in Physiological Computing’ due out next May.

On reflection, much of my difficulty revolved around the complexity of defining meaningful interaction in context.  For systems like BCI or ocular control, where input control is the key function, the meaningfulness of the HCI is self-evident.  If I want an avatar to move forward, I expect my BCI to translate that intention into analogous action at the interface.   But biocybernetic systems, where spontaneous psychophysiology is monitored, analysed and classified, are a different story.  The goal of this system is to adapt in a timely and appropriate fashion and evaluating the literal meaning of that kind of interaction is complex for a host of reasons.

Continue reading

Troubleshooting and Mind-Reading: Developing EEG-based interaction with commercial systems

With regards to the development of physiological computing systems, whether they are BCI applications or fall into the category of affective computing, there seems (to me) to be two distinct types of research community at work. The first (and oldest) community are university-based academics, like myself, doing basic research on measures, methods and prototypes with the primary aim of publishing our work in various conferences and journals. For the most part, we are a mixture of psychologists, computer scientists and engineers, many of whom have an interest in human-computer interaction. The second community formed around the availability of commercial EEG peripherals, such as the Emotiv and Neurosky. Some members of this community are academics and others are developers, I suspect many are dedicated gamers. They are looking to build applications and hacks to embellish interactive experience with a strong emphasis on commercialisation.

There are many differences between the two groups. My own academic group is ‘old-school’ in many ways, motivated by research issues and defined by the usual hierarchies associated with specialisation and rank. The newer group is more inclusive (the tag-line on the NeuroSky site is “Brain Sensors for Everyone”); they basically want to build stuff and preferably sell it.

Continue reading

CFP – Brain Computer Interfaces Grand Challenge 2012

The field of Physiological Computing consists of systems that use data from the human nervous system as control input to a technological system. Traditionally these systems have been grouped into two categories, those where physiological data is used as a form of input control and a second where spontaneous changes in physiology are used to monitor the psychological state of the user. The field of Brain-Computer Interfacing (BCI) traditionally conceives of BCIs as a controller for interfaces, a device which allows you to “act on” external devices as a form of input control. However, most BCIs do not provide a reliable and efficient means of input control and are difficult to learn and use relative to other available modes. We propose to change the conceptual use of “BCI as an actor” (input control) into “BCI as an intelligent sensor” (monitor). This shift of emphasis promotes the capacity of BCI to represent spontaneous changes in the state of the user in order to induce intelligent adaptation at the interface. BCIs can be increasingly used as intelligent sensors which “read” passive signals from the nervous system and infer user states to adapt human-computer, human-robot or human-human interaction (HCI, HRI, HHI). This perspective on BCIs challenges researchers to understand how information about the user state should support different types of interaction dynamics, from supporting the goals and needs of the user to conveying state information to other users. What adaptation to which user state constitutes opportune support? How does the feedback of the changing HCI and human-robot interaction affect brain signals? Many research challenges need to be tackled here.

Grand Challenge Website

Continue reading

BCI, biocybernetic control and gaming

Way back in 2008, I was due to go to Florence to present at a workshop on affective BCI as part of CHI. In the event, I was ill that morning and missed the trip and the workshop. As I’d prepared the presentation, I made a podcast for sharing with the workshop attendees. I dug it out of the vaults for this post because gaming and physiological computing is such an interesting topic.

The work is dated now, but basically I’m drawing a distinction between my understanding of BCI and biocybernetic adaptation. The former is an alternative means of input control within the HCI, the latter can be used to adapt the nature of the HCI. I also argue that BCI is ideally suited certain types of game mechanics because it will not work 100% of the time. I used the TV series “Heroes” to illustrate these kinds of mechanics, which I regret in hindsight, because I totally lost all enthusiasm for that show after series 1.

The original CHI paper for this presentation is available here.

 

Mood and Music: effects of music on driver anger

Last month I gave a presentation at the Annual Meeting of the Human Factors and Ergonomics Society held at Leeds University in the UK.  I stood on the podium and presented the work, but really the people who deserve most of the credit are Marjolein van der Zwaag (from Philips Research Laboratories) and my own PhD student at LJMU Elena Spiridon.

You can watch a podcast of the talk above.  This work was originally conducted as part of the REFLECT project at the end of 2010.  This work was inspired by earlier research on affective computing where the system makes an adaptation to alleviate a negative mood state.  The rationale here is that any such adaptation will have beneficial effects – in terms of reducing duration/intensity of negative mood, and in doing so, will mitigate any undesirable effects on behaviour or the health of the person.

Our study was concerned with the level of anger a person might experience on the road.  We know that anger causes ‘load’ on the cardiovascular system as well as undesirable behaviours associated with aggressive driver.  In our study, we subjected participants to a simulated driving task that was designed to make them angry – this is a protocol that we have developed at LJMU.  Marjolein was interested in the effects of different types of music on the cardiovascular system while the person is experiencing a negative mood state; for our study, she created four categories of music that varied in terms of high/low activation and positive/negative valence.

The study does not represent an investigation into a physiological computing system per se, but is rather a validation study to explore whether an adaptation, such as selecting a certain type of music when a person is angry, can have beneficial effects.  We’re working on a journal paper version at the moment.

REFLECT Project Promo Video

Some months ago, I wrote this post about the REFLECT project that we participated in for the last three years.  In short, the REFLECT project was concerned with research and development of three different kinds of biocybernetic loops: (1) detection of emotion, (2) diagnosis of mental workload, and (3) assessment of physical comfort.  Psychophysiological measures were used to assess (1) and (2) whilst physical movement (fidgeting) in a seated position was used for the latter.  And this was integrated into the ‘cockpit’ of a  Ferrari.

The idea behind the emotional loop was to have the music change in response to emotion (to alleviate negative mood states).  The cognitive loop would block incoming calls if the driver was in a state of high mental workload and air-filled bladders in the seat would adjust to promote physical comfort.  You can read all about the project here.  Above you’ll find a promotional video that I’ve only just discovered – the reason for my delayed response in posting this is probably vanity, the filming was over before I got to the Ferrari site in Maranello.  The upside of my absence is that you can watch the much more articulate and handsome Dick de Waard explain about the cognitive loop in the film, which was our main involvement in the project.