Special Issue Editors
- Hugo Gamboa (Universidade Nova de Lisboa, Portugal)
- Hugo Plácido da Silva (IT – Institute of Telecommunications, Portugal)
- Kiel Gilleade (Liverpool John Moores University, United Kingdom)
- Sergi Bermúdez i Badia (Universidade da Madeira, Portugal)
- Stephen Fairclough (Liverpool John Moores University, United Kingdom)
Deadline for Submissions: 30 June 2014
Physiological data provides a wealth of information about the behavioural state of the user. These data can provide important contextual information by allowing the system to draw inferences with respect to the affective, cognitive and physical state of a person. In a computerised system this information can be used as an input control to drive system adaptation. For example, a videogame can use psychophysiological inferences of the player’s level of mental workload during play to adjust game difficulty in real-time.
The field of Physiological Computing consists of systems that use data from the human nervous system as control input to a technological system. Traditionally these systems have been grouped into two categories, those where physiological data is used as a form of input control and a second where spontaneous changes in physiology are used to monitor the psychological state of the user. The field of Brain-Computer Interfacing (BCI) traditionally conceives of BCIs as a controller for interfaces, a device which allows you to “act on” external devices as a form of input control. However, most BCIs do not provide a reliable and efficient means of input control and are difficult to learn and use relative to other available modes. We propose to change the conceptual use of “BCI as an actor” (input control) into “BCI as an intelligent sensor” (monitor). This shift of emphasis promotes the capacity of BCI to represent spontaneous changes in the state of the user in order to induce intelligent adaptation at the interface. BCIs can be increasingly used as intelligent sensors which “read” passive signals from the nervous system and infer user states to adapt human-computer, human-robot or human-human interaction (HCI, HRI, HHI). This perspective on BCIs challenges researchers to understand how information about the user state should support different types of interaction dynamics, from supporting the goals and needs of the user to conveying state information to other users. What adaptation to which user state constitutes opportune support? How does the feedback of the changing HCI and human-robot interaction affect brain signals? Many research challenges need to be tackled here.
Grand Challenge Website
The deadline for submissions to this special session has been extended to May 20th
Anton Nijholt from University of Twente and Rob Jacob from Tufts University are organizing a special session at ICMI 2011 on “BCI and Multimodality”. All ICMI sessions, including the special sessions, are plenary. Hence, having a special session during the ICMI conference means that there is the opportunity to address a broad audience and make them aware of new developments and special topics. Clearly, if we look at BCI for non-medical applications a multimodal approach is natural. We can make use of knowledge about user, task, and context. Part of this information is available in advance, part of the information becomes available on-line in addition to EEG or fNIRS measured brain activity. The intended user is not disabled, he or she can use other modalities to pass commands and preferences to the system, and the system may also have information obtained from monitoring the mental state of the user. Moreover, it may be the case that different BCI paradigms can be employed in parallel or sequentially in multimodal (or hybrid) BCI applications.
Workshop at ACII 2011
The second workshop on affective brain-computer interfaces will explore the advantages and limitations of using neuro-physiological signals as a modality for the automatic recognition of affective and cognitive states, and the possibilities of using this information about the user state in innovative and adaptive applications. The goal is to bring researchers from the communities of brain computer interfacing, affective computing, neuro-ergonomics, affective and cognitive neuroscience together to present state-of-the-art progress and visions on the various overlaps between those disciplines.
Just a quick note, I’ll be doing an update on the workshop later on the week (i.e. after I recover from my upcoming jetlag which I’m hestiantly waiting for as I board my flight home). In the meantime check out the following workshop on augmented social interaction.
Workshop at ACII 2011
Augmenting Social Interaction through Affective Computing is the first workshop on affective computing that specifically aims to improve or enhance social interaction among humans. Social interactions, whether mediated or face-to-face, can benefit significantly from advances in affective computing and social signal processing. Example application areas include mental healthcare, training and coaching, negotiation, and close intimate interactions. To address this topic, we invite submissions on the relation between social interaction between humans and affective computing technologies.
A late addition to the conference list is BIOSIGNALS2010 – 3rd International Conference on Bio-Inspired Systems and Signal Processing to be held in Valencia in January 2010. This conference includes sessions on: signal processing, wearable sensors and user interface. Full details here
A workshop has been organised as part of CHI 2010 in Atlanta entitled “Brain, Body and Bytes”. Details are here. The same organisers have also set up a facebook group.
The workshop on affective computing and BCI in Amsterdam this September has extended its deadline to 22nd June for all papers. Website for workshop here
The European Future Tech conference has the catchy title “Science Beyond Fiction” and is organised by the Future & Emerging Technologies (FET) division of the European Commission. I’m involved in the REFLECT project and we’re doing a conference session about our work on 22nd April.
The PERADA project has asked me to talk about Biocybernetic Adaptation as part of a half-day Pervasive Computing workshop at the AISB conference in Edinburgh