How should real-time analysis be defined in physiological computing?

Admin: Please welcome to the site our new Physiological Computing blogger.

Greetings my name is Alexander Karran and I am a PhD student at the Liverpool John Moore’s School of Natural Science and Psychology, under the supervision of Dr Stephen Fairclough.  I will be working towards a research goal of developing a framework for real-time classifications of certain vectors of affect, using ambulatory body sensor networks under the umbrella of psychophysiology. This work will be multidisciplinary in nature, borrowing from and contributing to, computer science, e-health, psychology and affective computing.

My background is in computer science [1],[2] and I will be using this experience to help me create an ambulatory sensor network framework and algorithms for the real-time analysis and categorisation of affective physiological data in response to artistic stimuli. The aim is to use these algorithms within a real-world context, for the ArtSense project  which will use the output from the algorithms to inform and drive a physiological “adaption” engine which will provide a museum patron with a richer more immersive experience of exhibits that have been enhanced with Augmented Reality content by the adaption engine.

The challenge is to create algorithms and methods which are capable of analysing and categorising the minute changes in physiological signals with which the human body responds to stimuli, in such a way as to indicate a level of interest or engagement with that stimuli.  The greater the level of interest, the less information is provided via augmented reality content “pushes”.  A moderate level of interest should spur the adaption engine to provide more information. Low interest may indicate boredom and possibly give rise to the use of bursts of information content to elicit a concomitant rise in the magnitude of physiological signal changes, indicating a greater level of interest and providing a pathway for adaptive content responses.


One of the biggest challenges I have encountered thus far in pursuit of the research goal lies with how to define “real-time” analysis. From a computer science perspective, real-time is measured in processor clock cycles or at worst is measured in the 1-10 millisecond range.  I have found in my review of the literature that “real-time” in the psychophysiological sense can range anywhere from two weeks (in the case of pre-processed data) to 5-10 minutes. This massive disparity in real-time definitions stems from the data itself, in computer science data is clean, raw and digital and is analysed and processed “as is” with little or no lag range. Whereas psychophysiological data gained from sensors comes with more than a few bottlenecks, ranging from sensor clocks (the time it takes for the signal to aggregate enough to be passed on for recording) to the human physiological response itself (a case of “perceptual” real-time), in that, physiological responses to stimuli need to be processed by the brain then cascade throughout the nervous system to create a change of significant magnitude to be recorded.

These bottlenecks already stretch the boundaries of a traditional definition of real-time, thus another research challenge will be to evaluate the most effective software and hardware approaches to speeding up the processing of sensor data so as to provide output information fast enough to be used for meaningful interaction in “real-time” and also to create a workable model that can be used to define real-time for use in a physiological computing context.

The design of an adaptive affective computer system thus becomes a design problem demarcated by at least three definitions of real-time, taken from different fields; computer science (processing real-time), psychophysiology (human response real-time) and most importantly (from the standpoint of a user of such a system) cognitive science (perceptual real-time). I have some thoughts on how to tackle this problem of definition which I will outline in future posts, for now I will part with a nod to a paper by E.L van den broek[3]  who is also attempting to confront this problem.

[1] Haggerty, J., Karran, A.J., Lamb, D.J. & Taylor, M.J., “A Framework for the Forensic Investigation of Unstructured Email Relationship Data”, International Journal of Digital Crime and Forensics, 3 (3), pp. 1 – 18, 2011.

[2] Karran, A., Haggerty, J., Lamb, D., Taylor, M. & Llewellyn-Jones, D., “A Social Network Discovery Model for Digital Forensics Investigations”, Proceedings of the 6th Annual Workshop on Digital Forensics and Incident Analysis (WDFIA 11), Kingston, UK, 7 – 8 July, pp. 160-170, 2011.

[3] Broek, E.L. van den, Lisý, V., Janssen, J.H., Westerink, J.D.H.M., Schut, M.H., and Tuinenbreijer, K.: Affective Man-Machine Interface: Unveiling human emotions through biosignals, In A. Fred, J. Filipe & H. Gamboa (Eds.) BioMedical Engineering Systems and Technologies(series: Communications in Computer and Information Science, Vol. 52), pp. 21–47, Springer-Verlag Berlin/Heidelberg, 2010