In the shadow of the polygraph

I was reading this short article in The Guardian today about the failure of polygraph technologies (including fMRI versions and voice analysis) to deliver data that was sufficiently robust to be admissible in court as evidence.  Several points made in the article prompted a thought that the development of physiological computing technologies, to some extent, live in the shadow of the polygraph.

Think about it.  Both the polygraph and physiological computing aim to transform personal and private experience into quantifiable data that may be observed and assessed.  Both capture unconscious physiological changes that may signify hidden psychological motives and agendas, subconscious or otherwise – and of course, both involve the attachment of sensor apparatus.  The convergence between both technologies dictates that both are notoriously difficult to validate (hence the problems of polygraph evidence in court) – and that seems true whether we’re talking about the use of the P300 for “brain fingerprinting” or the use of ECG and respiration to capture a specific category of emotion.

Whenever I do a presentation about physiological computing, I can almost sense antipathy to the concept from some members of audience because the first thing people think about is the polygraph and the second group of thoughts that logically follow are concerns about privacy, misuse and spying.  To counter these fears, I do point out that physiological computing, whether it’s a game or a means of adapting a software agent or a brain-computer interface, has been developed for very different purposes; this technology is intended for personal use, it’s about control for the individual in the broadest sense, e.g. to control a cursor, to promote reflection and self-regulation, to make software reactive, personalised and smarter, to ensure that the data protection rights of the individual are preserved – especially if they wish to share their data with others.

But everyone knows that any signal that can be measured can be hacked, so even capturing these kinds of physiological data per se opens the door for spying and other profound invasions of privacy.

Which takes us inevitably back in the shadow of the polygraph.

I’m sure attitudes will change if the right piece of technology comes along that demonstrates the up side of physiological computing.  But if early systems don’t take data privacy seriously, as in very seriously, the public could go cold on this concept before the systems have had a chance to prove themselves in the marketplace.

For musings on a similar theme, see my previous post Designing for the Guillable.

1 thought on “In the shadow of the polygraph

  1. Pingback: Physiological Computing : An evening with Stelarc

Comments are closed.