Tag Archives: lie-detection

The Epoc and Your Next Job Interview

job-interview

Imagine you are waiting to be interviewed for a job that you really want.  You’d probably be nervous, fingers drumming the table, eyes restlessly staring around the room.  The door opens and a man appears, he is wearing a lab coat and he is holding an EEG headset in both hands.  He places the set on your head and says “Your interview starts now.”

This Philip K Dick scenario became reality for intern applicants at the offices of TBWA who are an advertising firm based in Istanbul.  And thankfully a camera was present to capture this WTF moment for each candidate so this video could be uploaded to Vimeo.

The rationale for the exercise is quite clear.  The company want to appoint people who are passionate about advertising, so working with a consultancy, they devised a test where candidates watch a series of acclaimed ads and the Epoc is used to measure their levels of ‘passion’ ‘love’ and ‘excitement’ in a scientific and numeric way.  Those who exhibit the greatest passion for adverts get the job (this is the narrative of the movie; in reality one suspects/hopes they were interviewed as well).

I’ve seen at least one other blog post that expressed some reservations about the process.

Let’s take a deep breath because I have a whole shopping list of issues with this exercise.

Continue reading

In the shadow of the polygraph

I was reading this short article in The Guardian today about the failure of polygraph technologies (including fMRI versions and voice analysis) to deliver data that was sufficiently robust to be admissible in court as evidence.  Several points made in the article prompted a thought that the development of physiological computing technologies, to some extent, live in the shadow of the polygraph.

Think about it.  Both the polygraph and physiological computing aim to transform personal and private experience into quantifiable data that may be observed and assessed.  Both capture unconscious physiological changes that may signify hidden psychological motives and agendas, subconscious or otherwise – and of course, both involve the attachment of sensor apparatus.  The convergence between both technologies dictates that both are notoriously difficult to validate (hence the problems of polygraph evidence in court) – and that seems true whether we’re talking about the use of the P300 for “brain fingerprinting” or the use of ECG and respiration to capture a specific category of emotion.

Whenever I do a presentation about physiological computing, I can almost sense antipathy to the concept from some members of audience because the first thing people think about is the polygraph and the second group of thoughts that logically follow are concerns about privacy, misuse and spying.  To counter these fears, I do point out that physiological computing, whether it’s a game or a means of adapting a software agent or a brain-computer interface, has been developed for very different purposes; this technology is intended for personal use, it’s about control for the individual in the broadest sense, e.g. to control a cursor, to promote reflection and self-regulation, to make software reactive, personalised and smarter, to ensure that the data protection rights of the individual are preserved – especially if they wish to share their data with others.

But everyone knows that any signal that can be measured can be hacked, so even capturing these kinds of physiological data per se opens the door for spying and other profound invasions of privacy.

Which takes us inevitably back in the shadow of the polygraph.

I’m sure attitudes will change if the right piece of technology comes along that demonstrates the up side of physiological computing.  But if early systems don’t take data privacy seriously, as in very seriously, the public could go cold on this concept before the systems have had a chance to prove themselves in the marketplace.

For musings on a similar theme, see my previous post Designing for the Guillable.

Did you steal my power-up? Be honest, remember your avatar sweats when you do

Thursday the Herald Sun (via GamePolitics) reported on the possibility of lie detection games being supported by the new Wii Vitality Sensor. While I’ve not seen any reports that the Vitality sensor measures skin conductance (i.e. level of sweat on the inner surface of the fingers) as claimed in the article*, it did get me thinking whether or not lie detection could be a fun game mechanic.

Basics of Lie Detection

Lie detection is based on the premise that when a person lies it elicits a physiological response which can be discriminated from the truth. In a polygraph test  (a type of lie detection test) this premise is used to ascertain whether a person is answering a question truthfully or not using a range of autonomic measures such as pulse rate (i.e. like Vitality supports), skin conductance and blood pressure. In a typical polygraph test an investigator begins by asking a subject a few sample questions for which the truth is already known. This allows them to build a baseline for physiological activity representative of a question answered truthfully. Next the investigator will ask questions for which the truth is not known and via their physiological responses the investigator will infer whether they have lied or not. Obviously this all assumes that lying has its own physiological discriminates. To my knowledge it doesn’t, least not the autonomic measures commonly used in a polygraph test. For more information I suggest consulting The Polygraph and Lie Detection (National Academy Press)**.

Anyways back to whether lie detection can be a fun game mechanic. I’m going to walk you through the design of an example lie detection game and discuss the various issues and its potential for play along the way.
Continue reading