The Lie Behind Lie Detectors (original) (raw)
And that's why there's a significant push underway to develop more-reliable lie-detection devices.
Functional magnetic resonance imaging, or fMRI, and electroencephalography, or EEG, are the most promising modern techniques vying to replace the polygraph. One reason researchers think these methods might be superior is that instead of using sweat and heartbeat to tell us what's going on in the mind, these technologies map the brain itself. Another reason is that both methods are better suited than the polygraph to identifying whether the subject has guilty knowledge, and this is more useful in security screening than the highly targeted interrogation required by the control-question test.
But these modern methods are less miraculous than they might seem. The fMRI test measures oxygen in the brain, and oxygen is related to blood flow. The scientific hypothesis is that greater blood flow (oxygen) is tightly coupled with greater neural activity. If scientists can figure out which part of the brain we use to lie, the theory goes, then fMRI can tell when we are lying.
The hard part, what Georgetown Medical School associate professor of neurology Tom Zeffiro calls the "black art," is generating accurate models of the relationship between neurological activity and blood flow. The fMRI results have to account for up to 30 or 40 factors other than deception -- including heart rate, respiration, motion -- that might all cause variance in the signal. Also, the area of the brain related to deception differs a bit from individual to individual. Culture, language, personality, handedness, gender, medications and health can all affect the results.
Most importantly, fMRI is susceptible to simple countermeasures. Since fMRI measures oxygen in the brain, a subject can defeat the test by breathing deeply or by holding her breath.
EEG has some of the same problems as fMRI, and some unique challenges. An EEG measures electrical activity on the surface of the scalp, on the tip of the nose and around the eyes. The device then infers through skin, skull and hair what's happening with electrical waves in the brain.
Researchers have identified one wave shape, P300, as associated with deception. Research assistant professor Jennifer Vendemia from the University of South Carolina studies P300, and at the Stanford workshop she said that it's possible to see a lie by looking at this wave shape, which occurs milliseconds after a question is posed. But it's difficult to measure deception separately from other neurological phenomena like switching tasks, recalling something autobiographical or recalling something learned.
As with fMRI, the existence of wave variations can be generalized over a pool of people, but differs from person to person. Moreover, the science suffers from Zeno's paradox: As EEG measurement becomes more refined, smaller errors in the readings have larger consequences for the results. Vendemia showed the audience slides of an EEG test, and it looked to me like a child's drawing of a fleet of purposeful worms.
Under laboratory conditions, fMRI technology might be 90 percent accurate in determining whether individuals in a test group of Americans are lying about taking a watch or a ring. But it's useless for employee screening, convicting the guilty, identifying terrorists at the airport or separating innocents from enemy combatants at Guantanamo Bay -- at least at the moment.
At some point soon, these high-tech lie detectors will be cheap, accurate, portable and unobtrusive enough to replace the polygraph in incident investigations. But we are a long way from reading minds.
Lie detection raises a host of complicated ethical problems about autonomy and the privacy of one's own thoughts. But before we get there, we have to know whether the thing works, and what exactly it does. Being a smart consumer of security technology means asking about accuracy rates, validity, reproducibility, specificity and sensitivity.
Once these tools are on the market, there will be immense pressure to use, or rather misuse, them in Guantanamo Bay, on the battlefield, in the courtroom and at your workplace. We'll hear the usual argument about the need to trade some privacy for increased security. But that bargain is only equitable when you actually get some security in the exchange. With even the best technology, science says lie detection is still only a little better than a shot in the dark.
Can't Hide Your Lying ... Face?