Today, the joke is on us. Cameras follow your car, global-positioning systems track your cellphone, software monitors your Web surfing, X-rays explore your purse, and airport scanners see through your clothes. Now comes the final indignity: machines that can look into your soul.
With the aid of functional magnetic resonance imaging (fMRI), which tracks the activation of specific parts of the brain, neuroscientists have been hard at work on Allen's fantasy. Under controlled conditions, they can tell from a brain scan which of two images you're looking at. They can tell whether you're thinking of a face, an animal or a scene. They can even tell which finger you're about to move.
But those feats barely scratch the brain's surface. Any animal can perceive objects and move limbs. To plumb the soul, you need a metaphysician. John-Dylan Haynes, a researcher at Germany's Bernstein Center for Computational Neuroscience, is leading the way. His mission, according to the center, is to predict thoughts and behavior from fMRI scans.
Haynes, a former philosophy student, is going for the soul's jugular. He's trying to clarify the physical basis of free will. "Why do we shape intentions in this way or another way?" he wonders. "Your wishes, your desires, your goals, your plans -- that's the core of your identity." The best place to look for that core is in the brain's medial prefrontal cortex, which, he points out, is "especially involved in the initiation of willed movements and their protection against interference."
To get a clear snapshot of free will, Haynes designed an experiment that would isolate it from other mental functions. No objects to interpret; no physical movements to anticipate or execute; no reasoning to perform. Participants were put in an fMRI machine and were told they would soon be shown the word "select," followed a few seconds later by two numbers. Their job was to covertly decide, when they saw the cue, whether to add or subtract the unseen numbers. Then they were to perform the chosen calculation and punch a button corresponding to the correct answer. The snapshot was taken right after the "select" cue, when they had nothing to do but choose addition or subtraction.
Until this experiment, which was reported last month in Current Biology, nobody had ever tried to take a picture of free will. One reason is that fMRI is too crude to distinguish one abstract choice from another. It can only show which parts of the brain are demanding blood oxygen. That's too coarse to distinguish the configuration of cells that signifies addition from the configuration that signifies subtraction. So Haynes used software to help the computer recognize complex patterns in the data. To dissect human thought, the computer had to emulate it.
Each participant took the test more than 250 times, choosing independently in each trial. The computer then looked at a sample of the scans, along with the final answers that revealed what choices were made. It calculated a pattern and used it to predict, from each participant's remaining scans, his or her decisions in the corresponding trials. Haynes checked the predictions -- add or subtract -- against the answers. The computer got it right 71 percent of the time.
I know what you're thinking: Why would anyone want a machine to read his mind? But imagine being paralyzed, unable to walk, type or speak. Imagine a helmet full of electrodes, or a chip implanted in your head, that lets your brain tell your computer which key to press. Those technologies are already here. And why endure the agony of mental hunt-and-peck? Why not design computers that, like a smart secretary, can discern and execute even abstract intentions? That's what Haynes has in mind. You want to open a folder or an e-mail, and your computer does it. Your wish is its command.
But if machines can read your mind when you want them to, they can also read it when you don't. And your will isn't necessarily the one they obey. Already, scans have been used to identify brain signatures of disgust, drug cravings, unconscious racism and suppressed sexual arousal, not to mention psychopathy and the propensity to kill.
Haynes understands the objection to these scans -- he calls it "mental privacy" -- but he buys only half of it. He doesn't like the idea of companies scanning job applicants for loyalty or scanning customers for reactions to products (an emerging practice known as neuromarketing). But where criminal justice is at stake, as in the case of lie detection, he supports using the technology. Ruling it out, he argues, would "deny the innocent people the ability to prove their innocence" and would "only protect the people who are guilty."
I hear what he's saying. I'd love to have put Khalid Sheik Mohammed through an fMRI before Sept. 11, 2001, instead of waiting six years for his confession. And I wish we'd scanned Mohamed Atta's brain before he boarded that flight out of Boston.
But what Haynes is saying -- and exposing -- is almost more terrifying than terrorism. The brain is becoming just another accessible body part, searchable for threats and evidence. We can sift through your belongings, pat you down, study your nude form through your clothes, inspect your body cavities and, if necessary, peer into your mind.
Using fMRI is just the first stage. Electrodes, infrared spectroscopy and subtler magnetic imaging are next. Scanners will shrink. Image resolution and pattern-recognition software will improve.
But don't count out free will. To make human choice predictable, you first have to constrain it so it's not really free. That's why Haynes confined his participants to arithmetic, gave them only two options and forbade them to change their minds. They could have wrecked his experiment by defying any of those conditions. So could you, if somebody came at you with a scanner or an electrode helmet. To look into your soul and get the right answer, science, too, has to cheat. Somewhere, Woody Allen is laughing. I can feel it.
that's most probably a part of EU project IQUEST: development of a computational model for collection, production and analysis of a new integrated measure of opinions, attitudes, feelings, beliefs, and behaviour.
Coordinator of the project is italian prof Edigio Robusto from University of Padova.
Interesting, that professor has only 2! publications abailable (in open press!).
Huh?
Other participants of the project are University of Essex, Univ of Graz, Conquerer multimedia e design; Anter Ltd.