Many philosophers have argued that people make decisions about what's right and wrong based on moral principles and rational thought. But other philosophers--and more recently, some psychologists and neuroscientists--have argued that there's more to the story. When faced with a moral dilemma, these scholars say, we rely on emotional reactions as well as our powers of reasoning. In a study of brain damage, published today, neuroscientists report evidence that emotions indeed exert a powerful influence on moral judgments.

In the new study, Antonio Damasio of the University of Southern California in Los Angeles and colleagues examined moral reasoning in six people who had damage to the ventromedial prefrontal cortex (VMPC), a brain region that regulates emotions. The researchers presented the patients with moral dilemmas that forced them to decide whether it was acceptable to sacrifice one person's life to save several others. For example, participants had to decide whether to flip a switch that diverts a runaway trolley from a track leading to five workers to a track leading to just one worker. The researchers also gauged the decisions of 12 people without brain damage and 12 patients with damage to brain regions unconnected to emotion.

©Joshua Greene
Think fast.
Would you push one person off a bridge to save the lives of five people?

In the trolley scenario, most people in all three groups said it was okay to flip the switch. However, the VMPC patients' decisions diverged when the scenario required inflicting direct personal harm on one person to save several others--such as shoving a large person off a bridge to slow a trolley headed for five people. From a strictly rational point of view, it's better to save five people instead of one, but the thought of pushing an innocent person to his death is emotionally wrenching. That may explain why only about 20% of people in the control groups said they'd push. The VMPC patients, on the other hand, made the utilitarian choice about twice as often, the researchers report online today in Nature.

The findings fit nicely with other evidence that moral judgments often involve a conflict between emotion and reason and that those two competing influences rely on different networks of brain regions, says Joshua Greene, a philosopher and cognitive neuroscientist at Harvard University. But Jordan Grafman, a cognitive neuroscientist at the National Institute of Neurological Disorders and Stroke in Bethesda, Maryland, isn't convinced that extreme moral dilemmas like the trolley problem evoke the same cognitive processes--and involve the same brain regions--as moral judgments in the real world involve. Even so, he says, the study "emphasizes that disciplines other than philosophy can contribute to issues related to moral behavior."