Posted on 09/13/2001 6:25:30 PM PDT by Nebullis
Princeton, N.J. -- In a study that combines philosophy and neuroscience, researchers have begun to explain how emotional reactions and logical thinking interact in moral decision-making.
Princeton University researchers reported in the Sept. 14 issue of Science that they used functional magnetic resonance imaging (fMRI) to analyze brain activity in people who were asked to ponder a range of moral dilemmas.
The results suggest that, while people regularly reach the same conclusions when faced with uncomfortable moral choices, their answers often do not grow out of the reasoned application of general moral principles. Instead, they draw on emotional reactions, particularly for certain kinds of moral dilemmas.
The results also show how tools of neuroscience are beginning to reveal the biological underpinnings of the subtlest elements of human behavior, said Joshua Greene, a graduate student in philosophy who conducted the study in collaboration with scientists in the psychology department and the Center for the Study of Brain, Mind and Behavior.
"We think of moral judgments as so ethereal," said Greene. "Now we're in a position to start looking at brain anatomy and understanding how neural mechanisms produce patterns in our behavior."
The study focused on a classic set of problems that have fascinated moral philosophers for years because of the difficulty in identifying moral principles that agree with the way people react.
One dilemma, known as the trolley problem, involves a runaway train that is about to kill five people. The question is whether it is appropriate for a bystander to throw a switch and divert the trolley onto a spur on which it will kill one person and allow the five to survive.
Philosophers compare this problem to a second scenario, sometimes called the footbridge problem, in which a train is again heading toward five people, but there is no spur. Two bystanders are on a bridge above the tracks and the only way to save the five people is for one bystander to push the other in front of the train, killing the fallen bystander.
Both cases involve killing one person to save five, but they evoke very different responses. People tend to agree that it is permissible to flip the switch, but not to push a person off the bridge. People in the study also followed this pattern. This distinction has puzzled philosophers who have not been able to find a hard and fast rule to explain why one is right and the other wrong. For each potential principle, there seems to be another scenario that undermines it.
One reason for the difficulty, said Greene, appears to be that the two problems engage different psychological processes -- some more emotional, some less so -- that rely on different areas of the brain.
"They're very similar problems -- they seem like they are off the same page -- but we appear to approach them in very different ways," said Greene.
Greene emphasized that the researchers were not trying to answer questions about what is right or wrong. Instead, given that people follow a pattern of behavior, the study seeks to describe how that behavior arises. In turn, a better understanding of how moral judgments are made may change our attitudes toward those judgments, Greene said.
The researchers conducted the study with two groups of nine people, who each answered a battery of 60 questions while undergoing MRI scanning. The researchers divided the questions into person and non-personal categories based on the general notion that the difference between the trolley and footbridge problems may have to do with the degree of personal involvement, and ultimately the level of emotional response.
Examples of non-personal ethical dilemmas included a case of keeping money from a lost wallet and a case of voting for a policy expected to cause more deaths than its alternatives. The researchers also included non-moral questions, such as the best way to arrange a travel schedule given certain constraints and which of two coupons to use at a store.
The scanning consistently showed a greater level of activation in emotion-related brain areas during the personal moral questions than during the impersonal moral or non-moral questions. At the same time, areas associated with working memory, which has been linked to ordinary manipulation of information, were considerably less active during the personal moral questions than during the others.
The researchers also measured how long it took subjects to respond to the questions. In the few cases in which people said it is appropriate to take action in the personal moral questions -- like pushing a person off the footbridge -- they tended to take longer to make their decisions. These delays suggest that this subgroup of people were working to overcome a primary emotional response, the researchers said.
Taken together, the imaging and response time results strongly suggest that emotional responses influenced moral decision-making and were not just a coincidental effect, the researchers concluded.
Professor of psychology John Darley, a coauthor of the paper, said the result fits into a growing area of moral psychology which contends that moral decision-making is not a strictly reasoned process, as has been believed for many years. "Moral issues do not come to you with a sign saying 'I'm a moral issue; treat me in a special way,'" Darley said. Instead, they engage a range of mental processes.
Other coauthors on the paper are Brian Sommerville, a former research assistant now at Columbia University Medical School; Leigh Nystrom, a research scientist in psychology; and Jonathan Cohen, a professor of psychology at Princeton.
Cohen also is director of the University's newly established Center for the Study of Brain, Mind and Behavior, which houses the fMRI scanner used in the study, and which seeks to combine the methods of cognitive psychology with neuroscience.
"Measuring people's behavior has served psychology well for many years and will continue to do so, but now that approach is augmented by a whole new set of tools," said Cohen.
Brain imaging allows scientists to build a catalog of brain areas and their functions, which can then be cross-referenced with behaviors that employ the same processes, Cohen said. Eventually, this combination of behavioral analysis and biological neuroscience could inform questions in fields from philosophy to economics, he said.
The current study, he said, "is a really nice example of how cognitive neuroscience -- and neuroimaging in particular -- provide an interface between the sciences and the humanities."
This is news?
Anyone who's been in a bar after 1AM can tell you that whales and cows share a common heritage...
Sounds like a couple of heretofore missing links. In other words, as the fossil and molecular evidence trails flesh out, perceived conflicts based upon incomplete data tend to be resolved.
So the good news for the C side is they can crow that all that old stuff about whales descending from mesonychid carnivores is unlikely. Science has had to change its story again.
The bad news for the Cs is there's fossil evidence for the alternate hypothesis and it's still evolution.
A few very important missing links. These are major finds.
I consider these types of studies as curiosities. There is a world of difference between a cold question and a hot event. Heroes don't ponder, they do.
Thank you for the link. I have no comment as the topic has been "proved" to my satisfaction by the DNA evidence and "old bones" considered in isolation are very likely to lead one astray.
Histones, the proteins around which DNA coils to form chromatin, are moving toward the forefront of epigenetic research (see also, "The Meaning of Epigenetics"). A recently floated hypothesis states that the highly modifiable amino termini, or tails, of these proteins could carry their own combinatorial codes or signatures to help control phenotype, and that parts of this code may be heritable. ...
What do you think about this?
If anything, a hot event would be more emotionally laden.
I like it.
Yes, precisely and also subconscious. Many "decisions" never reach the conscious level yet influence our actions. As one gains more life experiences, I believe more and more of this experience is condensed in "feelings" that are not expressed as a logical conclusion that can be expressed as a series of logical statements leading to the conclusion rather a vague nagging "something doesn't feel right" or "this is the right thing to do".
This will not be static like the genetic code, he says, but something that is context-dependent--meaning different things in different situations.
It's like language.
I think it is absolutely correct that the code is context dependent. The reason for this lies not in the sequence but in the structure.
How could it be otherwise? First comes first and the brain is organized volume-wise.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.