Free Republic
Browse · Search
News/Activism
Topics · Post Article

Skip to comments.

Brain imaging study sheds light on moral decision-making
Princeton University press release ^ | September 13, 2001 | Steven Schultz

Posted on 09/13/2001 6:25:30 PM PDT by Nebullis

Princeton, N.J. -- In a study that combines philosophy and neuroscience, researchers have begun to explain how emotional reactions and logical thinking interact in moral decision-making.

Princeton University researchers reported in the Sept. 14 issue of Science that they used functional magnetic resonance imaging (fMRI) to analyze brain activity in people who were asked to ponder a range of moral dilemmas.

The results suggest that, while people regularly reach the same conclusions when faced with uncomfortable moral choices, their answers often do not grow out of the reasoned application of general moral principles. Instead, they draw on emotional reactions, particularly for certain kinds of moral dilemmas.

The results also show how tools of neuroscience are beginning to reveal the biological underpinnings of the subtlest elements of human behavior, said Joshua Greene, a graduate student in philosophy who conducted the study in collaboration with scientists in the psychology department and the Center for the Study of Brain, Mind and Behavior.

"We think of moral judgments as so ethereal," said Greene. "Now we're in a position to start looking at brain anatomy and understanding how neural mechanisms produce patterns in our behavior."

The study focused on a classic set of problems that have fascinated moral philosophers for years because of the difficulty in identifying moral principles that agree with the way people react.

One dilemma, known as the trolley problem, involves a runaway train that is about to kill five people. The question is whether it is appropriate for a bystander to throw a switch and divert the trolley onto a spur on which it will kill one person and allow the five to survive.

Philosophers compare this problem to a second scenario, sometimes called the footbridge problem, in which a train is again heading toward five people, but there is no spur. Two bystanders are on a bridge above the tracks and the only way to save the five people is for one bystander to push the other in front of the train, killing the fallen bystander.

Both cases involve killing one person to save five, but they evoke very different responses. People tend to agree that it is permissible to flip the switch, but not to push a person off the bridge. People in the study also followed this pattern. This distinction has puzzled philosophers who have not been able to find a hard and fast rule to explain why one is right and the other wrong. For each potential principle, there seems to be another scenario that undermines it.

One reason for the difficulty, said Greene, appears to be that the two problems engage different psychological processes -- some more emotional, some less so -- that rely on different areas of the brain.

"They're very similar problems -- they seem like they are off the same page -- but we appear to approach them in very different ways," said Greene.

Greene emphasized that the researchers were not trying to answer questions about what is right or wrong. Instead, given that people follow a pattern of behavior, the study seeks to describe how that behavior arises. In turn, a better understanding of how moral judgments are made may change our attitudes toward those judgments, Greene said.

The researchers conducted the study with two groups of nine people, who each answered a battery of 60 questions while undergoing MRI scanning. The researchers divided the questions into person and non-personal categories based on the general notion that the difference between the trolley and footbridge problems may have to do with the degree of personal involvement, and ultimately the level of emotional response.

Examples of non-personal ethical dilemmas included a case of keeping money from a lost wallet and a case of voting for a policy expected to cause more deaths than its alternatives. The researchers also included non-moral questions, such as the best way to arrange a travel schedule given certain constraints and which of two coupons to use at a store.

The scanning consistently showed a greater level of activation in emotion-related brain areas during the personal moral questions than during the impersonal moral or non-moral questions. At the same time, areas associated with working memory, which has been linked to ordinary manipulation of information, were considerably less active during the personal moral questions than during the others.

The researchers also measured how long it took subjects to respond to the questions. In the few cases in which people said it is appropriate to take action in the personal moral questions -- like pushing a person off the footbridge -- they tended to take longer to make their decisions. These delays suggest that this subgroup of people were working to overcome a primary emotional response, the researchers said.

Taken together, the imaging and response time results strongly suggest that emotional responses influenced moral decision-making and were not just a coincidental effect, the researchers concluded.

Professor of psychology John Darley, a coauthor of the paper, said the result fits into a growing area of moral psychology which contends that moral decision-making is not a strictly reasoned process, as has been believed for many years. "Moral issues do not come to you with a sign saying 'I'm a moral issue; treat me in a special way,'" Darley said. Instead, they engage a range of mental processes.

Other coauthors on the paper are Brian Sommerville, a former research assistant now at Columbia University Medical School; Leigh Nystrom, a research scientist in psychology; and Jonathan Cohen, a professor of psychology at Princeton.

Cohen also is director of the University's newly established Center for the Study of Brain, Mind and Behavior, which houses the fMRI scanner used in the study, and which seeks to combine the methods of cognitive psychology with neuroscience.

"Measuring people's behavior has served psychology well for many years and will continue to do so, but now that approach is augmented by a whole new set of tools," said Cohen.

Brain imaging allows scientists to build a catalog of brain areas and their functions, which can then be cross-referenced with behaviors that employ the same processes, Cohen said. Eventually, this combination of behavioral analysis and biological neuroscience could inform questions in fields from philosophy to economics, he said.

The current study, he said, "is a really nice example of how cognitive neuroscience -- and neuroimaging in particular -- provide an interface between the sciences and the humanities."
 


TOPICS: News/Current Events; Philosophy
KEYWORDS:
Navigation: use the links below to view more comments.
first previous 1-2021-4041-45 last
To: Diamond
When a machine's actions are completely determined by physical forces, the moral intent cannot be known because the entire operation of the machine is based on coercion.

Coercion? I think you're getting at a determinism which excludes the uncertainty inherent in physical systems. But this is strange, coming from you that is. I might give you the same answer you might give someone who says free will is not possible with a God who is omniscient. There is an easy way around that, philosophically, so why belabor the point from a physical position where the requirement for fatalism is far less absolute?

Although you don't come out and say so, you seem to imply that morality exists outside of ourselves. I say it's an emergent function of our brain. Why? Is there a 'why' that applies here? What about 'How'? That's what this research study is aiming at. And if morality is not a function of our brains, where does morality reside and what structure of our brain perceives this morality? You are ultimately left to answer exactly the same questions you pose to me.

41 posted on 09/27/2001 9:41:34 AM PDT by Nebullis
[ Post Reply | Private Reply | To 40 | View Replies]

To: Nebullis
...determinism ...excludes the uncertainty inherent in physical systems. But this is strange, coming from you that is. I might give you the same answer you might give someone who says free will is not possible with a God who is omniscient. There is an easy way around that, philosophically, so why belabor the point from a physical position where the requirement for fatalism is far less absolute?

Whether the uncertainty observed in physical sytems is due to our lack of knowledge as to how they operate, or because they are intrincally uncertain, I do not know. In neither case though, can pure physical sytems account for nature of morality, with its attentendant concepts of free personal agency, accountablility, evil, praise, blame, intent, motive, volition, etc. Morality has a prescriptive nature, not a descriptive nature. Electrochemical pathways in the brain alone cannot account any notion of moral obligation and guilt in relation to persons. We do not assign praise or blame to chemical reactions or physical forces. The uncertainty principle, if that's what it is called, cannot itself account for the personal agency requisite of morality. Physical interactions, whether they are completely deterministic or not, are not right or wrong, they just are, and so they alone do not prescribe what 'ought' to be in a moral sense.

...if morality is not a function of our brains, where does morality reside and what structure of our brain perceives this morality? You are ultimately left to answer exactly the same questions you pose to me.

I do not deny that there is a correlation between our brains and morality (Bill and Hillary excepted), but I think that morality is more than a random combination of molecules in motion. If that's what moral rules are, then there is no reason to think that Jesus lived a praiseworthy life, or that Joseph Stalin was truly guilty; they were just operating on different synapses, so to speak, in the same way that your neurotransmitters might prefer chocolate ice cream and mine vanilla. If morality is entirely and solely a product of our brains then it is also entirely subjective and relative. There is no external standard.

I think that moral rules exist, even though they don't have physical properties. They are objective, and can be discovered, like other non-physical realities such as propositions, numbers, the laws of logic, etc. I think that impersonal forces such as chemicals and electricity cannot provide a basis for morality, because there is no obligation to obey a random or impersonal force. (How could an impersonal force issue a propositional command in the first place?) So that's why I think that morality is unintelligible without a personal God as the Source of that 'furniture of the universe'. Just my two cents worth.

Cordially,

42 posted on 09/27/2001 1:00:05 PM PDT by Diamond
[ Post Reply | Private Reply | To 41 | View Replies]

To: fire_eye
Anyone who's been in a bar after 1AM can tell you that whales and cows share a common heritage...

You mean, Hillary?

43 posted on 09/27/2001 1:14:24 PM PDT by jigsaw
[ Post Reply | Private Reply | To 5 | View Replies]

To: Diamond
Electrochemical pathways in the brain alone cannot account any notion of moral obligation and guilt in relation to persons.

Why not? We can change our level of guilt with a pill. A stroke will wipe out moral obligation. At most you can say that we don't yet know what, precisely, accounts for a notion of moral obligation or guilt, but we do know that the proper functioning of these features requires specific parts of the brain and specific neurochemicals.

We do not assign praise or blame to chemical reactions or physical forces.

That doesn't mean that the ability to assign praise or blame doesn't rest in these natural processes.

I do not deny that there is a correlation between our brains and morality, but I think that morality is more than a random combination of molecules in motion.

We already know that the action of these molecules is not random. Enfin, you say that there is something more than the molecules in our brains. There may be something outside the brain which has an affect on the brain. We could find out which parts of the brain it affects and how it does so. You see, in the end, whether we are looking for a morality receptor or a morality generator, the same questions can be asked.

I think that moral rules exist, even though they don't have physical properties. They are objective, and can be discovered, like other non-physical realities such as propositions, numbers, the laws of logic, etc.

There are moral rules which move beyond the intrinsic moral sensibility that all humans share. These also vary across cultures and are learned. But that's a different thing that what we are discussing.

Logic works because our brains work that way. Logic is a higher order expression of the physical properties of our brains. Clever people have layed them out in handy verbal or mathematical terms so that we can communicate them to each other. But these non-physical realities are exactly the result of the activity in our brains.

44 posted on 09/28/2001 9:01:36 AM PDT by Nebullis
[ Post Reply | Private Reply | To 42 | View Replies]

To: Diamond
If that's what moral rules are, then there is no reason to think that Jesus lived a praiseworthy life, or that Joseph Stalin was truly guilty; they were just operating on different synapses, so to speak, in the same way that your neurotransmitters might prefer chocolate ice cream and mine vanilla. If morality is entirely and solely a product of our brains then it is also entirely subjective and relative.

Why is it relative? That doesn’t make any sense. I’ve heard this claim so often and I'm always confounded by them. Why is an external standard better than an internal one? The way in which our brains function is no less absolute than a communicated external standard. Just because you prefer vanilla over chocolate doesn’t mean that the taste buds are tasting anything other than vanilla or chocolate and transmitting that to your brain. A lot of people like some moral rules better than others, but that doesn’t mean that the rules are changed by that preference.

45 posted on 09/28/2001 9:42:38 AM PDT by Nebullis
[ Post Reply | Private Reply | To 42 | View Replies]


Navigation: use the links below to view more comments.
first previous 1-2021-4041-45 last

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
News/Activism
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson