Former APA president drew from research to help explain evil under the backdrop of recent Iraqi prisoner abuses at Abu Ghraib.
By MELISSA DITTMANN
Monitor Staff
October 2004, Vol 35, No. 9
Print version: page 68
As the story goes, Dr. Jekyll uses a chemical to turn into his evil alter ego Dr. Hyde. In real life, however, no chemical may be needed: Instead, just the right dose of certain social situations can transform ordinarily good people into evildoers, as was the case with Iraqi prisoner abusers at Abu Ghraib, argued former APA president Philip G. Zimbardo, PhD, in a presidential-track program during APA's 2004 Annual Convention in Honolulu.
Indeed, Zimbardo--an emeritus psychology professor at Stanford University--highlighted how this Dr. Hyde transformation occurred among U.S. soldiers at Abu Ghraib by presenting classic psychology research on situational effects on human behavior.
Zimbardo, who will be an expert witness for several of the U.S. soldiers on trial, argued that situations pull people to act in ways they never thought imaginable.
"That line between good and evil is permeable," Zimbardo said. "Any of us can move across it....I argue that we all have the capacity for love and evil--to be Mother Theresa, to be Hitler or Saddam Hussein. It's the situation that brings that out."
Seduced into evil
In fact, the classic electric shock experiment by social psychologist Stanley Milgram, PhD, showed that when given an order by someone in authority, people would deliver what they believed to be extreme levels of electrical shock to other study participants who answered questions incorrectly.
Zimbardo said the experiment provides several lessons about how situations can foster evil:
*Provide people with an ideology to justify beliefs for actions.
*Make people take a small first step toward a harmful act with a minor, trivial action and then gradually increase those small actions.
*Make those in charge seem like a "just authority."
*Transform a once compassionate leader into a dictatorial figure.
*Provide people with vague and ever-changing rules.
*Relabel the situation's actors and their actions to legitimize the ideology.
*Provide people with social models of compliance.
*Allow dissent, but only if people continue to comply with orders.
*Make exiting the situation difficult.
Particularly notable, Zimbardo said, is that people are seduced into evil by dehumanizing and labeling others.
"They semantically change their perception of victims, of the evil act, and change the relationship of the aggressor to their aggression--so 'killing' or 'hurting' becomes the same as 'helping,'" he said.
For example, in a 1975 experiment by psychologist Albert Bandura, PhD, college students were told they'd work with students from another school on a group task. In one condition, they overheard an assistant calling the other students "animals" and in another condition, "nice." Bandura found students were more apt to deliver what they believed were increased levels of electrical shock to the other students if they had heard them called "animals."
People's aggression can also increase when they feel anonymous--for example if they wear a uniform, hood or mask, Zimbardo said.
"You minimize social responsibility," he explained. "Nobody knows who you are, so therefore you are not individually liable. There's also a group effect when all of you are masked. It provides a fear in other people because they can't see you, and you lose your humanity."
For example, an experiment in 1974 by Harvard anthropologist John Watson evaluated 23 cultures to determine whether warriors who changed their appearance--such as with war paint or masks--treated their victims differently. As it turned out, 80 percent of warriors in these cultures were found to be more destructive--for example, killing, torturing or mutilating their victims--than unpainted or unmasked warriors.
What's more, a person's anonymity can be induced by acting in an anonymity-conferring environment that adds to the pleasure of destruction, vandalism and the power of being in control, Zimbardo noted.
"It's not just seeing people hurt, it's doing things that you have a sense that you are controlling behavior of other people in ways that you typically don't," Zimbardo said.
Zimbardo noticed that in his own simulated jail experiment in 1971--the Stanford Prison Experiment--in which college students played the roles of prisoners or guards, and the guards became brutal and abusive toward prisoners after just six days, leading Zimbardo to prematurely end the experiment. The experiment showed that institutional forces and peer pressure led normal student volunteer guards to disregard the potential harm of their actions on the other student prisoners.
"You don't need a motive," Zimbardo said. "All you really need is a situation that facilitates moving across that line of good and evil."
Prison abuses
The same social psychological processes--deindividualization, anonymity of place, dehumanization, role-playing and social modeling, moral disengagement and group conformity--that acted in the Stanford Prison Experiment were at play at Abu Ghraib, Zimbardo argued.
So is it a few bad apples that spoil a barrel? "That's what we want to believe--that we could never be a bad apple," Zimbardo said. "We're the good ones in the barrel." But people can be influenced, regardless of their intention to resist, he said.
As such, the Abu Ghraib soldiers' mental state--such as stress, fear, boredom and heat exhaustion, coupled with no supervision, no training and no accountability--may have further contributed to their "evil" actions, he noted.
"I argue situational forces dominate most of us at various times in our lives," Zimbardo said, "even though we'd all like to believe we're each that singular hero who can resist those powerful external pressures, like Joe Darby, the whistle-blowing hero of the Abu Ghraib prison."
Read more about Zimbardo's Stanford Prison Experiment at www.prisonexp.org.
Further Reading
*Bandura, A., Underwood, B., & Fromson, M.E. (1975). Disinhibition of aggression through diffusion of responsibility and dehumanization of victims. Journal of Personality and Social Psychology, 9, 253-269.
*Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper & Row.
*Watson, J. (1973). Investigation into deindividuation using a cross-cultural survey technique. Journal of Personality and Social Psychology, 25, 342-345.
No comments:
Post a Comment