I thought a posting of this experiment would be appropriate because the information displays a different angle of the human condition in regards to certain behaviours found in certain people engaged in criminal activity.
Stanley Milgram
In 1961 and 1962, a series of experiments were carried out at Yale
University. Volunteers were paid a small sum to participate in what they
understood would be “a study of memory and learning.” In most cases, a
white-coated experimenter took charge of two of the volunteers, one of whom
was given the role of “teacher” and the other “learner.” The learner was
strapped into a chair and told he had to remember lists of word pairs. If he
couldn’t recall them, the teacher was asked to give him a small electric shock.
With each incorrect answer the voltage rose, and the teacher was forced to
watch as the learner moved from small grunts of discomfort to screams of
agony.
What the teacher didn’t know was that there was actually no current
running between his control box and the learner’s chair, and that the volunteer
“learner” was in fact an actor who was only pretending to get painful shocks.
The real focus of the experiment was not the “victim,” but the reactions of the
teacher pressing the buttons. How would he cope with administering greater
and greater pain to a defenseless human being?
The experiment, described in Obedience to Authority: An Experimental
View, is one of the most famous in psychology. Here, we take a look at what
actually happened and why the results are important.
Expectations and reality
Most people would expect that at the first sign of genuine pain on the part of
the person being shocked, the experiment would be halted. After all, it was
only an experiment. This is the response that Milgram received when, outside
the actual experiments, he surveyed a range of people on how they believed
subjects would react in these circumstances. Most predicted that the teachers
would not give shocks beyond the point where the learners asked to be freed.
These expectations were entirely in line with Milgram’s own. But what actually
happened?
Most subjects asked to act as teachers were very stressed by the experiment,
and protested to the experimenter that the person in the chair should
not have to take any more pain. The logical next step would then have been to
demand that the experiment be terminated. In reality, this rarely happened.
Despite their reservations, most people continued to follow the orders of
the experimenter and inflict progressively greater shocks. Indeed, as Milgram
noted, “a substantial proportion continued to the last shock on the generator.”
That was even when they could hear the cries of the learner, and even when
that person pleaded to be let out of the experiment.
How we cope with a bad conscience
Milgram’s experiments have caused controversy over the years; many people
are simply unwilling to accept that normal human beings would act like this.
Many scientists have tried to find holes in the methodology, but the experiment
has been replicated around the world with similar outcomes. As Milgram
noted, the results astonish people. They want to believe that the subjects who
volunteered were sadistic monsters. However, he made sure that they came
from a range of social classes and professions, and that they were normal
people put in unusual circumstances.
Why don’t the subjects administering the “shocks” feel guilty and just opt
out of the experiment? Milgram was careful to point out that most of his subjects
knew that what they were doing was not right. They hated giving the
shocks, especially when the victim was objecting to them. Yet even though
they thought that the experiment was cruel or senseless, most were not able to
extract themselves from it. Instead, they developed coping mechanisms to justify
what they were doing. These included:
v Getting absorbed in the technical side of the experiment. People have a strong
desire to be competent in their work. The experiment and its successful implementation
became more important than the welfare of the people involved.
v Transferring moral responsibility for the experiment to its leader. This is the
common “I was just following orders” defense found in any war crimes trial.
The moral sense or conscience of the subject is not lost, but is transformed into
a wish to please the boss or leader.
v Choosing to believe that their actions needed to be done as part of a larger,
worthy cause. Where in the past wars have been waged over religion or political
ideology, in this case the cause was science.
v Devaluing the person receiving the shocks: “If they are dumb enough not
remember the word pairs, they deserve to be punished.” Such impugning of
intelligence or character is commonly used by tyrants to encourage followers to
get rid of whole groups of people. They are not worth much, the thinking goes,
so who really cares if they are eliminated? The world will be a better place.
Perhaps the most surprising result was Milgram’s observation that the subject’s
sense of morality did not disappear, but was reoriented, so that they felt duty
and loyalty not to those they were harming but to the person giving the
orders. The subject was not able to extract themselves from the situation
because—amazingly—it would have been impolite to go against the wishes of
the experimenter. The subject felt they had agreed to do the experiment, so to
pull out would make them appear a promise-breaker.
The desire to please authority was seemingly more powerful than the
moral force of the other volunteer’s cries. When the subject did voice opposition
to what was going on, he or she typically couched it in the most deferential
terms. As Milgram described one subject: “He thinks he is killing
someone, yet he uses the language of the tea table.”
Why are we like this? Milgram observed that humans’ tendency to obey
authority evolved for simple survival purposes. There had to be leaders and
followers and hierarchies in order to get things done. Man is a communal animal,
and does not want to rock the boat. Worse even than the bad conscience
of harming others who are defenseless, it seems, is the fear of being isolated.
Most of us are inculcated from a very young age with the idea that it is
wrong to hurt others needlessly, yet we spend the first 20 years of our life
being told what to do, so we get used to obeying authority. Milgram’s experiments
threw subjects right into the middle of this conundrum. Should they “be
good” in the sense of not harming, or “be good” in the sense of doing what
they’re told? Most subjects chose the latter—suggesting that our brain is hardwired
to accept authority above all else.
The natural impulse not to harm others is dramatically altered when a
person is put into a hierarchical structure. On our own we take full responsibility
for what we do and consider ourselves autonomous, but once in a system
or hierarchy we are more than willing to give over that responsibility to
someone else. We stop being ourselves, and instead become an “agent” for
some other person or thing.
Milgram was influenced by the story of Adolf Eichmann, whose job it was to
engineer the death of six million Jews under Hitler. Hannah Arendt’s book
Eichmann in Jerusalem argued that Eichmann was not really a psychopath,
but an obedient bureaucrat whose distance from the actual death camps
allowed him to order the atrocities in the name of some higher goal. Milgram’s
experiments confirmed the truth of Arendt’s idea of the “banality of evil.”
That is, humans are not inherently cruel, but become so when cruelty is
demanded by authority. This was the main lesson of his study:
Ordinary people, simply doing their jobs, and without any particular hostility
on their part, can become agents in a terrible destructive process.
Obedience to Authority can make for painful reading, especially the transcript
of an interview with an American soldier who participated in the Mai Lai
massacre in Vietnam. Milgram concluded that there was such a thing as inherent
psychopathy, or “evil,” but that it was statistically not common. His alarm
was more about how an average person (his experiments included women too,
who showed almost no difference in obedience to men), if put into the right
conditions, can do terrible things to other people—and not feel too bad about
it.
This, Milgram noted, is the purpose of military training. Trainee soldiers
are put into an environment separate from normal society and its moral
niceties and instead are made to think in terms of “the enemy.” They are
instilled with a love of “duty”; the belief that they are fighting for a great
cause; and a tremendous fear of disobeying orders: “Although its ostensible
purpose is to provide the recruit with military skills, its fundamental aim is to
break down any residues of individuality and selfhood.” Trainee soldiers are
made to become agents for a cause, rather than freethinking individuals, and
herein lies their vulnerability to dreadful actions. Other people stop being
human beings, and become “collateral damage.”
What makes one person able to disobey authority, while the rest cannot?
Disobedience is difficult. Milgram’s subjects generally felt that their allegiance
was to the experiment and the experimenter; only a few were able to break
this feeling and put the person suffering in the chair above the authority system.
There was a big gap, Milgram noticed, between protesting that harm was
being done (which nearly all subjects did), and actually refusing to go on with
the experiment. Yet this is the leap made by those few who do disobey authority
on ethical or moral grounds. They assert their individual beliefs despite the
situation, whereas most of us bend to the situation. That is the difference
between a hero who is willing to risk their own life to save others, and an
Eichmann. Culture has taught us how to obey authority, Milgram remarked, but not
how to disobey authority that is morally reprehensible.
Obedience to Authority seems to offer little comfort about human nature.
Because we evolved in clear social hierarchies over thousands of years, part of
our brain wiring makes us want to obey people who are “above” us. Yet it is
only through knowledge of this strong tendency that we can avoid getting ourselves
into situations in which we might perpetrate evil.
Every ideology requires a number of obedient people to act in its name,
and in the case of Milgram’s experiment, the ideology that awed subjects was
not religion or communism or a charismatic ruler. Apparently, people will do
things in the name of Science just as Spanish Inquisitors tortured people in the
name of God. Have a big enough “cause,” and it is easy to see how giving
pain to another living thing can be justified without too much difficulty.
That our need to be obedient frequently overrides previous education or
conditioning toward compassion, ethics, or moral precepts suggests that the
cherished idea of human free will is a myth. On the other hand, Milgram’s
descriptions of people who did manage to refuse to give further shocks should
provide us all with hope for how we might act in a similar situation. It may be
part of our heritage to obey authority mindlessly, but it is also in our nature to
set aside ideology if it means causing pain, and to be willing to put a person
above a system.
Milgram’s experiments might have been less well known were it not for
the fact that Obedience to Authority is a gripping work of scientific literature.
This is a book that anyone interested in how the mind works should have in
their library. The genocide in Rwanda, the massacre at Srebrenica, and the
affronts to human dignity at Abu Ghraib prison are all illuminated and partly
explained by its insights.
Stanley Milgram
Born in New York City in 1933, Milgram graduated from high school in 1950
and earned a bachelor’s degree from Queens College in 1954. He majored in
political science, but decided he was more interested in psychology and took
summer courses in the subject in order to be accepted into a doctoral program
at Harvard. His PhD was taken under the supervision of eminent psychologist
Gordon Allport, on the subject of why people conform. Milgram worked with
Solomon Asch at Princeton University, who developed famous experiments in
social conformity.
Other areas of research included why people are willing to give up
their seats on public transport, the idea of “six degrees of separation,” and
aggression and nonverbal communication. Milgram also made documentary
films, including Obedience, on the Yale experiments, and The City and the Self,
on the impact of city living on behavior. For more information, read Thomas
Blass’s The Man Who Shocked the World: The Life and Legacy of Stanley
Milgram (2004)
Milgram died in New York in 1984.
No comments:
Post a Comment