‘When you think of the long and gloomy history of Man, you will find more hideous crimes have been committed in the name of obedience than have ever been committed in the name of rebellion.’
Philip Zimbardo’s experiment.
In 1971, Stanford University professor Philip Zimbardo used a university basement as a mock prison. Twenty-four students took the roles of prisoners or guards. After six days Zimbardo had to stop the experiment because some guards had become sadistic, and a few prisoners had become severely stressed or depressed.
Guards had forced the prisoners to sleep on a concrete floor, refused to let them urinate or defecate, and sexually humiliated them. Just for starters.
The prisoners had felt so powerless they could do nothing about their situation. They could not even leave the experiment because in their minds they had become real prisoners.
‘Any of us could be seduced to cross that line between good and evil. Most people are good people most of the time and we’d like to believe that we always are, and we never could cross that line. My research . . . leads me to conclude that most people under specified conditions, where there is dehumanisation at work, anonymity, diffusion of responsibility . . . can be drawn across that line and do things they could never imagine themselves usually doing.’
Dr Philip Zimbardo, Professor Emeritus of Psychology at Stanford University.
(Interviewed by Natasha Mitchell on the ABC’s Radio National’ program, “All in the Mind.”)
There are questions about how well Zimbardo’s experiment was conducted, and about the merit of the conclusions made. But even if both Zimbardo’s and Milgram’s experiments are found to be useless, the key in this chapter still applies.
The Stanley Milgram experiment.
You, the reader, agree to participate in an experiment and are told that whatever happens, you will be paid for just turning up.
You go to a warehouse and meet another applicant. An officious man in a white coat appears and pays you both. Unbeknownst to you the other applicant is an actor, a confederate. You both draw straws to decide which of you is to be the “teacher” and which is to be the “learner” but the draw is rigged so that you become the “teacher”. You watch the “learner” (the confederate) being strapped to a chair and have electrodes attached to his wrist.
You are led to a different room in which stands a fake electric shock generator with a row of 30 buttons. Each button ‘gives’ 15 volts more than the button preceding it. The last button ‘gives’ 450 volts. The buttons exceeding 180 volts are marked as dangerous, although you are assured the “learner” will receive no permanent tissue damage.
With a microphone and a list of questions you ask the learner (supposedly strapped to his chair) the questions. Every time he gives a wrong answer you are to press a button and give him a shock, increasing the voltage each time. (Of course, no shock is given, but you hear a pre-recorded gasp/scream to let you think the learner is in pain.)
The learner gets answers wrong and pretty soon you hear him screaming for mercy, but the stern man in the white coat standing behind you urges you on. ‘You must continue!’ he insists. His job is to insist you persist with experiment. He, the “supervising professor”, assures you he takes full responsibility.
So, you persist. At 315 volts you hear a blood curdling scream and then nothing. You ask another question and receive only silence. You are told to interpret this as a wrong answer, so you press the next button. There are six questions to go. You continue to hear nothing after asking each question, so you keep increasing the number of volts. Finally, on the sixth question, you give him the maximum voltage. The experiment is over.
In 1961, forty people were tested in this way by Assistant Professor of psychology, Stanley Milgram. The experiment was to ascertain how normal people can commit atrocities.
Milgram, and the rest of the interested world, were shocked by the results. Most of the 40 volunteers gave the maximum (would-be fatal) voltage. Only one person refused to continue when they reached the 255 volt level.
Over time, a thousand people were tested in varying ways, by different universities, and the results were in line with those of the first experiment.
In other words, 65% of us would knowingly give a fatal dose of electricity to another person if a stern man in a white lab coat insisted upon it!
This suggests that ordinary people can become torturers and killers if someone in authority tells them to act that way.
Would you have given that dangerous dose (exceeding 180 volts)? No? You might be right. Bear in mind that beforehand, every one of those volunteers probably would have said ‘no’.
‘The person who, with inner conviction, loathes stealing, killing and assault, may find himself performing these acts with relative ease when commanded by authority. Behaviour that is unthinkable in an individual who is acting on his own may be executed without hesitation when carried out under orders.’
What were the attitudes of the people who pressed the buttons?
Many were interviewed later. Most did not regret being participants. The participants who pressed every button, including the 450 volts button, justified their actions by saying they weren’t responsible. They took the attitude, ‘I was just following orders’. Of the person receiving the shocks they thought along the lines of:‘He agreed to it, and therefore must accept responsibility.’
One of the few people who stopped relatively early (at 255 volts) was asked who was responsible for shocking the learner against his will. His reply: ‘I would put it on myself entirely.’ He refused to assign any responsibility to the learner or to the man in the white lab coat.
One subject interviewed months later said: ‘. . . he (the man in the lab coat) said, “Just continue”, so I give him (the learner) the next jolt. And then I don’t hear no more answer from him, not a whimper or anything. I said, “Good God, he’s dead; well, here we go, we’ll finish him.” And I just continued all the way through to 450 volts. . . . I figured: well, this is an experiment and Yale (University) knows what’s going on. I’ll go through whatever they tell me to do . . . Well, I faithfully believed the man was dead, until we opened the door. When I saw him I said “Great, this is great.” But it didn’t bother me even to find that he was dead. I did a job.’
His wife later asked him, ‘Suppose the man was dead?’ He replied, ‘So he’s dead. I did my job.’
Q. ‘Why did the people give the fatal electric shocks because a man in a white lab coat insisted upon it?’
▪ We feel obliged to honour an agreement, to keep a promise.
▪ When we lose ourselves in the rules we can lose sight of the big picture.
▪ We want to prove we can do a good job.
▪ If someone else is prepared to take responsibility, we abrogate ours.
▪ We respect science and don’t want to interfere with an experiment.
▪ We can choose to see ourselves as ‘instruments’, and so cede responsibility.
▪ When someone uses firm assertive skills on us we can be easily manipulated.
▪ We forget that just because someone accepts responsibility it does not mean they will act responsibly.
Many of us will find ourselves asked to perform tasks we believe to be unethical. Many of us will give into the pressure.
Will one of those people be you?
‘Legalistic thinking asks only “what am I permitted to do?” whereas truly moral thinking asks “what would be the right thing to do?’
So, what am I suggesting?
I’m suggesting that we strengthen our moral compass by being disobedient, in a healthy way. That is, being disobedient in order to do the right thing.
There are times in life when we experience peer pressure, or pressure from authority, to behave in ways unbecoming. We might be asked to lie, or cheat, or turn a ‘blind eye’. Some of us give in to the pressure, afraid of being perceived as a troublemaker. We don’t want to ‘make a fuss’; we don’t want feel ‘different’. (Our deep need to belong is strong.)
In the early 2000’s, staff in the Department of Queensland Health let a man fraudulently steal over $16 million. Some officers in the department signed fraudulent documentation at the man’s request because they trusted him as their manager. They also failed to question discrepancies when certifying and processing the fraudulent payments by him.
It is hard to say ‘no’ to someone you trust, but in that department it was their job to say no. Procedures need to be followed, and the staff ignored those procedures. They were more interested in appearing agreeable than in protecting the department’s money, because they feared the man’s disapproval. They didn’t want to appear petty.
At least, that’s how I understand the situation.
Their decisions gave them short-term satisfaction (after all, it feels good to be helpful) but how many foolish decisions have they made in life because they wanted to appear agreeable? How often did they suspect they were being cheated by tradies, by financial advisors, or by sales staff, but rather than ask questions or decline, they looked away, afraid of being considered disagreeable? How often did they refuse to discipline their children, afraid of being considered intolerant? How often did they agree with something just to fit in?
Of course, I’m only guessing. I don’t know the staff of that department. I might be wrong when I assume that if they were too afraid to question that man’s aberrant behaviour, they would be too afraid to do anything in their lives out of the ordinary. I might be wrong in assuming that if they had been presented with Milgram’s button, they would have pressed it to the highest voltage, and kept pressing until their fingers had blisters. And come back the next day to do it again.
Instinctively we know right from wrong. If we want our lives to have depth, to have meaning, we need to be the type of person who would not push Milgram’s button.
Otherwise, we might as well just leave our moral compass in the cupboard and follow everyone else. Until we die.
‘Conformity is doing what everybody else is doing, regardless of what is right. Morality is doing what is right regardless of what everybody else is doing.’
If we can resist the pressure to seek other people’s approval, and instead actually do the thing we are meant to do: the right thing, then in the long run we benefit. We develop the ability to see the big picture, and gain a clearer picture of what matters in life. And with that knowledge we make sharp life decisions.
The benefits are long lasting and pervasive.
When we can think and act for ourselves, unswayed by adverse pressures, we can resist the collective beliefs that guide others into behaviours meaningless and unfulfilling. We can make sure our lives mean something.
And, we develop mental toughness.
Some professionals take advantage of our politeness by overcharging, hoping we don’t object. If we have developed the courage to be undaunted by their authority, we can discuss the matter with them.
Instead of blindly following our doctor’s recommendations, we can have the courage to question the treatment.
My Aunt refused to hire a financial advisor more focused on his commissions than on her retirement. He was a ‘guilt-tripper’, but she had the courage to say no.
When we learn to be disobedient, or question authority, we can avoid turning a blind eye to bad behaviour, or worse, being corrupted. Politicians, for example, might cross the floor rather than vote for legislation they don’t agree with. Company directors could act on irregularities instead of ignoring them.
It’s about resisting pressure.
By speaking up we might ignore our ‘deep need to belong’ and feel anxious as a consequence. It’s scary to stand up and say no. But in the long run, we not only strengthen our moral compass, (so that it can in turn then strengthen us), we also learn to deal with that anxiety. In so doing, we reduce our capacity to become anxious And that’s the aim of this book.
An added bonus: we discover that we can rely on ourselves. It’s a good feeling.
To stand against expectations and say ‘no’ is scary in the short-term, but having the ability to resist pressure and say ‘no’, and having the capacity to ask questions of someone in authority, will in the long-term allow us to feel powerful and in control of our life. That leads to less anxiety, and to emotional resilience.
Even when we fail to change the situation we can tell ourselves, ‘At least I had the courage to speak up.’
Being disobedient when it matters is a grand way to take responsibility for how our life unfolds.
‘Disobedience is taking charge of your own life; when you take over from a greater authority.’
Garrison Keillor, speaking to Ramona Koval on ABC Radio National’s ‘The Sunday Book Show’.
‘. . . every single breakthrough occurred because somebody decided to do something new. That first person’s actions ‘gave permission’ to others – if only to do what they already wanted to do.’
John-Paul Flintoff, How to Change the World.
‘. . . freedom is, by definition, people realizing that they are their own leaders.’
Diane Nash, quoted by John-Paul Flintoff in his book, How to Change the World.
Q. ‘Isn’t this about being assertive?’
It’s more than being assertive. I imagine many of the subjects who ‘electrocuted’ people in Stanley Milgram’s experiment could be assertive. I’m talking about the specific ability to question authority and say no. (Of course, assertiveness skills help with that.)
Practising disobedience is a key to resilience because every time we think for ourselves, we take responsibility for how our life unfolds. We realise we can rely on ourselves. That strengthens our belief that whatever happens, we can handle it.
‘It’s not what a lawyer tells me I may do; but what humanity, reason and justice tell me I ought to do.’
Edmund Burke, Second Speech on Conciliation, USA, 1775.
Whistleblowers are disobedient to authority.
A whistleblower is a person who reveals dishonest activities in an organisation. The benefits gained from whistleblowing, from exercising a strong moral compass, might be outweighed by the consequences. People have suffered terribly after being a whistleblower. They have lost their jobs, been sued, and been ostracised.
I don’t know why they are not supported. I would have thought we’d give them medals.
If you do choose to be a whistleblower, try this Australian website: whistleblowers.org.au
Things we can do in daily life to develop that ‘disobedient’ streak:
1. We can practise the other keys in this book, so when the time comes to be disobedient we are prepared. For example,
▪ Don’t lie about anything.
▪ Admit your mistakes.
▪ Apologise if necessary.
So when someone asks us to lie about our age to get a discount, for example, we are ready to refuse to do so.
2. We can get into the habit of questioning experts, and people in authority. That doesn’t mean becoming a troublemaker. It means: get into the habit of asking a question. That’s all. Ask your doctor to clarify a matter. Ask the builder why it’s done that way. It’s all good practice for a time when it is necessary to question the experts, the tradies, the sales person, the solicitor, the financial advisor . . . or any professional who might be more interested in your money than in you.
After we have made an appointment we can steadily compile a list of questions so that when we meet the person, we’re ready.
3. Break the law. There are many examples of the law not getting it right. Instead of mindlessly complying with a bad law, we can consult our moral compass instead.
‘Rules are abstractions for controlling behaviour and eliciting compliance and conformity – challenge them when necessary: ask,
▪ who made the rule?
▪ what purpose does it serve?
▪ who maintains it?
▪ does it make sense in this specific situation?
▪ what happens if you violate it?’
Dr. Philip Zimbardo, Emeritus Professor of psychology at Stanford University, USA.