A compass will point to Magnetic North unless it’s broken or influenced by a magnet. But what if you had a compass that, no matter what you did to it, always pointed to Magnetic North? What an extraordinary and reliable compass that would be.
In the same way, what if your own moral compass were so unbreakable it could lead you to act honestly in life, regardless of any pressure to be dishonest? How powerful would you feel if you knew that you could rely on yourself to do the right thing, midst any circumstances?
A strength like that would spread to other areas of your life. You would trust yourself and your decisions, because your direction would be clear to you. Yes, in life we make mistakes, we say the wrong thing, we make bad decisions – but you would know that beneath those blunders, those awkward words, those poor choices, was a moral compass steering you in the right direction towards that ultimate goal you have set for yourself: to build yourself a person.
It’s then you realise that those countless blunders you make in life, and all those little faults you have, don’t matter. They’re just speed-humps. White noise. Unimportant. They don’t matter the proverbial hill of beans.
‘When you think of the long and gloomy history of Man, you will find more hideous crimes have been committed in the name of obedience than have ever been committed in the name of rebellion.’
Philip Zimbardo’s experiment.
In 1971, Stanford University professor Philip Zimbardo used a university basement as a mock prison. Twenty-four students took the roles of prisoners or guards. After six days Zimbardo had to stop the experiment because some guards had become sadistic, and a few prisoners had become severely stressed or depressed.
Guards had forced the prisoners to sleep on a concrete floor, refused to let them urinate or defecate, and sexually humiliated them. Just for starters.
The prisoners had felt so powerless they could do nothing about their situation. They could not even leave the experiment because in their minds they had become real prisoners.
‘Any of us could be seduced to cross that line between good and evil. Most people are good people most of the time and we’d like to believe that we always are, and we never could cross that line. My research . . . leads me to conclude that most people under specified conditions, where there is dehumanisation at work, anonymity, diffusion of responsibility . . . can be drawn across that line and do things they could never imagine themselves usually doing.’
Dr Philip Zimbardo, Professor Emeritus of Psychology at Stanford University. (Interviewed by Natasha Mitchell on the ABC’s Radio National’ program, “All in the Mind.”)
There are questions about how well Zimbardo’s experiment was conducted, and about the merit of the conclusions made. But even if both Zimbardo’s and Milgram’s experiments are found to be useless, the key in this chapter still applies.
The Stanley Milgram experiment.
You, the reader, agree to participate in an experiment and are told that whatever happens, you will be paid for just turning up.
You go to a warehouse and meet another applicant. An officious man in a white coat appears and pays you both. Unbeknownst to you the other applicant is an actor, a confederate. You both draw straws to decide which of you is to be the “teacher” and which is to be the “learner” but the draw is rigged so that you become the “teacher”. You watch the “learner” being strapped to a chair and have electrodes attached to his wrist.
You are led to a different room in which stands a fake electric shock generator with a row of 30 buttons. Each button ‘gives’ 15 volts more than the button preceding it. The last button ‘gives’ 450 volts. The buttons exceeding 180 volts are marked as dangerous, although you are assured the “learner” will receive no permanent tissue damage.
With a microphone and a list of questions you ask the learner (supposedly strapped to his chair) the questions. Every time he gives a wrong answer you are to press a button and give him a shock, increasing the voltage each time. (Of course, no shock is given, but you hear a pre-recorded gasp or scream to let you think the learner is in pain.)
The learner gets answers wrong and pretty soon you hear him screaming for mercy, but the stern man in the white coat standing behind you urges you on. ‘You must continue!’ he insists. His job is to insist you persist with experiment. He, the “supervising professor”, assures you he takes full responsibility.
So, you persist. At 315 volts you hear a blood curdling scream and then nothing. You ask another question and receive only silence. You are told to interpret this as a wrong answer, so you press the next button. There are six questions to go. You continue to hear nothing after asking each question, so you keep increasing the number of volts. Finally, on the sixth question, you give him the maximum voltage. The experiment is over.
In 1961, forty people were tested in this way by Assistant Professor of psychology, Stanley Milgram. The experiment was to ascertain how normal people can commit atrocities.
Milgram, and the rest of the interested world, were shocked by the results. Most of the 40 volunteers gave the maximum (would-be fatal) voltage. Only one person refused to continue when they reached the 255 volt level.
Over time, a thousand people were tested in varying ways, by different universities, and the results were in line with those of the first experiment.
In other words, 65% of us would knowingly give a fatal dose of electricity to another person if a stern man in a white lab coat insisted upon it!
This suggests that ordinary people can become torturers and killers if someone in authority tells them to act that way.
Would you have given that dangerous dose (exceeding 180 volts)?
You might be right. Bear in mind that beforehand, every one of those volunteers probably would have said ‘no’ as well.
‘The person who, with inner conviction, loathes stealing, killing and assault, may find himself performing these acts with relative ease when commanded by authority. Behaviour that is unthinkable in an individual who is acting on his own may be executed without hesitation when carried out under orders.’
What were the attitudes of the people who pressed the buttons?
Many were interviewed later. Most did not regret being participants. The participants who pressed every button, including the 450 volts button, justified their actions by saying they weren’t responsible. They took the attitude, ‘I was just following orders’. Of the person receiving the shocks they thought along the lines of:‘He agreed to it, and therefore must accept responsibility.’
One of the few people who stopped relatively early (at 255 volts) was asked who was responsible for shocking the learner against his will. His reply: ‘I would put it on myself entirely.’ He refused to assign any responsibility to the learner or to the man in the white lab coat.
One subject interviewed months later said: ‘. . . he (the man in the lab coat) said, “Just continue”, so I give him (the learner) the next jolt. And then I don’t hear no more answer from him, not a whimper or anything. I said, “Good God, he’s dead; well, here we go, we’ll finish him.” And I just continued all the way through to 450 volts. . . . I figured: well, this is an experiment and Yale (University) knows what’s going on. I’ll go through whatever they tell me to do . . . Well, I faithfully believed the man was dead, until we opened the door. When I saw him I said “Great, this is great.” But it didn’t bother me even to find that he was dead. I did a job.’
His wife later asked him, ‘Suppose the man was dead?’ He replied, ‘So he’s dead. I did my job.’
Q. ‘Why did the people give the fatal electric shocks because a man in a white lab coat insisted upon it?’
▪We feel obliged to honour an agreement, to keep a promise.
▪When we lose ourselves in the rules we can lose sight of the big picture.
▪We want to prove we can do a good job.
▪If someone else is prepared to take responsibility, we abrogate ours.
▪We respect science and don’t want to interfere with an experiment.
▪When we choose to see ourselves as ‘instruments’ we can cede responsibility.
▪When someone uses firm assertiveness skills on us we can be easily manipulated.
▪We forget that just because someone accepts responsibility it does not mean they will act responsibly.
‘Legalistic thinking asks only “what am I permitted to do?” whereas truly moral thinking asks “what would be the right thing to do?’
What am I suggesting? I’m suggesting that we strengthen our moral compass by being disobedient, in a healthy way. That is, being disobedient in order to do the right thing.
There are times in life when we experience peer pressure, or pressure from authority, to behave in ways unbecoming. We might be asked to lie, or cheat, or turn a ‘blind eye’. Some of us give in to the pressure, afraid of being perceived as a troublemaker. We don’t want to ‘make a fuss’; we don’t want feel ‘different’. (Our deep need to belong is strong.)
In the early 2000’s, some staff members in the Department of Queensland Health let another staff member fraudulently steal millions of dollars. He didn’t follow the protocols but it is hard to say ‘no’ to someone you trust. But in that department it was their job to say ‘no’. Procedures need to be followed, and the staff ignored those procedures. They were more interested in appearing agreeable than in protecting the department’s money.
At least, that’s how I understand the situation.
Their decisions gave them short-term satisfaction. After all, it feels good to be helpful. But it prompts the question: how many foolish decisions might any of us make because we want to appear agreeable? How often do we suspect we are being cheated by tradies, by financial advisors, or by sales staff, but instead of asking questions, or declining, we succumb, afraid of being considered disagreeable? How often do people refuse to discipline their children, afraid of being considered intolerant? How often do we agree with something just to fit in? How often do we let a racist joke slide because we don’t want to appear prudish?
Of course, I’m only guessing. I haven’t met the negligent staff of that department. I might be wrong to assume that if they had been presented with Milgram’s button they would have pressed it to the highest voltage, and kept pressing until their fingers had blisters. And come back next day to press it again.
Instinctively we know right from wrong. If we want our lives to have depth and substance, we need to be the type of person who would not push Milgram’s button.
If we can resist the pressure to seek other people’s approval, and instead do the right thing, then in the long run, we benefit. We develop the ability to see the big picture, and gain a clearer picture of what matters in life. With that knowledge we make sharper life decisions.
The benefits are long-lasting and pervasive.
‘Conformity is doing what everybody else is doing, regardless of what is right. Morality is doing what is right regardless of what everybody else is doing.’
When we can think and act for ourselves, unswayed by adverse pressures, we can resist the collective beliefs that might guide others into behaviours that are meaningless and unfulfilling. We can make sure our lives mean something.
And, we develop a mental toughness.
Some professionals take advantage of our politeness by overcharging, hoping we don’t object. If we have developed the courage to be undaunted by their authority, we can discuss the matter with them.
Instead of blindly following our doctor’s recommendations, we will have the courage to question the treatment.
My Aunt refused to hire a financial advisor more focused on his commissions than on her retirement. He was a ‘guilt-tripper’, but she had the courage to say no.
When we learn to be disobedient, or to question authority, we can avoid being cheated, or misled, or corrupted. Politicians, for example, might cross the floor rather than vote for legislation they don’t agree with. Company directors could act on irregularities instead of ignoring them.
It’s about resisting pressure.
By speaking up we might ignore our ‘deep need to belong’ and feel anxious as a consequence. It’s scary to stand up and say no! But in the long run, having the ability to resist pressure and say ‘no’, and having the capacity to ask questions of someone in authority, will allow us to feel powerful and in control of our life.
When we strengthen our moral compass, (so that it can in turn then strengthen us), we reduce our capacity to become anxious in the first place. And that’s the aim of this book.
An added bonus: we discover that we can rely on ourselves. It’s a good feeling.
‘. . . every single breakthrough occurred because somebody decided to do something new. That first person’s actions ‘gave permission’ to others – if only to do what they already wanted to do.’
John-Paul Flintoff, How to Change the World.
To stand against expectations and say ‘no’ is scary in the short-term, but having the ability to resist pressure and say ‘no’, and having the capacity to ask questions of someone in authority, will in the long-term allow us to feel powerful and in control of our life. That leads to less anxiety, and to emotional resilience.
‘Disobedience is taking charge of your own life; when you take over from a greater authority.’
Garrison Keillor, speaking to Ramona Koval on ABC Radio National’s ‘The Sunday Book Show’.
Q. ‘When we stand up for what is right we usually fail.’
It’s the act of standing up which is important. It’s that act of defiance which adds to our inner authority.
‘. . . freedom is, by definition, people realizing that they are their own leaders.’
Diane Nash, quoted by John-Paul Flintoff in his book, How to Change the World.
Q. ‘Isn’t this about being assertive?’
It’s more than being assertive. I imagine many of the subjects who ‘electrocuted’ people in Stanley Milgram’s experiment could be assertive. I’m talking about the specific ability to question authority and say no. (Of course, assertiveness skills help with that.)
Practising disobedience is a key to resilience because every time we think for ourselves, we take responsibility for how our life unfolds. We realise we can rely on ourselves. That strengthens our belief that whatever happens, we will handle it.
Whistleblowers are disobedient to authority.
A whistleblower is a person who reveals dishonest activities in an organisation. The benefits gained from whistleblowing, from exercising a strong moral compass, might be outweighed by the consequences. People have suffered terribly after being a whistleblower. They have lost their jobs, been sued, and been ostracised. I don’t know why they are not supported. I would have thought we’d be giving them medals.
If you are an Australian and do choose to be a whistleblower, try this website: whistleblowers.org.au
‘It’s not what a lawyer tells me I may do; but what humanity, reason and justice tell me I ought to do.’
Edmund Burke, Second Speech on Conciliation, USA, 1775.
Things we can do in daily life to develop that ‘disobedient’ streak:
1. We can practise the other tips in this book, so when the time does come to be disobedient we are prepared. For example, don’t lie, admit your mistakes, and apologise when necessary.
So, when someone asks us to lie about our age to get a discount, for example, we are ready to refuse to do so.
2. We can get into the habit of questioning experts, and people in authority. That doesn’t mean becoming a troublemaker. It means: get into the habit of asking a question. That’s all. Ask your doctor to clarify a matter. Ask the builder why it’s done that way. It’s all good practice for a time when it is necessary to question the experts, the tradies, the sales person, the solicitor, the financial advisor . . . or any professional who might be more interested in your money than in you.
3. Break the law. There are many examples of the law not getting it right. Instead of mindlessly complying with a bad law, we can consult our moral compass instead.
‘Rules are abstractions for controlling behaviour and eliciting compliance and conformity – challenge them when necessary: ask,
▪ who made the rule?
▪ what purpose does it serve?
▪ who maintains it?
▪ does it make sense in this specific situation?
▪ what happens if you violate it?’
Dr. Philip Zimbardo, Emeritus Professor of psychology at Stanford University, USA.
The last word: One day you will find yourself asked to perform an unethical task. Many people would give in to the pressure. Will you be one of those people?