I addressed graduating ANU students today, speaking about doubt and uncertainty, scepticism and risk-taking, experimenting and being prepared to make a mistake.
‘The Spirit Which is Not Too Sure It’s Right’
ANU Graduation Address
12 July 2012
In 1931, the British air ministry decided to experiment by commissioning a new fighter aircraft.[1] The bureaucrats wanted aviation engineers to abandon past orthodoxies and create something entirely new.
The initial prototypes were disappointing. But then a company called Supermarine approached the ministry with a radical new design. A public servant by the name of Henry Cave-Brown-Cave decided to bypass the regular process and order it. The new plane was the Supermarine Spitfire.
The Spitfire was one of the greatest technological breakthroughs in aviation history. One British pilot called it ‘a perfect flying machine’. It fundamentally changed aviation wisdom, which had been that countries should focus on bomber fleets.
It’s no exaggeration to say that without the Spitfire, Britain may not have been able to fight off the Luftwaffe to win the Battle of Britain. Asked what he needed to beat the British, a German ace told Hermann Göring, ‘I should like an outfit of Spitfires’.
As economist Tim Harford points out, without the Spitfire, Germany might have occupied Britain. The course of world history was changed because a public servant decided to experiment with something new.
Today I want to speak with you about the virtues of experimenting and taking risk, and their flipsides: making mistakes and being wrong. I want to argue that having doubt is a good thing; that a little modesty is a smart way to live. As US judge Learned Hand famously said, ‘The spirit of liberty is the spirit which is not too sure that it is right’.
* * * * *
In 1984, a young psychologist called Philip Tetlock had the job of summing up expert opinion on how the Soviet Union might react to Ronald Reagan’s Cold War policies. He was struck by how often the leading US experts flat-out contradicted one another, so he designed an experiment.
Tetlock asked 300 expert commentators to make specific forecasts about the future.[2] Then he waited to see their results. Across nearly 30,000 predictions, he found that the experts were about as accurate as dart-throwing monkeys.[3]
Among these professional pundits, the least accurate were those who viewed the world through the lens of a single idea – what philosopher Isaiah Berlin once called ‘hedgehogs’. As new facts came in, these pundits stuck inflexibly to their initial views. Those who did a better job were the group that Berlin called ‘foxes’, who based their analysis on observing as much as possible. They were much more willing to change their analysis as the world shifted.
It’s fun to laugh at the inaccuracy of professional pundits, but Tetlock’s findings have lessons for us too. You should remember what you said in the past, but you shouldn’t be slavishly bound to it. If it helps, remember that there are virtually no atoms in your body that were there seven years ago.
It’s ok to change your mind. And when you do, you might as well admit it. As Keynes once put it when asked why he had changed his position on monetary policy during the Great Depression: ‘When the facts change, I change my mind. What do you do, sir?’
Investor Nassim Taleb argues that when it comes to adjusting to a changing world, some people are better than others.[4] Entrepreneurs are very good at it. Senior businesspeople are often too reluctant to admit a mistake. Politicians, Taleb argues, are the worst of all.
In her splendid book On Doubt, ABC journalist Leigh Sales writes that ‘Politics is littered with the carcases of the indecisive.’[5] In 2004, US President George W Bush used the ‘flip-flopper’ tag to devastating effect on rival John Kerry. Yet it’s hardly radical to imagine that the world would be a better place if Bush had been a little more self-reflective.
A good way of achieving this is to surround yourself with people who disagree with one another. Abraham Lincoln is one of the greatest leaders in history partly because he chose a cabinet who argued among one another – what historian Doris Kearns-Goodwin called ‘a team of rivals’.[6]
And yet it is too easy to see groupthink on all sides of politics. Take the case of anthropogenic climate change, where scientific evidence has grown stronger – while political support has weakened.[7] As psychologist Jonathan Haidt puts it, ‘once group loyalties are engaged, you can’t change people’s minds by utterly refuting their arguments’.[8]
It shouldn’t be this way. Any politician who is truly committed to evidence-based policymaking ought to be willing to admit when their policy doesn’t work.[9]
* * * * *
Looking out over this audience, I know that each of you have the best education that time, money, and Australia’s national university can deliver for you. You are extraordinarily well-prepared.
And yet for all that preparation, none of you has the guarantee of where you will end up. Each of your careers will be shaped by luck.
So you should enter the world of full-time work with a willingness to experiment, and a recognition that the optimal job match may not be the first one you try.
My friends who work on the economics of marriage argue that the same principles apply there too.
On the social side, join more than one club. If you want to invest, buy shares in more than one company. In 1990, Harry Markowitz won the Nobel Prize for his work on portfolio investment strategies – formalising the old adage ‘don’t put all your eggs in the one basket’. It’s not bad advice in other contexts too.
Allow yourself some safety to experiment. When things don’t work out, learn from the experience. You must ‘make peace with your losses’.[10] This isn’t easy. In her book Being Wrong, Kathryn Schulz compares the feeling of being wrong about something fundamental to feeling like a toddler lost in Manhattan.[11]
But if you can master the art of experimentation and learning from your mistakes, you’ll achieve a great deal. Without the willingness to risk failure, you may never truly succeed.[12]
You should also be open to serendipity. Accidents can lead to breakthroughs. In 1928, Alexander Fleming’s dirty laboratory led to him discovering the world’s first antibiotic in a contaminated petri dish.
Serendipity is literally in our DNA. Evolution is a series of random experiments carried out by nature. Each of us is the product of millions of years of experiments by nature.
When experiments succeed, the result can be an extraordinary breakthrough like the Spitfire. But very often, experiments fail. That shouldn’t stop you from pursuing life with a spirit of sceptical experimentation.
Apply the same principles to those around you. Don’t try to surround yourself with people who are infallible, but with people who try to learn from their errors. In your workplace, try to create an atmosphere in which people are able to take risks. Never assume that the most senior person in an organisation has nothing to learn from the most junior.
You may have driven your parents and your lecturers crazy by asking ‘Why? Why? Why?’. Don’t stop now – it’s always worth asking whether things can be done better.
Being sceptical doesn’t mean lacking passion. You can be passionate about the change you want to see in the world – yet willing to be guided by evidence on the right way to achieve your ideals.
Leigh Sales points out that many of the great breakthroughs in history have begun from a position of scepticism. Copernicus asked whether the earth sat at the centre of the universe. Martin Luther asked whether God’s forgiveness could be purchased with money. Mary Wollstonecraft asked why women didn’t have rights. Nelson Mandela asked why South African blacks were kept separate. Each refused to accept the prevailing wisdom.
As the saying goes, the reasonable person adapts themselves to the world; the unreasonable person adapts the world to them. Therefore all progress depends on unreasonable people. So go forth, and be unreasonable.
[1] This account is drawn from Tim Harford, 2011, Adapt: Why Success Always Starts with Failure, Hachette, London.
[2] Philip Tetlock, 2005, Expert Political Judgment: How Good Is It? How Can We Know? Princeton, NJ. Princeton University Press.
[3] The phrase ‘dart-throwing monkeys’ comes from a review essay by Louis Menand, 2005, ‘Everybody’s an Expert’ New Yorker, 5 December 2005.
[4] Nassim Taleb, interviewed by Russ Roberts on EconTalk, 16 January 2012. http://www.econtalk.org/archives/2012/01/taleb_on_antifr.html
[5] Leigh Sales, 2010, On Doubt, Melbourne University Publishing, Melbourne.
[6] Doris Kearns Goodwin, 2006, Team of Rivals: The Political Genius of Abraham Lincoln, Simon & Schuster, New York.
[7] From 2006 to 2012, the share of Australians who agree that ‘global warming is a serious and pressing problem. We should begin taking steps now even if this involves significant costs’ has fallen from 68% to 36%, while the share who say ‘until we are sure that global warming is really a problem, we should not take any steps that would have economic costs’ has doubled from 7% to 18%: The Lowy Institute Poll 2012: Public opinion and foreign policy
[8] Ezra Klein, ‘Unpopular Mandate’, New Yorker, 25 June 2012, pp.30-33.
[9] For one example, see Andrew Leigh, ‘Lessons Important For Us All’, The Chronicle, 3 July 2012. http://www.andrewleigh.com/blog/?p=2868
[10] Daniel Kahneman and Amos Tversky, quoted in Tim Harford, 2011, Adapt: Why Success Always Starts with Failure, Hachette, London. Another way of putting this is that you should avoid the sunk cost fallacy.
[11] Kathryn Schultz, 2011, Being Wrong: Adventures in the Margin of Error, Granta Books, London (cited in Tim Harford, 2011, Adapt: Why Success Always Starts with Failure, Hachette, London).
[12] Had time permitted, I would at this point have embarked upon a lengthy paean to randomised policy trials.
Do you like this post?
Be the first to comment
Sign in with