“In climate
research and modeling, we should recognize that we are dealing with a coupled
non-linear chaotic system, and therefore that long-term prediction of future climate states is not possible.” IPCC Third Assessment Report, 2001.
Despite this admission, the International
Panel on Climate Change (IPCC) continues to have confidence in a scenario for
the future which is so disturbing that nations are being asked to take drastic
action to mitigate the climate “catastrophe” that awaits us. While actual
evidence suggests that the climate is undergoing natural cyclical change and
that the “man-made” impacts are small, policy makers are determined to act on
models and scenarios from “experts” which remain unsupported by compelling “real
world” evidence.
Indeed, the expert “consensus” position
is based on selective use of evidence, some of it from peer reviewed journals
and some not, and expert group-think. Psychologists understand this phenomenon
and have developed a thorough understanding of just how wrong experts can be.
Phillip
Tetlock author of Expert Political
Judgement[1]
and a Professor of Psychology at Penn State University, provides strong
empirical evidence for just how bad we are at predicting. He conducted a
long-running experiment that asked nearly 300 political experts to make a
variety of forecasts about dozens of countries around the world. After tracking
the accuracy of about 80,000 predictions over the course of 20 years, Tetlock
found:
That experts thought they knew
more than they knew. That there was a systematic gap between subjective
probabilities that experts were assigning to possible futures and the objective
likelihoods of those futures materializing … With respect to how they did
relative to, say, a baseline group of Berkeley undergraduates making
predictions, they did somewhat better than that. How did they do relative to
purely random guessing strategy? Well, they did a little bit better than that,
but not as much as you might hope …
The psychologist Daniel Kahneman, who won the
Nobel Prize in Economics for his work on decision-making, has looked at the
issue of “experts” and why they often get things wrong. In his book Thinking
Fast and Slow[2] he
points to several aspects of their psychology as factors, but highlights two in
particular: the illusion of understanding and the illusion of validity. These
are primary causes of experts getting it wrong.
The illusion of understanding refers to the idea that the
world is more knowable than it actually is. In particular, experts believe that
they have an in-depth and insightful understanding of the past and this enables
them to better understand the future. They use what Kahneman refers
to as the WYSIATI rule – “what you see is all that there is” and this provides
the basis for their confidence.
For example, it must be the case that high
levels of government indebtedness (levels of debt to GDP ratio above 90% is the
most recent version of this[3])
stifle the economy and reduce investor and entrepreneurial confidence according
to some notable economists. Or it is obvious that human generated
C02 is the major cause of climate change according to some climatologists. Both
of these understandings are based on a particular view of historical data and “facts”
and an extrapolation of these views into the future.
The views exist independently of the evidence to support
them. Just as financial advisers are confident that they are
successful in predicting the future behaviour of stocks,
so macro-economists are confident that their views of austerity have
the weight of history behind them. Those committed to the view that human
produced CO2 is the primary cause of climate change are not deterred by
evidence that it may not be or that climate change has stalled for the last seventeen
years.
Experts are sustained in their beliefs by a professional
culture that supports them. Austerians (those who believe that austerity is the only
way) have their own network of support, as do the Keynesians who oppose them.
Anthroprocene climatologists who believe that man is the primary cause of
global warming have their own network of support among climate
change researchers and politicians while the skeptical climate scientists also
have their support networks. All remain ignorant of their ignorance and are
sustained in their belief systems by selected use of evidence and by the
support of stalwarts. These supportive networks and environments help sustain
the illusion of validity. It is an illusion because evidence which demonstrates
contrary views to those of the “experts” are dismissed and denied – the expert
position, whatever it may be, is valid simply because they are expert.
Indeed, using Isaiah Berlin’s 1953 work on Tolstoy (The
Hedgehog and the Fox), Austerians and anthropocenes are “hedgehogs” – they
know one big thing, they know what they know within a coherent framework, they
bristle with impatience towards those who don’t see things their way and they
are exceptionally focused on their forecasts. For these experts a “failed
prediction” is an issue of timing, the kind of evidence being adduced and so on
– it is never due to the fact that their prediction is wrong. Austerians who
look at the failure of their policies in Europe, for example, suggest that the
austerity did not go far enough; anthroprocene climatologists see the lack of
warming over the last seventeen years as proof that they are right, it is just
that the timing is a little out. Even the climatologist trapped in thick ice in
the Antarctic in December 2013 who set out to study the thinning ice-cap claims
he just went to the wrong place – “climate change is happening and the ice is
melting” he says, as he is lifted off the thick ice by helicopter.
Tetlock’s work, cited above, is a powerful testimony to
these two illusions – understanding and validity. His results are devastating
for the notion of “the expert”. According to Kahneman, “people who spend
their time, and earn their living, studying a particular topic produce poorer
predictions than dart throwing monkeys”.
Tetlock observes that “experts in demand were more
overconfident that those who eked our existences far from the
limelight”. We can see this in spades in both economics and climate change.
James Hanson, recently retired from NASA and seen to be one of the worlds
leading anthroprocene climatologists, makes predictions and claims that cannot
be supported by the evidence he himself collected and was responsible for. For
example, he suggested that “in the last decade it's warmed only about a
tenth of a degree as compared to about two tenths of a degree in the preceding
decade” – a claim not supported by the data set which he was responsible for.
This overconfidence and arrogance comes from being regarded as one of the
leading climate scientists in the world – evidence is not as important as the
claim or the person making it. Hanson suffers from the illusion of skill.
Kahneman recognizes people like Hansen. He suggests
“…overconfident
professionals sincerely believe they have expertise, act as experts and look
like experts. You will have to struggle to remind yourself that they may be in
the grip of an illusion.”
There are other psychological features of the expert that
are worthy of reflection. For example, how “group think” starts to permeate a
discipline such that those outside the group cannot be heard as rational or
meaningful – they are referred to as “deniers” or “outsiders”, reflecting the
power of group think. The power of a group (they will claim consensus as if
this ends scientific debate) to close ranks and limit the scope of conversation
or act as gatekeepers for the conversation. Irving Janis documented the
characteristics of group think in his 1982 study of policy disasters
and fiascoes[4]. He
suggests these features:
- Illusion of invulnerability
–Creates excessive optimism that encourages taking extreme risks. We can
see this in the relentless pursuit of austerity throughout Europe.
- Collective rationalization –
Members discount warnings and do not reconsider their assumptions. We see
this in relation to both climate change and austerity economics.
- Belief in inherent morality –
Members believe in the rightness of their cause and therefore ignore the
ethical or moral consequences of their decisions. Austerians appear to
willfully ignore the level of unemployment and the idea of a lost
generation of youth workers, especially in Greece and Spain. Anthropecene
climate researchers generally present themselves as morally superior.
- Stereotyped views of out-groups –
Negative views of “enemy” make effective responses to conflict seem
unnecessary. Climate “deniers” commonly face suggestions that they be
prosecuted or punished in some way[5].
- Direct pressure on dissenters –
Members are under pressure not to express arguments against any of the
group’s views. This has occurred in climate change research community,
since grants appear to favour those who adopt the view that man made CO2
is the primary cause of climate change.
- Self-censorship – Doubts and
deviations from the perceived group consensus are not expressed.
- Illusion of unanimity – The
majority view and judgments are assumed to be unanimous. This is
especially the case in “consensus” (sic)
climate change science and amongst austerians.
- Self-appointed ‘mindguards’ –
Members protect the group and the leader from information that is
problematic or contradictory to the group’s cohesiveness, view, and/or
decisions.
- all of these characteristics can be seen to be in play in
the two examples used throughout this chapter – economics of austerity and made
man global warming.
There is also the issue of the focusing
illusion. Kahneman sums this up in a single statement: “nothing in
life is as important as you think it is when you are thinking about it”.
“Government debt is the most important economic challenge facing society today”
says a well known economists, or “climate change is a life and death issue”
says US Secretary of State, John Kerry. Neither of these statements are
true for anyone unless they
are obsessive.
Society faces a great many challenges. Much will depend on
our own preoccupations and what focus one take for the concerns you have. Some
are more concerned about the future of Manchester United or Chelsea football
clubs than they are about debt, deficits or climate change. The illusion is
that one person’s focus is, by definition, better than another’s simply because
they are expert in this field.
Nassim Taleb makes a very compelling argument against
forecasting in several of his books, most notably in The Black Swan[6]. He
explains that we can make use of very short-term guesses or predictions, but
long-term forecasts are nothing more than pure guesswork. We are guilty of
ascribing far too much predictability to the truly unpredictable. It is very
common for our human brains to believe we are recognizing patterns that are
only a random sequence of events. Experts have tried to overcome our human
fallacies with tools such as quantitative modeling. However, even these models
play only on our biases. We believe that models that have accurately predicted
the future in the past are likely to predict the future going forward. But that
is no more true than believing me when I tell you that a coin will land heads
up just because I accurately predicted it would do so the last ten times.
So beware of predictions, especially those made by experts. New years eve and day are the prime season for prediction. You have been warned.
[1] Tetlock, P. E(2006) Expert Political Judgment – How Good is It? How Can we Know? New
Jersey: Princeton University Press.
[2] Kahneman, D (2011) Thinking Fast and Slow. Toronto:
Doubelday Canada
[3] Reinhart, C. and Rogoff, K.
(2013) This Time Its Different – Eight
Centuries of Financial Folly. New Jersey: Princeton University Press.
[4] See Janis, Irving L. (1982). Groupthink:
Psychological Studies of Policy Decisions and Fiascoes. Second
Edition. New York: Houghton Mifflin.
6 It has been suggested that those who deny climate change is caused by
human activity should be put to death. See http://joannenova.com.au/2012/12/death-threats-anyone-austrian-prof-global-warming-deniers-should-be-sentenced-to-death
[6] Taleb, N ( 2010) The Black Swan – The Impact of the Highly
Improbable. New York: Random House.
1 comment:
This is a valuable summary of the underlying psychological issues of taking strong predicative stances but, of course, the group think characteristics can be applied to both proponents and deniers of anthropogenic climate change. We all appear to be vulnerable in varying degrees to confrimation bias. But I still find it hard to overlook the Ehrlich formula I = PxAxT that drives human impact exponentially within a finite planetary space. Population may level, according to the UN prediction, at around 10.1 billion but affluence and technology appear unstoppable. Climate change might be an aspect of this basic clash between unlimited growth and finite carrying capacity, but it is only one. How about a look at oil supply and demand trends? I wonder how you respond to http://www.homerdixon.com/ ? He does worry about the impact of GHG emissions but sees the decline of oil as an even greater threat to business as usual.
Post a Comment