Gigerenzer’s Risk Savvy

Gerd Gigerenzer Risk Savvy: How to make good decisions Allen Lane 2014

Gerd challenges the notion that people aren’t very good at making decisions, and need to be nudged into making better ones. He thus differs from Kahneman and advocates of ‘nudge’. His alternative vision is of a participatory democracy with citizens being better educated and informed, particularly on risk and uncertainty.

He notes:

  • Even for ‘tame’, quantifiable, probabilities, people (including experts such as surgeons) routinely misunderstand probabilities, particularly when this benefits those presenting the information. But with suitable presentation they do very much better. (Thus there may be a role for governments to ensure that information is presented appropriately, as in adverts.)
  • There is a difference between uncertainty and probability that matters but that is not widely understood. (He gives some examples of how this can be improved upon.)
  • That mathematics teaching on probability and statistics is inadequate, encouraging (or at least not correcting)  attitudes that are inappropriate for broader uncertainty.

Actually, like many, GG seems to confuse mathematics as such with mathematics as it is widely taught, and hence to suppose that mathematic itself is only concerned with probability, not uncertainty. I beg to differ.

Part 1: The Psychology of Risk

1 Are People Stupid?

Terrorists use our brains

After 9/11:

[An] estimated sixteen hundred Americans lost their lives on the road due to their decision to avoid the risk of flying.
This death toll is six times higher than the total number of passengers (256) who  died on board the four fatal flights.

Gerd seems to see the decisions not to fly as a dangerous over-reaction driven by fear, following this bad habit:

If many people die at one point in time, react with fear and avoid that situation.

He says:

Only when we are risk savvy can we resist terrorist manipulation  … To get there [we need to know] the actual risk of flying.

How many miles would you have to drive until the risk of flying is the same as in a nonstop flight [from New York to Washington]? [Twelve miles].

Unfortunately Gerd is not comparing like with like. If the death toll on the roads is roughly 400 hundred in 3 months and you expect another terrorist attack, similar to 9/11, within 3 months, then driving looks quite sensible. Gerd seems to assume that the risk is one per year, but this seems rather arbitrary.

Becoming Risk Savvy

Nonage is the inability to use one’s own understanding without another’s guidance.

I agree that this is a problem. In essence ‘nudges’ advocates guidance on what to do, which may be hidden, whereas Gerd suggests helping people to really understand their situations, which would preferably be ‘open’.

2 Certainty is an Illusion

My security blanket, please

Humans appear to have a need for certainty. … In an uncertain world, we can not plan everything ahead: [we] can only cross each bridge when we come to it .. .

This reprises Savage. Mathematically, there are different types of uncertainty. For some we appear to be able to cross our bridges before we reach them.

[Technologies] from mathematical stock prediction methods to medical imaging machines, compete for the confidence promised by religion and authority.

Gerd’s phraseology is conventional, but as a mathematician I wonder if it might not be misconstrued. It seems to me that we should always have complete confidence in mathematical methods, in so long as they are genuinely mathematical. But how could there be any such thing as a ‘mathematical stock prediction method’? I think it means ‘stock prediction method that uses some mathematics’, but then why should we trust it? We might trust that the chicken in  a meal is safe, but does that mean that we should trust the whole meal?

Risk and uncertainty

I use [the term risk] for a world where all alternatives, consequences, and probabilities are known. … Most of the time, however, we live in a changing world where some of these are unknown: where we face unknown risks, or uncertainty.

To use Savage’s terminology, in small worlds we have risk, but most of our problems concern large worlds, with ‘true non-measurable uncertainty’.

The secret of intuition: Unconscious rules of thumb

Calculated intelligence may do the job for known risks, but in the face of uncertainty, intuition is indispensable.

Here Gerd seems to take a very narrow view of mathematical decision theory, as if it were only applicable to small worlds. It is the case that students are often taught to deal with small worlds first, and if they do not go on further they may be left with the impression that decision theory only applies to small worlds or (worse) that it is somehow ‘mathematical’ ‘logical’ or ‘pragmatic’ to treat all worlds as if they were small. But Boole, for example, considers cases where one has possible probability distributions, subject to constraints. If these happen to imply unique values for probabilities then one has a small world, and can apply the usual methods. In a large world one ends up with imprecise utilities. Sometimes (as in the 9/11 example, below) one can still identify a less undesirable course of action. At other times, one can’t. But there are a variety of methods (see my blog) that can be used for different cases. Thus it is not the case that mathematics can’t cope with large worlds. If you can specify the problem mathematically, you can use mathematics, albeit imprecisely. If you can’t, I suspect that an understanding of the relevant mathematics might still be helpful. Intuition and experience are typically vital in formulating the problem, even in small world situations, so I think we need both mathematics and intuition, not ‘either or’.

 The turkey illusion

Put yourself in the mind of a turkey. [The man who has fed you for the last few hundred days is approaching.] Will he feed you again? Using probability theory, you can calculate he chance that this will happen [using] the rule of succession … .

Again, Gerd makes a category error: how on earth can you apply probability theory? He seems to have an intuition that probability applies. You could suppose that on each day there was some constant probability of being fed and then use the rule of succession to put an upper bound on that conditional probability, but why should you? But you can’t just forget the condition and claim that the result is at all mathematical. A general point from the theory of knowledge is that you shouldn’t trust extrapolations beyond your effective experience base. It seems to me that the turkey’s intuition could be tempered by learning some relevant mathematics.

The quest for certainty

Gerd has 3 rules:

    1. RISK ≠ UNCERTAINTY. The best decision under risk is not [always] the best decision under uncertainty.
    2. RULES OF THUMB ARE NOT DUMB. In an uncertain world, simple rules of thumb can lead to better decisions than fancy calculations.
    3. LESS IS MORE. Complex problems do not always require complex solutions. Look for simple solutions first.

These are very reasonable, but elsewhere Gerd seems to suppose that:

  • Rules of thumb are always better in uncertain worlds than ‘fancy calculations’.
  • Mathematics is just about ‘fancy calculations’.

3 Defensive Decision Making

Risk aversion is closely tied to the anxiety of making errors. … Risk aversion is already fostered in schools, where children are discouraged from finding solutions to mathematics problems themselves and possibly make errors in the process. Instead, they are told the answer and tested on whether they can memorize and apply the formula.

Gerd never seems to define risk aversion, but it is probably true that most UK schools ‘teach to the test’ rather than being encouraged to do mathematics for themselves. The remedy might be more and better mathematical education, not less.

4 Why Do We Fear What’s Unlikely to Kill Us?

People’s goals have shifted steadily since the end of World War II towards more and more extrinsic goals [e.g. material rewards and others’ opinions]. [People] who report more internal control tend to fare better in life than those that don’t.

 Part II: Getting Risk Savvy

5 Mind Your Money

Where is the stock market going?

[Analysts] underestimate the volatility of the stock market and of the exchange rate. At fault for one are the mathematical models that they use. These treat the highly unpredictable financial market as if the risks were predictable. As a consequence, the forecasts consistently miss the large upswings and downswings and only do well if nothing remarkable happens – that is, when last year’s trend continues.

If we compare the turkey example, we could obviously fit a stochastic model to the amount of feed distributed, and we might accept that such a model could be called ‘mathematical’. But what basis is there for making predictions from it, except with the caveat ‘so long as the observed trends continue’? Naïve induction be pragmatic, but it isn’t mathematical, no matter how ‘fancy’ the mathematics used.

Anyone can become a market guru

Some people

predict a downturn so many times that eventually it is bound to come true … .

Gerd is scathing about this. But suppose that we continually have some low subjective probability of the current trend not continuing and are forced to make a prediction. We can either be pragmatic and simply extrapolation, or we could try to predict what would happen if the situation should change. But the common-sense language of prediction does not allow us to express what the we actually anticipate. This is not a limitation of the mathematics, but of the notion of prediction. (Actually, I would advocate a fancier model, but this simpler one makes the point.)

In practice one can identify and model various indicators such that if ‘current trends continue’ the indicators should stay within some identified ranges, so that deviations are indications that trends are not continuing and hence a crash could be imminent. (This will be more reliable with more and better indicators.)

How to beat a Nobel Prize Portfolio

The [mean-variance] portfolio maximizes the gain (mean) for a given risk or minimizes the risk (variance) for a given return.

But many numerate people use a simple rule of thumb:

Allocate your money equally to each of [the] funds.

Gerd’s presents the first rule as ‘mathematical’, since it involves more complicated calculations and has a mathematical proof that it is optimal for investment under risk. He regards the rule of thumb as a ‘heuristic’ since it is simple and relies more on intuition than mathematics. But he uses some quite reasonable mathematical arguments to show that in practice one never has enough data to justify the precise optimization of the mean-variance portfolio and that the rule of thumb is less sensitive to model errors. (The variance is often a poor indicator of the potential for radical change, and unusual low variance can be a sign of an impending change.)

6 Leadership and Intuition

Do executives make gut decisions?

In a conference on business education,

[an economist] and I recommended scrapping much of the business curriculum and in its place teaching how to make successful decisions in an uncertain world. Against the objections of the faculty [of economics and business], who preferred more mathematically elegant theories …. .

This suggests that decision theory as typically taught to economics and business students is ‘mathematically elegant’. But it seems to me that no matter how mathematically elegant one’s theories of probability and statistics are, teaching people to ignore significant uncertainties is neither mathematical nor elegant, but is silly and crude.

Gerd notes that executives are reluctant to act on their gut instinct, even when it is reasonable:

[Three] reasons were mentioned again and again:

  1. Rational justification is expected, intuition is not.
  2. Group decision making conflicts with gut feelings.
  3. The deep anxiety of not having considered all facts.

On the first point, the executives, like Gerd, seem to have a narrow view of what is ‘rational’. If they had a broader view they might find it easier to justify their assessments. Sometimes it is even rational to act on instinct. For example, when driving I may instinctively look for a hazard without pausing to think what might justify my looking. I have learned that acting on instinct in this case is useful (and arguably essential). Is this not rational?

The second point is often, I think, a matter of language and the nature of group-decision making. If I have a huge amount of experience and deep theoretical insight I may find it difficult to justify something which is actually perfectly rational. In both of these first two, the language of probability can be deeply constraining if one is actually faced with significant uncertainty and others are not used to dealing with it. Gerd’s book may help here, but more may be needed.

The third is a little strange. Is not intuition a fact? I think it means ‘facts that can be written down and verified by others’, in which case language and experience are once again important.

Less is more

In an uncertain world, the reason [that simple methods do better than complex ones] is that complex strategies are led astray by taking into consideration scores of details, many of which are irrelevant.

Actually, it seems to me that the complex  strategies are optimised for cases with no uncertainty, whereas the simple strategies have been selected for their robustness against uncertainty.

Common misconceptions about intuition

Gerd makes the following points:

Logic (or statistics) is best for dealing with known risks, while good intuitions and rules of thumb are indispensable in an uncertain world. [Bookkeeping] is only best in a world of known risk, not under uncertainty.

This overlooks the kind of uncertainty that is represented by Boole or von Neumann and Morgenstern’s Theory of Games.

7 Fun and Games

This has a good discussion of both the idealised and real Monty Hall problems. The first has quantified risk, the other uncertainty. It cites the minimax rule as a ‘rule of thumb’. But it is also recommended by mathematical Game theory, as above. Some of the other heuristics seem useful and genuinely not mathematical.

9 What Doctors Need to Know

We don’t need more paternalism, however; we need to teach people tools for thinking.

Including relevant mathematical tools?

The rest of this chapter is interesting and challenging.

10 Health Care: No decision About Me Without Me

Often you are given things like priors  and likelihoods but are interested in the posteriors. Mathematically, this is straightforward to work out using Bayes’ rule but people aren’t very good at it. Frequency data, presented using ‘icon boxes’ can be used to communicate the posteriors.

But there is a problem. What if the prior for you is significantly different? (E.g. there is a much greater prevalence for your family, ethnicity, occupation or region, or there is an epidemic.) With the existing data you could in principle work out the relevant posterior. Within icon boxes, you would need to make some adjustment, which would be harder. Maybe doctors should have an app for it?

11 Banks, Cows, and Other Dangerous Things

The bankers’ new clothes

[Bankers] weave intricate, magnificent … “risk models” that promise safety that doesn’t exist. These … are said to measure risk precisely.

What Gerd intends to say here was what mathematicians were saying before 2008. But, to be pedantic, suppose that bankers had had perfect risk models. By Gerd’s definition these only deal with the measurable risk, so any predictions are contingent on a crash not happening. That is, they assume ‘safety’, not promise it. Gerd also talks about Basel II. What he doesn’t say is that it set up a regime under which risk modellers competed to down-play risk, a process that in hindsight hardly seems sensible.

 The Turkey illusion

Finance mathematics has its roots in games of chance, that is, in known risk. The seductive power of this  mathematics is that it allows risk to be estimated with just one number.

Gerd’s glossary has:

If the uncertainty associated with an event can be quantified on the basis of empirical observations or causal knowledge (physical design), the uncertainty is called risk.

Assuming that Gerd means ‘quantified by a single number’, then it seems reasonable for finance mathematics to attempt to estimate it. The problem surely is that financial uncertainty is not just risk.

Such calculations provide much greater assurance than warranted, a steady illusion of uncertainty. That’s why they don’t help to prevent disaster. Rather they themselves are a potential hazard and a cause of financial meltdown.

But if financial mathematical methods only address risk, they ignore uncertainty and in particular the possibility that the future is not just a variation on the past. It is not true that the further you drive without refuelling, the less the probability of running out of fuel. So financial mathematics as such cannot cause disaster. But perhaps a misinterpretation of the results did. Simple mathematics could have helped. For example, if every 18 months or so you have an event that according to your model should only happen about once every hundred years, your model should not be relied upon to predict long-run (tail) behaviours.

Risk can be calculated when …:

  • low uncertainty: the world is stable and predictable.
  • few alternatives: not too many risk factors to be estimated.
  • high amount of data available to make these estimations.

This advice mirrors that of Keynes, for example. It shows why financial mathematics must be mute about financial stability so long as it confines itself to risk. Stability is an assumption, not a conclusion.

Part III Start Early (Gerd’s Conclusions)

12 Revolutionize School

 Two First Principles for Teaching

Teach Problem Solving in the Real World

[Statistical] thinking is the most useful branch of mathematics for life – and the one that children find most interesting. Teaching statistical thinking means giving people tools for problem solving in the real world. … Instead of mechanically solving a dozen problems with the help of a particular formula, children and adolescents should be asked to find solutions to real-life problems. … we need to teach the teachers.

Also, they should learn how to interpret proposed solutions, for example by answering the question ‘under what circumstances might this result be seriously misleading?’

Don’t teach for Tests, Teach for Life

[People] behind education-reform movements typically assume that teachers and pupils alike need carrots and sticks to persuade or nudge them into doing their best. A new breed of corporate-world managers assume that business plans are the road toward better schools, replacing experienced teachers by less experienced ones with lower salaries or by online instruction, or by paying teachers by the average test scores of their pupils.

Some good rules of thumb are given as an alternative:

  • Hire well and let them do their job.
  • Don’t just test last month’s topic; include what was learned before and what was yet to be learned.

In terms of the UK, it seems to me that teaching of the type being advocated was once fashionable, and that carrots and sticks have had to be used to pressure teachers and other educators away from such quaint practices, to be more business-like and accountable. Even new teachers, it seems to me, would appreciate Gerd’s points and would welcome the opportunity to do better by his standards, rather than by the current standards. Gerd’s approach would also seem sensible very much more widely. Perhaps businesses should have re-formed to be more like schools?

Health Literacy

[One’s propensity to get various cancers depends on lifestyle.]

Isn’t this true for many health issues? Doesn’t this point to the need for tailored ‘icon boxes’, and hence the need to continue providing likelihood information, but to make it more comprehensible (perhaps using an app to produce tailored icon boxes).

Everyone Can Learn to Deal with Risk and Uncertainty

[Coercing] and nudging people like a herd of sheep instead of making them competent is not a promising vision of democracy.

[It] is possible to improve competence in dealing with risk and uncertainty.

“Liberty cannot be preserved without a general knowledge among the people” [US Pres. John Adams, 1765] This vision is know as participatory democracy.

My Conclusions

I agree that improving people’s understanding of uncertainty and their ability to deal with it, including the use of appropriate heuristics, would make a big difference in finance and health, and would add crisis management, international relations and justice. I agree that we need to re-reform schooling, but think that many teachers (and headmasters and governors) could be part of the solution, given a suitable overall framework. Gerd hints at what such a framework should be, but no more. My own view is that the ‘reforms’ of the last few decades  moved education into a dystopian vision of ‘the modern era’, and what we need is something to replace that vision. But what? Similarly for hospitals and many businesses.

Gerd is consistently critical of nudging. This is sometimes interpreted as a criticism of Kahneman, but most of the criticism is of excessive nudging, which Kahneman may not support. Gerd sees his icon boxes as ‘informing the citizen’ but one could equally well see them as a nudge. The critical thing to me is to discourage misunderstandings, by whatever means. One would also wish to encourage better positive understanding, but first we would need to identify what that is. Here Gerd’s ideas seem more sensible than paternalistic nudges.

As a mathematician, I accept that Gerd’s portrayal of the impact of mathematics in much Western practice is reasonable. But Gerd repeatedly sets up a choice between inappropriate mathematics and heuristics. But what about appropriate mathematics? I think we need more and more appropriate mathematics, not less. Gerd’s implicit assumption seems to be that there is no more appropriate mathematics. I dispute this!

An example

Take Gerd’s example of the decision to fly or drive post 9/11. The risks of driving are reasonably quantifiable and moderate, in the sense that it is something that most people do most of the time without a second thought and while it results in a large loss of life, this is a cost that society as a whole is aware of and to accept. After 9/11 Gerd shows that the risk of flying would have been  worse, possibly much worse, if a similar repeat attack were expected within the next three months or less. From a conventional probability theory perspective you are required to estimate the probability of a repeat attack and hence compute the overall risk, and then choose the least risky option. But it is equally mathematical to follow Boole, von Neumann etc and ask ‘can I rule out the possibility that another attack is intended soon?’ and if not to avoid flying. Gerd (and others) regard this as being ‘risk averse’. But actually it is uncertainty aversion, and why not?

See Also

Savage is one of the key founders of modern statistics. He distinguishes between small and large worlds. Small worlds are where one has risk, large where one has uncertainty. Gerd observes that finance and health care have large world aspects. He may be right that in practice statistics are often applied as if one is in a small world, but this is not because statistics assumes that all worlds are small.

Gigerenzer’s ‘homo heuristicus’, and an interview with the Harvard Business Review.

I also have some notes on rationality and uncertainty and related psychology.

 

Dave Marsay

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: