Examples of Uncertainty in Real Decisions

Uncertainty, beyond that of numeric probability, is apparent in many familiar decisions. Here the focus is on those that may be familiar, where overlooked uncertainty seems to have led to important mistakes. See Sources of Uncertainty for an overview of the situations and factors considered.

Financial crash

Before the financial crash of 2007/8 finance was largely considered from the point of view that risk is variability. Keynes was ignored, both his economics and his mathematics of uncertainty and risk. After the crash Keynes’ economics and Keynesian economics came to the fore, and his ‘Knightian uncertainty’ more recognized. It is perhaps clear that the conditions and factors above – largely based on Keynes’ work – were operative. An approach to uncertainty that seeked to uncover the key factors may have been more helpful than thinking of them as sources of variability and probability distributions.

UK Miscarriages of Justice

Emotion and assessment

The UK’s most notorious miscarriages of justice often share some of the following characteristics:

An event evokes public outrage (and hence tends to be rare). There is intense pressure to find and punish those guilty. Suspects who lie outside the mainstream of society are found.

 Thus one tends not to have the conditions that support reliable probability judgements.

In the Birmingham six case, a key piece of evidence was a forensic test that showed that one  had handled explosives ‘with a 99% certainty’. An appeal was turned down on these reflexive grounds:

 “If they won, it would mean that the police were guilty of perjury; that they were guilty of violence and threats; that the confessions were involuntary and improperly admitted in evidence; and that the convictions were erroneous. That would mean that the Home Secretary would have either to recommend that they be pardoned or to remit the case to the Court of Appeal. That was such an appalling vista that every sensible person would say, ‘It cannot be right that these actions should go any further.”

In their final appeal it was recognized that a similar forensic result could have been obtained if the suspect had handled playing cards. Similar forensic problems bedevilled other cases, such as the Maguire seven.

Bayesian reasoning

The case R v T has raised some relatively mundane issues of estimation. The weight of evidence depends on an estimation of the likelihood of the evidence supposing that the suspect is innocent. In R v T footmarks were found at the scene of a murder that matched an associate’s shoes. The original forensic scientist used an approximation to whole population statistics for the prevalence of the shoes. But for many crimes the perpetrators are likely to be drawn from some local population who are likely to be more similar than the general population, and so typical forensic evidence is likely to be more likely for the appropriate population than for the population as a whole: if the print of a particular shoe is found then that shoe is likely to be more common among the associates of the victim than for the population as a whole.

Weapons of Mass Destruction

Most westerners, at least, regarded it as probable or highly probable that Saddam Hussein had WMD, leading to the decision to invade Iraq, after which none were found. From a probability perspective this may seem to be just bad luck. But it does seem odd that an assessment made on such a large and wide evidence base was so wrong.

This is clearly an area where probability estimation doesn’t meet the conditions to be non-contentious: Saddam was not a randomly selected dictator. Thus one might have been prompted to look for the specific factors. There was some of evidence, at the time, of:

  • complexity, particularly reflexivity
  • vagueness
  • source unreliability (widely blamed).

This might have prompted more detailed consideration, for example, of Saddam’s motivation: if he had no WMD, what did he have to lose by letting it be known? It seems unlikely that a routine sensitivity analysis would have been as insightful.

Stockwell

Two weeks after London’s 7/7 bombings and a day after an attempted bombing, Jean Charles de Mendez was mistaken for a bomber and shot at Stockwell tube station. This case has some similarities to miscarriages of justice. As the Gold Commander made clear at the inquest, the key test was the balance of probability between the suspect being about to cause another atrocity and an innocent man being killed. The standard is thus explicitly probabilistic rather than being one of ‘reasonable doubt’.

The suspect was being followed by ‘James’s team’, and James said that ‘it was probably him [the known terrorist]’. From then on nothing suggested the suspect’s innocence, and he was shot before he could blow himself up.

The inquest did not particularly criticise any of those involved, but from an uncertainty perspective the following give pause for thought:

  • the conditions were far from routine.
  • there were some similarities with known miscarriages of justice in terrorist cases
  • the specific factors above were present

More particularly:

  • The Gold Commander had access to relevant information that James lacked, which appears not to have been taken into account.
  • James regarded the request for a ‘probability assessment’ (as against hard evidence) of improper, and only provided one under pressure.
  • In assessing probability nothing that James’ team had seen (apart from some nervousness) was suggestive that the suspect was a terrorist. The main thing they had been told was that the suspect had come out of the flat of the known terrorist, but by then the Gold Commander knew that the terrorist’s flat had a shared doorway, so the probability assessment should have been reduced accordingly.
  • Those who shot the suspect were relying on James’ judgement, but were unaware of the circumstances in which he had given it.

With hindsight it may be significant that:

  • The suspect had got off the bus at Brixton, found the station to be closed, and got back on. The station was closed due to a security alert, but – not knowing this – the behaviour may have seemed to be anti-surveillance. [The inquest found that this innocent behaviour did not contribute to the death.]
  • The Gold Commander was in a reflexive situation: if the suspect was not shot then it must have been assessed that ‘on the balance of probability’ the suspect was innocent, in which case he ought not to have been followed.

Time was pressing, but a fuller consideration of uncertainty might have led to:

  • James being asked to supply descriptions of, and/or likelihoods for, what he had seen against the terrorist and innocent hypotheses, rather than ‘final’ probabilities.
  • Consideration being given to innocent explanations for the suspect’s behaviour

More

Ulrich Beck opined (1992) that the Knightian ‘true uncertainty’, particularly the reflexive, aspects of risk are being mishandled, with widespread adverse consequences. Naomi Klein has a similar view. Here are some relatively mundane specifics.

Economic Recovery from 2007/8

Robert Skidelsky, an advocate of Keynes and his view of uncertainty, has noted:

Keynes thought that the chief implicit assumption underlying the classical theory of the economy was that of perfect knowledge. “Risks,” he wrote, “were supposed to be capable of an exact actuarial computation. The calculus of probability … was supposed to be capable of reducing uncertainty to the same calculable status as certainty itself.”

For Keynes, this is untenable: “Actually…we have as a rule only the vaguest idea of any but the most direct consequences of our acts.” This made investment, which is always a bet on the future, dependent on fluctuating states of confidence. Financial markets, through which investment is made, were always liable to collapse when something happened to disturb business confidence. Therefore, market economies were inherently unstable.

Unless we start discussing economics in a Keynesian framework, we are doomed to a succession of crises and recessions. If we don’t, the next one will come sooner than we think.

Climate Change

Much of the climate change ‘debate’ seems to be being driven by preconceived ideas and special interests, but these positions tend to align with different views on uncertainty

Mobile phone cancer risk

The International Agency for Research on Cancer (IARC), part of the World Health Organization (WHO), has issued a press release stating that it:

has classified radiofrequency electromagnetic fields as possibly carcinogenic to humans (Group 2B), based on an increased risk for glioma, a malignant type of brain cancer, associated with wireless phone use.

…. The conclusion means that there could be some risk, and therefore we need to keep a close watch for a link between cell phones and cancer risk.”

Where: Group 2B; Possibly carcinogenic to humans: “This category is used for agents for which there is limited evidence of carcinogenicity … .” Thus it is possible that there is no carcinogenity.

The understanding uncertainty blog has noted how the British media has confused the issues, giving the impression that there was an increased risk of cancer. But from a probability perspective, what does ‘could be some risk’ mean? If the probability of risk r is p(r) then (from a standard Bayesian viewpoint) the (overall) risk is ∫(p(r).r)dr, which is positive unless there is definitely no risk. Thus if ‘there could be some risk’ then there is some risk. On the other hand, if we assess the risk as an interval, [0, small], then it is clear that there could be no risk, but (as the IARC suggests) further research is required to reduce the uncertainty. The IARC’s statement that:

The Working Group did not quantitate the risk; however, one study of past cell phone use (up to the year 2004), showed a 40% increased risk for gliomas in the highest category of heavy users (reported average: 30 minutes per day over a 10‐year period).

This is presumably the worst case to hand (balancing apparent effect and weight of evidence), so that (confusion of language apart) it is easy to interpret the release in terms of uncertainty, noting the link to heavy uasage. It is unfortunate that the British media did not: maybe we do need a more nuanced language?

 See Also

 Reasoning under uncertainty methods , biases and uncertainty,  metaphors, scaling

David Marsay