Risk and Uncertainty Communication

Davbid Spiegelhalter Risk and Uncertainty Communication Annu.Rev.Stat.Appl.2017.4:31–60

Abstract

This review briefly examines the vast range of techniques used to communicate risk assessments arising from statistical analysis. … There are some tentative final conclusions, but the primary message is to acknowledge expert guidance, be clear about objectives, and work closely with intended audiences.

My Comments

This a provides extensive, useful, authoritative guidance on the ‘state of the art’ for communicating lack of certainty to the public. I am reviewing it as a mathematician who has been consulted on these issues, particularly with regard to actual and potential crises.

Introduction

Background

[We] need to consider what we mean by the ill-defined terms in the title: risk, uncertainty, and communication.

What is Risk?

In everyday English usage, “risk” generally refers to undesirable things that might happen. In contrast, the official ISO 31000 (ISO 2009) definition of risk is the more balanced, if slightly vacuous, “effect of uncertainty on objectives.”

From a theoretical perspective, rational decision-making in the face of risk comprises four basic stages (Savage1951):

  1. structuring the list of actions, and the possible consequences of actions,
  2. giving a value to those possible futures,
  3. assigning a probability for each possible consequence ,given each action, and
  4. establishing the rational decision as that which maximizes the expected benefit.

This theory holds in situations of perfect contextual knowledge, such as gambling on roulette when we know the probabilities and possible gains or losses. These perfect-chance problems are the only type in which we might talk about a truly known risk, and even then we need to make assumptions about the fairness of the wheel and the integrity of the casino. Savage (1972) introduced the idea of an idealized small world with which a statistical model deals, but the real, big world is a far messier place, in which we do not have access to all the information necessary to carry out his normative rational behavior. So risk assessment, and therefore risk communication, has to deal with this additional uncertainty.

What is Uncertainty?

[Those] with an economics and social science background often adopt the distinction (Knight 1921) between risk, in which there is agreed quantification due to extensive data and good understanding of a controlled environment, and uncertainty, for when this is not feasible.

Perhaps the simplest three-level categorizations have been provided by Donald Rumsfeld’s (much derided at the time) “known knowns, known unknowns, and unknown unknowns” (Rumsfeld2002); known, unknown, and unknowable in economics (Diebold et al. 2010); or simply risk, uncertainty, and ignorance.

Sticking to this three-level categorization, but focusing on statistical modeling of risks, it is useful to think in terms of the following:

  • Aleatory uncertainty: inevitable unpredictability of the future due to unforeseeable factors, fully expressed by classical probabilities
  • Epistemic uncertainty: uncertainty about the structure and parameters of statistical models, expressed, for example, through Bayesian probability distributions, default parameter values, safety factors, and sensitivity analyses to assumptions (Morgan et al. 2009). The crucial aspect is that this is still modelled and quantified uncertainty.
  • Ontological uncertainty: uncertainty about the entire modeling process as a description of reality. By definition, this is not part of the modelled uncertainty, and can only be expressed as a qualitative and subjective assessment of the coverage of the model, conveying with humility the limitations of our knowledge.

When it comes to communicating the consequences of aleatory and epistemic uncertainty, the standard tools are to provide point estimates of risks, accompanied by distributions, ranges, or a list of alternatives driven by sensitivity analyses. The final, unmodeled uncertainty is more challenging, and some application areas have adopted summary qualitative scales to communicate the confidence in their modeling.

In most decision settings the usual ‘pragmatism’ is to start ‘rationally’ by taking a very simplistic approach to uncertainty and only acknowledging more challenging types when the simplistic approach has very clearly failed. In Savage’s terms, we act as if faced with a ‘small world’ and only acknowledge additional uncertainty when forced to. This is where I come in, although occasionally the problem is due to a technical mistake in applying small-world rationality.

Given any attempt by anyone to ‘communicate uncertainty’ I recommend considering not just uncertainty as they conceive it, but how it might impact on your objectives. By default I assume additional uncertainty and am very careful not to seem to endorse any decision that fails to take account of such uncertainty unless and until a ‘best effort’ has been to identify and address all ‘possible’ sources. Even then, the issue needs to be kept under review.

Often one can commend a decision ‘as long as we have identified and adequately assessed all the relevant factors’. A key distinction to be made is between factors that were ignored, unforeseen or in principle unforeseeable. In this respect I find the above characterisations of uncertainty puzzling, but agree that the real challenge is in the ‘unmodeled uncertainty’.

What is Good Communication?

“A risk communication is successful to the extent that it contributes to the outcomes its sponsor desires”

Three main phases in the history of risk communication were identified by Leiss (1996): Phase I was identified with expertise, in that experts produced numerical risk assessments based on the best available knowledge, and felt that simply communicating these to the public in a clear way would lead people to agree with the rational decisions being proposed.

Leiss’s Phase II, termed trust, [is] based on the belief that people will only accept appropriate risks if they can be persuaded to trust the source of information. … This leads us to Leiss’s Phase III, which is based on the idea of authorities becoming trusted by demonstrating trustworthiness.

I have mostly been involved where ‘the sponsor’ wants to be sure that the risks have been adequately characterized in order to support trustable communication, to specialists and public. In my experience people are mostly trustworthy at communicating the risk that they recognize: the problem often is that they may not recognize key additional uncertainty, such as the ‘ontological’. Indeed, ‘experts’ seem motivated to deny such uncertainty. Where communication is to ‘the public’ or others who are unused to dealing with radical risk there is also a dilemma:

The audience expect ‘experts’ to have taken account of all the uncertainty, so acknowledging that there is (or may be) residual uncertainty risks the audience losing faith in the experts.

Hence a common paradox:

Anyone who claims to be an experts (in the sense that the public demand) probably isn’t (in the sense that is required to deal with ‘big world’ problems)

It is often thus impossible, in the long-run, to ‘demonstrate trustworthiness’. My solution to this dilemma would be to educate the public – or at least professionals engaged in decision-making – on ‘radical uncertainty’. But this may not be practical.

Structure of This Review

I avoid discussing risk communication at times of acute crises, when numerical assessments would generally be inappropriate. I also avoid financial risk communication, which requires its own review.

COMMUNICATING PROBABILITIES OF EVENTS

E.g.:

“1.8 out of 100 men of European ethnicity will develop Psoriasis between the ages of 60 and 69.”

My interpretation of the conclusions (below) is that it might be better to say:

In the last 10 years, in Europe, 1.8 out of 100 men of European ethnicity reaching 69 had developed Psoriasis since they were 60, and there is no reason to think this will change any time soon.

and worse to say:

The probability that men of European ethnicity will develop Psoriasis between the ages of 60 and 69 is 1.8%

Positive and Negative Framing

Some important insights from behavioural psychology on public perception are presented. But my concern is more communication between more educated people.

Ways of Expressing Chances

More important than the choice of format is being absolutely clear as to what the probability actually means (Morgan et al. 2009), which requires careful specification of the reference class (Gigerenzer & Galesic 2012).

Communicating Chronic Risks

COMMUNICATING UNCERTAINTY ABOUT NUMERICAL RISKS

In the next two sections we consider epistemic uncertainty, also known as second-order uncertainty, about the numerical risks in a modeling framework, and ontological uncertainty about the whole modeling process due to limited scientific understanding.

This seems clearer than the characterisation above.

COMMUNICATING CONFIDENCE IN THE ANALYTIC PROCESS

When it comes to communicating limited scientific understanding, there have been numerous suggestions for expressing overall confidence in an analysis based on the quality and strength of the evidence. Approaches have been reviewed by Spiegelhalter & Riesch (2011), and include the following:

  • Explicit model uncertainty: A fully Bayesian procedure has been promoted as a means of weighting alternative models (Morgan et al. 2009), but is essentially reverting to epistemic uncertainty.
  • Qualitative scales expressed as strength of evidence: This is generally based on hierarchies-of evidence scales. For example, the US Preventive Services Task Force places its assessments of the net benefit of health-care interventions on a high/moderate/low certainty scale, whereas the UK Crime Reduction Toolkit and the Teaching/Learning Toolkit  both use 5-point scales for strength of evidence. The GRADE (Grades for Recommendation, Assessment, Development and Evaluation) scale (Balshem et al. 2011) is widely used in health applications ,placing the certainty of the evidence on a high/moderate/low/very low scale.
  • Acknowledged limitations and their possible impact: For example, the European Food Safety Authority conducted an exhaustive review of alternative quantitative methods for communicating epistemic uncertainty, but also recommends making qualitative assessments of the impact on a final conclusion from different unmodeled sources of uncertainty (EFSA2016).
  • Acknowledged ignorance: Unknown unknowns by definition cannot even be listed, and so strategies that are resilient to surprises are called for. This requires sufficient humility to admit the possibility of being wrong, sometimes known as Cromwell’s Law after Oliver Cromwell’s celebrated plea to the Church of Scotland: “I beseech you, in the bowels of Christ, think it possible you may be mistaken” (Carlyle 1871, p.18). So-called “black swans” (Taleb 2007) need not be unthought-of events: If they are simply more extreme than anything that has occurred before, these can be modeled by an appropriate heavy-tailed distribution.
  • Unacknowledged or meta-ignorance: This occurs when we do not even consider the possibility of error (Bammer & Smithson 2008),and is to be avoided.

The scales are important, but often well dealt with by ‘experts’. The problems I am faced with are more often:

  • How to be resilient in the face of possible insufficiently understood  events that are not just more extreme versions of those explicitly considered.
  • Retaining trust when we admit our inadequate understanding in a ‘big world’ situation.

CONCLUSIONS

 1. General issues when communicating risks based on statistical analysis

  • Be clear about objectives.
  • Segment audience into target groups and identify their needs, beliefs, and skills.
  • Develop, test, and evaluate material with target groups.
  • Build trust by being trustworthy.
  • Use plain language and limit information to only what is necessary.
  • Allow for different levels of interest, knowledge, and numeracy, for example, a top gist level, then numerical information, and then evidence and uncertainty.
  • Have the humility to admit uncertainty.

2. Communicating numerical risks

  • Use absolute risks (but also provide relative risks when dealing with potential catastrophic events).
  • For single unique events, use percent chance if possible, or if necessary, “1 in X.”
  • When appropriate, express chance as a proportion, a frequency, or a percentage—it is crucial to be clear about the reference class.
  • To avoid framing bias, provide percentages or frequencies both with and without the outcome.
  • Keep the denominator fixed when making comparisons with frequencies, and use an incremental risk format.
  • Be explicit about the time interval.
  • Be aware that comparators can create an emotional response.
  • For more knowledgeable audiences, consider providing quantitative epistemic uncertainty about the numbers and qualitative assessment of confidence in the analysis.
  • More sophisticated metrics can be made for technical audiences, but this only serves to exclude others.

This is ‘state of the art’ and useful in so far as it goes. It leaves open:

  • How to ensure that the appreciation of risks is adequate and appropriate for the target audience.
  • In particular, how to deal with ‘ontological uncertainty’, other than by acknowledging it.
  • How to cope with the dilemmas and paradoxes above.
  • What to do in an ‘acute crisis’.

My Comments

My approach to uncertainty has usually been to interpret Whitehead et al in a specific context. To be effective, it is usually necessary to establish what understanding my interlocutors already have of uncertainty. Whilst experts act as if their subject is a ‘small world’, many have bigger lives outside which can be exploited to develop a fuller understanding. This is often very individual. It would be helpful to have some more widespread understanding of uncertainty, particularly in an acute crisis. But what? Technically, the work of Whitehead et al, as discussed on my blog, seems as good as it gets and better than nothing. But it is notoriously obscure. I am looking out for something better. This review may be a start, but would need significant supplementation. I shall think about it.

Dave Marsay

Advertisements
%d bloggers like this: