Uncertainty is not just probability

I have just had published my paper, based on the discussion paper referred to in a previous post. In Facebook it is described as:

An understanding of Keynesian uncertainties can be relevant to many contemporary challenges. Keynes was arguably the first person to put probability theory on a sound mathematical footing. …

So it is not just for economists. I could be tempted to discuss the wider implications.

Comments are welcome here, at the publisher’s web site or on Facebook. I’m told that it is also discussed on Google+, Twitter and LinkedIn, but I couldn’t find it – maybe I’ll try again later.

Dave Marsay

Evolution of Pragmatism?

A common ‘pragmatic’ approach is to keep doing what you normally do until you hit a snag, and (only) then to reconsider. Whereas Lamarckian evolution would lead to the ‘survival of the fittest’, with everyone adapting to the current niche, tending to yield a homogenous population, Darwinian evolution has survival of the maximal variety of all those who can survive, with characteristics only dying out when they are not viable. This evolution of diversity makes for greater resilience, which is maybe why ‘pragmatic’ Darwinian evolution has evolved.

The products of evolution are generally also pragmatic, in that they have virtually pre-programmed behaviours which ‘unfold’ in the environment. Plants grow and procreate, while animals have a richer variety of behaviours, but still tend just to do what they do. But humans can ‘think for themselves’ and be ‘creative’, and so have the possibility of not being just pragmatic.

I was at a (very good) lecture by Alice Roberts last night on the evolution of technology. She noted that many creatures use tools, but humans seem to be unique in that at some critical population mass the manufacture and use of tools becomes sustained through teaching, copying and co-operation. It occurred to me that much of this could be pragmatic. After all, until recently development has been very slow, and so may well have been driven by specific practical problems rather than continual searching for improvements. Also, the more recent upswing of innovation seems to have been associated with an increased mixing of cultures and decreased intolerance for people who think for themselves.

In biological evolution mutations can lead to innovation, so evolution is not entirely pragmatic, but their impact is normally limited by the need to fit the current niche, so evolution typically appears to be pragmatic. The role of mutations is more to increase the diversity of behaviours within the niche, rather than innovation as such.

In social evolution there will probably always have been mavericks and misfits, but the social pressure has been towards conformity. I conjecture that such an environment has favoured a habit of pragmatism. These days, it seems to me, a better approach would be more open-minded, inclusive and exploratory, but possibly we do have a biologically-conditioned tendency to be overly pragmatic: to confuse conventions for facts and  heuristics for laws of nature, and not to challenge widely-held beliefs.

The financial crash of 2008 was blamed by some on mathematics. This seems ridiculous. But the post Cold War world was largely one of growth with the threat of nuclear devastation much diminished, so it might be expected that pragmatism would be favoured. Thus powerful tools (mathematical or otherwise) could be taken up and exploited pragmatically, without enough consideration of the potential dangers. It seems to me that this problem is much broader than economics, but I wonder what the cure is, apart from better education and more enlightened public debate?

Dave Marsay

 

 

The End of a Physics Worldview (Kauffman)

Thought provoking, as usual. This video goes beyond his previous work, but in the same direction. His point is that it is a mistake to think of ecologies and economies as if they resembled the typical world of Physics. A previous written version is at npr, followed by a later development.

He builds on Kant’s notion of wholes, noting (as Kant did before him) that the existence of such wholes is inconsistent with classical notions of causality.  He ties this in to biological examples. This complements Prigogine, who did a similar job for modern Physics.

Kauffman is critical of mathematics and ‘mathematization’, but seems unaware of the mathematics of Keynes and Whitehead. Kauffman’s view seems the same as that due to Bergson and Smuts, which in the late 1920s defined ‘modern science’. To me the problem behind the financial crash lies not in science or mathematics or even in economics, but in the brute fact that politicians and financiers were wedded to a pre-modern (pre-Kantian) view of economics and mathematics. Kauffman’s work may help enlighten them on the need, but not on the potential role for modern mathematics.

Kauffman notes that at any one time there are ‘adjacent possibles’ and that in the near future they may come to pass, and that – conceptually – one could associate a probability distribution with these possibilities. But as new possibilities come to pass new adjacent possibilities arise. Kauffman supposes that it is not possible to know what these are, and hence one cannot have a probability distribution, much of information theory makes no sense, and one cannot reason effectively. The challenge, then, is to discover how we do, in fact, reason.

Kauffman does not distinguish between short and long run. If we do so then we see that if we know the adjacent possible then our conventional reasoning is appropriate in the short-term, and Kauffman’s concerns are really about the long-term: beyond the point at which we can see the potential possibles that may arise. To this extent, at least, Kauffman’s post-modern vision seems little different from the modern vision of the 1920s and 30s, before it was trivialized.

Dave Marsay

From Being to Becoming

I. Prigogine, From Being to Becoming: Time and Complexity in the Physical Sciences, WH Freeman, 1980 

 See new page.

Summary

“This book is about time.” But it has much to say about complexity, uncertainty, probability, dynamics and entropy. It builds on his Nobel lecture, re-using many of the models and arguments, but taking them further.

Being is classically modelled by a state within a landscape, subject to a fixed ‘master equation’ describing changes with time. The state may be an attribute of an object (classical dynamics) or a probability ‘wave’ (quantum mechanics). [This unification seems most fruitful.] Such change is ‘reversible’ in the sense that if one reverses the ‘arrow of time’ one still has a dynamical system.

Becoming refers to more fundamental, irreversible, change, typical of ‘complex systems’ in chemistry, biology and sociology, for example. 

The book reviews the state of the art in theories of Being and Becoming, providing the hooks for its later reconciliation. Both sets of theories are phenomenological – about behaviours. Prigogine shows that not only is there no known link between the two theories, but that they are incompatible.

Prigogine’s approach is to replace the notion of Being as being represented by a state, analogous to a point in a vector space,  by that of an ‘operator’ within something like a Hilbert Space. Stable operators can be thought of as conventional states, but operators can become unstable, which leads to non-statelike behaviours. Prigogine shows how in some cases this can give rise to ‘becoming’.

This would, in itself, seem a great and much needed subject for a book, but Prigogine goes on to consider the consequences for time. He shows how time arises from the operators. If everything is simple and stable then one has classical time. But if the operators are complex then one can have a multitude of times at different rates, which may be erratic or unstable. I haven’t got my head around this bit yet.

Some Quotes

Preface

… the main thesis …can be formulated as:

  1. Irreversible processes are as real as reversible ones …
  2. Irreversible processes play a fundamental constructive role in the physical world …
  3. Irreversibility … corresponds … to an embedding of dynamics within a vaster formalism. [Processes instead of points.] (xiii)

The classical, often called “Galilean,” view of science was to regard the world as an “object,” to try to describe the physical world as if it were being seen from the outside as an object of analysis to which we do not belong. (xv)

… in physics, as in sociology, only various possible “scenarios” can be predicted. [One cannot predict actual outcomes, only identify possibilities.] (xvii)

Introduction

… dynamics … seemed to form a closed universal system, capable of yielding the answer to any question asked. (3)

… Newtonian dynamics is replaced by quantum mechanics and by relativistic mechanics. However, these new forms of dynamics … have inherited the idea of Newtonian physics: a static universe, a universe of being without becoming. (4)

The Physics of Becoming

The interplay between function, structure and fluctuations leads to the most unexpected phenomena, including order through fluctuations … . (101)

… chemical instabilities involve long-range order through which the system acts as a whole. (104)

… the system obeys deterministic laws [as in classical dynamics] between two bifurcation points, but in the neighbourhood of the bifurcation points fluctuations play an essential role and determine the “branch” that the system will follow. (106) [This is termed ‘structurally unstable”]

.. a cyclic network of reactions [is] called a hypercycle. When such networks compete with one another, they display the ability the ability to evolve through mutation and replication into greater complexity. …
The concept of structural stability seems to express in the most compact way the idea of innovation, the appearance of a new mechanism and a new species, … . (109)

… the origin of life may be related to successive instabilities somewhat analogous to the successive bifurcations that have led to a state of matter of increasing coherence. (123)

As an example, … consider the problem of urban evolution … (124) … such a model offers a new basis for the understanding of “structure” resulting from the actions (choices) of the many agents in a system, having in part at least mutually dependent criteria of action. (126)

… there are no limits to structural instability. Every system may present instabilities when suitable perturbations are introduced. Therefore, there can be no end to history. [DJM emphasis.] … we have … the constant generation of “new types” and “new ideas” that may be incorporated into the structure of the system, causing its continual evolution. (128)

… near bifurcations the law of large numbers essentially breaks down.
In general, fluctuations play a minor role … . However, near bifurcations they play a critical role because there the fluctuation drives the average. This is the very meaning of the concept of order through fluctuations .. . (132)

… near a bifurcation point, nature always finds some clever way to avoid the consequences of the law of large numbers through an appropriate nucleation process. (134)

… For small-scale fluctuations, boundary effects will dominate and fluctuations will regress. … for large-scale fluctuations, boundary effects become negligible. Between these limiting cases lies the actual size of nucleation. (146)

… We may expect that in systems that are very complex, in the sense that there are many interacting species or components, [the degree of coupling between the system and its surroundings] will be very large, as will be the size of the fluctuation which could start the instability. Therefore … a sufficiently complex system is generally in a metastable state. (147) [But see Comments below.]

… Near instabilities, there are large fluctuations that lead to a breakdown of the usual laws of probability theory. (150)

The Bridge from Being to Becoming

[As foreshadowed by Bohr] we have a new form of complimentarity – one between the dynamical and thermodynamic descriptions. (174)

… Irreversibility is the manifestation on a macroscopic scale of “randomness” on a microscopic scale. (178)

Contrary to what Boltzmann attempted to show there is no “deduction” of irreversibility from randomness – they are only cousins! (177)

The Microscopic Theory of Irreversible Processes

The step made … is quite crucial. We go from the dynamical system in terms of trajectories or wave packets to a description in terms of processes. (186)

… Various mechanisms may be involved, the important element being that they lead to a complexity on the microscopic level such that the basic concepts involved in the trajectory or wave function must be superseded by a statistical ensemble. (194)

The classical order was: particles first, the second law later – being before becoming! It is possible that this is no longer so when we come to the level of elementary particles and that here we must first introduce the second law before being able to define the entities. (199)

The Laws of Change

… Of special interest is the close relation between fluctuations and bifurcations which leads to deep alterations in the classical results of probability theory. The law of large numbers is no longer valid near bifurcations and the unicity of the solution of … equations for the probability distribution is lost. (204)

This mathematization leads us to a new concept of time and irreversibility … . (206)

… the classical description in terms of trajectories has to be given up either because of instability and randomness on the microscopic level or because of quantum “correlations”. (207)

… the new concept implies that age depends on the distribution itself and is therefore no longer an external parameter, a simple label as in the conventional formula.
We see how deeply the new approach modifies our traditional view of time, which now emerges as a kind of average over “individual times” of the ensemble. (210)

For a long time, the absolute predictability of classical mechanics, or the physics of being, was considered to be an essential element of the scientific picture of the physical world. … the scientific picture has shifted toward a new, more subtle conception in which both deterministic features and stochastic features play an essential role. (210)

The basis of classical physics was the conviction that the future is determined by the present, and therefore a careful study of the present permits the unveiling of the future. At no time, however, was this more than a theoretical possibility. Yet in some sense this unlimited predictability was an essential element of the scientific picture of the physical world. We may perhaps even call this the founding myth of classical science.
The situation is greatly changed today. … The incorporation of the limitation of our ways of acting on nature has been an essential element of progress. (214)

Have we lost essential elements of classical science in this recent evolution [of thought]? The increased limitation of deterministic laws means that we go from a universe that is closed to one that is open to fluctuations. to innovations.

… perhaps there is a more subtle form of reality that involves both laws and games, time and eternity. (215) 

Comments

Relationship to previous work

This book can be seen as a development of the work of Kant, Whitehead and Smuts on emergence, although – curiously – it makes little reference to them [pg xvii]. In their terms, reality cannot logically be described in terms of point-like states within spaces with fixed ‘master equations’ that govern their dynamics. Instead, it needs to be described in terms of ‘processes’. Prigogine goes beyond this by developing explicit mathematical models as examples of emergence (from being to becoming) within physics and chemistry.

Metastability

According to the quote above, sufficiently complex systems are inherently metastable. Some have supposed that globalisation inevitably leads to an inter-connected and hence complex and hence stable world. But globalisation could lead to homogenization or fungibility, a reduction in complexity and hence an increased vulnerability to fluctuations. As ever, details matter.

See Also

I. Prigogine and I. Strengers Order out of Chaos Heinemann 1984.
This is an update of a popular work on Prigogine’s theory of dissipative systems. He provides an unsympathetic account of Kant’s Critique of Pure Reason, supposing Kant to hold that there are “a unique set of principles on which science is based” without making reference to Kants’ concept of emergence, or of the role of communities. But he does set his work within the framework of Whitehead’s Process and Reality. Smuts’ Holism and Evolution, which draws on Kant and mirrors Whitehead is also relevant, as a popular and influential account of the 1920s, helping to define the then ‘modern science’.

Dave Marsay

Reasoning and natural selection

Cosmides, L. & Tooby, J. (1991). Reasoning and natural selection. Encyclopedia of Human Biology, vol. 6. San Diego: Academic Press

Summary

Argues that logical reasoning, by which it seems to mean classical induction and symbolic reasoning, are not favoured by evolution. Instead one has reasoning particular to the social context. It argues that in typical situations it is either not possible or not practical to consider ‘all hypotheses’, and that the generation of hypotheses to consider is problematic. It argues that this is typically done using implicit specific theories. Has a discussion of the ‘green and blue cabs’ example.

Comment

 In real situations one can assume induction and lacks the ‘facts’ to be able to perform symbolic reasoning. Logically, then, empirical reasoning would seem more suitable. Keynes, for example, considers the impact of not being able to consider ‘all hypotheses’.

While the case against classically rationality seems sound, the argument leaves the way open for an alternative rationality, e.g. based on Whitehead and Keynes.

See Also

Later work

Better than rational, uncertainty aversion.

Other

Reasoning, mathematics.

Dave Marsay

Better than Rational

Cosmides, L. & Tooby, J. (1994). Better than rational: Evolutionary psychology and the invisible hand. American Economic Review, 84 (2), 327-332.

Summary

[Mainstream Psychologists and behaviourists have studied] “biases” and “fallacies”-many of which are turning out to be experimental artifacts or misinterpretations (see G. Gigerenzer, 1991). [Gigerenzer, G. “How to Make Cognitive Illusions Disappear: Beyond Heuristics and Biases,” in W. Stroebe and M. Hewstone, eds.,  European review of social psychology, Vol. 2. Chichester, U.K.: Wiley, 1991, pp. 83-115.]

… 

One point is particularly important for economists to appreciate: it can be demonstrated that “rational” decision-making methods (i.e., the usual methods drawn from logic, mathematics, and probability theory) are computationally very weak: incapable of solving the natural adaptive problems our ancestors had to solve reliably in order to reproduce (e.g., Cosmides and Tooby, 1987; Tooby and Cosmides, 1992a; Steven Pinker, 1994).

…  sharing rules [should be] appealing in conditions of high variance, and unappealing when resource accrual is a matter of effort rather than of luck (Cosmides and Tooby, 1992).

Comment

They rightly criticise ‘some methods’ drawn from mathematics etc, but some have interpreted as meaning that “logic, mathematics, and probability theory are … incapable of solving the natural adaptive problems our ancestors had to solve reliably in order to reproduce”. But this leads them to overlook relevant theories, such as Whitehead and Keynes‘.

See Also

Relevant mathematics, Avoiding unknown probabilities, Kahneman on biases

NOTE

This has been copied to my bibliography section under ‘rationality and uncertainty’, ‘more …’, where it has more links. Please comment there.

Dave Marsay

Which Mathematics of Uncertainty for Today’s Challenges?

This is a slight adaptation of a technical paper presented to an IMA conference 16 Nov. 2009, in the hope that it may be of broader interest. It argues that ‘Knightian uncertainty’, in Keynes’ mathematical form, provides a much more powerful, appropriate and safer approach to uncertainty than the more familiar ‘Bayesian (numeric) probability’.

Issues

Conventional Probability

The combination of inherent uncertainty and the rate of change challenge or capabilties.

There are gaps in the capability to handle both inherent uncertainty and rapid change.

Keynes et al suggest that there is more to uncertainty than random probability. We seem to be able to cope with high volumes of deterministic or probabilistic data, or low volumes of less certain data, but to have problems at the margins. This leads to the questions:

  • How complex is the contemporary world?
  • What is the perceptual problem?
  • What is contemporary uncertainty like?
  • How is uncertainty engaged with?

Probability arises from a definite context

Objective numeric probabilities can arise through random mechanisms, as in gambling. Subjective probabilities are often adequate for familiar, situations where decisions are short-term, with only cumulative long-term impact, at worst. This is typical of the application of established science and engineering where one has a kind of ‘information dominance’ and there are only variations within an established frame / context.

Contexts

Thus (numeric) probability is appropriate where:

  • Competition is coherent and takes place within a stable, utilitarian, framework.
  • Innovation does not challenge the over-arching status quo or ‘world view’
  • We only ever need to estimate the current parameters within a given model.
  • Uncertainty can be managed. Uncertainty about estimates can be represented by numbers (probability distributions), as if they were principally due to noise or other causes of variation.
  • Numeric probability is multiplied by value to give a utility, which is optimised.
  • Risk is only a number, negative utility.

Uncertainty is measurable (in one dimension) where one has so much stability that almost everything is measurable.

Probability Theory

Probability theories typically build on Bayes’ rule [Cox] :

P(H|E) = P(H).(P(E|H)/P(E)),

where P(E|H) denotes the ‘likelihood’, the probability of evidence, E, given a hypothesis, H. Thus the final probability is the prior probability times the ‘likelihood ratio’.

The key assumptions are that:

  • The selection of evidence for a given hypothesis, H, is indistinguishable from a random process with a proper numeric likelihood function, P( · |H).
  • The selection of the hypothesis that actually holds is indistinguishable from random selection from a set {Hi} with ‘priors’ P(Hi) – that can reasonably be estimated – such that
    • P(HiÇHj) = 0 for i ¹ j (non-intersection)
    • P(ÈiHi) = 1 (completeness).

It follows that P(E) = SiP(E|Hi).P(Hi) is well-defined.

H may be composite, so that there are many proper sub-hypotheses, h Þ H, with different likelihoods, P(E|h). It is then common to use the Bayesian likelihood,

P(E|H) = òh ÞHP(E|h).dP(h|H),

or

P(E|H) = P(E|h), for some representative hypothesis h.

In either case, hypotheses should be chosen to ensure that the expected likelihood is maximal for the true hypothesis.

Bayes noted a fundamental problem with such conventional probability: “[Even] where the course of nature has been the most constant … we can have no reason for thinking that there are no causes in nature which will ever interfere with the operations the causes from which this constancy is derived.”

Uncertain in Contemporary Life

Uncertainty arises from an indefinite context

Uncertainty may arise through human decision-making, adaptation or evolution, and may be significant for situations that are unfamiliar or for decisions that may have long-term  impact. This is typical of the development of science in new areas, and of competitions where unexpected innovation can transform aspects of contemporary life. More broadly still, it is typical of situations where we have a poor information position or which challenge our sense-making, and where we could be surprised, and so need to alter our framing of the situation. For example, where others can be adaptive or innovative and hence surprising.

Contexts

  • Competitions, cooperations, collaborations, confrontations and conflicts all nest and overlap messily, each with their own nature.
  • Perception is part of multiple co-adaptations.
  • Uncertainty can be shaped but not fully tamed. Only the most careful reasoning will do.
  • Uncertainty and utility are imprecise and conditional. One can only satisfice, not optimise.
  • Critical risks arise from the unanticipated.

Likelihoods, Evidence

In Plato’s republic the elite make the rules which form a fixed context for the plebs. But in contemporary life the rulers only rule with the consent of the ruled and in so far as the rules of the game ’cause’ (or at least influence) the behaviour of the players, the participants have reason to interfere with causes, and in many cases we expect it: it is how things get done. J.M. Keynes and I.J. Good (under A.M.Turing) developed techniques that may be used for such ‘haphazard’ situations, as well as random ones.

The distinguishing concepts are: The law of evidence; generalized weight of evidence (woe) and iterative fusion.

If datum, E, has a distribution f(·) over a possibility space, , then distributions g(·) over ,

 òlog(f(E)).f(E )  ³ òlog(g(E)).f(E).

I.e. the cross-entropy is no more than the entropy. For a hypothesis H in a context, C, such that the likelihood function g = PH:C is well-defined, the weight of evidence (woe) due to E for H is defined to be:

W(E|H:C) º log(PH:C (E)).

Thus the ‘law of evidence’: that the expected woe for the truth is never exceeded by that for any other hypothesis. (But the evidence may indicate that many or none of thehypotheses fit.) For composite hypotheses, the generalized woe is:

W(E|H:C) º suph ÞH {W(E|h:C)}.

This is defined even for a haphazard selection of h.

Let ds(·) be a discounting factor for the source, s [Good]. If one has independent evidence, Es, from different sources, s, then typically the fusion equation is:

W(E|H:C,ds) £ Ss{ds (W(Es |H:C))},

with equality for precise hypotheses. Together, generalized woe and fusion determine how woe is propagated through a network, where the woe for a hypothesis is dependent on an assumption which itself has evidence. The inequality forces iterative fusion, whereby one refines candidate hypotheses until one has adequate precision. If circumstantial evidence indicates that the particular situation is random, one could take full account of it, to obtain the same result as Bayes, or discount [Good].

In some cases it is convenient, as Keynes does, to use an interval likelihood or woe, taking the infimum and supremum of possible values. The only assumption is that the evidence can be described as a probabilistic outcome of a definite hypothesis, even if the overall situation is haphazard. In practice, the use of likelihoods is often combined with conjectural causal modelling, to try to get at a deep understanding of situations.

Examples

Crises

Typical crisis dynamics

Above is an informal attempt to illustrate typical crisis kinematics, such as the financial crisis of 2007/8. It is intended to capture the notion that conventional probability calculations may suffice for long periods, but over-dependence on such classical constructs can lead to shocks or crises. To avoid or mitigate these more attention should be given to uncertainty [Turner].

An ambush

Uncertainty is not necessarily esoteric or long-term. It can be found wherever the assumptions of conventional probability theory do not hold, in particular in multilevel games. I would welcome more examples that are simple to describe, relatively common and where the significance of uncertainty is easy to show.

Deer need to make a morning run from A to B. Routes r, s, t are possible. A lion may seek to ambush them. Suppose that the indicators of potential ambushes are equal. Now in the last month route r has been used 25 times, s 5 times and t never, without incident. What is the ‘probability’ of an ambush for the 3 routes?

Let A=“The Lion deploys randomly each day with a fixed probability distribution, p”. Here we could use a Bayesian probability distribution over p, with some sensitivity analysis.

But this is not the only possibility. Alternatively, let B =“The Lion has reports about some of our runs, and will adapt his deployments.” We could use a Bayesian model for the Lion, but with less confidence. Alternatively, we could use likelihoods.

Route s is intermediate in characteristics between the other two. There is no reason to expect an ambush at s that doesn’t apply to one of the other two. On the other hand, if the ambush is responsive to the number of times a route is used then r is more likely than s or t, and if the ambush is on a fixed route, it is only likely to be on t. Hence s is the least likely to have an ambush.

Consistently selecting routes using a fixed probability distribution is not as effective as a muddling strategy [Binmore] which varies the distribution, supporting learning and avoiding an exploitable equilibrium.

Concluding Remarks

Conventional (numeric) probability, utility and rationality all extrapolate based on a presumption of stability. If two or more parties are co-adapting or co-evolving any equilibria tend to be punctuated, and so a more general approach to uncertainty, information, communication, value and rationality is indicated, as identified by Keynes, with implications for ‘risk’.

Dave Marsay, Ph.D., C.Math FIMA, Fellow ISRS

References:

Bayes, T. An Essay towards solving a Problem in the Doctrine of Chances (1763), Philosophical Transactions of the Royal Society of London 53, 370–418. Regarded by most English-speakers as ‘the source’.

Binmore, K, Rational Decisions (2009), Princeton U Press. Rationality for ‘muddles’, citing Keynes and Turing. Also http://else.econ.ucl.ac.uk/papers/uploaded/266.pdf .

Cox, R.T. The Algebra of Probable Inference (1961) Johns Hopkins University Press, Baltimore, MD. The main justification for the ‘Bayesian’ approach, based on a belief function for sets whose results are comparable. Keynes et al deny these assumptions. Also Jaynes, E.T. Probability Theory: The Logic of Science (1995) http://bayes.wustl.edu/etj/prob/book.pdf .

Good, I.J. Probability and Weighting of Evidence (1950), Griffin, London. Describes the basic techniques developed and used at Bletchley Park. Also Explicativity: A Mathematical Theory of Explanation with Statistical Applications (1977) Proc. R. Soc. Lond. A 354, 303-330, etc. Covers discounting, particularly of priors. More details have continued to be released up until 2006.

Hodges, A. Alan Turing (1983) Hutchinson, London. Describes the development and use of ‘weights of evidence’, “which constituted his major conceptual advance at Bletchley”.

Keynes, J.M. Treatise on Probability (1920), MacMillan, London. Fellowship essay, under Whitehead. Seminal work, outlines the pros and cons of the numeric approach to uncertainty, and develops alternatives, including interval probabilities and the notions of likelihood and weights of evidence, but not a ‘definite method’ for coping with uncertainty.

Smuts, J.C. The Scientific World-Picture of Today, British Assoc. for the Advancement of Science, Report of the Centenary Meeting. London: Office of the BAAS. 1931. (The Presidential Address.) A view from an influential guerrilla leader, General, War Cabinet Minister and supporter of ‘modern’ science, who supported Keynes and applied his ideas widely.

Turner, The Turner Review: A regulatory response to the global banking crisis (2009). Notes the consequences of simply extrapolating, ignoring non-probabilistic (‘Knightian’) uncertainty.

Whitehead, A.N. Process and Reality (1929: 1979 corrected edition) Eds. D.R. Griffin and D.W. Sherburne, Free Press. Whitehead developed the logical alternative to the classical view of uniform unconditional causality.