What logical term or concept ought to be more widely known?

Various What scientific term or concept ought to be more widely known? Edge, 2017.

INTRODUCTION: SCIENTIA

Science—that is, reliable methods for obtaining knowledge—is an essential part of psychology and the social sciences, especially economics, geography, history, and political science. …

Science is nothing more nor less than the most reliable way of gaining knowledge about anything, whether it be the human spirit, the role of great figures in history, or the structure of DNA.

Contributions

As against others on:

(This is as far as I’ve got.)

Comment

I’ve grouped the contributions according to whether or not I think they give due weight to the notion of uncertainty as expressed in my blog. Interestingly Steven Pinker seems not to give due weight in his article, whereas he is credited by Nicholas G. Carr with some profound insights (in the first of the second batch). So maybe I am not reading them right.

My own suggestion would be Turing’s theory of ‘Morphogenesis’. The particular predictions seem to have been confirmed ‘scientifically’, but it is essentially a logical / mathematical theory. If, as the introduction suggests, science is “reliable methods for obtaining knowledge” then it seems to me that logic and mathematics are more reliable than empirical methods, and deserve some special recognition. Although, I must concede that it may be hard to tell logic from pseudo-logic, and that unless you can do so my distinction is potentially dangerous.

Morphogenesis

The second law of thermodynamics, and much common sense rationality,  assumes a situation in which the law of large numbers applies. But Turing adds to the second law’s notion of random dissipation a notion of relative structuring (as in gravity) to show that ‘critical instabilities’ are inevitable. These are inconsistent with the law of large numbers, so the assumptions of the second law of thermodynamics (and much else) cannot be true. The universe cannot be ‘closed’ in its sense.

Implications

If the assumptions of the second law seem to leave no room for free will and hence no reason to believe in our agency and hence no point in any of the contributions to Edge: they are what they are and we do what we do. But Pinker does not go so far: he simply notes that if things inevitably degrade we do not need to beat ourselves up, or look for scape-goats when things go wrong. But this can be true even if the second law does not apply. If we take Turing seriously then a seeming permanent status quo can contain the reasons for its own destruction, so that turning a blind eye and doing nothing can mean sleep-walking to disaster. Where Pinker concludes:

[An] underappreciation of the Second Law lures people into seeing every unsolved social problem as a sign that their country is being driven off a cliff. It’s in the very nature of the universe that life has problems. But it’s better to figure out how to solve them—to apply information and energy to expand our refuge of beneficial order—than to start a conflagration and hope for the best.

This would seem to follow more clearly from the theory of morphogenesis than the second law. Turing’s theory also goes some way to suggesting or even explaining the items in the second batch. So, I commend it.

Dave Marsay

 

 

Advertisements

Uncertainty is not just probability

I have just had published my paper, based on the discussion paper referred to in a previous post. In Facebook it is described as:

An understanding of Keynesian uncertainties can be relevant to many contemporary challenges. Keynes was arguably the first person to put probability theory on a sound mathematical footing. …

So it is not just for economists. I could be tempted to discuss the wider implications.

Comments are welcome here, at the publisher’s web site or on Facebook. I’m told that it is also discussed on Google+, Twitter and LinkedIn, but I couldn’t find it – maybe I’ll try again later.

Dave Marsay

Evolution of Pragmatism?

A common ‘pragmatic’ approach is to keep doing what you normally do until you hit a snag, and (only) then to reconsider. Whereas Lamarckian evolution would lead to the ‘survival of the fittest’, with everyone adapting to the current niche, tending to yield a homogenous population, Darwinian evolution has survival of the maximal variety of all those who can survive, with characteristics only dying out when they are not viable. This evolution of diversity makes for greater resilience, which is maybe why ‘pragmatic’ Darwinian evolution has evolved.

The products of evolution are generally also pragmatic, in that they have virtually pre-programmed behaviours which ‘unfold’ in the environment. Plants grow and procreate, while animals have a richer variety of behaviours, but still tend just to do what they do. But humans can ‘think for themselves’ and be ‘creative’, and so have the possibility of not being just pragmatic.

I was at a (very good) lecture by Alice Roberts last night on the evolution of technology. She noted that many creatures use tools, but humans seem to be unique in that at some critical population mass the manufacture and use of tools becomes sustained through teaching, copying and co-operation. It occurred to me that much of this could be pragmatic. After all, until recently development has been very slow, and so may well have been driven by specific practical problems rather than continual searching for improvements. Also, the more recent upswing of innovation seems to have been associated with an increased mixing of cultures and decreased intolerance for people who think for themselves.

In biological evolution mutations can lead to innovation, so evolution is not entirely pragmatic, but their impact is normally limited by the need to fit the current niche, so evolution typically appears to be pragmatic. The role of mutations is more to increase the diversity of behaviours within the niche, rather than innovation as such.

In social evolution there will probably always have been mavericks and misfits, but the social pressure has been towards conformity. I conjecture that such an environment has favoured a habit of pragmatism. These days, it seems to me, a better approach would be more open-minded, inclusive and exploratory, but possibly we do have a biologically-conditioned tendency to be overly pragmatic: to confuse conventions for facts and  heuristics for laws of nature, and not to challenge widely-held beliefs.

The financial crash of 2008 was blamed by some on mathematics. This seems ridiculous. But the post Cold War world was largely one of growth with the threat of nuclear devastation much diminished, so it might be expected that pragmatism would be favoured. Thus powerful tools (mathematical or otherwise) could be taken up and exploited pragmatically, without enough consideration of the potential dangers. It seems to me that this problem is much broader than economics, but I wonder what the cure is, apart from better education and more enlightened public debate?

Dave Marsay

 

 

The End of a Physics Worldview (Kauffman)

Thought provoking, as usual. This video goes beyond his previous work, but in the same direction. His point is that it is a mistake to think of ecologies and economies as if they resembled the typical world of Physics. A previous written version is at npr, followed by a later development.

He builds on Kant’s notion of wholes, noting (as Kant did before him) that the existence of such wholes is inconsistent with classical notions of causality.  He ties this in to biological examples. This complements Prigogine, who did a similar job for modern Physics.

Kauffman is critical of mathematics and ‘mathematization’, but seems unaware of the mathematics of Keynes and Whitehead. Kauffman’s view seems the same as that due to Bergson and Smuts, which in the late 1920s defined ‘modern science’. To me the problem behind the financial crash lies not in science or mathematics or even in economics, but in the brute fact that politicians and financiers were wedded to a pre-modern (pre-Kantian) view of economics and mathematics. Kauffman’s work may help enlighten them on the need, but not on the potential role for modern mathematics.

Kauffman notes that at any one time there are ‘adjacent possibles’ and that in the near future they may come to pass, and that – conceptually – one could associate a probability distribution with these possibilities. But as new possibilities come to pass new adjacent possibilities arise. Kauffman supposes that it is not possible to know what these are, and hence one cannot have a probability distribution, much of information theory makes no sense, and one cannot reason effectively. The challenge, then, is to discover how we do, in fact, reason.

Kauffman does not distinguish between short and long run. If we do so then we see that if we know the adjacent possible then our conventional reasoning is appropriate in the short-term, and Kauffman’s concerns are really about the long-term: beyond the point at which we can see the potential possibles that may arise. To this extent, at least, Kauffman’s post-modern vision seems little different from the modern vision of the 1920s and 30s, before it was trivialized.

Dave Marsay

From Being to Becoming

I. Prigogine, From Being to Becoming: Time and Complexity in the Physical Sciences, WH Freeman, 1980 

 See new page.

Summary

“This book is about time.” But it has much to say about complexity, uncertainty, probability, dynamics and entropy. It builds on his Nobel lecture, re-using many of the models and arguments, but taking them further.

Being is classically modelled by a state within a landscape, subject to a fixed ‘master equation’ describing changes with time. The state may be an attribute of an object (classical dynamics) or a probability ‘wave’ (quantum mechanics). [This unification seems most fruitful.] Such change is ‘reversible’ in the sense that if one reverses the ‘arrow of time’ one still has a dynamical system.

Becoming refers to more fundamental, irreversible, change, typical of ‘complex systems’ in chemistry, biology and sociology, for example. 

The book reviews the state of the art in theories of Being and Becoming, providing the hooks for its later reconciliation. Both sets of theories are phenomenological – about behaviours. Prigogine shows that not only is there no known link between the two theories, but that they are incompatible.

Prigogine’s approach is to replace the notion of Being as being represented by a state, analogous to a point in a vector space,  by that of an ‘operator’ within something like a Hilbert Space. Stable operators can be thought of as conventional states, but operators can become unstable, which leads to non-statelike behaviours. Prigogine shows how in some cases this can give rise to ‘becoming’.

This would, in itself, seem a great and much needed subject for a book, but Prigogine goes on to consider the consequences for time. He shows how time arises from the operators. If everything is simple and stable then one has classical time. But if the operators are complex then one can have a multitude of times at different rates, which may be erratic or unstable. I haven’t got my head around this bit yet.

Some Quotes

Preface

… the main thesis …can be formulated as:

  1. Irreversible processes are as real as reversible ones …
  2. Irreversible processes play a fundamental constructive role in the physical world …
  3. Irreversibility … corresponds … to an embedding of dynamics within a vaster formalism. [Processes instead of points.] (xiii)

The classical, often called “Galilean,” view of science was to regard the world as an “object,” to try to describe the physical world as if it were being seen from the outside as an object of analysis to which we do not belong. (xv)

… in physics, as in sociology, only various possible “scenarios” can be predicted. [One cannot predict actual outcomes, only identify possibilities.] (xvii)

Introduction

… dynamics … seemed to form a closed universal system, capable of yielding the answer to any question asked. (3)

… Newtonian dynamics is replaced by quantum mechanics and by relativistic mechanics. However, these new forms of dynamics … have inherited the idea of Newtonian physics: a static universe, a universe of being without becoming. (4)

The Physics of Becoming

The interplay between function, structure and fluctuations leads to the most unexpected phenomena, including order through fluctuations … . (101)

… chemical instabilities involve long-range order through which the system acts as a whole. (104)

… the system obeys deterministic laws [as in classical dynamics] between two bifurcation points, but in the neighbourhood of the bifurcation points fluctuations play an essential role and determine the “branch” that the system will follow. (106) [This is termed ‘structurally unstable”]

.. a cyclic network of reactions [is] called a hypercycle. When such networks compete with one another, they display the ability the ability to evolve through mutation and replication into greater complexity. …
The concept of structural stability seems to express in the most compact way the idea of innovation, the appearance of a new mechanism and a new species, … . (109)

… the origin of life may be related to successive instabilities somewhat analogous to the successive bifurcations that have led to a state of matter of increasing coherence. (123)

As an example, … consider the problem of urban evolution … (124) … such a model offers a new basis for the understanding of “structure” resulting from the actions (choices) of the many agents in a system, having in part at least mutually dependent criteria of action. (126)

… there are no limits to structural instability. Every system may present instabilities when suitable perturbations are introduced. Therefore, there can be no end to history. [DJM emphasis.] … we have … the constant generation of “new types” and “new ideas” that may be incorporated into the structure of the system, causing its continual evolution. (128)

… near bifurcations the law of large numbers essentially breaks down.
In general, fluctuations play a minor role … . However, near bifurcations they play a critical role because there the fluctuation drives the average. This is the very meaning of the concept of order through fluctuations .. . (132)

… near a bifurcation point, nature always finds some clever way to avoid the consequences of the law of large numbers through an appropriate nucleation process. (134)

… For small-scale fluctuations, boundary effects will dominate and fluctuations will regress. … for large-scale fluctuations, boundary effects become negligible. Between these limiting cases lies the actual size of nucleation. (146)

… We may expect that in systems that are very complex, in the sense that there are many interacting species or components, [the degree of coupling between the system and its surroundings] will be very large, as will be the size of the fluctuation which could start the instability. Therefore … a sufficiently complex system is generally in a metastable state. (147) [But see Comments below.]

… Near instabilities, there are large fluctuations that lead to a breakdown of the usual laws of probability theory. (150)

The Bridge from Being to Becoming

[As foreshadowed by Bohr] we have a new form of complimentarity – one between the dynamical and thermodynamic descriptions. (174)

… Irreversibility is the manifestation on a macroscopic scale of “randomness” on a microscopic scale. (178)

Contrary to what Boltzmann attempted to show there is no “deduction” of irreversibility from randomness – they are only cousins! (177)

The Microscopic Theory of Irreversible Processes

The step made … is quite crucial. We go from the dynamical system in terms of trajectories or wave packets to a description in terms of processes. (186)

… Various mechanisms may be involved, the important element being that they lead to a complexity on the microscopic level such that the basic concepts involved in the trajectory or wave function must be superseded by a statistical ensemble. (194)

The classical order was: particles first, the second law later – being before becoming! It is possible that this is no longer so when we come to the level of elementary particles and that here we must first introduce the second law before being able to define the entities. (199)

The Laws of Change

… Of special interest is the close relation between fluctuations and bifurcations which leads to deep alterations in the classical results of probability theory. The law of large numbers is no longer valid near bifurcations and the unicity of the solution of … equations for the probability distribution is lost. (204)

This mathematization leads us to a new concept of time and irreversibility … . (206)

… the classical description in terms of trajectories has to be given up either because of instability and randomness on the microscopic level or because of quantum “correlations”. (207)

… the new concept implies that age depends on the distribution itself and is therefore no longer an external parameter, a simple label as in the conventional formula.
We see how deeply the new approach modifies our traditional view of time, which now emerges as a kind of average over “individual times” of the ensemble. (210)

For a long time, the absolute predictability of classical mechanics, or the physics of being, was considered to be an essential element of the scientific picture of the physical world. … the scientific picture has shifted toward a new, more subtle conception in which both deterministic features and stochastic features play an essential role. (210)

The basis of classical physics was the conviction that the future is determined by the present, and therefore a careful study of the present permits the unveiling of the future. At no time, however, was this more than a theoretical possibility. Yet in some sense this unlimited predictability was an essential element of the scientific picture of the physical world. We may perhaps even call this the founding myth of classical science.
The situation is greatly changed today. … The incorporation of the limitation of our ways of acting on nature has been an essential element of progress. (214)

Have we lost essential elements of classical science in this recent evolution [of thought]? The increased limitation of deterministic laws means that we go from a universe that is closed to one that is open to fluctuations. to innovations.

… perhaps there is a more subtle form of reality that involves both laws and games, time and eternity. (215) 

Comments

Relationship to previous work

This book can be seen as a development of the work of Kant, Whitehead and Smuts on emergence, although – curiously – it makes little reference to them [pg xvii]. In their terms, reality cannot logically be described in terms of point-like states within spaces with fixed ‘master equations’ that govern their dynamics. Instead, it needs to be described in terms of ‘processes’. Prigogine goes beyond this by developing explicit mathematical models as examples of emergence (from being to becoming) within physics and chemistry.

Metastability

According to the quote above, sufficiently complex systems are inherently metastable. Some have supposed that globalisation inevitably leads to an inter-connected and hence complex and hence stable world. But globalisation could lead to homogenization or fungibility, a reduction in complexity and hence an increased vulnerability to fluctuations. As ever, details matter.

See Also

I. Prigogine and I. Strengers Order out of Chaos Heinemann 1984.
This is an update of a popular work on Prigogine’s theory of dissipative systems. He provides an unsympathetic account of Kant’s Critique of Pure Reason, supposing Kant to hold that there are “a unique set of principles on which science is based” without making reference to Kants’ concept of emergence, or of the role of communities. But he does set his work within the framework of Whitehead’s Process and Reality. Smuts’ Holism and Evolution, which draws on Kant and mirrors Whitehead is also relevant, as a popular and influential account of the 1920s, helping to define the then ‘modern science’.

Dave Marsay

Reasoning and natural selection

Cosmides, L. & Tooby, J. (1991). Reasoning and natural selection. Encyclopedia of Human Biology, vol. 6. San Diego: Academic Press

Summary

Argues that logical reasoning, by which it seems to mean classical induction and symbolic reasoning, are not favoured by evolution. Instead one has reasoning particular to the social context. It argues that in typical situations it is either not possible or not practical to consider ‘all hypotheses’, and that the generation of hypotheses to consider is problematic. It argues that this is typically done using implicit specific theories. Has a discussion of the ‘green and blue cabs’ example.

Comment

 In real situations one can assume induction and lacks the ‘facts’ to be able to perform symbolic reasoning. Logically, then, empirical reasoning would seem more suitable. Keynes, for example, considers the impact of not being able to consider ‘all hypotheses’.

While the case against classically rationality seems sound, the argument leaves the way open for an alternative rationality, e.g. based on Whitehead and Keynes.

See Also

Later work

Better than rational, uncertainty aversion.

Other

Reasoning, mathematics.

Dave Marsay

Better than Rational

Cosmides, L. & Tooby, J. (1994). Better than rational: Evolutionary psychology and the invisible hand. American Economic Review, 84 (2), 327-332.

Summary

[Mainstream Psychologists and behaviourists have studied] “biases” and “fallacies”-many of which are turning out to be experimental artifacts or misinterpretations (see G. Gigerenzer, 1991). [Gigerenzer, G. “How to Make Cognitive Illusions Disappear: Beyond Heuristics and Biases,” in W. Stroebe and M. Hewstone, eds.,  European review of social psychology, Vol. 2. Chichester, U.K.: Wiley, 1991, pp. 83-115.]

… 

One point is particularly important for economists to appreciate: it can be demonstrated that “rational” decision-making methods (i.e., the usual methods drawn from logic, mathematics, and probability theory) are computationally very weak: incapable of solving the natural adaptive problems our ancestors had to solve reliably in order to reproduce (e.g., Cosmides and Tooby, 1987; Tooby and Cosmides, 1992a; Steven Pinker, 1994).

…  sharing rules [should be] appealing in conditions of high variance, and unappealing when resource accrual is a matter of effort rather than of luck (Cosmides and Tooby, 1992).

Comment

They rightly criticise ‘some methods’ drawn from mathematics etc, but some have interpreted as meaning that “logic, mathematics, and probability theory are … incapable of solving the natural adaptive problems our ancestors had to solve reliably in order to reproduce”. But this leads them to overlook relevant theories, such as Whitehead and Keynes‘.

See Also

Relevant mathematics, Avoiding unknown probabilities, Kahneman on biases

NOTE

This has been copied to my bibliography section under ‘rationality and uncertainty’, ‘more …’, where it has more links. Please comment there.

Dave Marsay