Ashby’s Works

W. Ross Ashby‘s two books below “introduced exact and logical thinking into the brand new discipline of cybernetics and were highly influential. … [He] was president of the Society for General Systems Research from 1962 to 1964. … [He] Ross Ashby was one of the original members of the Ratio Club .

 

An Introduction to Cybernetics

Chapman and Hall 1956.

This is best known for: “Law of Requisite Variety[9] stating that “variety absorbs variety, defines the minimum number of states necessary for a controller to control a system of a given number of states.” In response, Conant (1970) produced his so-called “Good Regulator theorem” stating that “every Good Regulator of a System Must be a Model of that System.

Preface

It is the author’s belief that if the subject [Cybernetics] is founded in the common-place and well understood, and is then built up carefully, step by step, there is no reason why the worker with only elementary mathematical knowledge should not achieve a complete understanding of its basic principles.

It lays the foundation for a general theory of complex regulating systems, developing further the ideas of  Design for a Brain.

Part Three: Regulation and Control

11: Requisite Variety

THE LAW OF REQUISITE VARIETY

Only variety can destroy variety.

12: The Error-Controlled Regulator

THE MARKOVIAN MACHINE

A “machine” is essentially a system whose behaviour is sufficiently law-abiding or repetitive for us to be able to make some prediction about what it will do. … [The Markov machine] is one whose states change with time not by a single-valued transformation but by a matrix of transition probabilities. For it to remain the same absolute system the values of the probabilities must be unchanging.

A Markovian machine has various forms of stability, which correspond to those mentioned in Chapter 5.

As is now well known, a system around a state of equilibrium behaves as if “goal-seeking”, the state being the goal. A corresponding phenomenon appears in the Markovian case. Here, instead of the system going determinately to the goal, it seems to wander, indeterminately, among the states, consistently moving to another when not at the state of equilibrium and equally consistently stopping there when it chances upon that state. The state still appears to have the relation of “goal” to the system, but the system seems to get there by trying a random sequence of states and then moving or sticking according to the state it has arrived at. Thus, the objective properties of getting success by trial and error are shown when a Markovian machine moves to a state of equilibrium. At this point it may be worth saying that the common name of “trial and error” is about as misleading as it can be. … “Hunt and stick” seems to describe the process both more vividly and more accurately.

MARKOVIAN REGULATION

Regulation that uses Markovian machinery can therefore now be regarded as familiar and ordinary.

GAMES AND STRATEGIES

What happens, he may ask, when regulation and control are attempted in systems of biological size and complexity ? What happens, for instance, when regulation and control are attempted in the brain or in a human society? Discussion of this question will occupy the remaining chapters.

13: Regulating the Very Large System

13/1. Regulation and control in the very large system is of peculiar interest to the worker in any of the biological sciences, for most of the systems he deals with are complex and composed of almost uncountably many parts. The ecologist may want to regulate the incidence of an infection in a biological system of great size and complexity, with climate, soil, host’s reactions, predators, competitors, and many other factors playing a part. The economist may want to regulate against a tendency to slump in a system in which prices, availability of labour, consumer’s demands, costs of raw materials, are only a few of the factors that play some part. The sociologist faces a similar situation. And the psychotherapist attempts to regulate the working of a sick brain that is of the same order of size as his own, and of fearful complexity. These regulations are obviously very different from those considered in the simple mechanisms of the previous chapter. At first sight they look so different that one may well wonder whether what has been said so far is not essentially inapplicable.

Regulation in biological systems certainly raises difficult problems —that can be admitted freely. … Largeness in itself is not the source … What is usually the main cause of difficulty is the variety in the disturbances that must be regulated against.

Let us therefore approach the very large system with no extravagant ideas of what is achievable.

On this earth, the whole dynamic biological and ecological system tends to consist of many subsystems loosely coupled (S.4/20); and the sub-systems themselves tend to consist of yet smaller systems, again more closely coupled internally yet less closely coupled between one another; and so on. …  Arbitrary or not, however, some boundary must always be drawn, at least in practical scientific work, for otherwise no definite statement can be made.

DESIGNING THE REGULATOR

Throughout, we shall be exemplifying the thesis of D. M. MacKay: that quantity of information, as measured here, always corresponds to some quantity, i.e. intensity, of selection, either actual or imaginable.

Thus the act of “designing” or “making” a machine is essentially an act of communication from Maker to Made, and the principles of communication theory apply to it.

Thus the making of a machine of desired properties (in the sense of getting it rather than one with undesired properties) is an act of regulation.

[A] regulator can be selected from some general set of mechanisms (man, non- regulatory) only by being either the survivor of some process of natural selection or by being made (another process of selection by another regulator.

14: Amplifying Regulation

This is the final chapter.

[If] the final regulator can be arrived at by stages (the whole selection occurring in stages) the possibility exists that the provision of a small regulator at the first stage may lead to the final establishment of a much bigger regulator (i.e. one of larger capacity) so that the process shows amplification. This is the sense in which “amplifying” regulation is to be understood. The law of Requisite Variety, like the law of Conservation of Energy, absolutely prohibits any direct and simple magnification but it does not prohibit supplementation.

Whence comes the supplementation? From random sources as in S.12/15 and from the environment itself! For it is the environment that is forced to provide much of the determination about how the organism shall act.

The last two chapters [12 and 13] have developed the subject somewhat speculatively, partly to give the reader practice in applying the earlier methods, and partly to show what lies ahead, for the prospects are exciting.

But these matters are not matters for an Introduction.

The introduction was originally written to explain the first edition of the ‘design for a brain’, but the second edition contains some ‘advanced’ material which sheds some light on the speculations and caveats of the introduction. In particular, the extent to which a very large system may have properties quite different from those of even a very large Markov machine. In particular ‘amplifying adaptation’ supersedes ‘amplifying regulation’.

Design for a  Brain

Chapman and Hall 1960, 2nd Edition Revised. Original 1952

 

Preface

For the deduction to be rigorous, an adequately developed logic of mechanism is essential.

There now exists a well developed logic of pure mechanism, rigorous as geometry, and likely to play the same fundamental part, in our understanding of the complex systems of biology, that geometry does in astronomy.

The conclusions reached are summarised at the end of Chapter 18, but they are likely to be unintelligible or misleading if taken by themselves; for they are intended only to make prominent the key points along a road that the reader has already traversed. They may, however, be useful as he proceeds, by helping him to distinguish the major features from the minor.

Having experienced the confusion that tends to arise whenever we try to relate cerebral mechanisms to observed behaviour, I made it my aim to accept nothing that could not be stated in mathematical form for only in this language can one be sure, during one’s progress that one is not unconsciously changing the meaning of terms, or adding assumptions, or otherwise drifting towards confusion. The aim is proving achievable. … But the rigour and coherence depended on the mathematical form, which is not read with ease by everyone.

Preface to the Second Edition

At the time when this book was first written, information theory was just beginning to be known. Since then its contribution to our understanding of the logic of mechanism has been so great that a separate treatment of these aspects has been given in my
Introduction to Cybernetics … . Its outlook and methods are fundamental to the
present work.

2: Dynamic Systems

2/5. Because any real ‘machine  has an infinity of variables, from which different observers (with different aims) may reasonably make an infinity of different selections, there must first be given an observer (or experimenter); a system is then defined as any set of variables that he selects from those available on the real ‘machine’. It is thus a list, nominated by the observer, and is quite different in nature from the real ‘machine’. Throughout the book, ‘the system’ will always refer to this abstraction, not to the real material ‘machine’.

2/16 …  Because of its importance, science searches persistently for the state-determined. As a working guide, the scientist has for some centuries followed the hypothesis that, given a set of variables, he can always find a larger set that (1) includes the given variables, and (2) is state-determined. Much research work consists of trying to identify such a larger set, for when the set is too small, important variables will be left out of account, and the behaviour of the set will be capricious. The assumption that such a larger set exists is implicit in almost all science, but, being fundamental, it is seldom mentioned explicitly.

The assumption is now known to be false at the atomic level. We, however, will seldom discuss events at this level; and as the assumption has proved substantially true over great ranges of macroscopic science, we shall use it extensively.

Strategy for the complex system

2/17 … I suggest that it must try to be exact in certain selected cases, these cases being selected because there we can be exact. With these exact cases known, we can then face the multitudinous cases that do not quite correspond, using the rule that if we are satisfied that there is some continuity in the systems’ properties, then insofar as each is near some exact case, so will its properties be near to those shown by the exact case.

4: Stability

4/1. The words l stability ‘, ‘ steady state ‘, and ‘ equilibrium ‘ are used by a variety of authors with a variety of meanings, though there is always the same underlying theme. As we shall be much concerned with stability and its properties, an exact definition must be provided.

Given a field, a state of equilibrium is one from which the representative point does not move. When the primary operation is applied, the transition from that state can be described as ‘ to itself ‘.

(Notice that this definition, while saying what happens at the equilibrial state, does not restrict how the lines of behaviour may run around it. They may converge in to it, or diverge from it, or behave in other ways.)

Given the field of a state-determined system and a region in the field, the region is stable  if the lines of behaviour from all points in the region stay within the region.

A field will be said to be stable if the whole region it fills is stable; the system that provided the field can then be called stable.

Two systems may be joined so that they act and interact on one another to form a single system: to know that the two systems when separate were both stable is to know nothing about the stability of the system formed by their junction: it may be stable or unstable.

4/19. The fact that the stability of a system is a property of the system as a whole is related to the fact that the presence of stability always implies some co-ordination of the actions between the parts.

[Note] that as the number of variables increases so usually do the effects of variable on variable have to be co-ordinated with more and more care if stability is to be achieved.

5: Adaptation as Stability

5/2. The suggestion that an animal’s behaviour is ‘adaptive’ if the animal responds correctly to a stimulus ‘ may be rejected at once. First, it … cannot be applied when the free-living organism and its environment affect each other reciprocally. Secondly, the definition provides no meaning for ‘ correctly ‘ unless it means conforming to what the experimenter thinks the animal ought to do ‘. Such a definition is useless.

Homeostatis

5/3. I propose the definition that a form of behaviour is adaptive if it maintains the essential variables (S. 3/14) within physiological limits.

Generalised homeostasis

5/8. We can now recognise that ‘ adaptive ‘ behaviour is equivalent to the behaviour of a stable system, the region of the stability being the region of the phase-space in which all the essential variables lie within their normal limits.

[This] involves the concept of a machine changing its internal organisation. So far, nothing has been said of this important concept; so it will be treated in the next chapter.

6: Parameters

6/4. The importance of distinguishing between change of a variable and change of a parameter, that is, between change of state and change of field, can hardly be over-estimated.

6/7. We now reach the main point of the chapter. Because a change of parameter-value changes the field, and because a system’s stability depends on its field, a change of parameter-value will in general change a system’s stability in some way.

Equilibria of part and whole

That a whole dynamic system should be in equilibrium at a particular state it is necessary and sufficient that each part should be in equilibrium at that state, in the conditions given to it by the other parts.

7: The Ultrastable System

To be adapted, the organism, guided by information from the environment, must control its essential variables, forcing them to go within the proper limits, by so manipulating the environment (through its motor control of it) that the environment then acts on them appropriately.

The process of trial and error … may be playing the invaluable part of gathering information, information that is absolutely necessary if adaptation is to be
successfully achieved.

Step-functions 

Although there is no rigorous law, there is nevertheless a wide-spread tendency for systems to show changes of step-function form if their variables are driven far from some usual value.

Systems containing step-mechanisms

Suppose, in a state-determined system, that some of the variables are due to step-mechanisms, and that these are ignored while the remainder (the main variables) are observed on many occasions by having their field constructed. Then so long as no step-mechanism changes value during the construction, the main variables will be found to form a state-determined system, and to have a definite field. But on different occasions different fields may be found.

The ultrastable system

7/26. In the first edition the system described in this chapter was called ‘ ultrastable ‘ … . At that time the system was thought to be unique, but further experience [in the ‘Introduction’, above, 12/8-20] has shown that this form is only one of a large class of related forms … .

For convenience, its definition will be stated formally. Two systems of continuous variables (that we called ‘ environment ‘ and ‘ reacting part ‘) interact, so that a primary feedback (through complex sensory and motor channels) exists between them. Another feedback, working intermittently and at a much slower order of speed, goes from the environment to certain continuous variables which in their turn affect some step-mechanisms, the effect being that the step-mechanisms change value when and only when these variables pass outside given limits. The step-mechanisms affect the reacting part; by acting as parameters to it they determine how it shall react to the environment.

18: Amplifying Adaptation

This is the conclusion of the main, less mathematical, section.

Thus, selection for complex equilibria … is the rule. (18/1).

Appendix: 20 Stability

[The] probability of stability is small in large systems assembled at random. (20/10)

My Comments

This all seems consistent with the ‘process logic’ of Whitehead and subsequent work by Keynes and Ashby’s fellow Ratio Club member, Turing. As such it is a considerable advance on the first edition, whose ideas many seem to take as ‘Ashby’s Cybernetics’.

Dave Marsay

Advertisements
%d bloggers like this: