Ellsberg’s Risk, Ambiguity, and Savage

Daniel Ellsberg, Risk, Ambiguity, and the Savage Axioms The Quarterly Journal of Economics (1961) 75 (4): 643-669.

I. ARE THERE UNCERTAINTIES THAT ARE NOT RISKS?

Knight maintained that … “uncertainty” prevailed and hence that numerical probabilities were inapplicable —

in situations when the decision-maker was ignorant of the statistical frequencies of events relevant to his decision; or
when a priori calculations were impossible; or
when the relevant events were in some sense unique; or
when an important, once-and-for-all decision was concerned.’

Yet the feeling has persisted that, even in these situations, people tend to behave “as though” they assigned numerical probabilities, or “degrees of belief,” to the events impinging on their actions.

A number of sets of constraints on choice-behavior under uncertainty have now been proposed, all more or less equivalent or closely similar in spirit, having the implication that — for a “rational” man — all uncertainties can be reduced to risks.’ (Ramsey, Savage, de Finetti, … .)

The propounders of these axioms tend to be hopeful that the rules will be commonly satisfied, at least roughly and most of the time, because they regard these postulates as normative maxims, widely-acceptable principles of rational behavior.

A side effect of the axiomatic approach is that it supplies, at last (as Knight did not), a useful operational meaning to the proposition that people do  not always assign, or act “as though” they assigned, probabilities to uncertain events.

One could emphasize here either that the postulates failed to be acceptable in those circumstances as normative rules, or that they failed to predict reflective choices; I tend to be more interested in the latter aspect, Savage no doubt in the former. … But from either point of view, it would follow that there would be simply no way to infer meaningful probabilities for those events from their choices, and theories which purported to describe their uncertainty in terms of probabilities would be quite inapplicable in that area (unless quite different operations for measuring probability were devised).

I propose to indicate a class of choice-situations in which many otherwise reasonable people neither wish nor tend to conform to the Savage postulates, nor to the other axiom sets that have been devised.

But the implications of such a finding, if true, are not wholly destructive. First, both the predictive and normative use of the Savage or equivalent postulates might be improved by avoiding attempts to apply them in certain, specifiable circumstances where they do not seem acceptable. Second, we might hope that it is precisely in such circumstances that certain proposals for alternative decision rules and nonprobabilistic descriptions of uncertainty (e.g., by Knight, Shackle, Hurwicz, and Hodges and Lehmann) might prove fruitful. I believe, in fact, that this is the case.

II. UNCERTAINTIES THAT ARE NOT RISKS

The important finding is that, after rethinking all their “offending” decisions in the light of the axioms, a number of people who are not only sophisticated but reasonable decide that they wish to persist in their choices.

III. WHY ARE SOME UNCERTAINTIES NOT RISKS?

Responses from confessed violators indicate that the difference is not to be found in terms of the two factors commonly used to determine a choice situation, the relative desirability of the possible pay-offs and the relative likelihood of the events affecting them, but in a third dimension of the problem of choice : the nature of one’s information concerning the relative likelihood of events.

The subject can always ask himself : “What is the likelihood that the experimenter has rigged this urn? Assuming that he has, what proportion of red balls did he probably set? If he is trying to trick me, how is he going about it? What other bets is he going to offer me? What sort of results is he after?”

Ellsberg considers that taking account of the above factors can be justified as follows:

In terms of my best estimates of probabilities, action I has almost as high an expectation as action II. But if my best guesses should be rotten, which wouldn’t surprise me, action I gives me better protection; the worst expectation that looks reasonably possible isn’t much worse than the “best guess” expectation, whereas with action II it looks possible that my expectation could really be terrible.

Ellsberg advocates taking:

… actions whose expected values are relatively insensitive to the particular distribution in that range, without giving up too much in terms of the “best guess” distribution. That strikes me as a sensible, conservative rule to follow. What’s wrong with it?

There is, in fact, no obvious basis for asserting that it will lead him in the long run to worse outcomes than he could expect if he reversed some of his preferences to conform to the Savage axioms.

… the ambiguities surrounding the outcome of a proposed  innovation, a departure from current strategy, may be much more noticeable.

The decision rule discussed … will definitely bias the choice away from such ambiguous ventures and toward the strategy with “known risks.” Thus the rule is “conservative” in a sense more familiar to everyday conversation than to statistical decision theory; it may often favor traditional or current strategies, even perhaps at high risk, over innovations whose consequences are undeniably ambiguous. This property … does not seem irrelevant to one’s attitude toward the behavior.

However, in this case we are assuming conservatism, not pessimism; our subject does not actually expect the worst, but he chooses to act “as though” the worst were somewhat more likely than his best estimates of likelihood would indicate.

In effect, he “distorts” his best estimates of likelihood, in the direction of increased emphasis on the less favorable outcomes and to a degree depending on his confidence in his best estimate.

If these propositions should prove valid, the question of the optimality of this behavior would gain more interest.

It would seem incautious to rule peremptorily that the people [who do not act rationally, by only taking account of conventional probability] should not allow their perception of ambiguity, their unease with their best estimates of probability, to influence their decision: or to assert that the manner in which they respond to it is against their long-run interest and that they would be in some sense better off if they should go against their deep-felt preferences. If their rationale for their decision behavior is not uniquely compelling (and recent discussions with T. Schelling have raised questions in my mind about it), neither, it seems to me, are the counterarguments. Indeed, it seems out of the question summarily to judge their behavior as irrational: I am included among them.”

Comments

Ellsberg gives us cause to doubt the normative correctness of the common idea that probability is just a number, and suggests an alternative, but Ellsberg’s arguments are not decisive.

I look at it like this:- A decision rule is a form of strategy. If we select a risk with a known probability, then according to the law of large numbers the risk should ‘come good’ in a proportion of instances that tends to the probability. But in an ambiguous situation the proportion could take a range. Thus when we repeatedly take risks in the short-term, the overall risk tends to be ‘averaged out’. But this need not be true when faced with ambiguity. For example, if an adversary creates or selects challenges for us, we have no reason to suppose that even ‘on average’ even our best estimates will be correct. From a game-theoretic perspective, it makes sense to select an option whose pay-offs cannot be affected by others. Thus the convention, which Ellsberg criticises, of equating a range of possible probabilities to some average value seems flawed: they have different implications for outcomes and hence decisions.

The common assumption, as in Savage’s sure-thing principle, seems to be that the random mechanisms have been constructed and selected without trickery. But this seems a very stringent limitation: most cases of interest involve some form of adaptation, either through deliberate agency, evolution, or some combination. Thus even if there is no actual ‘player’ trying to trick us, life can produced some tricky challenges. Life is not just a gamble.

See Also

My notes on classical probability and broad uncertainty.

Dave Marsay

Advertisements

3 Responses to Ellsberg’s Risk, Ambiguity, and Savage

  1. Pingback: UK judge rules against probability theory? R v T « djmarsay

  2. Pingback: Uncertainty, utility and paradox « djmarsay

  3. Pingback: Making your mind up (NS) « djmarsay

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: