In an earlier post I said that probability was really a common ground for three moments in the life of humanity. First our own subjective ability to judge how likely or not certain events are. It is now well known that we humans are particularly bad at some kinds of judgement on outcomes. I'm very much looking forward to reading

**The Drunkard's Walk**, on my desk now, which deals with this subject, emphasising the Kahneman and Tversky discoveries in economics and psychology. Keynes himself in his doctoral dissertation, suggests we don't have the capacity for making fine-grained probabilistic calls, but instead are capable of broadly ranking outcomes in terms of their likelihood. According to the final chapter of

**The Lady Tasting Tea**, the philosopher Patrick Suppes attended a Tversky presentation highlighting just how incoherent we can be when it comes to subjective probability estimation and came up with a version of 'approximate probability' which not only was consistent with what Tversky found in his psychological experimentation but which was also consistent with Kolmogorov's axioms of probability.

Second is the moment we construct randomisation machines, whose behaviour was sufficiently outside our own heads to allow experimenters and theoreticians to begin an analysis of probability

*independent*of, perhaps in spite of how limited or biassed our own subjective abilityis to reason about uncertainty.Third is the moment Kolmogorov mainlines these probabilistic analyses into the core of mathematics, via a creative re-interpretation of the frequentist approach as a number of manipulations of set theory.

Historically, however, step two came

*before*step one. But the order of their discovery does not imply that the Kahneman and Tversky created a*refinement*of the classical theory of Pascal and Fermat, or later of Kolmogorov. No, they merely discovered what has been true about human brains for probably tens of thousands of years - they've merely probed our bias and limits. The right formal analysis of a financial structure or a game of chance is a*better*analysis than one which is sensitive to our own limits. Take the famous two games of the**Chevalier De Mere**(Antoine Gombaud) the very same games which he engaged Fermat and Pascal to find out how to split the pot fairly if the game is broken up early by mutual consent.In game 1, he bet he could get a 6 on four rolls of the dice. Since the probability of this happening is more likely than not, it is a good game to play. Probably at some point during the end of the life of this game, it became sufficiently widely known that the probability of a 6 in 4 rolls is more likely than not that no-one would play with him. So he moved on to game 2 - rolling two dice 24 times, the challenge being to get a double-six. It turns out that this is slightly less likely than $\frac{1}{2}$. De Mere played this game regularly and made many losses. Knowing peoples' psychological biasses and limits doesn't really help you much here; being a human being with all of our above-mentioned weaknesses in judging uncertainty is also not very helpful. Knowing the analysis most certainly is helpful. Disseminating this knowledge in a sense

*creates*a more rational player. A similar point is maintained by Perry Mehrling about Fischer Black, the so-called CAPM-man.My point is the rational analytical approach adds value here in spite of it not initially reflecting psychological reality - culturally, if the analytical approach pays off, that idea can usurp our more primitive subjective reasonings. Finally if the game you're playing isn't just 'out there' like De Mere's games, but instead involved a model of other players (e.g. poker), then psychological insight into human biasses can become just as invaluable as knowing the card probabilities.