Showing posts with label expectation. Show all posts
Showing posts with label expectation. Show all posts

Wednesday, 6 November 2019

Markowitz and expectation

One of Harry Markowitz's aha moments comes when he reads John Burr WIlliams, on equity prices being the present value of future dividends received.  Markowitz rightly tightened this definition up to foreground the fact that this model works with some future uncertainty, so the phrase 'present value of future dividends' ought to be 'expected present value of future dividends'.  We are dealing with a probability distribution here, together with some variance expressing our current uncertainty.  When variance here is a now-fact, representing our own measure of ignorance, that fits well with a Bayesian/Information Theoretic framework.  

I note that in 1952 the idea of future expected volatility was very dramatically under-developed.  It was still two decades away from the Black-Scholes paper and the trading of listed equity options on exchange.  The term implied volatility was not in common finance parlance.  

The other interpretation of variance in Markowitz's classic Portfolio Selection, 1952 is that it ought to be the expected future variability in the stock's (or portfolio's, or asset's, or factor's) return.  That is, the first of Markowitz's two stages in selecting a portfolio is making an estimate of the expected return and expected variance of the return stream.

He says:
The process of selecting a portfolio may be divided into two stages. The first stage starts with observation and experience and ends with beliefs about the future performances of available securities. The second stage starts with the relevant beliefs about future performances and ends with the choice of portfolio. 
 I'm mentioning this since I think Markowitz thought of minimum variance as a tool in the 'decision making with uncertainty' toolbox, namely that it in effect operationalises diversification, something he comes into the discussion wanting to foreground more than it had been foregrounded in the last.

What has happened largely since then is that maximum likelihood historical estimates of expected return and expected variance have taken precedence.  Of course, this is convenient, but it doesn't need to be so.  For example, imagine that a pair of companies have just entered into an M&A arrangement.  In this case, historical returns tell only a part of the story.

Also, if you believe Shiller 1981, the realised volatility of stock prices in general over the next time period will be much greater than the volatility on show for dividends and perhaps also not much like the realised volatility for the time period just past.

Taking a step back even further, we are assuming that the expected mean and variance of the relevant expected distribution is of the sort of shape which can appropriately be summarised by a unimodal distribution with finite variance, and that these first two moments give us a meaningful flavour of the distribution.  But again, just think of the expected distribution of an acquired company in an M&A deal half way through the deal.  This isn't likely to be normal-like for example, and may well be bimodal.

Friday, 22 March 2013

Probability preferences : expectation is secondary

I didn't realise counting was so important to the theory of probability.  First you have the simplified sub-case where all N disjoint outcomes are mutually exclusive, in which case you can use combinatorics to estimate probabilities.  Combinatorics just being counting power tools.  In effect the move is to set all of these $\frac{1}{n}$ probabilities to be mapped to the natural numbers.  Then comparing probability areas becomes a question of counting sample space elementary outcomes.  

Second, even in the case where it is a general (non equi-probable) distribution, you can look at the set of outcomes themselves and map them to a series of numbers on the real (or whole) line.  So say you have a die with six images on them.  You could map those images to six numbers.  In fact, dice normally come with this 1-to-6 mapping additionally etched onto each of the faces.  The move from odds-format to ratio-of-unity format that we see in probability theory is crying out for a second number, representing some kind of value, perhaps a fair value, associated with some game or contract or activity.  In other words, now we've partitioned the sample space into mutually exclusive outcome weights, let's look at finding numerical values associated with the various states.  When it comes to pricing a financial contract which has an element of randomness in it (usually a function of some company's stock price, which serves nicely as such a source), then a careful reading of the prospectus of the derived instrument ought to be able to be cashed out in terms of a future value, given any particular level of the stock.

I've seen Pascal's wager claimed to be the first use of expectation in a founding moment for decision theory.  By the way, that's a poorly constructed wager since it doesn't present value the infinite benefit of God's love. That could make a dramatic difference to the choices made.  Anyway, Huygens himself wrote about expectations in his probability book, but for me, the warm seat problem (the problem of points) represents an attempt to find a mean future value starting from now during a game.  This is an expectation calculation, even though the word may not have been used in this context.

Tuesday, 12 March 2013

Probability preferences

In order to support my claim that Pascal (and to some extent, Fermat) are too highly praised in the history of probability theory, I'd like to make a claim about what I see as important in the constellation of ideas around the birth of probability theory.  This is my opinion, and is based on what I know that has happened in the subject of probability theory since the time of Cardano, Pascal, Fermat and Huygens.

Concepts of primary importance in probability theory (in the pre-Kolmogorov world of Cardano, Fermat, Pascal)
  1. Event Space
  2. Independence. 
  3. Conjunction and disjunction.     
  4. Equivalence class.  
  5. Parallel/sequential irrelevance of future outcomes.  
  6. A relation between historical observed regularities and multiple future possible worlds.  
  7. A clear separation between the implementation of the random process(es) and the implementation of the activity, game, contract, etc. which utilises the source of randomness.

Concepts of secondary importance.
  1. Equi-probable event space. 
  2. Expectation.  
  3. Single versus multiple random sources.  
  4. Law of large numbers (though it is of primary importance to the dependent subject of statistics).
  5. i.i.d. (two or more random sources which are independent and identically distributed)
  6. A Bernoulli scheme
  7. The binomial distribution
  8. Stirling's approximation for n factorial
  9. The normal distribution
  10. Information content of a random device
  11. Identification of the activity, game, contract, etc, as purely random, or additionally strategic. 

I'd like to say something about each of these in turn.

Before I do, I'd like to say this - the Greeks didn't develop probability theory, as Bernstein and also David suggest, due to a preference for theory over experimentation, but perhaps because probabilities are ratios, and the Indians didn't invent base ten positional number notation until the eighth century A.D., making subsequent manipulations of these ratios more notationally bearable.  No doubt the early renaissance love of experimentation (Bacon and Galileo) may have assisted in drawing the parallel between the outcome of a scientific experiment and the outcome of a randomisation machine.