## Monday, 28 November 2016

### A feeling for equity factors

## Wednesday, 16 November 2016

### Two interpretations of the Security Market Line

The chart documents some relationships about some particular stock or portfolio. It is interesting that for the purposes of the SML it doesn't matter of you're considering a single stock or a portfolio of some socks with a bunch of weightings.

The Y axis shows you what the CAPM model output at any given moment thinks a set of portfolios is likely to return. The X axis shows the beta of that portfolio. So each <x,y> co-ordinate represents a set of portfolios where each member shares the same expected return and the same beta as as its cohabitees. This set of portfolios behind a single point can be considered infinite. And of course there are an infinite number of points on he Cartesian plane.

Only the set of portfolios which give the best return over risk profile exist on a single upwardly sloping line referred to as the SML. These on-line portfolios are expected to return the market return (plus the risk free rate).

The slope of the line is referred to as the Treynor ratio and equals the excess return to be expected from the market now in excess of a (fairly) risk free rate.

Passive index trackers have as their job the task of residing at the point <1, E[R_m ] +R_f> in as cheap a way as possible. That is to say, there are many portfolios which have a beta of (about) 1.0 and which have an expected return of the return of the market. Passive fund managers try to implement being on this point in as cost effective way as possible. Passive fund managers of leveraged offerings try to do the same thing but at betas of 2.0, 3.0, 0.5 etc.

CAPM tells you there's no point being off-the-line as you're taking diversifiable risk and hence shouldn't be getting paid for it. You only get paid for diversifiable risk, that is, risk which is correlated with the market.

Active portfolio management believes that some portfolios exist above and below the SML and can be exploited to make returns greater than the market.

Deciding which value of x you'd like is not something the model can help you with. That represents an exogenous 'risk appetite choice. Once you've made that choice the SML tells you, assuming it is based on a well functioning and calibrated CAPM, how much you can expect to make.

Let's imagine you have a normal risk appetite and set x=1. There are many ways of constructing a portfolio which delivers that return but the one where you're fully invested in the market portfolio is a natural choice. You could be fully invested in a number of other portfolios which do the same. Or you could be under-using your notional capital investment and using market weights for all other stocks; or you could be borrowing money and over-gearing your unit of capital to raise the beta greater than 1.

That is, by using financing gearing, you can travel with a fixed market portfolio up and down the x-axis, in theory to any value of x. Of course you can't get infinite leverage but still, the theory assumes you can.

You can achieve the same by using leveraged products - equity options or equity futures, for example. These narrow your time horizon (theta) but in theory you don't need to worry about that.

If you try to be, e.g., fully invested and then try to tilt the beta by owning long more high beta stocks than the market, you will indeed see your Y value increase but this will also be taking risk which is diversifiable. So you'll be taking risk you are not getting paid for. Achieving this same level of return can be achieved in a more efficient way with the market portfolio and some form of leverage, and this approach is theoretically to be preferred on this basis.

In practice there are costs associated with gaining any leverage to achieve a desired return. Perhaps a better model is a CAPM with funding costs burned in.

Also you won't see a SML with negative x values. There's no reason why not. Sometimes you my be seeking a portfolio which returns less than the risk free rate (and perhaps even negative returns) in certain circumstances. In this case you'd see the return go negative as your beta goes negative.

A question arises in my head. Long term, which is the best value of x to sit on together with a market portfolio, if your goal is to maximise expected excess returns across all time periods and business cycles. I think this is a permutation of asking whether there's a way of forecasting the Treynor ratio (the equity risk premium). If you could, then you could move to a x>1 portfolio construction and move likewise to a x<1 construction when your model calls a decreasing equity risk premium.

What if my equity risk premium forecaster was a random process which swept randomly through the 0.9-1.1 range? Long term, would this not be equivalent to a steady 1.0? Could the algoithm have a degree of mean reversion at the back. That is to say, if a long term random peppering of the 0.9-1.0 space delivers 1.0 results, then if your active algorithm has placed you at 0.9 for a while, might it then increase the hit rate at the >1.0 space?

So the SML is an SML for today, and the slope of that curve may steepen or become shallow though time. Probably within a very tight range.

Calculating and predicting the equity risk premium seems to be perhaps an even more valuable thing to do than trying to do active equity factor portfolio modelling.

## Tuesday, 15 November 2016

### Scissors, a reference portfolio and a clear correlation divide

The Moves Like Jagger model (MLJ) is a kind of look inside a behaviour. The look inside in effect chops the behaviour in two. It acts like a pair of scissors. All it needs is a reference behaviour. The scissors then chop any to-be-analysed behaviour into two pieces. One is perfectly correlated to the reference behaviour. The other is perfectly uncorrelated with the reference behaviour.

All that the capital asset pricing model (CAPM) adds is a statement that the behaviour of the reference is worthwhile, indeed the ideal behaviour. CAPM in effect adds morality to the scissors. It claims that you can't act better than the reference behaviour. A consequence of this is that the uncorrelated behaviour is in some sense wrong, sinful if you like. Why would you do it if the ideal behaviour is to be strived for? By making the ideal behaviour a target, you start then to see the uncorrelated behaviour as distorting, wrong, avoidable, residual. So the language of leftovers or residua enters.

When we finally get to the equity factors active management approach, a space again opens up to re-analyse the so-called residua into a superlative component and a random component. The superlative component is behaviour which is actually better than the reference behaviour. FInally after this better behaviour is analysed (it is called alpha), it is claimed that the remainder is once again residua.

The scissors operation of covariance is the tool. CAPM is the use of the tool in a context of some assumptions around the perfection of the reference behaviour. Post-CAPM/equity factors is the use of the scissors in the context of some assumptions around the possibility of exceeding the quality of the reference behaviour.

One aspect of CAPM I have not spent much time on is the element of risk appetite. Let's pretend that the only asset available to you is the Vanguard market ETF and that you have 1 unit of capital which you've allocated to investing. No sector choices are possible. No single name choices are possible. Are you limited to receiving on average just the market return? No, because there's one investment decision you need to make which is prior to the CAPM, namely how much of your investment unit you'd like to keep in cash and what fraction you'd like to invest in the market.

The way you go about that decision is an asset allocation decision, and is a function of your appetite for risk, which is said to exist prior to the CAPM reasoning. If you have no appetite for risk you invest precisely 0% of your unit capital in the market portfolio (and receive in return the risk free rate). In theory, you could invest 100% of your unit of capital and receive the market return. Indeed, in theory you can invest >100% of your unit of capital, through the process of borrowing (funding leverage) or through the selection of assets witg built in leverage (asset leverage). Through either of these techniques, or any combination of both, you can get a return which is an amplified version of the market return.

With amplified returns though, or gearing, you run the risk of experiencing gambler's ruin early in the investing game. Gambler's ruin traditionally happens when the capital reduces to 0. Its probability can be estimated. With any kind of amplified return, there's a point before 0 where your broker, through margin calls, will effectively bring the investing game to a halt.

The process by which you decide what your degree of investment and amplification in the market is going to be is an asset allocation decision. You're after all investing your unit either in the market portfolio or in the risk free asset. This decision can be made once forever. What is the single static best allocation of cash between the risk free asset and a correspondingly over-or under-invested market portfolio? Or this decision can be time sensitive - i.e. your decision can move with time.

Insofar as the investment community does this in a more or less correlated way this creates waves of risk-on and risk-off patterns in markets.

Making a single fixed static allocation decision is a bit like a surfer who bobs up and down in a stationary way as waves arrive to the shore. Trying to be dynamic about it is like that same surfer standing up at some point and trying to let that wave take him to shore on a rewarding ride. The CAPM in a sense tells you nothing about which of these two approaches are best for long term returns.

## Monday, 14 November 2016

### Moves like Jagger

## Friday, 4 November 2016

### Which sector to begin

**Energy**

**Materials**

**Industrials**

**Consumer discretionary**

**Consumer staples**

**Healthcare**

**Financials**

**Information technology**

**Telecommunication services**

**Utilities**

**Real estate**

**Conclusion**

## Thursday, 3 November 2016

### Factor multi-temporality and the layer model is born

Another element of the approach I read about when I'm reading around equity factors which I don't like much is the implict attempt to provide a single time granularity acoss all factors.

In their steady march to generalisation, modellers have arrived at fundamental and/or economic factor models which all sit on underying observation data which occurs at the same time frequency - often end of day. In the same way as they assume or manufacture more-or-less homogeneous equity atoms, with a singular universal semantics, so too do they expect their operation to work at the same speed. But some equity factors could be slow moving, some could be much more rapidly moving. They hope that the combination of steady time evolution plus a rolling window of history over which parameter updates are performed will provide enough dynamism to allow fast moving and in-play factors to come to the fore. I'm thinking here on the fundamental factor side of momentum based factors, and on the economic factor side, of major economic announcements. Major economic announcements, scenduled and unscheduled, can move some assets, indeed some stocks, more than others.

If Ihad additional buckets to push stocks, then this opens up the possibility of various factor models being optimised to different speeds of the market.

A factor, after all, in its most general sense, is just a series of observables which may change the expected return of a stock. I am all for making factors be as widely scoped as possible. If a factor exists out there which isn't commonly recognised as such but which is predictive on the expected return of a stock, then it ought to be considered. Time does not move homogeneously for all factors, nor does a factor have to look like a factor to the factor community.

As well as looking at sector ETFs for a leg up into the world of factor modelling, I'll also be looking at multi-factor ETFs to see what's being implemented out there, as a way of getting straight to the kinds of factors which the community keep coming back to. I expect to see some momentum factors in there, and some value/steady growth ones. I expect there to be a preponderance of fundamental factors too - i.e. based on the idea that equity factor modelling is at heart the replacement equity analysts. But to me non-fundamental factors will be the heart of my approach.

This regime switching idea I have can be described as follows. For certain periods in the evolving life of a publicly listed company, the dominant factor is the beta to the market. But at other times the regime switches and the dominant attractor factor is in-play m&a, or, stock being shorted in the lead up and launch of a new convertible issue, or stock being bought up by virtue of its inclusion (or exclusion) in a major market index. Or that the stock has entered the 'week before FOMC announcement', it being particulaly interest rate sensitive, during earnings season, around dividend ex date periods.

In keeping with my humility approach, I set the hurdle high for regimes other than the beta regime - since that's the least damaging position to adopt.

Of course driving this all would be an asset allocation model, which again defaulted in moments of ignorance to the set of parameters which are generally considered a good mix. This would give your stock allocation some fixed amount within which to play.

The sector/geography/ETF context would be the main habitat of a stock and it only gets stolen away by other pseudo sectors for a set of specific reasons. To repeat, the alternative is to fully include all the relevant factors on an equal footing and to let the rolling window of calibrating market parameters to drive weight to the in play factors. I think this is going to expose the model to sparse data problems but in some primitie sense they can be considered compatible. In one you get the benefit of a single driving equation but weakened use of limited data below.

It is my view that one must be prepared to keep a really good model on ice for potentially years until the triggering signal is stong enough to activate it. I shall refer to these as episodic factor sets. Having them 'always on' and bleeding s-called explanatory power to them each day seems wrong to me.

So my model shapes up as follows: there's an asset allocation driving layer. Within that, for each asset, there's a layer which sets a target long/short ratio. (These two together represent your setting of the leverage). When you have your asset size and long/short ratio, you set about finding the set of stocks in your sectors/regimes which ought to contribute positively to alpha. FInally, especially for short term economic or fundamental factors, your way of expressing your equity exposure can be done through long or short the stock, but also through more or less complex options strategies.

## Wednesday, 2 November 2016

### What does 'own the market' actually mean?

**factor polysemy**issue.

## Tuesday, 1 November 2016

### Equity Factors - a humble beginning

**entropic fate of specific factors**. Too young, they appear to be noise to the market, too old, they are arbitraged away.

**circadian rhythm of factor sets**.

**ecological adaptation**of your factor model.