I continue my Tuesday reading day, with Endgame. There follows a brief summary of Chapter 5 of the Endgame book. "What they said; Bang!" That's about the height of Mauldin and Tepper's value added in this chapter. They merely parrot Reinhart and Rogoff. In slightly more detail. We're part way through an 8 year de-leveraging process. If this started in 2008, then perhaps we're already at the stage where the real GDP begins to quicken even as the de-leveraging continues for a further five years. A key retrospectively clear signal was a housing boom accompanied by a sharp rise in debt. The distal cause is human psychology - greed and fear, as per Keynes's animal spirits. In short, irrationality, together with institutional failings (the Fed's excessive efficient markets perspective which denies the spotting and pricking of bubbles; housing sector policy changes; foreign inflows form China as the workforce came on-line post communism).
The R&R ceiling kicks in somewhere between 90-100% debt/GDP ratio, at which point the bond markets bolt in an instant, at some indeterminate, un-knowable triggering event. These debt/GDP ratios move like economic super-tankers, very slowly.
Each developing nation faces a game-theoretic timing issue on when is best to deal with their de-leveraging burden. I'd say this chapter was slight.
Just as statistics is really an elaborate form of a particular kind of probability activity over sufficiently large numbers, then so too am I beginning to see the volatility of equity derivatives and the credit spread of the fixed income world as two other distinct kinds of mathematical context within which you can find probability theory applied. And of course, probability is useful to us insofar as it can place a number on something ultimately unknowable. Albeit a known unknown ('risk' or measurable uncertainty, in the Knightian sense). My main point here is that credit spread and volatility, the two great inputs into fixed income and volatility modelling, are brothers.
In my head, the analytics behind fixed income have always seemed a lot more certain than the uncertainty expressed in the volatility world - equity derivatives, convertibles, exotics, etc. This distinction had been a top-to-bottom one, since the fixed income depended on more established mathematics, largely based on traditional calculus and algebra, whereas the new derivatives mathematics is based on stochastic calculus which had only been discovered in the 1950s. On top of that, fixed income operated in a world of coupon cash flows, with only the chance of default as the ultimate deus ex machina lurking in a corner somewhere. The pay-off diagrams were similarly predictable and had fewer dimensions of risk.
In reality there's a less clear distinction. Indeed, by dealing with the phenomena of fixed income in a mathematically straightforward way, there's an argument that it could have a tendency to deceive practitioners into believing their world was a lot less uncertain than it actually was. There's no danger of that with the world of so-called volatility products (equity options, exotics, volatility and variance swaps).
But in both cases the quantitative analyst undertakes to model a number of uncertainties via concrete finite numbers which encode some element of uncertainty about the world, and nowhere is this more important to realise than in the idea of the credit spread.
Before elaborating, I'll generalise - what we're doing is taking a part of the world and, somewhat like scientists, trying to model it with concrete numbers. Those concrete numbers, in the context of the world of financial contracts, represent a theory about how the life of a financial contract will play out. Now clearly there are so many dimensions of uncertainty around two or more parties engaging in a financial contract that it ought to be always in mind just how many things can go wrong. In short, the sky could fall down on your head, the carpet could be pulled from below your feet. your eyes could be deceiving you, your counter party might start playing a different game to the one you started playing or you could discover you made a game-play tactical or strategic error. This is as useful a broad classification as I've seen. I will refer to them as the errors in the sky, carpet, eye, rule-set and game-play. The classification is of course arbitrary and the probabilities associated with them vary from country to country, from time to time.
Examples of sky surprises - hyper-inflation blow out all the expectations you had when the contract was initiated as to the value of future cash flows. A major political revolution transforms the meaning of ownership.
Examples of carpet changes. Dramatic legislative changes in property or tax law undermine some implicit assumptions which went into the analysis which leads you to sign the financial contract. Governments introduce new policy which re-contextualises the value of your existing base of contracts.
Examples of rule-set changes. Liquidity in your contract dries up, the government manipulates the calculation methodology of a key price index, a run of scandals leads to certain classes of contract being structurally re-priced.
Examples of game-play changes. You realise your estimate of the likelihood of a future event is dramatically wrong. An M&A action causes little-read and little understood legal provisions to be interpreted in a dramatically unfavourable light. Corporate so-called agency issues result in unexpected corporate behaviour. Events which had been uncorrelated at the time of the contract's inception have become inextricably linked. Your institution changes the magnitude of the risk premium attached to its ownership of your financial contract.
In short, the world is a complex and unpredictable place. It ought not be a great surprise to learn that a financial model of the contract, together with those parameters which act as inputs into the model, will at any point in the life of the contract, be more or less real, accurate, close to reality.
Of course, this knowledge is what causes model builders to try to tie their models to the always-moving markets from which elements of the state of the world can be approximated from the values of certain market prices. In other words, this knowledge of the scale of the uncertainty compels model builders to make them real-time.
There's a philosophical question lurking here. How do we evaluate a model's usefulness during the life of a contract? Is there a correspondence to reality - the degree of closeness in proportion to the model's usefulness? Or perhaps we are not justified in speaking of such a correspondence view of science (and financial engineering). Operationally, practitioners behave as if they're acting under just such a correspondence perspective; in which the name of their game is to get a model which is the closest statement to reality. Or is there no such simple correspondence with reality to be expected?
Leaving that aside for now, the more prosaic question concerns how to articulate a model such that it contains a bunch of parameters which can, in theory at least, be correlated with a moving set of market variables that can be implied by various market prices. But how can you turn uncertainty about the always-unfolding future into concrete numbers? Well, you just can. Probabilities do this, standard deviations (volatility) do it also, and credit spread also does it.
There's a famous, famously lousy betting strategy which is referred to as the Martingale. The simplest characterisation is as follows. You're faced with a betting game based on tossing a coin. You place a bet of any size you care on the outcome of the coin toss. If you're right, you get your original stake back, doubled. If you lose, you lose your stake. You'll see it referred to as 'doubling down' also in the context of trading - this is a looser variant where you're raising your bet size as the market goes continually against you.
With the Martingale algorithm assisting you with placing your best size, then if you are infinitely wealthy and are prepared to toss the coin infinitely often, you can construct a winning strategy. Interestingly, the strategy is nothing whatsoever to do with actually predicting the outcome of the coin toss. It is all about how big a bet you place on any of the sequence of coin tossing games you participate in. You bet an initial stake on the first toss. if you win, you're richer by the initial stake. if you lose, you play the game again, doubling the best size. If you win, you get back your second stake, plus the same again. That extra second iteration stake fully makes up for the loss of the initial stake, and you're left with an initial stake's worth of profit. Repeat and become rich beyond your wildest dreams.
In practice, the gambling institution (or your own finite wealth) will impose bet size limits, which dramatically increases the chances of gambler's ruin in a short losing streak.
Lying under it is the gambler's fallacy, the belief that in an evens game, wins and losses even out. That is, people dramatically underestimate the likelihood of a long string of losses. As you burn exponentially though your cash pile, each independent evens bet doesn't care where it is in the history of your losses or wins. Each future toss is either just a win or a loss.
In the context of trading, the poorest reasonable assumption to be made about you is you are no better than a random process when making a call on a binary market outcome. However this is not always going to be true, since markets do exhibit runs and reversals.
The doubling down strategy might have its origin in that part of Kahneman's prospect theory which states that when you've made a big loss, you're more likely to roll the dice to break even, rather than the rational path, which is to reduce your risk sizing and wait for the market to pick up again. Your bets get exponentially bigger in order to 'pay' for the thrill of reversing luck in the very next bet. But in practice with real trading, you might be happy with recovery of your losses over a number of bets.
Surely there are circumstances when a trading strategy somewhat like the Martingale one makes sense? First of all, assume you're putting on a trade with no better than evens odds, as per the original Martingale theory. Now, suppose you had 1,000,000 currency units to allocate to this bet. Well, you could just put the lot on this bet. But you don't know the future, so you could foresee a couple of trading periods where the investment moves against you. Why not put on 100 units initially. If the market moves against you, continue to buy into the position in 100 unit chunks - I assume that nothing changes in your trading thesis. As long as your trading thesis looks OK, you're getting to buy in at a price even better than at the starting period, and you were happy to buy in then. You wouldn't need to increase the bet exponentially since you don't need to pay for the thrill of a dramatic single jump to profit in one trading period. You'd be able to last for a much longer bad run, each time buying in at a lower average price. Again, assuming nothing changed in your trading thesis, you could at worst imagine 10,000 trading periods all going against you, before you exhausted your overall trade limit of 1,000,000 units.
The above scenario concentrated on a use case which exhibited extreme downside behaviour against you and had a distinct Martingale-like feel to it, but certainly doesn't strike me as a prospect theory like bias.
So if an alien came down and examined the set of trades on a market and saw Martingale-like trading patterns, they couldn't really say this was because of a prospect theory bias reason or because of a more healthy conservative opening strategy.
If this is the case, then any time you come across a trading book with dismisses Martingale-like trade sizing algorithms out of hand as wrong, or as evidence of a prospect theory kind of bias, then you know they're not telling you the full picture. On the flip side anyone suggesting a Martingale like sizing strategy is probably giving you bad advice.
Two more points to make on this. First, the theoretical Martingale is marked theoretical due to the house limits or the wealth of the player. Another angle on this is to say that the real unspoken criterion here is the minimal bet size (and the frequency of bets). If there was a market where the minimal bet size was a tiny fraction of a cent, and you could trade it hundreds of thousands of times a second, then you're moving a lot closer to getting it to work as a winning trade sizing strategy.
Second, this dovetails with a piece of mathematics called the gambler's ruin, which is often seen in probability textbooks showing how long you have got before any given fixed outcome gamble exhausts your initial wealth.
A Martingale is also the name given to the strap which attaches around a horse's neck and to its body, keeping its head in a narrow, froward looking position. The analogy was in the classical bet sizing strategy which calculated the expected profit or loss at time $t$ to be the current size of the winnings pot at time $t, W_t$. Apparently there was a French village Martique which had famously miserly inhabitants.
Paul Levy in the 1930s took the word and applied it to one of the two basic properties of randomly generated numbers which would result in them being normally distributed. The other was finite variance.
Tuesday evenings these days for me is economics night. At the moment, I'm reading a macroeconomics populist book, "Endgame".
I'm on chapter four, which I'll summarise as succinctly as possible here. First, I notice that when I search for an Amazon book called Endgame, that this one comes up first, well ahead of Samuel Beckett's great play. Beckett's should come first.
Finally, I'm reading this 2011 published book in February 2013 so I get to see how the immediate future panned out for the authors' predictions. On p42 the authors state that they wrote that section in November 2010.
The authors in chapter four propose a macroeconomic thesis. Many countries, primarily America, are so indebted that it will be practicably impossible to innovate and grow out of their current economic endgame. Phenomena of the endgame include generally slower real GDP growth, more frequent recessions, higher real GDP growth volatility, higher equity and bond market volatility, higher structural unemployment, particularly at the low skilled, who have been undercut by a cheaper globalised workforce and who are receiving compensatory governmental transfer payments which merely extends the pain. Looking back from proximal to distal causes he draws on elements of Minsky the so-called post-Keynesian. This is a story of complacency and low volatility leading to credit booms. This is itself an elaboration of the 'animal spirits' of Keynes. In a way Minsky seems to have taken the idea out of the realm of personal psychology and embedded it in financial institutions. He also identifies the effects of lengthening global supply chains as a significant cause of global volatilities of the kinds mentioned above. Central bank responses will lead to continued quantitative easings, which in turn makes the debt burden worse.
Next they spell out advice to readers as investors. Reduce the average holding period of your investments. Invest tactically. This isn't too surprising, coming from money managers. Readers as ageing citizens should count on receiving a much reduced safety net in their retirement.
How's he done in February 2013? Well, we have just had surprising negative Q4 GDP read for the US, and for the UK and Europe too. But Equity market volatility is at 5 and 6 year lows as measured by the daily S&P500 realised volatility and the VIX.
Two intriguing charts plot a relationship between the credit cycle and equity volatility. First, he gets a chart of annual change in commercial and industrial loans over time. He plots a point against a time two years into the future. That is, the C&I growth obervation point for October 2010 gets plotted on the October 2013 x-axis. On this same diagram he plots the VIX, which exhibits a very strong correlation. He does likewise with the MOVE Index 3 years forward, against the Fed funds rate.
Both of these curves exhibit impressive correlation, especially given the seeming predictive power of the two proxies for the credit cycle - commercial and industrial loan growth and fed funds rate. However, the leading indicators are actually predicting incredibly low volatility for 2013, which is exactly what we've been seeing. This appears to be, short term, exactly the opposite of his stated macroeconomic thesis of increased volatility. Certainly the Fed funds rate now is still flat on the floor, leading me to wonder if volatility is in for several more years of flatness (VIX below 18)? In any case, it doesn't support, in the short term, Maudlin and Tepper's claim.
This is my whole experience with the book so far. They look like they're reading a lot of other peoples' research but not quite connecting some dots which are there right in front of their eyes.
Today ought to be my 'trading research' day but I still can't stop thinking there's so much in the Kahneman book that I want to say something about what I read there recently. I've been thinking about present value and how organisations (and the culture generally) dress up various packages of cash flows to solve a number of non-financial constraints.
The constraints I'm thinking of are thinks like tax planning, since some parts of the tax law in any country are in essence arbitrary, as is the seemingly clear cut distinction of capital and income upon which a lot of the tax code is based. Second is to suit the cash flow needs of purchasers and providers of these products. Third is to exploit or get exploited by the ever-shifting inflationary environment which timed money packages find themselves surrounded by. And fourth is a Kahneman-inspired point.
I think the everyday packages of cash flow which underlie, for example, a modern mortgage deal, are also partly created to fool potential buyers. This low level deception goes on at a straight forward level - for example in burying a fraction of a mortgage's cost in the an up front fee so that the headline rate looks more competitive than it really is. That kind of deception is the sort of marketing deception we're all familiar with in our lives. But more sophisticated variants of it are buried in an extended discussion which begins around chapter 31 of the Kahneman book.
"Every simple choice formulated in terms of gains and losses can be de-constructed in innumerable ways into a combination of choices, yielding preferences that are likely to be inconsistent"
I was just about to make the same point about interpreting and equating (present valuing) packages of cash flows, with a view to showing how difficult this is to do in your head and here's Kahneman making a similar point about how our system I brain prefers reaching narrow frame solutions to immediate problems, where what's needed always is a wider frame approach. In the case of present value, collating all of the cash flows, rates, retrocessions, discount teasers, final adjustments, knock-ins, caps, floors, foll overs which we as ordinary financial consumers are faced with is an arduous task. It is so easy to make the choice to hand with respect to a narrow frame of reference. Broad framed decision making requires solving multiple problems in parallel in a way which is broadly optimal, not locally optimal. With Kahneman the problems in question are collections of expectation calculations (typically bets with unquestioned probabilities and cash values which need to be valued multiply, sometimes with a kind of chaining of valuations often found in moderately sophisticated financial models) and with me the problems to be solved are present value valuations with a view to working out which deal is the best.
Surely for any given country the best discount curve isn't too difficult to construct at any moment in time? Surely a high quality date, rate, data capture web front end wouldn't be too difficult to develop? Why not a free iPhone app which took a lot of the pain away, a beautiful pocket discounter, cash flow valuer? There are hundreds of them, but they're all rather confusing to use and force the discount curve on the user? The app server should construct this for them, leaving dates, fees, rates to be entered. Of course, many templates could be provided to assist, and there's nothing stopping the app from already doing this with competitive cash flow offerings out there in the marketplace. Many websites do this, to varying degrees of business success but none I've found which simplifies the user experience for it to be maximally useful.
James Buchanan made some telling points exposing the kinds of structural incentives which drive policy makers in the public sphere to promulgate dramatically sub-optimal policy. A kind of political version of the so-called agency problem in the corporate world. Of course, the financial world occupies such a murky in-between realm, it being so close to treasury departments the world over. And especially when their lobbying power causes them to be too big to fail. Surely the Buchanan of finance is already probing the warped incentives via his own form of quasi-public choice theory. Any why not in corporations too and not just with the agency problem? Even quasi-Austrians are focussed on the unintended consequent emergent behaviour of poorly aligned micro incentives in the quasi-public sector.
Sticking to describing packages of cash flows discounted at an appropriate rate, there's a clothed-naked metaphor which springs to my mind. The clothed element is the financial cultural description of the cash flows - this first one is an up-front fee, this next set are interest plus repayment of capital, that one is a final fee, etc. The naked element strips away the semantics, leaving just the cash flows, the dates and the discount curve. In a similar vein, the clothed element of rate interpretation contains cultural artefacts around day-count conventions, business day convention, the practice of holiday dates and weekends, accrued interest calculations, clean and dirty pricing. And the naked being continuously compounded, year-fraction equivalent. The operations team, finance team, legal team, trading group, management, they all live in the clothed realm. The quants and the technologists who implement the quant algorithms live in the naked realm.
The phenomena of this kind of nakedness are cleaner, fewer, more complex to understand. And the process of dressing and undressing, while prone to many errors and omissions is a technique it pays to do well.
It is nice to have some present value experience, or rules of thumb. Given that one of the long term goals of this blog, and of me personally, is to become very experienced in pricing convertible bonds, then it is useful to know what the bounds on a bond's present value tend to be.
Here's a rough and ready view. Convertibles last 5 years. Interest rates are usually in the 4% ($r=0.04$) ballpark. Like bonds, they are usually quoted with their price (and some other analytics) in par format, which for the present purposes is like pretending that they always have a face value $F$ of 100 units of the convertible's currency.
If you pretended that a security was nothing other than a single cash flow at the end of the five years ($t=5$), then the present value would be $100 e^{rt}$ or 82, approximately. This number is usually referred to as the investment value or the bond floor of the convertible, since it is, in general, the present value of all the fixed income side of the instrument, that is to say, ignoring volatility and optionality components.
Many convertibles have get out clauses, both for the investor and for the issuer. The net result is that, under favourable market conditions, the convertible might only last 3 years. Again at 4%, this would give a bond floor of about 89.
When you've only a year left then the bond floor drifts up to 96, on its way to par, which is in this theoretical and overly simple case, the final repayment price. The closer in time you get to expiry, the closer the simple bond floor goes toward 100.
If the prevailing discount rate is much lower, say at 1%, then you'd get 5 year present values of 95. In summary, the five year bond present value for interest rates 1,2,3,4,5,6 and 7 percent are, respectively, 95,90,86,82,78,74,70. That same range of rates applied to a single cash flow only three years hence, where there isn't so much of a compounding effect, produces these bond floors: 97,94,91,89,86,84,81, which all deviate less from par than the five year instrument.
Below is a somewhat prettier table showing this. Discount rate on left and years running along the top. There's synmmetry in here since really all iso-values of $-rt$ give the same discount factor. This is, of course, just a visualisation of how the natural exponential $e^x$ plays out when $x$ is made up of two factors, $-r$ and $t$.
While I'm at it, taking representative discount rates of 1% 4% and 7%, how many years before the present value of 100 drops by half? 70 years, 18 years and 10 years respectively. Likewise for a drop to one hundredth of its value (that is before the present value of 100 becomes 1)? 420, 105 and 60.
Adding intervening cash-flows by way of interest payments is more of the same, just an extra wrinkle.
If a nation state wanted to get rid of half the public debt in a decade, then one way to achieve it is to have a nominal discount rate of 7%. If we call that 7% 2% real and 5% inflation, then a central bank just needs 5% inflation for a decade to wipe out half of the debt holders. If a nation's debt holders are all domestic debt holders, then you've effected a transfer tax from the average lender to the average debtor, a kind of Jubilee. A foreigner looking at your country might demand more of his currency now for your currency as a result of this worry. If the debt holders are largely foreign, you are imposing the cost on them, which will have its own macro-economic consequences.
Wednesdays are my science and philosophy evenings. And I'm still on the brilliant thought-provoking Kahneman book. I want to think a bit about how you might apply some of his identified biases to the realm of trading.
The most important and general, to my mind is the phenomenon of anchoring. In trading practice, a trader usually finds himself with an implicit or explicit target of profit to reach on a monthly basis. They also are subject to monthly (and other) stop loss limits which, if breached, trigger a formal review of their book by management and risk teams, with possible further sanctions, including having to sell out of some positions, to stop trading, or in extremis, to leave the company. There are also more industry-wide targets which they're expected to aim for - like a benchmark index.
All of this anchors and contextualises what they actually do.
How else could it be? Some traders know their own career average return and draw-down. But one of the benefits of being a trader is the partial clean slate you get when moving from one employer to another. A soft reset occurs on their trading record, unless of course the trader in question has a career record he can crow about, in which case he'll go to a special effort to maintain the career numbers. I would guess many traders have exploited the ability to wash away their previous return datasets.
These are what I'd call paradigmatic anchoring effects. There are also syntagmatic ones. Imagine you have a multiple security strategy in play - for example you are long a convertible, short some stock, long CDS protection. In an important sense this is a singular strategy. And you would think the trader should look at the strategy p&l atomically too. But in reality, the trader will be in and out of the individual holdings regularly. Perhaps these holdings in isolation traders develop anchors, Anchors at the level of the individual holding can be potentially sub-optimal. Or to take an even simpler strategy, imagine you're long the stock. Perhaps your net quantity changes dramatically through the year as you're buying and selling. But each sell or buy of your stock can have adverse micro-anchors, relating to the p&l of the holding, not the overall strategy.
Imagine you thought this year being long gold is a good strategy. Most likely there would be many points during the year where you'd be buying, and then at other times selling gold. At each sell, there would be a profit or a loss. If it was a loss, then you might re-anchor it against an overall profitable net performance for the long gold idea. Conversely, if you're overall wrong on gold, then this might dominate even though your most recent sell was for a profit. Being net losing on gold, according to prospect theory, would tip you into that quadrant where a person risks more than he would normally. Perhaps this is the advantage of the monthly target and stop loss - it encourages you to bank your profits metaphorically and to partly wipe the slate clean on a loss, so that you are carrying less anchoring baggage with you going into the next month's trading.
I can see how the 'doubling down' practice is grounded in prospect theory. A loss causes you to increase risk since you concentrate on the small likelihood of 'winning back' your losing position. Perhaps you increase your bet size as you do it, a permutation of the classical martingale trading strategy.
In Kahneman's language whilst a rational agent will prefer a broad frame in judging simultaneous decisions, real humans sometimes perform sub-optimally by preferring a narrow frame. My examples above merely add the syntagmatic / paradigmatic dimension, that is, narrow framing in the context of genuinely parallel collections of decision, and in the context of a close series of related actions spread in a short period of time.
Following on from yesterday's posting about alternative models of political/economic/moral actors at the trans-individual level I am also reminded of another posting I did a while back mentioning mercantilism. Two observations seem clear to me now. Firstly, that mercantilism, following close on the heels of Hobbes's great master-work of political economy, was also an example of an explanatory rule book which was also pitched at the aggregate level. But secondly, that this rule-book was wrong. Just because a philosopher or political economist can invent such a trans-individual rule book for aggregate pseudo-actors, it doesn't follow that this rule-book is in practice any good. Mercantilism was an example, I think. That is to say, it is an additional open question to evaluate the field of possible aggregate-based explanations of national actors. Some will work for a given moment in time better than others. Mercantilism certainly has its own chapter in the history of the modern nation state, but it can lay no a priori claim to be the 'best' model. That's a question of scientific investigation, ideally.
Another example is theoretical and practical examples involving 'The Market' as aggregate actor. The Austrian school seems generally happy to disembody when desired, and pull in market average at short notice. Recovering individual market fact states in the face of knowledge of the aggregate state can help,
'The Market' is today for us just as emergent a phenomenon as mercantilism. Could not the consensus opinion of the market be wrong too? It is strange how the Austrians live with the management of the body politic through markets-based economy but criticise violently the aggregate based macro of Keynes.
It seems to me you can't know without looking whether one or another aggregates-based explanation works or not. Some will, some won't. You can't write them off just because they're aggregates based per se.
Neoclassical economists at core prefer macro-economic explanations which cash out at the micro-level. As I mentioned in a previous post some of the roots of the story Keynes tells about the actor known as leviathan can be traced back to Thomas Hobbes. But I'd like to throw in a religious dimension to all of this, which points even further back to Machiavelli and forward to Nietzsche.
Hobbes's construct was a macro actor, a personification of the state, onto which desires, intentions, actions are imposed. Collections of these actors interacting at the state level are the genesis of macro-economics. This is Keynes's root. No assumption is made about any evolution from individual actors to state actors. They're just different beasts. Understanding this allows him to question simplifying projections of homespun metaphors which make sense at the individual level but which might not work at all in the same way at the state level.
But a generation or two before Hobbes, Machiavelli tried to develop a book of etiquette for the only figure who could span the individual and the state - and that is the king or prince. This is an amazing point of connection here between two entire schools of thought which have been struggling to maintain a friendly dialogue ever since. In the embodiment of the prince, here you have a flesh and blood man who's being given a new rulebook for behaviour vis-a-vis his role as sovereign. It is no wonder that rulebook seemed so alien. The context behind this was I think an implicit understanding that the tenets of living - the rules of behaviour - embodied in Christianity (and by extension to several of the other major world religions) were insufficient for the emerging nation state post the Christian empire's dominance. Driving a state with rules which applied to individual Christians became inadequate. Machiavelli's was an early attempt to make a new rulebook for the sovereign. Nietzsche likewise took this breakdown of the homogeneous and unquestioned presence of Christianity in Western life and tried to apply it to individuals themselves. Both thinkers take the same inspiration but write their rulebooks for the national-sovereign and the personal-sovereign. The evolution of the discourse on the rulebook for the national-sovereign becomes increasingly hermetically sealed, less immoral-seeming, still alien, more amoral, formalised even. This is, in a sense Catholic to Nietzsche's more Protestant radical individualist pursuit. which has echoes in the work of J.S. Mill, through Samuelson, Friedman and the mainstream neoclassical approach. The Catholic state-as-actor/institution-as-actor approach has echoes in Adam Smith, Veblen, institutional analysis, Keynes.
The rejection of the project of Keynes's aggregates based approach in favour of the micro-based approach is in a sense a rejection of the power of the state to write its own rule book, as the rule book is too detached from the individuals within it. It is a rejection of an elite tradition of technocratic statecraft, talking in riddles among themselves, preferring to see individuals as effaced details.
So much for the history of these ideas. That explains to me some of the mutual cultural hostility when you hear the modern day intellectual combatants face off in the media. Their positions may be deeply connected anyway. The science of emergence is still in its infancy. I can well imagine that the creative act of invention behind Machiavelli, Nietzsche, Hobbes, Keynes will not be the end of the story. I marvel at their inventiveness but suspect in time that strong complex connections will be found between the micro and the macro level across many social sciences. So perhaps the best way to bring Keynes together with Buchanan and Lucas is to think of them as providing joint inspiration to new generations of political economists as they recognise the gap - but also try to bridge it, with increasing levels of success - between the rulebooks of individuals and the rulebooks of the various institutions which they produce.