The Bank of England and Federal Reserve held a two-day conference last week in London on big data and machine learning. All very interesting stuff.
There was an intriguing vignette as we emerged from the conference room for the frugal lunch on the first day.
Straight ahead was a table with sandwiches, fruit and the like. Most participants made for this, so many that a long queue soon formed, stretching well out of the room. But a sharp right instead brought you to a smaller table, with identical food. The wait was very much shorter.
This illustrates important aspects of modern economic theory.
In fashion markets and on the internet, for example, products or sites can rapidly become popular for reasons not connected to their inherent qualities. They become more popular simply because they are already popular. People start to follow the crowd rather than rely on their own judgment.
The same thing can be observed in bubbles in financial and property markets. An extreme example was seen in the case of Northern Rock in the run-up to the financial crisis in 2008. The bank did in fact have enough assets to pay its liabilities. But it experienced a short-term liquidity problem and approached the government for support.
The news leaked, and within 24 hours huge queues formed outside the branches as people scrambled to get their money out. The longer the queue, the bigger it became. The result was the first bank failure in the UK for 150 years.
This herd-like behaviour seems irrational. But an important paper by Sushil Bikhchandani and colleagues in the top-ranked Journal of Political Economy 25 years ago showed that it was perfectly compatible with the economic concept of rationality.
Suppose you have to make a decision, like the table to go to in order to pick up lunch. You might have some private information about the options. In addition, there is some public information available to all.
If enough people have chosen and have already given more weight to the public information, it will seem like it is more accurate than your own. So you are likely to give more weight to it when you choose.
This is exactly what happened at the central bank event. The first few people coming out of the conference used private information, and made for the table they could see. Others behind them could only see people getting lunch, and simply followed them. They used the public information about where lunch was available.
Those in the long queue had imperfect information, another key concept in economic theory. They were outside the room and could not see the other table in the opposite corner.
I considered approaching people in the queue to sell them information to shorten their wait. But it is a bit tricky to value information – yet another important issue in modern economics. As soon as I mentioned it, they would easily guess there was another table and find it themselves.
The conference itself was fascinating. But it was certainly gratifying to see economists behave as rational herders.
As published in City AM Wednesday 5th December 2018
Image: Northern Rock by Alex Gunningham via Wikimedia under CC BY-SA 2.0Read More
This month saw the tenth anniversary of the collapse of Lehman Brothers, a collapse which precipitated one of the only two global financial crises of the past 150 years.
The late 2000s and early 1930s were the only periods in time when capitalism itself has trembled on the edge of the precipice.
It was in November 2008 that the Queen put her famous question about the crisis to the academics of the London School of Economics: “Why did nobody notice it?”
The answer is simple. In the models of the economy at the time, finance did not matter.
Mainstream economists did not notice the massive financial imbalances in the economy, because in their models, any problems that might link to these imbalances were assumed away.
To be of any use, all scientific models have to make simplifications of reality. But orthodox macroeconomics took a step too far. It assumed that the workings of the whole economy could be explained by analysing the theoretical behaviour of just a single decision maker. In the jargon, this is the “representative agent”.
The agent is a device which economists used to model the economy. It was extremely clever, and could solve hard mathematical problems – calculating how the decisions of average consumers and companies would affect the macroeconomy.
These kinds of models go by the splendid name of “dynamic stochastic general equilibrium models”, or just plain “DSGE” to their friends. But at its most basic, the problem with such economic models was that there was only one decision maker in them.
Having just two, a “creditor” and a “debtor” for example, would have helped a lot.
Over the past decade, economists have been scrambling to incorporate other financial factors into their models, such as household debt. Key contributions to this research are discussed in the latest issue of the Journal of Economic Perspectives.
Bizarre though they may seem, DSGE models now finally recognise the potential importance of household finance in causing crashes.
A particularly interesting paper in the journal is by Atif Mian of Princeton and Amir Sufi of Chicago. Their focus is considerably wider than the crisis of the late 2000s in the United States. They quote empirical studies across some 50 countries with data going back to the 1960s. They found that a rise in household debt relative to the size of the economy is a good predictor of whether GDP growth will slow down.
Rickard Nyman, a computer scientist at UCL, and I applied machine learning algorithms to data on both public and private (households and commercial companies) sector debt in both the UK and America. We find that the recession of 2008 could have been predicted in the middle of 2007.
Perhaps the most striking result is that public sector debt played little role in causing the crisis. The driving force was the very high levels of private sector debt.
A critic might say that this is simply a case of generals fighting the last war.
True, we don’t know whether a completely different nasty event lies around the corner. But at long last, economists appreciate the fundamental importance of debt and finance in Western economies.
As published in City AM Wednesday 26th September 2018
Image: Her Majesty The Queen by UK Home Office on Flickr licensed under CC-BY 2.0Read More
Our boys make progress – and I don’t mean on Brexit.
On a visit to Glasgow last Thursday, a popular Scottish newspaper had a mock-up photo of Harry Kane lifting the cup. In massive type, the headline shrieked “This Would Be the End of the World”. Yes, it would rather put the Highland Clearances into perspective.
There is a general perception this year that the football has been more entertaining than usual. This is reflected in the fact that the average number of goals per game – 3.18 – is the highest since the 1958 finals.
The qualifiers for the last 16 generally followed the form book, with only three of them – Russia, Denmark, and Sweden – edging out teams placed above them in the FIFA rankings before the tournament started.
But the patterns in the results show once again how close many of the teams are in ability. One team has to win, though it is not obvious which one.
Germany’s own qualifying group illustrates the point. A key concept in economic theory is that of transitivity. It essentially means that preferences should be well-structured.
If I prefer product A to product B and product B to product C, the assumption is that I prefer A to C.
If we carry this over into team sports, it seems logical that if A beats B and B beats C, then A should beat C.
None of these “transitive triples”, as the jargon puts it, were observed in Group F. Mexico beat Germany, who beat Sweden. But Sweden beat Mexico. Sweden also beat South Korea, who beat Germany.
The conclusion is that the teams in this group were very evenly matched. It was largely a matter of chance rather than superior ability that Mexico and Sweden qualified.
In the round of 16, three of the eight games ended in draws and the result was by penalty shoot-out. Two of the others were decided by goals deep into injury time. And one of the quarter finals was won on penalties.
Again, the implication is that there is a great deal of randomness in the outcome. Even in England’s famous victory over Colombia, the opposition goalkeeper got his hand to the final penalty shot but could not prevent the ball entering the net. Move his hand by just a few centimetres, and he saves it.
To round off this football economics analysis, finally and frivolously, is winning the World Cup good for the economy? I looked at the eight years from 1974 when European countries won.
As a control group, I examined the US and Australia, two western economies where soccer is a minor sport. Growth in a World Cup year was higher than in the previous year seven times, and lower nine times. Growth was higher in the year after the World Cup nine times and lower seven. So the pattern here looks completely random.
In the countries which won, growth was higher in the World Cup year than the previous on four occasions, and lower on four. But in contrast to the control group, growth in the year after victory fell six times out of the eight.
Winning the World Cup is bad – or so the statistics say!
As published in City AM Wednesday 11th July 2018
The Transport for London (TfL) bus experiment has proved to be overwhelmingly unpopular.
Supposedly at every bus stop (but more usually once the bus has pulled away) a disembodied voice informs the passengers that the bus is about to move.
The hated announcement is being run as a trial for four weeks. TfL will then evaluate its effect on the number of accidents on the buses themselves.
A conflict between individual and collective welfare is exposed by the reactions to the experiment.
Collectively, we do not want it to continue, but individuals have little incentive to stop it in an effective way.
For example, public spirited individuals could fall down and claim that this was due to the motion of the bus. The statistics would then show an increase in accidents. Even the most obdurate bureaucracy would find it hard to persist with the experiment
The “victims” would bear costs as individuals, such as the time spent reporting it, plus the risk they might actually injure themselves. But they would create a benefit for everyone else. The voice on the buses would be switched off.
The concept of the winners compensating the losers has been a fundamental principle of economic theory for at least 100 years. It is important in public policy making, in the cost-benefit analysis which is carried out to decide whether a public infrastructure project should go ahead.
This is the rationale for the soon-to-be-abolished tolls on the Severn crossings, for example. The users benefit from a much-reduced travel time, but the non-users lose by having to pay taxes to build the bridge in the first place.
In general, the problem with implementing this in full is that the gainers are small in number relative to the losers. They tend to object vociferously to the charges levied on them, so that they rarely pay the full amount of their benefit to compensate everyone else.
With the bus scheme, the reverse is the case. Large numbers benefit from the ending of the scheme, and only the small number simulating an accident would lose.
But a public body such as TfL could hardly be expected to set up a scheme which would undermine its own experiment.
We might then ask why a market has not emerged to compensate those willing to simulate a fall. In a market, individuals could be paid the full costs they incur.
With social media, setting up such a market would be easy. But there would be two main problems.
The first is that of trust. How would participants be reassured that the relevant monies would be paid? The issue of institutional trust is a fundamental reason why markets are difficult to set up in many contexts.
There is also what economists call the free-rider problem. How many of those who dislike the voice would simply leave it to others to make the payment? There would be no coordination mechanism for ensuring that everyone paid.
Annoying though it may be, the bus experiment shows that even everyday issues often raise fundamental aspects of economic theory.
As published in City AM Wednesday 31th January 2018
Do Tube strikes make Londoners better off?
At first sight, the question is simply absurd. The answer is surely “no”.
But a paper in the Quarterly Journal of Economics comes to the opposite conclusion. Cambridge economist Shaun Larcom and his colleagues analysed the two-day strike of February 2014.
They obtained detailed travel information on nearly 100,000 commuters for days before, during, and after the strike.
A key feature of the strike is that nearly half the stations remained open. So most commuters could experiment with routes different to the ones they normally use.
The project may seem barking mad. But it investigates an important issue in economic theory.
Richard Thaler’s recent Nobel Prize for behavioural economics received a lot of publicity. Behavioural economics looks for examples of people making decisions in ways which deviate from those predicted by the rational choice model of economics.
A criticism from the mainstream is that deviations might indeed be observed at a point in time. But over time, they will disappear as people learn to be rational and make the best decision.
The Tube network remains the same for long periods of time. Commuters have many opportunities to learn about it. So almost all of them should use the quickest possible route to work. If someone has just moved jobs or homes, there may be a short period of adjustment. But everyone else ought to have learned the best way to travel.
Yet Larcom and his colleagues find that a significant fraction of London commuters fail to find their optimal routes. They come to this conclusion by comparing the journeys of the people in their data set before and after the strike.
Of course, for many journeys the best route is trivially easy to discover. If you live in Richmond and work in Hammersmith, there is only the District Line. Other journeys have more options. Larcom notes that there are 13 potential ways to travel between Waterloo and King’s Cross.
The authors point out that many decisions faced by consumers are more complex and less repetitive than the commuter problem they analyse. So, in an excellent example of jargon, they state that “our estimate of suboptimal habits may be a lower bound to the problem in other contexts”.
In other words, systematic and persistent deviations from rational choice are an important feature of the real world.
Economists of course like to value everything, and there is a standard way of valuing time. The academics estimate that the time gains subsequently achieved by those who switched routes outweighed the time losses incurred by everyone else during the strike. So Londoners were better off as a result of the strike.
Bizarre though it may seem, the article is a good example of how economics is becoming much more empirical when thinking about individuals’ behaviour and less reliant on pure theory.