The OBR’s forecasts should be taken not just with a pinch of salt, but with the contents of an entire mine
There has been a great deal of crowing in metropolitan liberal circles over the report of the Office for Budget Responsibility (OBR), published with the Budget last week.
The OBR revised downwards its projections for GDP growth for each of the next five years. Annual average growth to 2022 is predicted to be just 1.4 per cent a year.
The OBR believes that the UK is experiencing a “negative supply shock”.
But forecasts are merely forecasts. They do not constitute scientific evidence at all. This is especially true of economic predictions.
One section of the OBR’s report which relates to facts rather than views about the future has been seized on. This is that growth in the euro area during 2017 has been both stronger than it was in 2016, and stronger than in the UK. This is represented as showing that the EU is dynamic, and the UK is fading away.
But the experience of just a few months data – we only have official data to, at the very latest, the end of September – needs to be put into context.
Since 2007, the year immediately before the financial crisis, GDP in the UK has grown by just over 10 per cent.
This does indeed represent a decade of growth which, by historical standards, is low.
But the figure is very similar in Germany. In France, output is only around six per cent higher than it was 10 years ago. In Spain, GDP has risen by five per cent.
In Italy, however, the economy has shrunk by some five per cent since 2007. The Italians have had a decade not just of low growth, but of negative growth. They have gone backwards.
Despite over 40 years of EU membership, the UK economy remains far more synchronised with the US in terms of the year-on-year fluctuations of the business cycle.
So over this period, we see some years when economies in the EU have grown faster than in the UK, and some years when they have grown more slowly. This is precisely what to expect when the cycles are not coordinated.
The OBR itself is fully aware of the huge potential for error in economic forecasts.
Indeed, the report illustrates the uncertainty around its five-year projection of 1.4 per cent annual GDP growth in a so-called “fan chart”. This shows the potential range around the prediction, based on past errors made in official forecasts.
At worst, growth could be negative, with an annual average fall of one per cent. But at best, we could have a sustained boom with growth of over four per cent a year.
Based on how wrong past forecasts have been, the next five years could see a cumulative fall in GDP of over five per cent, or a cumulative rise of over 20 per cent.
The OBR’s forecasts should be taken not just with a pinch of salt, but with the contents of an entire mine.
As published in City AM Wednesday 29th November 2017
Are members of the Labour Party frontbench experts in doublethink? The concept was invented by George Orwell for his novel 1984, written in the 1940s as a critique of the Soviet Union.
Masters of doublethink can hold, for purposes of political expediency, two opposing opinions at the same time, one of which might be complete nonsense.
The Leader himself set a good example during the general election campaign when he promised to abolish all outstanding student debt. Jeremy Corbyn rather backtracked on this after the votes had been cast, when it was pointed out to him that this would cost around £100 billion – over £1,500 for every man, woman and child in the UK.
His close ally, the Shadow Chancellor, followed this up on Sunday. Asked about the cost of Labour’s re-nationalisation plans, John McDonnell said that “you don’t need a number because you swap shares for government bonds”.
Independent experts put a provisional costing of around £500 billion on McDonnell’s plans. This amounts to over 20 per cent of GDP.
Imagine you want to buy a house for £10 million but have no savings. And imagine that you somehow persuade someone to lend you the money. True, you have acquired an asset worth £10 million and have a debt of the same size. Your net wealth position is unchanged.
But you face the problem of paying the interest on the loan, the terms of which may be very onerous, reflecting your credit risk.
McDonnell argues that nationalised industries will make a profit, which will take care of the interest payments. Stretching credibility even further, Labour argues that because the interest on government bonds is currently only just over 1 per cent, the payments would not amount to much.
Yet it is obvious that the markets might want a much higher rate of interest to finance the plans of a Chancellor who wanted to add £500 billion to public debt.
Emily Thornberry, the Shadow Foreign Secretary, has also got in on the doublethink act. Challenged on TV to name any country where Labour’s policies of financing spending by issuing debt had worked, she finally came up with Germany and Sweden.
The Bank of International Settlements complies data on the ratio of government debt to GDP. There are several ways to do this, but on their preferred approach in Germany it is currently 73 per cent and in Sweden 44 per cent. In the UK it is already 116 per cent.
Much more plausible comparators are Italy and Greece, where the ratios are 150 and 173 per cent, figures which McDonnell would reach easily. In Italy, GDP is still 5 per cent below its peak level in 2007, a whole decade ago. And in Greece it is 26 per cent lower.
Are the top Corbynites cynical exponents of doublethink? Less charitable people might say they are just plain dim. As so often in economics, the evidence so far does not enable us to decide between the two hypotheses. But, either way, they are bad news.
As published in City AM Wednesday 22nd November 2017
Do Tube strikes make Londoners better off?
At first sight, the question is simply absurd. The answer is surely “no”.
But a paper in the Quarterly Journal of Economics comes to the opposite conclusion. Cambridge economist Shaun Larcom and his colleagues analysed the two-day strike of February 2014.
They obtained detailed travel information on nearly 100,000 commuters for days before, during, and after the strike.
A key feature of the strike is that nearly half the stations remained open. So most commuters could experiment with routes different to the ones they normally use.
The project may seem barking mad. But it investigates an important issue in economic theory.
Richard Thaler’s recent Nobel Prize for behavioural economics received a lot of publicity. Behavioural economics looks for examples of people making decisions in ways which deviate from those predicted by the rational choice model of economics.
A criticism from the mainstream is that deviations might indeed be observed at a point in time. But over time, they will disappear as people learn to be rational and make the best decision.
The Tube network remains the same for long periods of time. Commuters have many opportunities to learn about it. So almost all of them should use the quickest possible route to work. If someone has just moved jobs or homes, there may be a short period of adjustment. But everyone else ought to have learned the best way to travel.
Yet Larcom and his colleagues find that a significant fraction of London commuters fail to find their optimal routes. They come to this conclusion by comparing the journeys of the people in their data set before and after the strike.
Of course, for many journeys the best route is trivially easy to discover. If you live in Richmond and work in Hammersmith, there is only the District Line. Other journeys have more options. Larcom notes that there are 13 potential ways to travel between Waterloo and King’s Cross.
The authors point out that many decisions faced by consumers are more complex and less repetitive than the commuter problem they analyse. So, in an excellent example of jargon, they state that “our estimate of suboptimal habits may be a lower bound to the problem in other contexts”.
In other words, systematic and persistent deviations from rational choice are an important feature of the real world.
Economists of course like to value everything, and there is a standard way of valuing time. The academics estimate that the time gains subsequently achieved by those who switched routes outweighed the time losses incurred by everyone else during the strike. So Londoners were better off as a result of the strike.
Bizarre though it may seem, the article is a good example of how economics is becoming much more empirical when thinking about individuals’ behaviour and less reliant on pure theory.
As published in City AM Wednesday 15th November 2017
Image: Underground by By Elliott Brown is licensed under CC by 2.0
The so-called “productivity puzzle” just does not go away.
The October, employment figures released by the Office for National Statistics (ONS) brings it into focus.
The number of people in work rose to a new record high of 32.1m, with an increase of around one per cent compared to a year ago.
Total output, measured by GDP, continues to rise, but modestly. We do not yet have official estimates for the year to October, but GDP seems to be up by some 1.5 per cent.
Productivity is defined as output per worker, so it is only around 0.5 per cent higher than a year ago. No scientific consensus has yet emerged to explain why productivity growth continues to be so low.
But there is increasing evidence that the rate of growth of output is being systematically underestimated.
The economy cannot be put in a set of scales and measured. Its size has to be estimated, and the ONS uses a wide variety of methods to do this.
The fundamental problem is that the foundations for estimating GDP were built in the 1930s and 1940s, when the economy was dominated far more by manufacturing. Measuring how many things have been produced is inherently easier than measuring services.
The ONS does not stand still, and tries to take account of the massive changes in the economy which have taken place. But the rise of the internet economy brings entirely new problems to solve.
A key one is what the futurologist Alvin Toffler many years ago called the “prosumer” sector.
Traditionally, products are developed and sold by companies, and consumed by, well, consumers.
In the prosumer sector, consumers themselves participate in the production and development of products and services.
A good example is the statistical package R. This is open source, and freely and readily downloadable by anyone.
In recent years, R has become the package of choice for young scientists in a wide range of disciplines around the world. They both use it, and contribute to its development by uploading their own algorithms.
A huge range of routines can be downloaded. Its graphics features are amazing. Software is appearing on it that has the potential to take on commercial giants such as Word and Powerpoint.
It has become a very valuable tool for scientific research, using the word “valuable” in its every day sense of the word. But it is run by a small not-for-profit foundation, so in ONS terms its value is close to zero.
The problem is that R is what economic theory describes as a “public good”.
This jargon phrase applies to anything where anyone can consume it, and where the supply never runs out. No matter how many people use R, it is always available for the next person.
For most goods and services, this is just not true. When I put my swimming towel on the pool lounger, it is no longer available to you.
The prosumer sector creates a lot of output. But economics has not yet solved the question of how to value public goods.
As published in City AM Wednesday 8th November 2017
Image: Vintage Scales by Public Domain Pictures is licensed under CC by 0.0
Karl Marx famously wrote: “History repeats itself, first as tragedy, second as farce”.
The phrase might well have been coined with Catalonia in mind.
Generalissimo Franco began a military coup against the elected Spanish government in the Canary Islands in 1936. The battle spread across Spain, and Catalonia was the last redoubt of the Republic to fall, in 1939. Franco took brutal revenge. Tens of thousands were imprisoned or executed, many of these within living memory. The Catalan language was banned.
Now the Catalans have proclaimed independence and Spain has imposed direct rule.
We do not of course know how events will pan out this time around. Things may turn serious. Yet there is certainly a slapstick element to having two different sets of police on the streets, and two different groups of civil servants, each taking different sets of orders.
In both the late 1930s and now, economics has a potentially decisive role in the eventual outcome.
There are many reasons for Franco’s victory. An important one is that the Republican side could just not obtain enough modern armaments. Catalonia even then was the richest part of Spain, but the arms the Catalans needed were made abroad, and, as the civil war progressed, increasingly they could not afford them.
A leading element in the Catalan government was the Workers’ Party of Marxist Unification (POUM in Spanish). POUM was inspired by Leon Trotsky, in much the same way as Jeremy Corbyn and his close acolytes appear to be today.
Sympathy for the historical role of POUM goes a long way to explaining why Corbynistas are enthusiastic supporters of the contemporary Catalans.
But POUM made a catastrophic mistake: initiating a policy of expropriating private property. One effect was a major loss of confidence, and the collapse of the Republican peseta on the foreign exchanges, meaning that all imports, not just weapons, became punitively expensive.
Another generalissimo who was a political contemporary of Franco, one Joseph Stalin, described Trotskyists as a “gang of wreckers and diversionists”. In this, at least, he was surely correct.
This time, the Catalans are desperately trying to create separate a currency, using technology based on digital tokens. Their government is considering an e-residency programme such as the one in Estonia. This provides a way to operate a location-independent business online.
More traditional businesses have already voted with their feet. Almost 1,700 companies, including two big banks (Sabadell and CaixaBank), have switched their headquarters to other parts of Spain since the crisis escalated at the start of October.
The EU has made it clear that an independent Catalonia would not be a member of either the EU or the Eurozone. The latter would probably be a decided advantage, but effective expulsion from the EU could cause serious short term dislocations.
It is not just loyalty to Spain which is leading a lot of Catalans to demonstrate against independence. Whatever the long term outlook, the immediate economic costs would be substantial.
As published in City AM Wednesday 1st November 2017
Image: Demonstration by By Màrius Montón via Wikimedia Commons is licensed under CC by 4.0
Mark Carney, governor of the Bank of England, is getting his retaliation in early.
Faced yet again with the Bank failing to deliver its designated target of a two per cent inflation rate, in a speech last week he suggested that his remit was broader.
“We face a tradeoff between having inflation above target and the need to support, or the desirability of supporting, jobs and activity”, the governor stated.
In other words, he claimed that the Monetary Policy Committee (MPC) of the Bank should be concerned not just with inflation, but with what economists describe as the “real” economy, output and jobs.
The Federal Reserve in the US is explicitly mandated to take account both inflation and the real economy when it sets interest rates. This is definitely not the case with the Bank of England. When Gordon Brown made it independent in 1997, its remit was unequivocal. It was to ensure that inflation was two per cent a year.
This time round, inflation is above the Bank’s target. The current level of some three per cent may even rise in the short term because the weakness of sterling is pushing up the cost of imports.
But in recent years, inflation has been below the two per cent desired rate, even falling to zero in 2015.
All this time, Bank rate has been essentially flat. The MPC cut it to just 0.5 per cent in March 2009, where it remained until the reduction to 0.25 per cent in August 2016.
To put this into perspective, when the rate fell to 1.5 per cent in January 2009, this was the first time it had been below two per cent since the Bank was created in 1694, well over 300 years ago.
So here is a puzzle for mainstream macroeconomists, whether in central banks or universities. Central banks are meant in theory to be able to control inflation by setting short term interest rates. Inflation has been low since 2009. But at the same time, the Bank rate has been at all-time record lows.
Perhaps more pertinently, inflation has fluctuated from year to year, even though interest rates have to all intents and purposes not changed. It was 4.5 per cent in 2011, and 0.7 per cent in 2016.
In short, inflation seems to lead a life of its own, independently of what the experts on the MPC either say or do.
Inflation really is a naughty boy all round. A central concept in orthodox economic thinking, encapsulated in the quote from Carney above, is that there is a tradeoff between inflation and jobs and output. The faster the economy grows and unemployment falls, the higher inflation will be.
But starting in the early 1990s, for around 15 years across the entire Western world, both inflation and unemployment experienced prolonged falls.
The idea that a central bank can control inflation by adjusting interest rates is shown by the evidence to be absurd.
It is yet another example of the limits to knowledge in orthodox macroeconomics.
As published in City AM Wednesday 25th October 2017
Image: Mark Carney by Bank of England is licensed under CC by 2.0
The competition and Markets Authority (CMA) published a report about Price comparison sites at the end of last month. They seem simple enough, but these straightforward sites raise interesting issues for economics.
Overall, the CMA was pretty positive about the DCTs – digital comparison tools, to give them their Sunday best name. The conclusion was that “they make it easier for people to shop around, and improve competition – which is a spur to lower prices, higher quality, innovation and efficiency”.
DCTs offer two main benefits. First, they save time and effort for people by making searches and comparisons easier. Second, they make suppliers compete harder to provide lower prices and better choices to consumers. In short, they bring the real world closer to the perfectly informed consumers and perfectly competing firms in the ideal world of economic theory.
But even in this market, there is an issue which goes right to the heart of much of the information which can be accessed through the internet: how do we know whether we can trust it?
The main problem is that the comparison sites typically provide their services free of charge to consumers. They make money by charging a commission to suppliers.
This creates an incentive for a DCT to promote those suppliers which pay it the most commission. An effective way of doing this on the internet is by the order in which the information on the various suppliers is presented.
It is not that DCTs deliberately provide misleading information, or even that a site leaves off a major supplier which does not pay the particular website enough. But they can put those that pay the most at the top of the list.
Notoriously with Google searches, people rarely click through to anything which is not in the top three results of the search.
Allegedly, 60 per cent of the time, only the site which comes at the very top of is accessed.
Obviously on a DCT, consumers are likely to look at more. That is the whole point of using the site. But although the CMA does not provide hard data on this, it expresses a clear concern about the ways in which the sites rank the suppliers.
How the DCTs themselves set their prices raises a more general question for economics. The basic rule, described in the textbooks since time immemorial, is to set price equal to marginal cost – in other words, at the point where the revenue from one extra sale equals the cost of producing that extra item.
The standard assumption made in teaching generations of students their introductory course to economics is that as the level of output increases, marginal cost first of all falls but eventually rises.
But on the internet, once the site is set up, the cost of dealing with an extra user is effectively zero. The time-hallowed formula of economics is a recipe for bankruptcy.
The internet is forcing companies to innovate in their pricing strategies. And it is forcing economists to rethink some of their theories.
As published in City AM Wednesday 18th October 2017
Behavioural economics has received the ultimate accolade.
Richard Thaler of the University of Chicago Business School has been awarded the Nobel Prize in economics for his work in this area.
Economics over the past 20 to 30 years has become far more empirical. Leading academic journals do still carry purely theoretical articles, but far less than they once did.
This shift towards the empirical takes two forms. Major advances have taken place in the heavy duty statistical theory of analysing large scale databases containing information on individuals and their decisions. This was recognised when James Heckman and Daniel McFadden were awarded the Nobel Prize in 2000.
Behavioural economics is much less technical. In any given situation, the decision which a purely rational person would take is identified. We then look how people actually behave, and see if there are any deviations from the rational way of doing things.
Perhaps the main finding of behavioural economics is so-called prospect theory, first set out nearly 40 years ago by Daniel Kahneman. In essence, prospect theory says that people dislike making losses more than they like making gains of the same amount.
Another important discovery is that, when weighing up how to value future costs and benefits, people often place much more weight on the present and very immediate future than standard economic theory assumes. Last month I wrote about how this helps to explain the reluctance of electorates to deal with climate change.
These two results are backed by large amounts of evidence obtained in a range of different contexts. So now they are being integrated into economic theory.
But many economists are altogether less sure about much of the rest of behavioural economics. One of the issues is that it often gives the impression of being rather ad hoc. No reason is given as to why people in one situation appear to behave rationally, but in another they do not. Very few guidelines have emerged as to when we can expect to see deviations from rationality.
Another issue is that many economists are prepared to accept that non-rational behaviour might be observed at a point in time. But in a reasonably stable situation, people will learn over time to be rational.
Behavioural economics is not just about advancing knowledge on the workings of the economy. Policy-makers have become interested.
Cass Sunstein, Thaler’s colleague, served in the Obama administration as head of the Office of Information and Regulatory Affairs. David Cameron set up the so-called “Nudge Unit” in his government based on Thaler’s ideas. Thaler claimed 10 years ago that a “nudge” could lead to “better investments for everyone, more savings for retirement, less obesity, more charitable giving, a cleaner planet, and an improved educational system”. In his 2016 book Misbehaving, he has backed off the extravagance of these claims.
Still, whatever the doubts and qualifications, behavioural economics has made a big impact. An economist can no longer be said to have a good training if he or she is not familiar with its main themes.
As published in City AM Wednesday 11th October 2017
Image: Richard Thaler by Chatham House is licensed under CC by 2.0
A red-hot topic in economics is randomised controlled trials (RCT). Esther Duflo, the MIT academic who has really driven this idea, has surely put herself in pole position for a Nobel Prize at some point.
The idea of RCTs has been imported from medicine.One group of people are selected at random to be subject to a particular policy, and the outcomes in this set are compared to the rest of the population, which are not.
The studies have been almost exclusively carried out in developing countries. Evaluating RCTs often involves some subtle statistical points, but they are a powerful way of identifying what really works. Their policy impact has already been substantial.
Over 200m people worldwide have been reached by the scaling up of programmes evaluated by the J-PAL network, in which Duflo is the leading light. The RCT studies themselves are carried out on a small scale, evaluating very particular policies. If they succeed, they can be expanded. Examples include encouraging the take-up of school-based deworming, chlorine dispensers for safe water, and free insecticidal bed nets.
A closely related concept is known as a natural experiment. This is when we observe two contrasting policies which have been carried out in the past, either at the same time on different populations or at different times on the same one.
The policies in this case have not been deliberately designed as part of an experiment. They have been introduced as part of the political process.
But good natural experiments can be just as informative as RCTs. Indeed, they can reach the parts which RCTs cannot get to, because we can observe natural experiments which have taken place on very large scales.
By far the most important of these is the series of natural experiments on the performance of market-oriented economies compared to their centrally planned socialist rivals.
The current tensions highlight the differences between North and South Korea. In the 1950s, the latter had living standards similar to African countries. Now, they are at Western levels.
Other countries which were poor in the mid-twentieth century and which have adopted the principles of market-oriented economics have also prospered.
The fall of the Berlin Wall at the end of the 1980s brought into sharp focus the contrast between East and West Germany. The Trabant was a popular car in the East, but it was of such poor quality that its value dropped to almost zero as soon as Western cars could be imported.
The major economic contest of the twentieth century was between the US and the Soviet Union, won easily by America.
India and China practised different forms of socialism until the late 1980s. The Chinese was the most extreme – resulting, for example, in the deaths of at least 60m people in the self-induced famines around 1960. After adopting market principles, both countries have flourished.
The outcomes of these major natural experiments are decisive. Belief in socialism in 2017 is equivalent to believing the sun goes round the Earth.
The German elections on Sunday went pretty much according to the polls. Another victory for Chancellor Merkel.
Much of the commentary has focussed on the success of the far-right Alternative fur Deutschland (AfD) party. One of its leading candidates eulogised the German armed forces during the Second World War – a topic even more sensitive topic there than it is here.
The AfD took nearly 13 per cent of the vote and 94 seats in the Bundestag. This puts them within striking distance of the Social Democrats (SPD). The SPD share of the vote collapsed to just over 20 per cent, barely half the support it attracted only a decade ago.
The Social Democrats – the German equivalent of Labour in the UK – were annihilated not just in areas like Bavaria where traditional centre-right parties have always been strong. They lost very heavily in the old East Germany, where the AfD secured its highest level of support.
It is no accident that the rise of the AfD and the fall of the SPD go hand in hand.
A popular myth among the liberal elite in Britain is that our country has an exceptionally high level of inequality compared to the rest of Europe. This far from being the case.
The same forces which have widened inequality here have operated in Germany, in some ways even more powerfully.
The opening up of Eastern Europe in the early 1990s has had a strong effect. Employers soon realised that economies such as Poland and the Czech Republic possessed educated labour forces, whose productivity potential had been suppressed by the gross inefficiencies inherent in planned economies. German companies opened up new production plants in the old Soviet bloc countries in Europe rather than at home.
This has been combined with the impact of both globalisation and mass immigration.
The effect on wage rates of this increase in competition in the labour market has been dramatic. Christian Dustmann at UCL has examined the evolution of wage rates in the former West Germany.
The fifteenth percentile of the wage distribution is the level at which only 15 per cent of wages are lower. In West Germany, at the fifteenth percentile, real wages have fallen almost continuously since the mid-1990s.
At the fiftieth percentile, where half get more and half get less, the reduction has been less sharp. But the fall had set in by the early 2000s.
At the eighty-fifth percentile, we see the mirror image of the fifteenth: real wages grew strongly, reaping the benefits of the recovery of the economy.
In the states of the old East Germany, the problems are even worse.
The sharp rise in inequality is a key reason for the collapse in support for the social democratic parties across Europe. Their traditional voters have been the ones who have been hit the most by static or falling real incomes. They have not been defended by social democrat parties, which now represent the interests of the public sector urban professional classes.
The fact that Jeremy Corbyn’s Labour did not suffer the same fate as Germany’s SDP is a clear indictment of the Conservative campaign in the election.