Paste your Google Webmaster Tools verification code here

Blog

What England’s greatest ever batsman tells us about how to use statistics

Posted by on August 11, 2016 in Economic Theory | 0 comments

What England’s greatest ever batsman tells us about how to use statistics

Cricket fans will be delighted that Joe Root is establishing himself this summer as a truly great batsman. His Test match batting average of 55.49 is bettered by only 16 players from across the world since Test cricket began in 1877. Root currently sits seventh in the England career batting averages, and he clearly has the ability to improve. But who is the greatest English batsman of all time?

The question is interesting in its own right to cricket fans. But it also raises general issues around the need to understand the background to statistics, how they are produced, and in what context they appear.

That said, statistics alone are decisive in determining the greatest Test batsman. That laurel falls unequivocally to Don Bradman. In terms of players whose careers have ended, only four have an average exceeding 60. Three of these are between 60 and 61. Bradman averaged 99.94.

But judging the best English batsman ever is a bit more tricky. Geoff Boycott, who bore the brunt of ferocious West Indian attacks in the 1970s, can clearly be ruled out. His average of 47.72 falls too far short of those with averages in the high 50s to be explained away by circumstances.

One candidate is the Surrey opening batsman Jack Hobbs. In the course of a long career between 1905 and 1934, he not only averaged 56.94 in Tests, but accumulated the highest number of first class career runs of anyone in the world, a total of 61,760. It is hard to see this latter record being broken. Careers are shorter than they used to be, and Hobbs’s three decades were not then untypical. And there are a lot more Test matches, which reduces the opportunities for top batsmen to make big scores against the weaker sides in the county championship. Only two out of the top 20 run makers ended their career after 1990.

Hobbs’s opening partner for England during much of the interwar period was the Yorkshire man Herbert Sutcliffe. In a career lasting from 1919 to 1945, he registered the best Test average of all England players, 60.73, and notched up a grand total of 50,607 runs at a slightly better average than Hobbs. Not as well-known as the Surrey man, Sutcliffe seems to have the edge.

But delving under the stats, my own vote goes to Len Hutton. Only sixth on the all-time list of England averages at 56.67 and scorer of “only” 40,140 runs, as a 21 year old, he set a new individual world record, with 364 against Australia at the Oval in 1938. As important context, his best years were cut off by the war. Not only that, he received a serious war injury to his arm, forcing him to alter his batting style. When Test cricket resumed, he had to face the demon combination of Lindwall and Miller, part of the great Australian team of the late 1940s, the strongest side in history until that date.

Statistics on their own, whether in cricket or in economics, often do not tell the full story.

As published in CITY AM on Wednesday 10th August 2016.

Image: You’re out by Graham Dean is licensed under CC BY 2.0

Why the economic picture tends to be rosier than initial estimates suggest

Posted by on August 4, 2016 in Economic Theory, GDP, Inflation, Markets | 0 comments

Why the economic picture tends to be rosier than initial estimates suggest

One of the surprises of last week was the Office for National Statistics (ONS) estimate of economic growth in the second quarter of 2016, the period from April to the end of June. In the run up to Brexit, the economy expanded by 0.6 per cent on the first quarter of the year. This was an acceleration, with the first quarter of 2016 only being up 0.4 per cent on the previous one.

The situation in the third quarter is currently confused. The GFK consumer confidence survey for July showed the biggest monthly drop since 1990. But the weakness of the pound means that exports are due for a boost. Certainly, judging from the sheer numbers of foreign tourists crowding London in the last few weeks, we are raking in the euros and the dollars.

But the ONS view on what happened in the second quarter of 2016 is by no means the last word. Quite rightly, our national statisticians are keen to provide information on what has been happening in the economy as fast as they can. So they publish what is known as the “first estimate” of GDP growth for a quarter just a few weeks later. It is this estimate which grabbed the headlines.

Most economic data published by the ONS are estimates, produced with information gathered from a wide variety of sources. But as time goes by, more of it comes in for any particular quarter. Self-employment income, for example, is quite important these days. But an accurate picture is not available until after the end of the tax year, when all the returns are submitted. So the initial estimate might very well change over time.

Looking back over the past 20 years, the average of all the first estimates of growth made over this period is a bit lower than the average of the latest estimates. So, on balance, first estimates get revised upwards, showing that the economy has been more buoyant. But statistically speaking, we cannot say with real confidence that they are significantly different.

Certainly, the averages can conceal some large inaccuracies in the first estimate data. The ONS now thinks we entered the recession of the late 2000s in the second quarter of 2008, with the economy shrinking by 0.7 per cent compared to the first quarter. But the first estimate which was made showed modest growth of 0.2 per cent. The first estimates did indicate a recession in the second half of 2008, but underestimated how much the economy was contracting. In contrast, during 2009, the first estimates failed to register the speed of the economic recovery. In the winter of 2011-12, the first estimates notoriously suggested we had entered a new recession, which is not borne out by the latest data.

It is a hard life being a policy-maker. One of the problems is reading the runes about where the economy is now and where it has been in the recent past. The first estimates of GDP are better than nothing, but can on occasion be quite wrong.

 

As published in CITY AM on Wednesday 3rd August 

Image: Scale by Thomas Leuthard licensed under CC BY 2.0

Why Britain and the US are streets ahead of Europe in innovation

Posted by on July 28, 2016 in Economic Theory, Euro-zone, Markets, Uncategorized | 0 comments

Why Britain and the US are streets ahead of Europe in innovation

The proposed takeover of the hugely successful ARM Holdings by the Japanese giant SoftBank is in the news. Cambridge-based ARM is well placed to exploit the white hot concept of the internet of things, highlighting the UK’s recent advances in this field.

The UK has also performed well in biotechnology. But the industry came under scrutiny last week at a Centre for the Study of Financial Innovation seminar. Geoffrey Owen, former editor of the Financial Times, and Sussex academic Michael Hopkins introduced their new book Science, the State and the City.

On a world scale, the UK is second only to the United States in biotech, outstripping everyone else in key performance indicators for the industry. Owen and Hopkins’s book, however, is prompted by the fact that we are a very long way behind the leader. For example, US scientists have 45 per cent of all the citations in life sciences in academic journals, while we have just 15 per cent. The UK government spends roughly double the amount on health research and development of our European neighbours, but America spends at least 10 times as much as we do.

Our distant second places, in these and other areas which determine the success of a high technology industry, feedback on each other and cumulate. As a result, the market capitalisation of US biotech firms is more than 20 times as big as those of the UK.

Why has this happened? After all, what is possibly the greatest scientific discovery of the twentieth century which made all this possible, that of the double helix structure of DNA, was by the British scientists Crick and Watson.

Owen and Hopkins carefully dismantle the myth that it is the short-term outlook of the City which is responsible. This is often compared unfavourably to the long-termist approach of Germany and Japan. But it is the allegedly short-term Anglo-Saxon economies which are by far the best performers in biotech, an industry in which the time period from scientific discovery to marketable product is at least 10 and often as much as 15 years.

They do note, however, that British academics appear more interested in publishing academic papers and securing yet more research grants than in the process of commercialisation. There is a steady flow of entrepreneurial scientists who found biotech companies, but it is very much a minority taste in the UK compared to America.

The US industry clusters, with firms concentrated in San Francisco and Boston. So does the British, mainly near Cambridge. But attempts by European governments to develop clusters in a top down, dirigiste way have not worked. Owen and Hopkins argue that American success is based on a bottom up, evolutionary process, in which a successful ecosystem emerges rather than being designed.

Entrepreneurial academics, teaching hospitals, and venture capital spontaneously collaborated for mutual benefit. The US government also helped, with its massive funding for research and regulatory changes which boosted the industry.

The lesson is a general one for development. The public sector can facilitate but not command success. That arises from the drive of individuals with proper incentives.

Paul Ormerod

As published in CITY AM on wednesday 27th July 2016

Image: DNA representation by Andy Leppard is licensed under CC BY 2.0

Sorry, Prime Minister: Legislation won’t end excess in the boardroom

Posted by on July 21, 2016 in Behavioural Economics, Economic Theory, Executive Pay, Infrastructure, Politics | 0 comments

Sorry, Prime Minister: Legislation won’t end excess in the boardroom

A key platform of our new Prime Minister is to curb what she perceives to be boardroom excesses.  “It is not anti-business to suggest that big business needs to change”, she said.

One of her proposals is to allow employee and worker representatives to sit on company boards, a suggestion which has not gone down well in the corporate world.  The debacle at the Co-op, with its legion of elected directors, has been cited many times as an argument against Mrs May’s idea.  First Group already has employees on the boards not just of its component companies, but of the group plc itself.  However, given that one of the companies is Great Western trains, one of the most notoriously unreliable of all the rail operators, this has not proved to be a panacea.

May is keen on making shareholder votes on executive remuneration legally binding.  True, in the spring, a clear majority, some 60 per cent, rejected BP boss Bob Dudley’s £14 million pay.  But only 33 per cent of shareholders failed to back Martin Sorrell’s package at the WPP AGM last month, even though both Standard Life and Hermes, two of Britain’s most influential fund managers, voted against the pay report.

Board members receive emoluments.  Shop floor workers get pay.  Yet however we describe them, the gap between the two has opened up in dramatic fashion in recent decades, with no obvious economic justification.  In the United States, for example, the average compensation of CEOs in the top 350 firms is some $15 million a year.  This enormous sum is 300 times higher than the amount the companies pay to the typical worker.  In the mid-1970s, the ratio was not 300:1 but only 30:1.  Even in the mid 1990s it was around 100:1.  This latter figure would still hand the average CEO some $5 million today, not a bad sum to have.  The American economy has done well, but it also did well in the decades immediately after the Second World War, when remuneration disparities were much narrower.

Legislation is the brutal option, but there is no guarantee it would work.  The fundamental challenge is to alter the set of values which has become dominant.  The social norm at the top of industry has become one in which it is perfectly acceptable to be paid very large amounts, virtually regardless of performance.  This behaviour has influenced remuneration in non-market sectors of the economy, such as the pay of top management in universities, and in particular that of Vice-Chancellors.  Their average salary in 2014/15 was £274,000.  Incredibly, the Vice-Chancellor of Falmouth University – it does in fact exist – received £285,000.

There is a simple policy which could radically alter attitudes and behaviour.  The Prime Minister should let it be known that no-one who behaves in an ostentatious way on this matter will either be knighted or elevated to the Lords.  Indeed, such honours might be reserved for captains of industry who volunteer for substantial pay reductions.  Miracles might then happen and social norms drastically changed.

Paul Ormerod

As published in CITY AM on Wednesday 20th July

Image: Home Secretary, Theresa May, speaking at the Girl Summit, by DFID – UK Department for International Development  licensed under CC BY 2.0

Why austerity must be the order of the day for May’s chancellor

Posted by on July 15, 2016 in Austerity, Economic Theory, Employment, Euro-zone, GDP | 0 comments

Why austerity must be the order of the day for May’s chancellor

On the face of it, the Brexiteers have a bit of explaining to do.

A week before the vote, Boris Johnson dismissed fears about the value of sterling, and accused the governor of the Bank of England of “talking the economy down”.

Yet the economy does seem to have stalled, property funds have had to suspend redemptions, and the pound has collapsed. So what is to be done? The chancellor – whoever Theresa May chooses for the job – faces something of a challenge to kick-start the economy and restore confidence. Fortunately, economic history provides some very clear guidelines.

In 1949, the post-war Labour government devalued the pound by 30 per cent, from a rate of $4.03 to $2.80. The latter rate held until 1967, when the Labour government of Harold Wilson reduced it further to $2.40, a fall of 14 per cent.

By the time of Margaret Thatcher’s first government, the world had moved away from fixed exchange rates to a floating system. Against the dollar, sterling fell by 13 per cent in 1981 compared to the previous year, and by further 14 per cent in 1982.

In each case, the devaluation was followed by periods of strong economic growth. In the five years after the devaluation of the late 1940s, GDP grew by an average of 3.7 per cent a year. Following the Wilson devaluation, the five year average was 3.6 per cent, and the corresponding figure in the 1980s was 3.9 per cent.

These compare to the average over the post-war period of just 2.5 per cent. Obviously, a fall in the value of the pound makes our exports more competitive and enables British goods and services to compete more effectively against imports in our domestic markets. But there was an additional factor which was necessary to take advantage of the devaluations.

In both the late 1940s and mid to late 1960s, the economy was at full employment. There was a bit more slack in the early 1980s, but this was almost entirely in the old industrial areas. In the current situation, we are effectively at full employment. The unemployment rate is only 5 per cent, and a record 74.2 per cent of the working age population are in work. Space had to be created to enable net exports to expand.

In each of the three historical episodes, the then chancellors did so by tightening the fiscal stance. In other words, by implementing austerity.

Under the left-wing government of the 1940s, the policy was dramatic. A budget deficit of over 6 per cent of GDP (over £100bn at today’s prices) was transformed to surpluses of 4.6 per cent in 1949 and 3.4 per cent in 1950. Roy Jenkins changed the 1967 deficit of 3.9 per cent of GDP to a surplus of 1.8 per cent in 1969. And Geoffrey Howe provoked the notorious letter signed by 364 economists by cutting public spending in the early 1980s. In each case, devaluation combined with drastic fiscal tightening worked wonders.

The lesson is clear.

As published in City AM on Wednesday 13th May

Image: Numbers and Finance by reynermedia licensed under CC by 2.0

How Stalin’s right-hand man could help the UK in EU exit negotiations

Posted by on July 7, 2016 in Behavioural Economics, Economic Theory, Euro-zone, Politics, Uncategorized | 0 comments

How Stalin’s right-hand man could help the UK in EU exit negotiations

The topic of behavioural economics is very fashionable. But many economists remain rather sniffy about it, arguing that it often does not really add to what the discipline already knows. But one of its most distinctive and strongest results from a policy perspective is its emphasis on what is called the “architecture of choice”.

Economists love jargon phrases. But this particular one is in essence very simple. In any given context, the rules which are drawn up for the process of choice can have an absolutely decisive impact on the actual decisions.

For example, before the referendum the government struggled against opposition in the Lords to get through a bill on trade union political funds. At present, the costs of any political fund operated by a union are automatically deducted from a member’s dues. A member has to positively opt out if he or she does not want to pay it. The proposal was to make all members “opt in” so they would only pay into the fund if they take action to do so. The government has partially backed down on the measure, and it will now only apply to new members.

The reason the opposition has been so bitter is because how the choice is put will have a dramatic effect on the outcome. The “architecture of choice” will determine whether most union members pay the political levy or whether most do not. From a purely rational perspective, the only additional cost under “opting in” is trivial. It is the few minutes it would take to fill the form in. But in practice under “opt in”, most people would not bother.

The UK faces a crucial architecture of choice problem with the now notorious Article 50 of the EU’s Lisbon Treaty. In order to leave the EU, a member state has to invoke the article. Once this is done, there is a period of two years under which the terms of exit are negotiated. When the two years are up, the deal is simply what it is at the time. In theory further changes can be made, but since these would require the unanimous consent of all EU member states, it would be highly unlikely to happen.

So once we invoke Article 50, the EU has us over a barrel. The French, say, could simply sit there stalling for time and blocking all our proposals. Of course, they would never stoop so low. But if some other country did, we would just have to take what the EU gave us at the end of the two years. This is why we have to have extensive informal negotiations before Article 50 is triggered, which EU leaders say mustn’t happen. The Swiss drew up their first treaty with the EU in 1972 and are still negotiating.

The only alternative is to adopt the strategy of Molotov, Stalin’s right-hand man, at the United Nations in the middle of the last century. He simply said “no” to virtually everything. Until we get informal talks, we turn up at the Council of Ministers and veto every proposal on any subject whatsoever, regardless of its merit. A suitable job for Michael Gove, perhaps.

As published in CITY AM on Wednesday 6th July 2016

Image: LC-USZ62-135316 by National Museum of the U.S. Navy is licensed under CC 1.0

The only way could be down for shares – and Brexit is just the catalyst

Posted by on June 30, 2016 in Economic Theory, Euro-zone, Financial Crisis, Inflation, Markets, Politics, Recession | 0 comments

The only way could be down for shares – and Brexit is just the catalyst

The Brexit vote creates many uncertainties, exciting or frightening depending on your predilection. One thing which is certain is that the Leave victory was delivered by the less-skilled sections of the electorate.

It seems part of a more general stirring up of what we might think of as the dispossessed, those who feel left behind by globalisation. In France the Front National, in the Netherlands Geert Wilders’s Party for Freedom, in Germany Alternative fur Deutschland – throughout Europe, in fact, these discontents receive an increasingly sympathetic hearing.

Equity markets have been very volatile and nervous in the face of the uncertainties which Brexit creates. But there may be a good reason for this from a longer-term perspective.

Compared to 30 years ago, stock prices both in Europe and the US are at much higher levels. A key reason underpinning this is the shift from wages to profits as a proportion of national income which has taken place. The share of wages in national income has fallen, and that of profits has risen. Profits have grown faster than the economy as a whole, and so the potential future dividend stream from shares has gone up. As a result, shares have become more valuable.

Measuring the share of wages in national income is not as straightforward as it might seem. Should it, for example, include self-employed income or the remuneration of chief executives? In February 2015, the OECD, along with the International Labour Organisation, published a detailed study of trends in the G20 economies since the early 1990s. No matter which measure was used, the data show that the wage share declined significantly in almost every member state of the G20, and nowhere was there a significant trend increase.

The changes themselves may appear small. On one measure, for example, the wage share fell from an average of 69 per cent of national income in 1990 to 65 per cent now. But in terms of, say, the UK economy, four percentage points represents nearly £80bn.

More recently, there has been a levelling off in the downward trend. The distribution of income between wages and profits has been stabilising. Does Brexit signify a tipping point, when the trends of the last few decades might start to be reversed?

The economic orthodoxy, not just in theory but in practice, has been one of open borders for both labour and capital. Both must be allowed to flow freely. But there is an increasing groundswell of public opinion against this. Donald Trump, for example, supports a 20 per cent tax on all imported goods to protect American jobs. Bernie Sanders has opposed every free trade deal which the United States has negotiated, and vowed to “take on corporations which take their jobs to China”.

It is much easier to protect wages in a world of tariff barriers and restrictions on capital movements. Boris Johnson sees Britain as a global entrepreneur, but most Brexit supporters do not. Brexit would not be the cause of a long-term downward revision to share prices, but more a symbol of why it’s happening.

As Published in CITY on Wednesday 29th June 2016

Image: The British Question by Andrew Gustar is licensed under CC BY 2.0

The EU referendum has shown just how irrational voters are

Posted by on June 23, 2016 in Economic Theory, Euro-zone, Politics | 0 comments

The EU referendum has shown just how irrational voters are

Some things never seem to change.  In the mid-16th century, in the course of her short reign Queen Mary, a daughter of Henry VIII, tried to restore Catholicism.  To this end, she arranged to marry King Philip of Spain, at a time when Spain dominated Europe.  The Spanish ambassador in London sent back a gloomy report.  The English, he said, drink vast quantities of alcohol and hate all foreigners.

In the spirit of continuity, people from other countries continue to be baffled by us.  We await the results of the referendum with keen anticipation.  But it is not easy to explain to someone from Spain, as I had to do this week, that the areas of the UK which might benefit from Brexit are very likely to vote Remain, whilst those which would struggle will vote Leave.

Rational choice seems to have flown out of the window.  The former mining valleys of South Wales, for example, whose income per head is one of the lowest in Western Europe, benefit substantially from an inflow of EU funds.  Yet it appears that they will vote strongly to quit the EU.  In contrast, London and the South East, which have a dynamic, innovative economy, are keen to stay in the EU.

It is obvious to anyone who has visited these places that the Thames Valley, for example, has a more successful and outward looking economy than, say, the old mill towns of the North of England.  But how do we actually measure this?    George Osborne is trying to redress regional imbalances with his Northern Powerhouse, and trying to boost the long term potential of the whole of the UK with his Productivity Plan.  The problem is that the data produced by the Office for National Statistics is wholly inadequate for judging whether a local area has started to perform better as a result of his initiatives, or whether it has even got worse.

We can get an idea of the relative dynamism of each local authority area in the UK by looking at the stability of relative unemployment rates both across the UK as a whole, and within each of the UK’s regions.  If an area if performing relatively badly, its unemployment rate is likely to be higher than not just the average in the UK, but compared to other areas in its own region, such as Yorkshire or Eastern England.

The numbers make depressing reading.  Comparing unemployment rates in the 382 local authority areas in the UK in 2005 with the rates in 2015, the correlation is 0.88.  In plain English, little change.  If you were doing badly before the crash, you were still doing badly after it.  Even if we go back to 1990, in general the poor areas are still poor, and the rich ones rich.

The map of local unemployment rates in Britain continues to be a mirror image of the referendum poll numbers.  Another source of bafflement for Spain and the rest of the EU in the early 21st century.

Paul Ormerod

As Published in CITY AM on Wednesday 22nd June

Image: Polling Card by Abi Begum as licensed under CC BY 2.0 

Old Europe’s poor innovation record is a harbinger of long-term stagnation

Posted by on June 16, 2016 in Euro-zone, Markets | 0 comments

Old Europe’s poor innovation record is a harbinger of long-term stagnation

The economic debate around Brexit has been disappointing.  Far too many of the points focus on the short-term.  Would Brexit precipitate a sterling crisis?  Well, if it did, at some point the currency would bounce back?  Would it tip us into a recession?  Maybe, but recessions come to an end.

The key economic question, not just in the Brexit debate but one which faces the West as a whole, is what is going to happen to the long-term growth rate of the economy.  It is the long term growth rate which determines living standards, which determines how much we can afford as a society to spend on health, education and pensions. Long-term growth rates reflect the underlying productive potential of the economy.

On this criterion, the Leave camp seems to have the debate in the bag.  There is a strong consensus across economists, regardless of their views on Brexit, that the main determinant of long-term growth is innovation.

The European Commission pays a great deal of lip service to this, but Europe in general still lags considerably behind America in terms of innovation.  Innovation is by definition disruptive.  It creates new companies and industries, but at the same time destroys existing ones.  The willingness of a country to embrace rather than resist change is crucial.

Old Europe, to use Donald Rumsfeld’s notorious phrase for the original, core members of the EU, has an abysmal record.  True, the Germans implemented important structural reforms in their labour market in the 2000s, mirroring those introduced here by Mrs Thatcher in the 1980s.  But long term growth rates in Old Europe have been falling now for nearly fifty years.

The average growth rate over a sufficiently long period is a good indicator of long-term potential.  Over twenty years, for example, the short-term booms and busts will even themselves out.  In the 1950s and 1960s, growth in the core EU economies was very high, at 7 per cent a year, reflecting the huge post war boom.  In 2015, the average over the past two decades was barely above 1 per cent.  In contrast, the 20 year average growth rate in the UK has been pretty stable.  In 1970, it was 2.8 per cent a year.  It is now 2.4 per cent.

Remaining shackled to a system which appears to prefer regulating to innovation does not seem such a good bet.  But we have to take into account the likely response of the major players in the EU to Brexit.   The fear is that the UK deciding to leave the EU would trigger similar responses across the continent.  It would certainly give huge encouragement to already strong anti-EU feeling in many countries.

So the only rational response is to be punitive towards us, to try and make life as difficult as possible over as long a period as possible.  Whether they love us or loath us, the EU would have no alternative.  Couples getting divorced may wish the process were harmonious. But in reality, it is often nasty, messy and the bitterness can last for years.

Paul Ormerod

As published in CITY AM on Wednesday 15th June 2016

Image: Economy by Stefan Powell is licensed under CC BY 2.0

The poor state of macro justifies scepticism with Brexit disaster forecasts

Posted by on June 9, 2016 in Economic Theory, Employment, Euro-zone, Financial Crisis, GDP, Inflation | 0 comments

The poor state of macro justifies scepticism with Brexit disaster forecasts

David Cameron has tried to frame the Brexit debate into one based on economics.  Standing with him is the overwhelming consensus of economists themselves, from academics to the International Monetary Fund (IMF).  Their pronouncements are not having that much impact on the electorate if the polls are to be believed.

There is justification for this public scepticism. The arguments relate to what might happen to the economy at the aggregate, or macro level.  How much will GDP rise or fall, how many jobs will be lost or created, what will happen to trade, to inflation?

At the individual level, or micro level as economists call it, a great deal of progress has been made in the past twenty years or so. But at the overall, macro level, mainstream economics has if anything gone backwards. Concepts such as rational behaviour and equilibrium have been incorporated into the thinking of macro economists, at the very time that their micro colleagues are challenging them.

Olivier Blanchard, until recently chief economist at the International Monetary Fund, has real form on the perils of believing orthodox macro economics. In August 2008, for example, just three weeks before Lehman Brothers collapsed and the worst recession since the 1930s burst on the world, he published a paper claiming that the state of macroeconomics was “good”.

The relationship between inflation and unemployment is a central building block of macroeconomics.  Economists even have a special phrase for it, the so-called ‘Phillips curve’, named after the LSE based academic who discovered it in the 1950s. The curve in theory says: the lower is unemployment, the higher is inflation.  This is the subject of Blanchard’s latest offering in the American Economic Review.

The Phillips curve is not just of academic interest. The Monetary Policy Committee, for example, has an inflation target, and unless they know what the curve looks like, they are not going to be able to do a very good job.

Blanchard sets out a formidable looking mathematical model. He then employs statistical techniques in conjunction with the theory, in the same way that, for example, the UK Treasury published one with their estimates of the trade costs of Brexit, and claims that “the US Phillips curve is alive and well”.

Up to a point, Lord Copper. For one of Blanchard’s conclusions is that “The standard error of the residual in the relation is large, especially in comparison to the low level of inflation”. Translated into English, this simply means that his model does a poor job at explaining what has been going on. This is hardly surprising.  The unemployment rate peaked in the US at just under 10 per cent in 2010. Since then it has halved to stand at 5.0 per cent.  But inflation is slightly lower, at 1.2 per cent compared to the 1.6 per cent average in 2010.  The story is just the same in the UK and Germany. Since the crisis, unemployment has fallen sharply, and inflation has edged down. Macro models are by far the weakest part of economics.

Paul Ormerod

As published in CITY AM on Wednesday 8th June

Image: Exit by Shannon Clark is licensed under CC BY 2.0