Paste your Google Webmaster Tools verification code here

The national accounts are the new JK Rowling

The national accounts are the new JK Rowling

A potential candidate for the world’s most boring book is the Office for National Statistics’ National Accounts: Sources and Methods.  This book, all 502 pages of it, is currently available in hardback on Amazon for just 1p.  It does exactly what it says in the title.  It gives a detailed description of how the data in the national accounts – variables such as GDP, inflation, earnings – are estimated.

These data series are the building blocks on which economic policy is based.  The Bank of England has a mandate to keep inflation around 2 per cent.  The Chancellor frets if the latest GDP growth figures are weak.  All these indicators depend upon the detailed processes described in the book of gathering information, sifting it, and deciding what it means.

To an economist at least, the tedium of the book is interrupted by occasional pearls of wisdom.  There is huge uncertainty around any particular number which is produced.  Sources and Methods suggest that the potential margin of error around an estimate of GDP is plus or minus 1 per cent.  To put this in context, average annual GDP growth in the UK over the past 30 years is just 2.3 per cent.  For some of the more obscure statistics, the error margin was believed to be as large as plus or minus 15 per cent.

But national accounts have suddenly become sexy.  Sir Charles Bean, former Deputy Governor of the Bank of England, has been tasked by George Osborne to carry out an independent review of the quality and delivery of UK economic statistics.  There is a particular focus on the challenges of measuring the modern economy, with a rapidly rising proportion of all economic activity taking place via the Internet.  Measuring value in cyber society, with its completely innovative range of products and services, is a major intellectual problem.

The latest issue of one of the American Economic Association’s flagship journals, the Journal of Economic Perspectives, carries an article on communicating uncertainty in official economic statistics.  The initial estimates for any statistic are invariably revised over time.  These revisions are often large, so that the early estimates offer a misleading view of the economy to policymakers.  In the first quarter of 2014, for example, there was an unexpected fall in American GDP on the previous quarter.  Initially believed to be just 1 per cent at an annual rate, the number was revised to a much larger drop of 2.9 per cent.

The problem goes much wider. For example, national accounts statisticians rely quite a lot on surveys.  But ‘non-response’ can be serious.  In poverty statistics for the US, for example, over the 2002-12 period between 7 and 9 per cent of households in the sample yielded no data at all by not responding.  A massive 41 to 47 per cent gave incomplete data by not filling in all of the survey.

Statistics in general is suddenly fashionable. High starting salaries go to graduates who can analyse Big Data.  And the boring old national accounts find exciting new challenges.

As published in City AM on Wednesday 15th September

Image: Harry Potter by Halle Stoutzenberger licensed under CC BY 2.0

Read More

Whatever it is, Corbynomics is not mainstream

Whatever it is, Corbynomics is not mainstream

A group of economists hit the headlines last week with their claim that Jeremy Corbyn’s policies are supported by mainstream economics.  Perhaps the best known of them is David Blanchflower, a Monetary Policy Committee member when Gordon Brown was Chancellor.  He predicted before the 2010 General Election that under the Conservatives, unemployment would rise from 2.5 million to 4 million, even 5 million was ‘not inconceivable’.  The actual number now is 1.85 million.  Still, economic forecasting is a notoriously difficult exercise.

The claim that Corbyn represents orthodox economic thinking is not easy to sustain.  It is not possible to find a single article in a leading academic journal which recommends nationalising large swathes of the economy, particularly without compensation.  Indeed, completely opposite themes are stressed, such as the importance of competition and markets, and respect for the principle of contracts and the rule of law.

To be fair, the Corbynistas only endorse his tax and spend policies.  They claim that support for fiscal and monetary expansion is now the economic mainstream.  But they fail to take into account one of the most fundamental concepts in mainstream macroeconomics, the so-called Lucas critique.  This esoteric idea, quite unknown to the general public, has profound practical implications.  

Many Keynesian economists try and assess the impact of policy changes in the following way.  They take the key aggregate variables in an economy, such as personal consumer spending, exports, unemployment and the like, and use advanced statistical techniques to correlate them to other variables.  Data is used over the past twenty or thirty years, to get enough observations.  What emerges is the average impact over this period of changes in one variable on another.  To take a simple purely illustrative example, we might find that if sterling fell by 10 per cent, on average over the past the value of exports increased by 5 per cent.

All these statistical relationships are bundled together in a computer, and questions can then be asked.  What might happen if public spending were increased?  The complex interrelationships in the programme are calculated, and the answer pops out.  Forty years ago, Chicago based Nobel Laureate Robert Lucas made his critique.  Changes in policy may very well change the average relationships which previously existed.  The past is not necessarily a guide to the impact of a policy change.  President Hollande discovered the practical power of this point when he put tax rates up to 75 per cent.  He was presumably advised, on the basis of evidence from the past, that this would raise revenue.  But hundreds of thousands of the most enterprising French citizens did not pay the tax at all.  They simply left the country.

The idea that Corbyn’s policies on state control of enterprise can be separated from his fiscal and monetary proposals is not one which bears more than a moment’s scrutiny.  The Lucas critique applies in spades. Any analysis which pontificates on the effects of the latter without taking into effect the whole gamut of his polices can hardly be taken seriously.

As published in City AM on Wednesday 2nd September 2015

 

Image: “Jeremy Corbyn” by Garry Knight is licensed by CC BY 2.0

Read More

History shows why robots won’t destroy our jobs

History shows why robots won’t destroy our jobs

Economics is often described as the dismal science, but it often contains cheerful material. A paper by the leading American economic historian Joel Mokyr made for exuberant holiday reading. Written for the top Journal of Economic Perspectives, it is entirely in English and contains not a single mathematical symbol. Mokyr examines the history of anxieties about the economic impact of technology since the late 18th century.

We are living through precisely such a phase of worry at present, as fears abound that robots will destroy our jobs and take over the world. There is nothing new under the sun. The same concern was widespread two centuries ago. The machinery installed in new-fangled factories would create mass unemployment. Mokyr points out neatly that people get anxious at the same time about a problem which has completely opposite implications. Namely, that we are running out of ideas, and the progress of technology will grind to a halt. The great English economist David Ricardo addressed exactly this question in the early 19th century in his Principles of Political Economy. Many leading economists in the United States share the concern today.

The most famous group objecting to machinery two hundred years ago were the Luddites, who went round smashing it up, along with any unfortunate mill owner they could get their hands on. But the slightly later Captain Swing riots were also widespread, particularly in rural areas, and were often even more dangerous. Mokyr notes that the modern equivalent is the Occupy Wall Street movement, an altogether tamer creature. It turns out that the Swing riots were mainly directed not against the new threshing machines used by farmers, but against the use of cheap immigrant labour from Ireland. Hello? And in any event, the main complaint made by the working class in the first half of the 19th century was about the exceptionally long hours they were required to work, an observation difficult to square with claims that jobs were being eliminated on a large scale.

In the end, the fears of the Luddites that machinery would impoverish workers were not realized, and the main reasons are well understood. Technological change increased the demand for other types of labour that were complementary to the new technologies. So, for example, large numbers of supervisors and managers were needed for the vast new factories and companies. Product innovation created completely new markets which demanded completely new types of job. And the process has continued. As Mokyr says ‘Nineteenth-century political economists lacked an ability to predict new job categories like the personal fashion consultants, cyber security experts, and online-reputation managers of the twenty-first century’.

In fact, the demand for labour has held up far more than was expected. Between 1900 and 1930, for example, weekly hours in American manufacturing fell from 59.6 to 50.6. A simple extrapolation, beloved of doom merchants, would imply only 25.4 hours would be worked by 2015. Of course, innovation is disruptive. But over the 250 years history of capitalism, its positive effects have greatly outweighed the negatives ones of job destruction.

As published in City AM on Wednesday 26th August 2015

Image: Mr Robot has some RAM by Chris Isherwood is licensed under CC -BY-2.0.

Read More

Response to Cecil the Lion’s death is a sad lesson in the irrationality of public opinion

Response to Cecil the Lion’s death is a sad lesson in the irrationality of public opinion

Alas poor Cecil! Close personal friend of mine, sadly dead now.

The catchphrases of the Scottish comedian Bob Doolally capture the outpourings of grief among the Twitterati at the death of the now famous lion. The mourning is mixed with incoherent rage, as long-standing opponents of torture and capital punishment demand that the American dentist who killed the animal have his teeth pulled out without anaesthetic and then be sent to Zimbabwe to be hanged.

Yet the story illustrates two deep features of the current world. We can usefully reflect on the truly appalling outrages which have been inflicted on Zimbabwe by President Robert Mugabe. At the start of his regime, he used Cuban and North Korean troops to murder 20,000 political opponents. One of the most fertile countries in Africa has been reduced to destitution and starvation by racially motivated land grabs. Economic mismanagement, which far surpassed that of the Greeks, led to an inflation rate of one million per cent. But these outcomes scarcely rate a mention, in contrast to the global swamping of social media occasioned by the shooting of a lion.

In cyber society, there is in general only a tenuous connection between the objective content of an incident and the amount of popular attention which it receives. It is not a matter of people gathering all available information and then making a considered, rational choice, as standard economic theory assumes they do. Popularity is self-reinforcing, and in a dramatic way. Network theory is beginning to illuminate why some stories or products spread like wildfire while most receive virtually no attention. But this is due to the subtle mathematical properties of the connections rather than the content or the competing merits of the product.

Economics does much better at understanding the second aspect of the Cecil story. There is a big demand across the world to hunt exotic and dangerous species. The markets for trophies and other by-products of hunting, such as the alleged aphrodisiac of powdered rhinoceros horn, are probably even larger. Unrestricted entry into such markets would lead to the so-called problem of the commons. This arises when the actions of individuals, each making decisions independently and in a rational way, generate an outcome which is bad for the group as a whole. Resources become depleted to the point of extinction.

The <a href=”http://www.nobelprize.org/nobel_prizes/economic-sciences/laureates/2009/ostrom-facts.html” target=”_blank”>Nobel Laureate Elinor Ostrom</a> spent much of her career researching how this problem is managed in contexts such as fisheries and farmlands. The existence of a well-defined community, whose members influence each other through their shared values and cultural norms, is a good indicator of success. But the urge to hunt is global, and so we face a market failure.

The regulation of hunting is one of the very few functioning aspects of the Zimbabwean state. It is a way of limiting access to rare species, and the permit fees received from legal hunters provide the resources required to combat poaching. Calls to ban hunting are ill-informed, for this would simply magnify the problem of the commons and lead to a world in which Cecils were extinct.

As published in City AM on Wednesday 5th August 2015

Image: “Cecil” by Daughter#3 is licensed under CC BY 2.0. 

Read More

Why cricket is like spam

Why cricket is like spam

The holiday season gets into full swing, but a shadow has been cast by the abysmal failure of our boys to get anywhere near the enormous target of 509 which Australia set them to win in the second Test match.  It may seem preposterous even to have thought they would.  But a revolution seems to be taking place in the ability of teams to make large scores in the fourth innings.

S Rajesh, the stats editor of the website ESPNcricinfo, has a fascinating piece on whether batting in the last innings has become easier.  In the 140 year history of Test cricket, teams have scored 350 or more in the final innings on only 49 occasions.  Of these, no fewer than 21 have been in the past ten years.  The chances of winning when faced with such a challenge still remain low.  Only four sides won in the most recent decade, and only nine in total, but the ability to score heavily seems to have leapt up.

Before the Second World War, teams made 350 or more just five times.  Admittedly, one of these was the monumental 654-6 which England made in South Africa in 1939.  The match was timeless, with England being set 696 to win.  But at the end of the tenth day, the match had to be abandoned as a draw so that the team could catch their ship home.  In the five decades from 1945 to 1995, with many more Tests being played, 350 was exceeded only 14 times.

Rajesh offers some explanations for the dramatic rise in large fourth innings totals.  Higher scoring rates, boosted by the techniques of Twenty20 cricket, mean that teams tend to start their final innings earlier in the match, when the pitch has had less chance to deteriorate.  And in general pitch maintenance is better, so they crumble less.

This all sounds plausible and rational.  But the change may not be a permanent one.  The world of spam filtering illustrates why.  The attacking side, the spammers, constantly change their strategies to try and break through, and the defenders also develop their techniques.  At the moment, they are on top, with the US company Symantec claiming that spam rates are now lower than ever.  But we have been here before.  In 2012, the infamous Russian botnet, Grum, was taken down by spam fighters and spam fell by a half, only to bounce back.  In the same way, there are two sides in a cricket match, and strategies evolve over time. They just take longer to work out and perfect.   In the inter-war period, massive scores were made very rapidly, as improvements in batting techniques dominated.  The fielding side then gained the upper hand.  Fielders became more athletic and defensive placements got better.  Bowling techniques evolved in their ability to contain the batsman.

In any evolutionary system in which two adversaries face each other, fluctuations in outcomes will take place.  Spam and cricket are just two examples.  Maybe even England will be able to learn how to score more than 103.

As published in City AM on Wednesday 22nd July 2015

Image: Kevin Pieterson by Nic Redhead licensed under CC BY 2.0

Read More

Banks, cancer and Stephen Hawking

Banks, cancer and Stephen Hawking

Massive fines for banks, gross misbehaviour, huge bonuses for failure, bailouts at vast expense to the taxpayer: it’s little wonder that politicians and pundits can almost invariably win cheap applause by describing the financial system as being a cancer on society.

But in a deep way, cancer and the financial system do have much in common. They both exhibit qualities which in the scientific jargon are known as “robust yet fragile”. It is a key concept in the new but rapidly expanding field of complexity science, described by Stephen Hawking as the science of the twenty-first century. Complexity provides the tools which connect many apparently unrelated phenomena. Bright young people in particular need to listen to Hawking’s opinion to equip themselves with the skills which will make them really marketable.

The concept of “robust yet fragile” is relevant to almost any system which evolves over time. Successful systems develop features which enhance their ability to survive. In particular, they need to be able to withstand the continuous shocks and surprises which happen all the time in real life. The subject of last week’s column, Fifa, has just experienced a major shock which may prove terminal for the organisation. But for the most part, unanticipated events are on a smaller scale. Robust systems develop the capacity to absorb these kinds of shocks. It’s pretty obvious, one might think.

The important insight is that it is exactly the ways in which systems evolve to become robust which also makes them fragile. The global financial system during the decades prior to the crisis became increasingly interconnected. A massively complicated network of assets and liabilities developed.

At one level, this was good news. If a particular connection went under and a bank was left with a bad debt, the fact that it now had so many other connections, other contracts, meant it was more able to take the hit. But when confidence started to collapse in 2008, the very fact that financial institutions had become so closely entwined with each other meant that the adverse consequences spread like wildfire. The system was robust to most shocks, but had become fragile. The effects of a single piece of bad news could be transmitted across the dense network very efficiently.

Last week a major breakthrough in the treatment for many cancers was announced, and it illustrates the robust yet fragile nature of cancer. Cancer evolves continuously, thereby defending itself against standard attacks such as targeted therapies. It stays one step ahead and makes itself robust to the shocks designed to kill it. But its evolution has made it vulnerable to a new approach, which harnesses the body’s immune system to attack cancerous cells. The ways in which cancer has changed has made it easier for the immune system to recognise the difference between normal and cancer cells. True, it has required some very smart science to take advantage of this. But the robustness which cancer developed to cope with previous shocks has made it fragile to the latest one.

As Published in City AM on Wednesday 10th June 2015

Image: Ice Sculpture by William Warby under license CC BY 2.0

Read More