The current crisis dominates everything, from trade to everyday life. But, within a relatively short space of time, it will pass. What next? What will be the “new normal” after coronavirus?
A key policy aim across the west for many decades since the Second World War was to reduce barriers to international trade. But it seems likely that the crisis will reverse this longstanding trend. There were already signs of it slowing down.
President Trump is trading accusations with China on who is to blame for the coronavirus pandemic. Of course, this mood can swing.
But an important new phrase in US government circles is “decoupling”. American supply chains have become increasingly dependent on China in the past 20 years or so. The talk is of breaking this dependence, probably by using new trade barriers.
Within the EU, national governments have reasserted domestic sovereignty in a dramatic way. The authority of the European Commission has been reduced, and it will not be easy to restore it.
The instinct of economists is to recoil in horror whenever they are confronted with the idea of barriers to trade. These are regarded as being unequivocally a Bad Thing. But their own discipline shows that matters are not necessarily so clear-cut.
Over 60 years ago, the theoretical journal the Review of Economic Studies — then as now a desired outlet for academic economists — published a paper entitled “The general theory of second best”.
Richard Lipsey, one of the authors, went on to write a best-selling textbook. The other, sadly dead now, was Kelvin Lancaster. He was a highly original thinker who in the opinion of many should have been given the Nobel Prize.
The paper is set in the highly abstract context of what economists call general equilibrium. Everyone behaves exactly in accord with economic theory. Supply and demand balance in every market, so there is no unemployment, for example. It represents the theoretical ideal of the efficient allocation of resources.
If there were only one barrier to such a perfect state of affairs, getting rid of it would lead to a better outcome. Lipsey and Lancaster asked the simple question: if there were more than one, what can we say if just one of these is eliminated?
Their answer was quite devastating — so much so that economists who encounter this famous article as students look it firmly in the eye and then try and forget it.
They showed that there was no theoretical presumption that the economy would be more efficient if an imperfection were removed but others still remained.
In the real world, there are of course many deviations from this abstract, perfect world. This means that there is no presumption in economic theory that bringing in some restrictions on trade will make things worse. It is an empirical and not a theoretical issue.
Just like the UK leaving the EU, what really matters is the domestic response to changes in the external environment. A bit more protectionism across the globe could stimulate a new wave of innovation in the west, as we look to rely on ourselves rather than China.
As published in City AM Wednesday 25 March 2020
Image: Cargo ship via Pxfuel is licensed for free use.
John Maynard Keynes could certainly craft a neat phrase.
In the Second World War, he wrote in his pamphlet How to Pay for the War: “It is only in a free community that the task of government is complicated by the cause of social justice.”
The impact of the coronavirus pandemic is similar to a war. Governments have to spend more on some stuff (bombers or ventilators) and restrict access to resources for other activities (in WW2, petrol was rationed, now sports events restricted).
In the current crisis, many otherwise viable companies will go to the wall as demand for their products and services drop. Already, the airlines are clamouring for a huge bailout.
Rishi Sunak’s loan scheme is a very good start, but it does not go far enough.
Looming over all of this: how should all the extra spending, extending even further than the loan scheme, be paid for?
Another great British economist, David Ricardo, was also fascinated by the question of how to pay for a major war. In his case, it was the Napoleonic wars in the early nineteenth century.
The government could either raise taxes or issue bonds to cover the increase in spending.
Ricardo argued that the effect on the economy would be the same regardless of which method was used. If increased government spending was financed by taxation, total demand in the economy would be unaffected. Military spending would rise, but private spending would fall.
According to Ricardo, the issue of government bonds would also have no effect. The bonds give rise to a stream of interest payments in the future and at some stage have to be repaid. So taxes in the future would be higher. A rational agent would anticipate these higher taxes. They would increase savings now in order to be able to meet them.
This concept, known as Ricardian equivalence, is hotly contested in macroeconomics even today. If it is true, so-called Keynesian policies for more public spending and bigger deficits simply do not work.
In the financial crisis of the late 2000s, the public sector deficit rose. Both the household and the corporate sector increased their savings, rather than running them down to maintain spending levels. So Ricardian equivalence is not as far-fetched as it may seem.
Traditionally, wars have mainly been paid for by the government issuing debt. In the Napoleonic wars, Bank of England data shows that government debt as a percentage of GDP rose from 100 to 150. In the First World War, it went from 20 to 110, and in the Second, from 130 to 250.
A similar massive rise now might simply be offset by an increase in private sector savings. Demand would fall even further than it is doing.
Sunak’s proposal is to make loans to any business that wants them, large or small. But the repayment period needs to be longer, say 10 years. The company could pay these back as and when it chose. Otherwise, the outstanding loans would be converted into equity in the company.
The loans would therefore either be repaid or backed by an asset. The principles of sound finance would be maintained, and businesses would survive.
As published in City AM Wednesday 18th March 2020
Image: Rishi Sunak via Flickr is licensed for use CC By-ND 2.0
The various pronouncements on coronavirus are a source of puzzlement to many.
On the one hand there are lurid predictions of millions of cases and hundreds of thousands of deaths. On the other, while the actual numbers are growing, they seem tiny so far compared to the scale of the predictions.
Almost 100 years ago, two Scots, Anderson McKendrick and William Kermack, developed an apparently simple mathematical model to explain and predict the spread of viruses. This abstract model remains the basis of our modern understanding. It gives insights not just into the spread of diseases, but how things like fake news disseminate on the internet.
These economists proposed that people at any point in time are in one of three conceptual states.
The first defines those who are susceptible to any particular virus. For example, a certain type of person may be susceptible to rumours that Elvis Presley is alive. It is not clear yet who is susceptible to Covid-19. It seems to be affecting most demographic groups, but the World Health Organisation pondered last week that children might not be susceptible, for example.
The next category is those who are infected, which is straightforward enough. The final one is “recovered”. This could mean genuinely recovered, or actually dead — at any rate, no longer susceptible.
Kermack and McKendrick set up three non-linear differential equations to describe how a virus might spread. Their apparent simplicity disguises a fiendish complexity.
From the names of the categories, it is known as the SIR model — susceptible, infected, recovered.
A major uncertainty is whether to use this model or its SIS variant. Here, the final “S” also means susceptible. The SIS model means that people can get re-infected. The common cold is a good example.
The key part of the system is determining how many susceptibles any given infected person passes the disease onto before he or she recovers. In turn, this depends on how much the susceptibles and infected intermingle (hence the drastic quarantines in China and Italy), the probability of catching the virus from a single contact, and the length of time someone is infected.
Basically, a virus will spread if a sufferer infects on average more than one susceptible. The current number for Covid-19 seems to be between two and three.
Typically, solutions of the model start with a very small number of cases relative to the size of the population. Then, very quickly, these accelerate dramatically.
Imagine a city of one million. People are only infectious for one day and infect two susceptibles. Someone catches the disease. There are only 128 cases at the end of the first week. But in less than three weeks, everyone will have had it.
Modern versions of the model look more closely at how people intermingle in reality, and use big data to map infection patterns. This is the basis for the search for so-called “super spreaders”.
In practice, predicting the course of any particular virus is a challenge. My sympathies lie with those who have this task. But a 100-year-old mathematical model tells us that the very large numbers we read about could easily become reality.
As published in City AM Wednesday 11th March 2020
Image: Monitoring Passengers by China News Service via Wikimedia is licensed for use CC BY 3.0
The reverberations around the resignation of Sir Philip Rutnam, the top civil servant at the Home Office, continue.
Priti Patel, the home secretary, is receiving a barrage of abuse.
Labour’s John McDonnell has pronounced that he cannot see how Patel could carry on. He raised the possibility that she might be in some way “suspended”.
It seems to have slipped the shadow chancellor’s mind that he himself was keen to carry out a purge of economists in the Treasury and Bank of England if Labour had won the election. The officials which remained would have required “re-educating”.
But right now it doesn’t really matter what Labour thinks. The salient point about the criticism of Patel is that it is coming from the serried ranks of the metropolitan liberal elite. The Guardian newspaper has been in a total lather. The BBC’s coverage has hardly been impartial.
This group see Rutnam as one of their own: a professional expert, conscientiously going about his business. Naturally, they regard his actions as sound, carried out in the interests of the nation as a whole.
An important part of economic theory takes a completely different view of the motivations of bureaucrats. James Buchanan was awarded the Nobel Prize in 1986 for his work in developing what is called “public choice theory”.
Public choice economics rejects the idea that bureaucrats act in a disinterested, objective way. They are no less selfish than the rest of us. Their primary motivation is not to serve the public, it is to further the interests both of themselves as individuals and of the bureaucracy as a whole.
The Home Office itself provides many examples which support this view. When tasked by ministers with deporting those without correct documentation, the bureaucrats did not try to track down, say, Albanian drug dealers. Instead, they minimised effort to themselves, identifying people who had lived here for decades but whose paperwork was not quite in order. The result was the Windrush scandal.
Some 20 years ago, I was involved in a project on crime for Charles Clarke when he was home secretary under Tony Blair. I discovered an influential group in the Home Office who believed that the number of criminal offences actually carried out was more or less constant from year to year.
It may have appeared from the data that there had been a huge increase in crime since the Second World War. On the contrary (according to these officials), this merely reflected changes in the propensity to report crimes. The actual level of crime, they purported, was the same in 2000 as it had been in 1950.
I was impressed by the brilliance of this hypothesis. It meant that no bureaucrat could ever be criticised for failing to control crime.
Of course, the view that people always act purely in their own self-interest is rarely completely true. There will be a mix of motivations at play. But in clashes between politicians and the bureaucracy, it is essential for democracy that the former win.
As published in City AM Wednesday 4th March 2020
Image: Home Office by Steph Gray via Wikimedia is icensed for use CC BY-SA 2.0
At first sight, long-term swings in individual seats in Australian elections are a definite niche interest, one for the real trainspotter.
But during a visit to Sydney University’s Complex Systems Institute, I noticed a fascinating piece in The Australian newspaper.
The Australian Labor Party had a good result in the 2007 federal elections, and a relatively poor one in those of 2019. Nationally, there was a swing of seven per cent from Labor to the centre-right Coalition.
In eight constituencies in Queensland — equivalent to some 30 seats in the House of Commons — the average swing away from Labor was over 16 per cent. All were held by Labor in 2007. All were won by the Coalition in 2019.
They had one key thing in common: in each constituency, coal mining or commodity extraction was an important part of the local economy.
We see exactly the same phenomenon across the west as a whole. Substantial groups of voters are very reluctant to pay the price now for policies which might yield benefits in terms of the climate a decade or more into the future.
Two decades ago, long before climate change became a fashionable topic, lorry drivers brought the UK to a virtual standstill in a protest against rising fuel prices. Much more recently, President Emmanuel Macron saw the streets of French cities in flames. The initial trigger which led to the so-called “gilet jaunes” movement was also proposed fuel tax increases.
So how can changes be made on a sufficient scale to address climate change in the light of this lack of democratic consent?
The liberal left in various countries sets great store by so-called citizens assemblies. A small group of citizens, reflecting the socio-demographic characteristics of the population, is selected at random to solve a particular policy challenge. One, set up by parliament itself, has been meeting in Britain on the topic of climate change. The 110 assembly members are encouraged to consider the topic in depth.
The liberal hope is that, guided by experts, ordinary people will come up with policy recommendations congenial to them. But this is only half the story. To be more convincing, the assembly members need to be made to live for a year or so experiencing the consequences of the policies they devise.
Anyone can advocate, say, an immediate ban on petrol and diesel vehicles in the abstract. But if you have to give up your own car here and now, you may come to an entirely different conclusion about what should be done.
But there is a silver lining. In Australia, this year solar energy costs are falling below those of coal and gas for the first time. A decade ago, they were over five times more expensive. As a result, households are installing solar panels in huge numbers. In the deserts, companies are building massive solar farms.
Hair shirts imposed on electorates by central planners will not work, and will instead spark democratic discontent. Ingenuity and innovation create the opportunity for a solution arising out of free choice.
As published in City AM Wednesday 26th February 2020
Image: Australia protest via Wikimedia by Takver licensed for use CC BY-SA 2.0
In the days of the old Soviet Union, so-called Kremlinologists would pore over every utterance of the Politburo, every sentence in Pravda, to try to work out what was really going on.
Sajid Javid’s defenestration from the Treasury has led to an upsurge in similar types of intellectual effort here. What was it really all about?
The Treasury, as the guardian of the public finances, has had a conservative line on spending since time immemorial. Keynes railed against it way back in the 1930s.
But the more orthodox thinkers within the Treasury suffered a substantial defeat last September, when Javid himself introduced the autumn Spending Review. The increase planned in 2020/21 for what the Treasury calls “day-to-day departmental spending” (which covers the running costs of public services) was the highest for 15 years, at 4.1 per cent in real terms.
True, by keeping the squeeze on benefits, the planned increase in total public spending was only 2.0 per cent after inflation. But even this meant that public spending was envisaged to grow faster than the economy as a whole.
If that was a win for the government, the Treasury then won a big victory by slipping its chosen man in as governor of the Bank of England.
Despite the smokescreen of alternative names put out during the long process of selection, it is unlikely that the Treasury ever had any intention of allowing anyone other than Andrew Bailey, a career public servant, into the Threadneedle Street job. More innovative thinkers such as Andy Haldane, chief economist at the Bank, and Gerard Lyons, a distinguished Brexiteer economist, lost out.
But this particular game seems to be the best of three — and it looks like the Treasury will lose.
Political economy demands that spending in the newly blue north of England not only be increased, but be seen to increase. Javid subsequently proved somewhat reluctant to open the spending taps too far, and has now been replaced by Rishi Sunak, a close ally of the Prime Minister, with his team expected to work more closely with Number 10.
In principle, a substantial relaxation of the controls on infrastructure spending seems justified. Interest rates are now so low that the British government can borrow for 20 or 30 years at a one per cent rate.
Even taking a reasonably pessimistic view, this is lower than the sustainable annual growth rate of the real economy in the longer term. So extra spending can indeed be paid for by the proceeds of growth.
Moreover, after sharp rises during the financial crisis of the late 2000s, the ratio of public sector debt relative to GDP was stabilised and has now been flat at some 85 per cent for the past six years. True, there must be some risk that the markets will eventually lose confidence and interest rates rise as a result. But here, it is perception and narrative which matters at least as much as objective economic statistics.
Boris Johnson’s election victory not only moved the nation on from Brexit wrangling, but inflicted a punishing defeat on the forces of socialism personified by Jeremy Corbyn. In image terms in the markets, the government is riding high.
A big increase in spending will be greeted with equanimity. The Treasury has lost this one.
As published in City AM Wednesday 19th February 2020
Image: HM Treasury via Wikimedia licensed for use CC BY-SA 2.0
Last week, the entire world witnessed the shambles of the vote counting in the Iowa Democratic caucus.
It should have been straightforward — but adding all the votes up in a consistent way took a whole week.
The list of errors is as long as your arm. In some precincts, for example, the total number of votes reported exceeded the number of eligible voters. But the main source of the problem seems to have been a mobile phone app that the Iowa Democratic Party used to collect results from caucus sites.
A system was in place which had stood the test of time through many election campaigns. But it was old-fashioned. It needed to be “modernised”. Hence the app.
Many layers of modern management share this obsession with technology for technology’s sake.
Financial institutions, for example, are fixated on apps: apps to manage day-to-day expenses, apps to help manage investments, apps to boost savings for your old age.
No doubt some of these have their uses. But the idea that apps can make major changes to the behaviour of individuals, and therefore drive productivity and prosperity, is something of a pipe dream.
As a further example of technology’s counter-productive impact, consider an elderly relative of mine who is in a care home. On entry, you used to sign the visitors book and enter the time of arrival; on exit, you put the time you were leaving — with a pen. A new computer system has been installed. It takes several times longer to enter these details. As far as I can judge, virtually no visitors use it.
Nobel laureate Bob Solow famously pronounced 30 years ago that “you can see the computer age everywhere but in the productivity statistics”.
In the 1980s, the decade to which Solow was basically referring, personal computers and fax machines were the cutting-edge of new technology. It seems like the Stone Age compared to the technology available to us now. Yet Solow’s problem remains.
Productivity growth was very low during the most recent decade, despite the massive advances made in technology. There are clearly many reasons for this. But one of them is, quite simply, that technology is often being introduced in situations where it is quite unnecessary. As a result, people become less rather than more productive.
More generally, new technology is proliferating in areas where the potential productivity gains are not that high. For example, it is convenient when buying a round of drinks to be able to tap your card rather than delve into your pockets for loose change, but it is unlikely that this innovation enables more drinks to be sold in any given pub or bar.
Similarly, self-service checkouts in supermarkets help avoid standing in long queues, but these rely on customers being willing to supply their own labour for free, rather than requiring paid staff to scan the goods for them.
The Iowa app incident is a source of amusement, but it may be telling us something more profound about why productivity growth remains low. New technology is being applied when it is simply not needed.
As published in City AM Wednesday 12th February 2020
Image: Iowa State Line via Flickr by Tony Webster licensed for use CC BY 2.0
Universities and their students are seldom out of the news. Ever since Tony Blair pledged to send 50 per cent of 18–21 year olds to university, they have been a persistent topic in political economy.
University towns now notoriously favour Labour at the ballot box, often an island of red in a surrounding sea of blue. One of the few rational policies of Jeremy Corbyn in the last election was the commitment to write off student debt. It was an excellent way for him to gather votes.
A key argument put forward for increasing student numbers was the existence of the “graduate premium”. Over their working lives, graduates earned more than non-graduates, so the expansion of universities would be positive for both individuals and the economy.
Quite a few commentators at the time argued that this rise in graduate supply was unlikely to be met by a corresponding surge in demand. The graduate premium would therefore not persist, certainly not for the lower ranked universities. Why pay extra for something which is in excess supply?
This is exactly how it has turned out. Many graduates end up in mundane, low-paying jobs. The Office for National Statistics shows that 31 per cent of graduates have more education than is required for the work they are doing.
And what about the 50 per cent of the age group who do not go to university? It is ironic that the left-wing parties shed tears for indebted university students, who in general have more privileged backgrounds. They have little to say about the rest.
All school leavers at 16 must now stay in some form of education until 18. Most attend a further education (FE) college, often combining this with a part-time job.
A lot is heard not just about universities but about the impact of “austerity” on schools. But within the education sector, it is the FE colleges which have experienced the greatest cutbacks since 2010.
Here is a great opportunity for the government to both increase the level of human capital in the economy and be seen to be delivering for the “left behind”. There are already rumours that the chancellor is planning a big increase in spending on FE in the March Budget.
Investment in university students has gone well past the point of diminishing returns. In contrast, the neglect of the FE sector offers the chance of getting a real return on increased spending.
The obvious beneficiaries will be the young people who do not go to university. With extra skills, they can earn an “FE premium”. It may be modest, but being able to earn even £10 an hour instead of the minimum wage makes a big difference to the individual concerned.
There could also be major benefits to the economy as a whole. The UK is notorious for the so-called “long tail” of productivity. Many SMEs have low productivity levels, so increasing the quality of the labour available gives them the chance to address this problem.
Big spending increases on FE colleges, and less attention to universities, are win-win for the government.
As published in City AM Wednesday 5th February 2020
Image: Graduates via Flickr by Sakeeb Sabakka licensed for use CC BY 2.0
It is a truth which has rapidly become universally acknowledged (to borrow Jane Austen’s famous phrase) that the government must deliver for its new supporters in the regions.
This is a massive challenge. The gap in income per head, for example, between London and other areas of the country is obviously large. But the firm trend has been for this difference to widen, rather than narrow.
Between 1997 and 2017, income per head, after allowing for inflation, rose by around 17 per cent in both the north east and the north west — just under one per cent a year. In Wales, another area where the Conservatives made big gains, the overall increase was a mere 11 per cent. In contrast, in London it rose by 42 per cent over these two decades. In inner London, the increase was no less than 56 per cent.
But that’s only half the picture. To fully understand what is going on, we need to look within the regions themselves. Manchester provides a perfect illustration.
In the mid-1990s, within half a mile of the city’s main rail stations was a bomb site. Not a site created by a contemporary IRA outrage, but by the Germans in the Second World War. In the subsequent 50 years, no one had thought it worthwhile to develop a piece of land in the centre of a major English city.
How times have changed. The total resident population of the city centre is now 80,000. Manchester has been totally transformed. The skyline has altered just as dramatically as that of central London.
The economic structure of central Manchester has come to resemble those of the inner London boroughs.
The Office for National Statistics provides detailed data on the numbers employed in each industry for every UK local authority. Turning these into percentages, I used some fairly straightforward maths to work out which local authorities have an industrial structure most similar to that of Manchester.
The answer is urban areas like Camden and Islington in London, and other major regional cities such as Bristol and Leeds. None of the nine other boroughs which make up the Greater Manchester region look remotely like the city centre itself.
The same is true of other English cities such as Newcastle and Leeds. The types of jobs on offer are quite different from those in the surrounding hinterlands.
The cities voted for Labour with massive majorities. It was in their satellite areas where the Conservatives triumphed.
The story looks the same from whichever level of geographic aggregation we look. Comparing the regions of the UK, London is much richer than the rest. Within individual regions, the main city is much richer than the rest.
It is easy to see why this happens. Once an area starts to become more attractive for business, other firms increasingly see it as a place to locate. Skilled people want to both work and live there. A virtuous circle is created, and the area pulls away from its surroundings.
If the government really wants to level up the country, it will need to be really imaginative to avoid falling into this trap.
As published in City AM Wednesday 29th January 2020
Image: Manchester vis Flickr by Zuzanna Neziri licensed for use CC BY 2.0
Last week, health secretary Matt Hancock signalled an important change of strategy.
Accident and Emergency Departments have a target that 95 per cent of patients should be admitted, transferred or discharged within four hours. Hancock suggested that the target will be scrapped. Instead, wait times will be determined by clinical need.
Cue predictable hyperbolic outrage. The president of the Royal College of Emergency Medicine, for example, claimed that this change would have a “near-catastrophic impact on patient safety”.
The NHS is not meeting the target by a long chalk. In December, the actual figure assessed within four hours slipped to under 70 per cent.
A key reason seems to be increased demand for A&E services. Since the Conservatives came to power in 2010, admissions have increased by almost 25 per cent.
It is inherently implausible to imagine that cases of genuine emergencies have risen by this amount. Road casualties, for example, far from increasing, have actually fallen by 24 per cent since 2010.
There is much anecdotal evidence to suggest that people are bypassing GP surgeries and turning up at A&E with trivial complaints. Perhaps GPs are so oversubscribed that people who cannot get appointments go to hospital instead, or maybe limited out-of-hours care means that patients feel they have little choice if they fall ill at weekends.
But regardless, the lengthening waits indicate excess demand for A&E care. Some form of rationing is necessary to allocate resources and to decide who gets treated.
There are two ways to ration. One is by price — whoever is willing to pay the most gets dealt with first. The other is by queue.
Even the most hardline free marketeer would surely balk at the idea of making people involved in genuine accidents wave their credit cards. So queue it has to be. And in such circumstances, it is entirely appropriate that decisions on who to treat first should be made on clinical grounds rather than a purely arbitrary target on the length of wait.
This controversy demonstrates the wider problem with setting targets: sooner or later (and usually sooner), people work out how to game them.
In A&E departments, once a patient has waited more than four hours, they have zero priority. The hospital incurs no more downsides if the wait is 14 hours rather than four hours plus a single minute.
We see this in other sectors too. Schools can, for example, meet exam targets by getting rid of weaker students — hardly what the target was designed to achieve.
And the Windrush scandal had its origins in the Home Office targets for the numbers to be deported. Officials could have tried to track down members of eastern European criminal gangs. Instead, they focused on the seemingly easier task of deporting elderly people who had lived in Britain for decades. They worked out how to meet the targets by minimising their effort.
Examples of gaming the system proliferate. Hancock is to be applauded for taking the first step to dismantle the culture of bureaucratic, counter-productive targets.