Pizza Hut is the latest addition to the list of companies grovelling to criticism on social media.
The restaurant chain tweeted an apology for running a promotion in the Sun newspaper.
A few weeks ago, Paperchase said that it would not place any more marketing campaigns with the Daily Mail after receiving “hundreds” of complaints.
In the public sphere, last year Greater Manchester Police staged a simulated terror attack in the massive Trafford Park retail complex. The carnage began, realistically, with the cry “Allahu Akbar”. Following a Twitter storm, the police felt forced to apologise.
Boris Johnson, in his inimitable style, has condemned Pizza Hut and Paperchase for being “cowardly”. The campaign against them was run a by a small group of hard-left activists calling themselves Stop Funding Hate.
But examples such as these raise a more important question. Which century is British management living in?
After being attacked by critics on social media, many outfits respond with blind panic. A famous Monty Python sketch depicts the novel Wuthering Heights, not in words but in semaphore, a nineteenth century technology. Many senior managers seem to remain stuck at this level of communications technology.
Scientific knowledge of how things spread on social media such as Twitter has grown enormously in the last few years. Yet swathes of top management appear to be completely unaware of this work.
A high-powered study published last year by the physicists Guido Caldarelli and Gene Stanley, editor of the top statistical physics journal Physica A, confirmed that social media users typically form communities of interest which foster confirmation bias, segregation, and polarisation.
In other words, in general people on social media are preaching to the already converted.
With Rickard Nyman, a computer science colleague at UCL, I conducted a real-time analysis of the tweets during the Brexit campaign. Modern algorithms reveal as clear as day that there were two communities, with little connection to each other. One group was talking about what they would see as the “grown-up” themes of employment, the economy, trade and such like. The other, in essence, just didn’t like foreigners all that much.
The key moment was when, with just over two weeks to go, immigration began to get traction as a theme amongst the “Remain” Twitter community. Otherwise, the two groups were just reinforcing existing opinions and prejudices.
More is known. A significant proportion of tweets do not get retweeted at all. And it is the act of retweeting which shows that the recipient is paying attention.
Simply being a follower and reading a tweet involves effectively zero effort. The number of followers is a very weak indicator of a person’s influence.
Most tweets which express strong emotions essentially just fade away. It is the more balanced ones which have a greater chance of getting traction across the network.
The most depressing thing about the reactions of companies and public bodies to social media attacks is not, as Boris would have it, their cowardice. It is that they seem to show very little understanding of modern technology.
As published in City AM Wednesday 13th December 2017
Image: Pizza Hut via Stephen McKay is licensed under CC by 2.0
The American economy continues to power ahead. The widely respected and independent Congressional Budget Office (CBO) reckons that the actual level of GDP in the US in 2017 is finally back at the level of potential output.
The potential level of GDP is the amount of output which would be produced if there were no spare capacity in the economy. In a service and internet-oriented economy, any estimates of it are fraught with difficulties.
The maximum output of a car plant or steel mill is reasonably straightforward to work out, at least in the short term. But it is less obvious what the constraints are on any web-related business.
Still, the concept of potential output is taken seriously by policy-makers. And the CBO does a better job than most at guessing what it is.
On their figures, the last time actual and potential GDP were in balance was in the year immediately prior to the crisis, 2007, which at least makes sense.
In 2009, the depth of the recession, the CBO calculates the gap between the two to be six per cent. That may not sound a lot, but in money terms that represents more than one trillion dollars.
American GDP is now almost 15 per cent more than it was in 2007, and 20 per cent more than in 2009.
Along with this, employment has surged, with 17.2m net new jobs being created from the low point of December 2009. As in the UK, employment is at record highs.
The increase in employment is entirely due to the private sector, where it has grown by 17.3m.
In contrast, the numbers employed by the government, whether federal or state, have been cut by 100,000.
The same applies on the output side. Again, it is the private sector which is driving the recovery.
Compared to the bottom of the recession in 2009, and after stripping out inflation, public sector spending is down by $200bn.
In contrast, private sector investment has risen more than 10 times this amount – an increase of $2.1 trillion.
So, despite strict restraints on the public sector, the American economy has recovered well from the crisis – indeed, better than the best performing main European economies, Germany and the UK.
The evidence has been there all along, as soon as the US began to pull out of the recession in the early part of this decade. It is evidence which seems to be studiously ignored by the strident voices in British academic circles calling for an end to “austerity”.
Of course, there have been tax cuts, and these stimulate the private sector. But the risk over the longer term is that growth will not be rapid enough to bring in enough revenue to curb the growth in public sector debt.
Indeed, the CBO sees the potential rise in this debt as an important threat to the long-term growth of America. Higher public borrowing, in its view, reduces the private sector investment which is needed for growth.
As published in City AM Wednesday 6th December 2017
The OBR’s forecasts should be taken not just with a pinch of salt, but with the contents of an entire mine
There has been a great deal of crowing in metropolitan liberal circles over the report of the Office for Budget Responsibility (OBR), published with the Budget last week.
The OBR revised downwards its projections for GDP growth for each of the next five years. Annual average growth to 2022 is predicted to be just 1.4 per cent a year.
The OBR believes that the UK is experiencing a “negative supply shock”.
But forecasts are merely forecasts. They do not constitute scientific evidence at all. This is especially true of economic predictions.
One section of the OBR’s report which relates to facts rather than views about the future has been seized on. This is that growth in the euro area during 2017 has been both stronger than it was in 2016, and stronger than in the UK. This is represented as showing that the EU is dynamic, and the UK is fading away.
But the experience of just a few months data – we only have official data to, at the very latest, the end of September – needs to be put into context.
Since 2007, the year immediately before the financial crisis, GDP in the UK has grown by just over 10 per cent.
This does indeed represent a decade of growth which, by historical standards, is low.
But the figure is very similar in Germany. In France, output is only around six per cent higher than it was 10 years ago. In Spain, GDP has risen by five per cent.
In Italy, however, the economy has shrunk by some five per cent since 2007. The Italians have had a decade not just of low growth, but of negative growth. They have gone backwards.
Despite over 40 years of EU membership, the UK economy remains far more synchronised with the US in terms of the year-on-year fluctuations of the business cycle.
So over this period, we see some years when economies in the EU have grown faster than in the UK, and some years when they have grown more slowly. This is precisely what to expect when the cycles are not coordinated.
The OBR itself is fully aware of the huge potential for error in economic forecasts.
Indeed, the report illustrates the uncertainty around its five-year projection of 1.4 per cent annual GDP growth in a so-called “fan chart”. This shows the potential range around the prediction, based on past errors made in official forecasts.
At worst, growth could be negative, with an annual average fall of one per cent. But at best, we could have a sustained boom with growth of over four per cent a year.
Based on how wrong past forecasts have been, the next five years could see a cumulative fall in GDP of over five per cent, or a cumulative rise of over 20 per cent.
The OBR’s forecasts should be taken not just with a pinch of salt, but with the contents of an entire mine.
As published in City AM Wednesday 29th November 2017
Are members of the Labour Party frontbench experts in doublethink? The concept was invented by George Orwell for his novel 1984, written in the 1940s as a critique of the Soviet Union.
Masters of doublethink can hold, for purposes of political expediency, two opposing opinions at the same time, one of which might be complete nonsense.
The Leader himself set a good example during the general election campaign when he promised to abolish all outstanding student debt. Jeremy Corbyn rather backtracked on this after the votes had been cast, when it was pointed out to him that this would cost around £100 billion – over £1,500 for every man, woman and child in the UK.
His close ally, the Shadow Chancellor, followed this up on Sunday. Asked about the cost of Labour’s re-nationalisation plans, John McDonnell said that “you don’t need a number because you swap shares for government bonds”.
Independent experts put a provisional costing of around £500 billion on McDonnell’s plans. This amounts to over 20 per cent of GDP.
Imagine you want to buy a house for £10 million but have no savings. And imagine that you somehow persuade someone to lend you the money. True, you have acquired an asset worth £10 million and have a debt of the same size. Your net wealth position is unchanged.
But you face the problem of paying the interest on the loan, the terms of which may be very onerous, reflecting your credit risk.
McDonnell argues that nationalised industries will make a profit, which will take care of the interest payments. Stretching credibility even further, Labour argues that because the interest on government bonds is currently only just over 1 per cent, the payments would not amount to much.
Yet it is obvious that the markets might want a much higher rate of interest to finance the plans of a Chancellor who wanted to add £500 billion to public debt.
Emily Thornberry, the Shadow Foreign Secretary, has also got in on the doublethink act. Challenged on TV to name any country where Labour’s policies of financing spending by issuing debt had worked, she finally came up with Germany and Sweden.
The Bank of International Settlements complies data on the ratio of government debt to GDP. There are several ways to do this, but on their preferred approach in Germany it is currently 73 per cent and in Sweden 44 per cent. In the UK it is already 116 per cent.
Much more plausible comparators are Italy and Greece, where the ratios are 150 and 173 per cent, figures which McDonnell would reach easily. In Italy, GDP is still 5 per cent below its peak level in 2007, a whole decade ago. And in Greece it is 26 per cent lower.
Are the top Corbynites cynical exponents of doublethink? Less charitable people might say they are just plain dim. As so often in economics, the evidence so far does not enable us to decide between the two hypotheses. But, either way, they are bad news.
As published in City AM Wednesday 22nd November 2017
Do Tube strikes make Londoners better off?
At first sight, the question is simply absurd. The answer is surely “no”.
But a paper in the Quarterly Journal of Economics comes to the opposite conclusion. Cambridge economist Shaun Larcom and his colleagues analysed the two-day strike of February 2014.
They obtained detailed travel information on nearly 100,000 commuters for days before, during, and after the strike.
A key feature of the strike is that nearly half the stations remained open. So most commuters could experiment with routes different to the ones they normally use.
The project may seem barking mad. But it investigates an important issue in economic theory.
Richard Thaler’s recent Nobel Prize for behavioural economics received a lot of publicity. Behavioural economics looks for examples of people making decisions in ways which deviate from those predicted by the rational choice model of economics.
A criticism from the mainstream is that deviations might indeed be observed at a point in time. But over time, they will disappear as people learn to be rational and make the best decision.
The Tube network remains the same for long periods of time. Commuters have many opportunities to learn about it. So almost all of them should use the quickest possible route to work. If someone has just moved jobs or homes, there may be a short period of adjustment. But everyone else ought to have learned the best way to travel.
Yet Larcom and his colleagues find that a significant fraction of London commuters fail to find their optimal routes. They come to this conclusion by comparing the journeys of the people in their data set before and after the strike.
Of course, for many journeys the best route is trivially easy to discover. If you live in Richmond and work in Hammersmith, there is only the District Line. Other journeys have more options. Larcom notes that there are 13 potential ways to travel between Waterloo and King’s Cross.
The authors point out that many decisions faced by consumers are more complex and less repetitive than the commuter problem they analyse. So, in an excellent example of jargon, they state that “our estimate of suboptimal habits may be a lower bound to the problem in other contexts”.
In other words, systematic and persistent deviations from rational choice are an important feature of the real world.
Economists of course like to value everything, and there is a standard way of valuing time. The academics estimate that the time gains subsequently achieved by those who switched routes outweighed the time losses incurred by everyone else during the strike. So Londoners were better off as a result of the strike.
Bizarre though it may seem, the article is a good example of how economics is becoming much more empirical when thinking about individuals’ behaviour and less reliant on pure theory.
As published in City AM Wednesday 15th November 2017
Image: Underground by By Elliott Brown is licensed under CC by 2.0
The so-called “productivity puzzle” just does not go away.
The October, employment figures released by the Office for National Statistics (ONS) brings it into focus.
The number of people in work rose to a new record high of 32.1m, with an increase of around one per cent compared to a year ago.
Total output, measured by GDP, continues to rise, but modestly. We do not yet have official estimates for the year to October, but GDP seems to be up by some 1.5 per cent.
Productivity is defined as output per worker, so it is only around 0.5 per cent higher than a year ago. No scientific consensus has yet emerged to explain why productivity growth continues to be so low.
But there is increasing evidence that the rate of growth of output is being systematically underestimated.
The economy cannot be put in a set of scales and measured. Its size has to be estimated, and the ONS uses a wide variety of methods to do this.
The fundamental problem is that the foundations for estimating GDP were built in the 1930s and 1940s, when the economy was dominated far more by manufacturing. Measuring how many things have been produced is inherently easier than measuring services.
The ONS does not stand still, and tries to take account of the massive changes in the economy which have taken place. But the rise of the internet economy brings entirely new problems to solve.
A key one is what the futurologist Alvin Toffler many years ago called the “prosumer” sector.
Traditionally, products are developed and sold by companies, and consumed by, well, consumers.
In the prosumer sector, consumers themselves participate in the production and development of products and services.
A good example is the statistical package R. This is open source, and freely and readily downloadable by anyone.
In recent years, R has become the package of choice for young scientists in a wide range of disciplines around the world. They both use it, and contribute to its development by uploading their own algorithms.
A huge range of routines can be downloaded. Its graphics features are amazing. Software is appearing on it that has the potential to take on commercial giants such as Word and Powerpoint.
It has become a very valuable tool for scientific research, using the word “valuable” in its every day sense of the word. But it is run by a small not-for-profit foundation, so in ONS terms its value is close to zero.
The problem is that R is what economic theory describes as a “public good”.
This jargon phrase applies to anything where anyone can consume it, and where the supply never runs out. No matter how many people use R, it is always available for the next person.
For most goods and services, this is just not true. When I put my swimming towel on the pool lounger, it is no longer available to you.
The prosumer sector creates a lot of output. But economics has not yet solved the question of how to value public goods.
As published in City AM Wednesday 8th November 2017
Image: Vintage Scales by Public Domain Pictures is licensed under CC by 0.0
Karl Marx famously wrote: “History repeats itself, first as tragedy, second as farce”.
The phrase might well have been coined with Catalonia in mind.
Generalissimo Franco began a military coup against the elected Spanish government in the Canary Islands in 1936. The battle spread across Spain, and Catalonia was the last redoubt of the Republic to fall, in 1939. Franco took brutal revenge. Tens of thousands were imprisoned or executed, many of these within living memory. The Catalan language was banned.
Now the Catalans have proclaimed independence and Spain has imposed direct rule.
We do not of course know how events will pan out this time around. Things may turn serious. Yet there is certainly a slapstick element to having two different sets of police on the streets, and two different groups of civil servants, each taking different sets of orders.
In both the late 1930s and now, economics has a potentially decisive role in the eventual outcome.
There are many reasons for Franco’s victory. An important one is that the Republican side could just not obtain enough modern armaments. Catalonia even then was the richest part of Spain, but the arms the Catalans needed were made abroad, and, as the civil war progressed, increasingly they could not afford them.
A leading element in the Catalan government was the Workers’ Party of Marxist Unification (POUM in Spanish). POUM was inspired by Leon Trotsky, in much the same way as Jeremy Corbyn and his close acolytes appear to be today.
Sympathy for the historical role of POUM goes a long way to explaining why Corbynistas are enthusiastic supporters of the contemporary Catalans.
But POUM made a catastrophic mistake: initiating a policy of expropriating private property. One effect was a major loss of confidence, and the collapse of the Republican peseta on the foreign exchanges, meaning that all imports, not just weapons, became punitively expensive.
Another generalissimo who was a political contemporary of Franco, one Joseph Stalin, described Trotskyists as a “gang of wreckers and diversionists”. In this, at least, he was surely correct.
This time, the Catalans are desperately trying to create separate a currency, using technology based on digital tokens. Their government is considering an e-residency programme such as the one in Estonia. This provides a way to operate a location-independent business online.
More traditional businesses have already voted with their feet. Almost 1,700 companies, including two big banks (Sabadell and CaixaBank), have switched their headquarters to other parts of Spain since the crisis escalated at the start of October.
The EU has made it clear that an independent Catalonia would not be a member of either the EU or the Eurozone. The latter would probably be a decided advantage, but effective expulsion from the EU could cause serious short term dislocations.
It is not just loyalty to Spain which is leading a lot of Catalans to demonstrate against independence. Whatever the long term outlook, the immediate economic costs would be substantial.
As published in City AM Wednesday 1st November 2017
Image: Demonstration by By Màrius Montón via Wikimedia Commons is licensed under CC by 4.0
Mark Carney, governor of the Bank of England, is getting his retaliation in early.
Faced yet again with the Bank failing to deliver its designated target of a two per cent inflation rate, in a speech last week he suggested that his remit was broader.
“We face a tradeoff between having inflation above target and the need to support, or the desirability of supporting, jobs and activity”, the governor stated.
In other words, he claimed that the Monetary Policy Committee (MPC) of the Bank should be concerned not just with inflation, but with what economists describe as the “real” economy, output and jobs.
The Federal Reserve in the US is explicitly mandated to take account both inflation and the real economy when it sets interest rates. This is definitely not the case with the Bank of England. When Gordon Brown made it independent in 1997, its remit was unequivocal. It was to ensure that inflation was two per cent a year.
This time round, inflation is above the Bank’s target. The current level of some three per cent may even rise in the short term because the weakness of sterling is pushing up the cost of imports.
But in recent years, inflation has been below the two per cent desired rate, even falling to zero in 2015.
All this time, Bank rate has been essentially flat. The MPC cut it to just 0.5 per cent in March 2009, where it remained until the reduction to 0.25 per cent in August 2016.
To put this into perspective, when the rate fell to 1.5 per cent in January 2009, this was the first time it had been below two per cent since the Bank was created in 1694, well over 300 years ago.
So here is a puzzle for mainstream macroeconomists, whether in central banks or universities. Central banks are meant in theory to be able to control inflation by setting short term interest rates. Inflation has been low since 2009. But at the same time, the Bank rate has been at all-time record lows.
Perhaps more pertinently, inflation has fluctuated from year to year, even though interest rates have to all intents and purposes not changed. It was 4.5 per cent in 2011, and 0.7 per cent in 2016.
In short, inflation seems to lead a life of its own, independently of what the experts on the MPC either say or do.
Inflation really is a naughty boy all round. A central concept in orthodox economic thinking, encapsulated in the quote from Carney above, is that there is a tradeoff between inflation and jobs and output. The faster the economy grows and unemployment falls, the higher inflation will be.
But starting in the early 1990s, for around 15 years across the entire Western world, both inflation and unemployment experienced prolonged falls.
The idea that a central bank can control inflation by adjusting interest rates is shown by the evidence to be absurd.
It is yet another example of the limits to knowledge in orthodox macroeconomics.
As published in City AM Wednesday 25th October 2017
Image: Mark Carney by Bank of England is licensed under CC by 2.0
The competition and Markets Authority (CMA) published a report about Price comparison sites at the end of last month. They seem simple enough, but these straightforward sites raise interesting issues for economics.
Overall, the CMA was pretty positive about the DCTs – digital comparison tools, to give them their Sunday best name. The conclusion was that “they make it easier for people to shop around, and improve competition – which is a spur to lower prices, higher quality, innovation and efficiency”.
DCTs offer two main benefits. First, they save time and effort for people by making searches and comparisons easier. Second, they make suppliers compete harder to provide lower prices and better choices to consumers. In short, they bring the real world closer to the perfectly informed consumers and perfectly competing firms in the ideal world of economic theory.
But even in this market, there is an issue which goes right to the heart of much of the information which can be accessed through the internet: how do we know whether we can trust it?
The main problem is that the comparison sites typically provide their services free of charge to consumers. They make money by charging a commission to suppliers.
This creates an incentive for a DCT to promote those suppliers which pay it the most commission. An effective way of doing this on the internet is by the order in which the information on the various suppliers is presented.
It is not that DCTs deliberately provide misleading information, or even that a site leaves off a major supplier which does not pay the particular website enough. But they can put those that pay the most at the top of the list.
Notoriously with Google searches, people rarely click through to anything which is not in the top three results of the search.
Allegedly, 60 per cent of the time, only the site which comes at the very top of is accessed.
Obviously on a DCT, consumers are likely to look at more. That is the whole point of using the site. But although the CMA does not provide hard data on this, it expresses a clear concern about the ways in which the sites rank the suppliers.
How the DCTs themselves set their prices raises a more general question for economics. The basic rule, described in the textbooks since time immemorial, is to set price equal to marginal cost – in other words, at the point where the revenue from one extra sale equals the cost of producing that extra item.
The standard assumption made in teaching generations of students their introductory course to economics is that as the level of output increases, marginal cost first of all falls but eventually rises.
But on the internet, once the site is set up, the cost of dealing with an extra user is effectively zero. The time-hallowed formula of economics is a recipe for bankruptcy.
The internet is forcing companies to innovate in their pricing strategies. And it is forcing economists to rethink some of their theories.
As published in City AM Wednesday 18th October 2017
Behavioural economics has received the ultimate accolade.
Richard Thaler of the University of Chicago Business School has been awarded the Nobel Prize in economics for his work in this area.
Economics over the past 20 to 30 years has become far more empirical. Leading academic journals do still carry purely theoretical articles, but far less than they once did.
This shift towards the empirical takes two forms. Major advances have taken place in the heavy duty statistical theory of analysing large scale databases containing information on individuals and their decisions. This was recognised when James Heckman and Daniel McFadden were awarded the Nobel Prize in 2000.
Behavioural economics is much less technical. In any given situation, the decision which a purely rational person would take is identified. We then look how people actually behave, and see if there are any deviations from the rational way of doing things.
Perhaps the main finding of behavioural economics is so-called prospect theory, first set out nearly 40 years ago by Daniel Kahneman. In essence, prospect theory says that people dislike making losses more than they like making gains of the same amount.
Another important discovery is that, when weighing up how to value future costs and benefits, people often place much more weight on the present and very immediate future than standard economic theory assumes. Last month I wrote about how this helps to explain the reluctance of electorates to deal with climate change.
These two results are backed by large amounts of evidence obtained in a range of different contexts. So now they are being integrated into economic theory.
But many economists are altogether less sure about much of the rest of behavioural economics. One of the issues is that it often gives the impression of being rather ad hoc. No reason is given as to why people in one situation appear to behave rationally, but in another they do not. Very few guidelines have emerged as to when we can expect to see deviations from rationality.
Another issue is that many economists are prepared to accept that non-rational behaviour might be observed at a point in time. But in a reasonably stable situation, people will learn over time to be rational.
Behavioural economics is not just about advancing knowledge on the workings of the economy. Policy-makers have become interested.
Cass Sunstein, Thaler’s colleague, served in the Obama administration as head of the Office of Information and Regulatory Affairs. David Cameron set up the so-called “Nudge Unit” in his government based on Thaler’s ideas. Thaler claimed 10 years ago that a “nudge” could lead to “better investments for everyone, more savings for retirement, less obesity, more charitable giving, a cleaner planet, and an improved educational system”. In his 2016 book Misbehaving, he has backed off the extravagance of these claims.
Still, whatever the doubts and qualifications, behavioural economics has made a big impact. An economist can no longer be said to have a good training if he or she is not familiar with its main themes.