The appearance of Liverpool and Spurs in the Champions League final and Arsenal and Chelsea in the Europa Cup one has generated massive interest.
But the official ticket prices for the games are surprisingly reasonable.
Liverpool and Spurs have been offered 16,613 tickets each. Five per cent of these are expensive, at £513 each. A further 21 per cent are available at £385. But the bulk – 54 per cent – cost only £154, and there are even 20 per cent which can be bought for just £60.
These compare favourably with other major cultural events, such as a performance at Covent Garden with top opera stars.
Uefa organises the competitions and sets the prices of the tickets. The demand is of course very much greater than the supply. This will be reflected in the prices charged on the unofficial market in tickets.
Why does Uefa not bag this revenue for itself? Even if it doubled the official prices, the events would still sell out.
The NFL in America follows a similar policy for the Superbowl.
Top behavioural economist and Nobel laureate Richard Thaler quotes a top NFL executive in his book Misbehaving.
The NFL “takes a long-term strategic view” towards ticket pricing at the Super Bowl, keeping them reasonable despite huge demand in order to foster its “ongoing relationship with fans and business associates”. The point is that both Uefa and the NFL have repeated dealings with clubs and fans. They judge that it would be counterproductive in the longer term to exploit their monopoly of major events.
In contrast, the hotels which the English soccer fans are now desperately seeking to book will probably never see the individuals who stay there again. It’s a one-off transaction, and so they are free to raise their prices so as to maximise their immediate profits.
Air fares are also going through the roof, particularly for the exotic location of Baku where Chelsea and Arsenal will play. There are far fewer travel options than there are to Madrid, where the Champions League game will be held.
Although the algorithms used by airlines will set these sky-high prices, some of these companies will be used repeatedly by quite a number of fans. They therefore do run the risk of creating a bad image which damages their business in the longer term.
The fact that the two finals are an all-English affair is raising concerns in other major European soccer nations. The standard of play in Spain’s La Liga or Italy’s Serie A is certainly comparable to that in the Premier League.
But the Premier League dominates in terms of the monies it receives from television rights – more than twice La Liga, for example.
This means more money for clubs, which can then buy more top players. This phenomenon is observed throughout modern popular culture. Success itself breeds success, and unto him that hath, more shall be given.
It is a totally different world to when the maximum wage for players was fixed at £20 a week, and they wore Brylcreem and smoked Woodbines. But economics is always present.
As published in City AM Wednesday 15th May 2019
If there were a betting market in future winners of the Nobel prize in economics, MIT’s Daniel Acemoglu would be at pretty short odds. His highly innovative work has already won him a string of prizes.
So his research is always worth following – especially when he challenges the conventional wisdom, as in his paper in the latest issue of the Journal of Economic Perspectives.
Economists are usually optimistic about the impact of new technology.
The innovation itself destroys jobs – the Luddite riots in the early nineteenth century, for example, were in direct response to the displacement of skilled handloom weavers by the new machinery in textile factories.
But this, along with all subsequent waves of innovation, enabled goods and services to be produced more cheaply. As a result, the spending power of everybody else in the economy increased, and new jobs were created.
Mass production in factories during the industrial revolution was of course a phenomenon without precedent in the history of the world. Other completely revolutionary technologies followed, such as the railways and electricity.
The rapid advance of robots and artificial intelligence seems to be the latest example of a transformative new technology.
Acemoglu argues that it is not these “brilliant” (as he puts it) technologies which threaten jobs and wages. These enable things to be produced much more cheaply than before, substantially boosting real incomes elsewhere in the economy. Then new kinds of goods and services can be created as a result of the increase in spending power.
Rather, the risk to overall employment and living standards comes from the introduction of “so-so technologies”, which generate only small productivity improvements.
Examples of so-so technologies include automated customer service, which has displaced human service representatives. It is, however, generally deemed to be low-quality, and thus unlikely to have led to large productivity gains.
The cost of your bank charges or your supermarket shop have not exactly been reduced much by the introduction of automated answering systems or self-service check-outs. But jobs have been lost as a result.
Acemoglu suggests a key reason why modern economies have, as he puts it in the jargon, “moved along this [particular] innovation possibilities frontier”. In the US and also here in the UK, the tax system has evolved in ways which subsidise the use of equipment and penalise the use of labour through payroll taxes such as our employers’ national insurance contributions.
Interestingly, he also points the finger at the big tech companies. Their business model is based on automation and small workforces.
The impact of innovative technology which destroys particular jobs needs to be counterbalanced by innovation elsewhere, which creates new tasks, new jobs which no one had previously thought of. We have had some, such as software and app development and database design, but nowhere near enough.
Governments need to rethink the tax system as it applies to investment and employment. And they need to rebuild support for long-term innovation, which gives more scope to invent completely new jobs.
As published in City AM Wednesday 8th May 2019
The Extinction Rebellion protesters on the streets of London seemed to consist of two disparate interest groups: pensioners and the young. Their shared connection is that most of them – certainly in the former group – seemed to be affluent.
An identical alliance was observed a few months ago in the rather unlikely setting of the borough of Richmond upon Thames.
The Liberal Democrat council wanted to introduce a 20 mile an hour speed limit on every single road in the borough, except for two major trunk roads. But they chose to hold a referendum on the matter before making a decision.
Just as with Brexit, this plan rather backfired on them. The proposal was defeated. In an uncanny replay of the EU vote itself, the margin was narrow at 49 to 51 per cent.
Echoing the national Lib Dem attitude to the Brexit vote, the local councillors announced that they would ignore the result of a vote that they themselves had instigated.
Shamelessly, given the way that they have vilified older voters over Brexit, the Lib Dems cited as a reason for bringing in the speed limits that 60 per cent of elderly respondents were in favour. Their second reason was that a similar percentage of young people also voted for their proposition.
At one level, this is just an amusing anecdote revealing the true nature of the supposedly cuddly Lib Dems. Even by the standard of modern politicians, they are wholly duplicitous. Nick Clegg set the tone in the 2010 General Election, when he promised not to put up tuition fees, and then promptly voted to treble them when he became deputy prime minister.
Yet the episode in Richmond upon Thames does suggest that this strange political alliance between affluent pensioners and young people is both widespread and deep seated. A sizeable proportion of this new grouping appears to believe that all their problems are caused by capitalism. But if it were not for capitalism, very few of them would exist in a way which allowed them to carry out political protests.
In the case of the pensioners, this is quite literally true. Without the prosperity generated by capitalism, most of them would now be dead. After all, life expectancy for men is now 79 years and for women 83, and over the past 100 years, it has increased by nearly three years a decade. So in 1919, life expectancy was in the low to mid-50s. If life expectancy had not increased, then the ranks of Extinction Rebellion would have been noticeably thinner.
The Nobel prize-winning economic historian Robert Fogel argues that much of the improvement to life expectancy was due to the increases in calorific intake and better nutrition which economic growth made possible.
As for the young, it was only in 1918 that the school leaving age was raised from 12 to 14. Apart from a really tiny minority of the very privileged, they would all have been at work rather than at “uni”.
Capitalism generates economic growth by encouraging innovation. And it will be innovation that will solve the climate problem, rather than wearing a hair shirt.
As published in City AM Wednesday 1st May 2019
The internet has led to a massive increase in the amount of information available.
Often, this is a good thing. For example, shopping around to find the cheapest price for something has become far easier.
But it can have its downsides. A report last week from the consumer magazine Which highlighted one such disadvantage. An investigation claimed that the review system on parts of Amazon was being undermined by fake five-star reviews.
The magazine analysed the listings of hundreds of popular tech products in 14 online categories, such as headphones and smartwatches.
Researchers sorted the headphone reviews, for example, by the average scores of the brands. The first page of results – those with the highest scores – consisted almost entirely of little-known brands, with nearly 90 per cent of the reviews from unverified buyers.
In other words, there was no evidence that the reviewer had ever bought the item in the first place.
Companies like Amazon are well aware of these potential problems. They take steps to try to guard against them. A flurry of very good posts for a less well-known brand is one of the classic footprints which enable fake reviews to be identified.
But Which suggested that the volume and variety of fake reviews was so large that the defences are currently being overwhelmed.
A similar problem arose almost from the very start of email, when spam first appeared. Ever since then, a complicated evolutionary game has been played between the spammers and the spam filters.
It is a game because spam wins if it gets through, and the filters win if it does not. It is evolutionary because both sides are constantly adjusting their strategies. The filters seem gradually to be getting the better of it, though I am currently being plagued by emails from China offering to sell me plastic moulds.
The fake review – and more generally the fake news – problem has not been an issue for quite as long, but concern over it is growing.
The instinct of many people is to reach for the law, and in particular to regulate. Set up a body, staff it with bureaucrats who of course have the public interest at heart, and the problem will be solved, goes the logic. The European Commission is a strong proponent of this approach.
But there are already some good illustrations of the private sector reducing what economists describe as “reputation systems failures”.
For example, a 2017 paper by Andrey Fradkin and colleagues at the MIT School of Management analysed experiments by Airbnb.
A particularly successful one appears to be that of the simultaneous review: both the buyer and seller post their reviews, and only then are they allowed to see what was written about them.
Not all consumers give feedback. Many who have a bad experience do not bother to rate the seller or product – they just stop buying from the platform. Platform providers therefore have a strong incentive to verify posts and encourage real reviews, perhaps using monetary payments to reduce selection bias.
Just as we didn’t need to regulate against spam, given time, markets will find solutions to what is currently a pressing problem.
As published in City AM Wednesday 24th April 2019
Image: Online shopping by Maxpixel is licensed under CC0 1.0
The International Monetary Fund (IMF) is up to its usual tricks. Last week, it predicted a two-year recession in the UK in the event of a no-deal Brexit.
Even in the main forecast, involving a mild Brexit, GDP was projected to grow by only 1.2 per cent this year and 1.4 per cent in 2020.
These are very gloomy numbers. If they were correct, it would be the weakest period of growth since the financial crisis itself in 2008 and 2009.
The IMF has form on this matter. Six years ago, in the spring of 2013, mainstream economists were full of doubt that the government’s policy of austerity would work. In January that year, the IMF projected only one per cent growth, which in April it slashed to just 0.6 per cent.
In fact, economic growth accelerated from 1.4 per cent in 2012 to 2.0 per cent in 2013 and 3.0 per cent in 2014. In line with the thinking of Project Fear, in the middle of June 2016 the IMF predicted an immediate recession if the UK voted to leave.
Exactly the opposite happened. The economy continued to grow, and unemployment to decline. To be fair, this time around there does seem to be evidence of a slow-down. The Office for National Statistics (ONS) suggests only modest growth at an annual rate of around one per cent in the last three months of last year.
A recent Deloitte’s survey of chief financial officers found only 13 per cent of them more optimistic about prospects than they were three months ago. Does the online world tell us anything different?
The ONS is making progress here. The agency is starting to use so-called big data to try to get faster and more accurate fixes on what is happening to economic activity. Online information such as value added tax returns and road traffic is being analysed.
Given that this is the first time it has ventured into this field, the ONS is understandably cautious about its initial estimates. It says, rather cryptically, that the indicators they are using are broadly in line with their long-term averages and paint a mixed picture.
Since 2016, with my UCL colleague Rickard Nyman, I have been monitoring on a daily basis how people in London are feeling.
The conventional measurement of wellbeing is based on responses to surveys. In contrast, the Feel Good Factor (FGF) extracts the sentiments which people reveal, knowingly or unknowingly, in their online posts, using advanced machine learning algorithms.
There were big drops immediately after Brexit and after Donald Trump’s election. But the FGF recovered in a matter of days.
Averaging the data over each quarter, optimism peaked at the start of 2017. By early 2018, a sharp drop took place, but sentiment was still around its 2016-19 average. During early 2019, the FGF is down again, but only slightly, and the past few weeks show no change compared to the same period last year.
Uncertainty over Brexit does seem to be having a negative impact on sentiment in the short term. But the overall trend offers some sunny perspective on the IMF’s dismal economic forecasts.
As published in City AM Wednesday 17th April 2019
Image: British Weather by Wikimedia is licensed under CC BY 2.0
As the Brexit process unfolds, the possibility of a Corbyn government has become much more tangible. Last month, John McDonnell, the shadow chancellor, wrote to the Treasury to say that in power he would require them to “widen the range of economic theories and approaches in which its officials and those in the rest of the government are trained”.
In principle, this would be a good thing. Machine learning algorithms, for example, have been shown beyond doubt to be more powerful than the traditional economists’ tool of econometrics for analysing data. Standard economic theory is not as good as cultural evolution theory at understanding how search engines, reputation systems, and social media affect the decisions we make and the news we read.
Somehow, however, one feels that this is not the retraining which McDonnell has in mind.
The fashionable idea among left-wing economists is something called “Modern Monetary Theory” (MMT). In the US, the rising Democrat star Alexandria Ocasio-Cortez appears to be keen on it, believing that it could finance her Green New Deal as well as an immense raft of social programmes and welfare benefits.
A key part of MMT asserts that governments who control their own currency can finance any level of spending simply by printing more money. Countries in the Eurozone, for example, cannot do this, because the European Central Bank controls how much money can be created – but the UK can.
A sharp increase in public largesse almost always creates an increase in the public sector deficit, which is the difference between spending and the income that the government gets from taxes.
The conventional way of financing the deficit is by issuing bonds. This both creates a stream of interest payments to the lenders, and at some point – depending on the date of maturity – have to be repaid.
MMT asserts that printing money instead removes these constraints. Money created by the government never needs to be repaid. For example, £10 notes carry the phrase “I promise to pay the bearer on demand the sum of 10 pounds”. If you take it to the cashier’s desk in the Bank of England, they will do just that – they will give you another £10 note instead.
Also, money carries no interest. In technical terms, we might think of money as a “zero coupon perpetual bond”, although I have never seen MMT theorists refer to it in this way. In one way, MMT is completely true. Countries like the US and the UK can finance government deficits by printing money rather than issuing bonds. Indeed, there are genuine arguments to be had about the appropriate mix of the two.
Where the theory falls down, however, is not recognising the adverse consequences of creating too much money. The economic history of the world is replete with examples of how this just creates inflation, from Roman emperors to the latest example, Venezuela.
MMT ought to be renamed the Magic Money Tree. The Bank of England is running a competition for whose face should be on the £50 note. For MMT theorists, the answer is obvious: old magic grandad himself, Jeremy Corbyn.
As published in City AM Wednesday 10th April 2019
Image: Jeremy Corbyn by Chris McAndrew via Wikimedia is licensed under CC BY 3.0
Social media influencer, Yovana Mendoza, provided an amusing diversion from Brexit last week.
The 20-something vlogger built a very lucrative personal brand around veganism. She amassed over 3m followers on YouTube and Instagram by advocating a raw vegan diet and 25-day water fasts.
All seemed to be going well until a competitor observed and filmed Mendoza eating seafood. The vlogger’s embarrassment was compounded by the fact that she tried to hide the fish.
Unsurprisingly, the ensuing video went viral.
Mendoza performed what has become the ritual apology on online media: “It was the worst day of my life. I felt like someone had died,” she posted. “Someone did”, was one of the less abusive responses, “the fish”.
The final act in the saga was her own video in which she confessed to eating fish “for health reasons”.
The episode illustrates some fundamental features of the online world. It shows how the popularity of products or ideas need not necessarily be connected to the inherent merits of the offer.
There has always been a strong tendency in popular culture for success to breed success. Things become desirable, not necessarily on account of their inherent qualities, but simply because they are already popular.
The internet compounds these tendencies. The inherent attributes of products become outweighed by the effect of social influence on the choices that people make.
In more conventional markets, where social influence is weak, economic theory has a good understanding of how consumers behave. In this type of market, consumers gather information on the attributes of the alternatives, such as price and quality, choosing products based on their individual preferences and affordability.
In recent decades, the theory has been expanded to incorporate situations in which consumers don’t have all the information to hand.
But it is still essentially based on the idea that people compare what a product offers with what they want.
This differs in the online world, because what people want is altered by observing what other people want.
For example, Mendoza sought to convince her 3m followers that raw veganism and extreme water fasts were part of a healthy lifestyle, despite not following her own advice.
There is no suggestion that Mendoza’s work is fake. But the high emotional content that she regularly published gave it a better chance of being noticed and spread by social influence.
People are learning this fast, and the use of “clickbait” is spreading rapidly.
The largest ever study on fake news was published just over a year ago in the Science journal, and it concluded that fake news and rumours tended to spread much faster and reach more people than accurate stories.
A key reason is that fake news typically shows a much higher level of emotion in their overall content. The question is whether we will all learn to see through this and start behaving rationally again.
But the furore around Brexit suggests that we have some way to go.
As published in City AM Wednesday 3rd April 2019
Image: Clickbait by Pete Unseth via Wikimedia is licensed under CC BY-SA 3.0
Should pure blue sky research be funded?
Certainly, the answer from government-backed research councils seems to be “no”. The emphasis is increasingly on research which has immediate practical applications.
Yet seemingly esoteric research can shed light in quite unexpected areas. For example, a PhD thesis written by a then obscure research student 70 years ago helps us understand the difficulties encountered today in resolving the current Brexit problem with a series of votes.
The number of alternatives suggested as the outcome of the Brexit process has been bewildering.
During the past week alone there has been: Theresa May’s deal; her deal plus a customs union; her deal plus a customs union and the Single Market; a Canada-style free trade agreement; another referendum; revoking Article 50 and cancelling Brexit; and leaving without a deal at all.
Little wonder that MPs have struggled to produce an overall majority in favour of any particular option.
So now we come to the idea of so-called “indicative votes”. MPs are due to vote on each of a large range of options to see which, if any, command a majority.
A variant would be to get MPs – or the electorate as a whole if there were another referendum – to rank explicitly the alternatives in order of preference. When we elect the London mayor, we have to express our preferences rather than just cast one vote, as we do in a General Election.
All of these approaches seem plausible. They share the same basic idea: test the options with a voting system based in some way on preferences among the alternatives, and see which comes out top.
It seems common sense. But unfortunately, as is often the case, common sense is not a very good thing to rely on.
The PhD thesis mentioned above was written by Kenneth Arrow, who went on to win the Nobel Prize in economics. He demonstrated the inherent problems of preference-based voting systems.
Arrow, who died in early 2017 at the age of 95, is virtually unknown to the general public. He spent his life in the sheltered groves of American Ivy League universities. But he made some of the most profound contributions to economic theory in the whole of the second half of the 20th century.
One of these was his so-called Impossibility Theorem. He proved that, whenever voters have three or more alternatives, no system of ranked voting can convert the ranked preferences of individuals into a set of preferences at the aggregate level which is guaranteed to be consistent.
Arrow’s result applies not just to a given practical example, but to all systems of this kind. Paradoxes abound. For example, even if all voters prefer X to Y, it is entirely possible that at the group level the result may not reflect this.
When once asked about the practical implications, Arrow himself said: “Most systems are not going to work badly all of the time. All I proved is that all can work badly at times.”
Brexit is an excellent example not just of this, but of the value of high quality, blue sky research.
As published in City AM Wednesday 27th March 2019
Image: Brexit via Pixabay
HMRC’s programme to make tax digital continues to roll out.
Anyone with a small business will know about the imminent deadline of 1 April, when VAT returns become digital.
Quick to seize an opportunity, several companies have developed software to ease the task.
The digitisation of tax raises the wider issue of whether technology will help the Office for National Statistics (ONS) put together faster and more reliable measures of the state of the economy. The VAT returns potentially give the ONS real-time information.
The current methods of constructing the national accounts – the picture of how the economy is doing – remain rooted in the twentieth century. Ron Jarmin, the deputy director of the US Census Bureau, writes in the latest issue of the top ranked Journal of Economic Perspectives that: “current measurement programs are not keeping pace with the changing economy, and current methods for collecting and disseminating statistical information are not sustainable”.
For example, national accounting bodies such as the ONS and the Bureau of Economic Affairs in America still rely heavily on sample surveys for their information.
Jarmin points out that surveys are encountering increasing problems. Response rates by both households and companies have declined substantially, increasing costs and threatening quality.
The intellectual conservatism of outfits such as the ONS is illustrated by measurements of well-being, or happiness. Hailed as an innovation when David Cameron instructed the ONS to produce this in 2014, it is based purely on old-fashioned survey questionnaires.
Economists in general are traditionally sceptical of survey-based approaches. The respondents, in the jargon of economic theory, simply state their preferences when answering a series of questions.
Economists place much greater weight on preferences which are revealed by the actions which people take. In the 1980s in Britain, survey after survey showed a stated preference for higher taxes and more public spending. Yet in their actions at the ballot box, people kept electing Margaret Thatcher. They revealed a preference for the exact opposite.
The online world is replete with revealed emotions. Indeed, the entirely new language of emojis has evolved to allow people to do this.
Modern machine learning techniques can readily translate the text of tweets and blogs into a scientifically-based measure of wellbeing. And they can do so much faster and more reliably than the survey methods used by the ONS.
Jarmin urges governmental statistical agencies to rely much more on digital information in general. He argues that material “generated from transactions, online interactions, sensors, the internet of things, and many other sources can be used to capture various aspects of economic activity”.
He notes the massive increase recently in the number of economists working for tech companies in the US. Here, innovative methods of data collection and analysis are the norm.
Statistical agencies such as the ONS need to show the same energy and move much more rapidly into the twenty-first century.
As published in City AM Wednesday 20th March 2019
Knife crime continues to dominate the headlines.
What can be done about it? Economics does not pretend to provide all the answers. But, perhaps surprising to some, it has a lot of useful insights to offer on crime.
Gary Becker was a professor of both economics and sociology at Chicago. One day he was pushed for time, and weighed up whether to park in an inconvenient garage, or on the street right next to where he was going, risking a fine.
This sparked his interest more generally into the costs and benefits of crime to the criminal. The result was a paper in the Journal of Political Economy in 1968 which eventually won him the Nobel Prize in 1992.
Becker’s theoretical work has generated a large number of scientific empirical investigations into crime by economists. Would capital punishment, for example, be a sufficient deterrent to reduce the number of knife murders?
If execution reduced the number of murders, it would be morally wrong not to implement it. Sparing the life of the criminal would come at the cost of future innocent victims of other murderers.
Economists have carried out many statistically-based investigations into this topic, mainly using data in the US. From a scientific perspective, it is an excellent source, as there is a lot of natural variability in the data. Some states have capital punishment, others do not, and the number of executions differs a lot across the former.
On balance, although there is some evidence of a deterrent effect, it is not sufficiently strong to be conclusive. This is particularly the case given the rise in recent decades of gang culture.
Young gang members face a non-trivial probability of death in any given year from both other gangs and members of their own. So capital punishment would just be a marginal increase in this probability.
Steve Levitt, also of the University of Chicago, shot to fame in 2005 with the best-selling book Freakonomics which he co-authored. Given that a great deal of crime is committed by relatively unskilled men from single-parent families, Levitt showed that increases in abortion rates reduced the “supply” of such people, and so cut the crime rate.
More pertinently in the current context of knife crime, Levitt describes the evidence on crime and police numbers as being “persuasive”. In urban environments, hiring an extra police officer generates benefits in reduced crime which exceed the cost of employing them.
The “cost” to the criminal is the potential punishment, such as a prison sentence. But this consists of two elements: the expected length of sentence if convicted, and the probability of being caught in the first place. If the latter is low, even much stiffer sentences will have only a small deterrence effect.
The evidence suggests that an increase in the probability of being arrested does reduce crime. Longer sentences can also work, providing that there is sufficient chance of being caught.
The evidence from economics offers little comfort to the Prime Minister in her claim that police numbers play no role in the current spate of knife crimes in the UK.
As published in City AM Wednesday 13th March 2019