Paste your Google Webmaster Tools verification code here

Comparison sites are forcing businesses and economists to rethink price theories

Comparison sites are forcing businesses and economists to rethink price theories

The competition and Markets Authority (CMA) published a report about Price comparison sites at the end of last month. They seem simple enough, but these straightforward sites raise interesting issues for economics.

Overall, the CMA was pretty positive about the DCTs – digital comparison tools, to give them their Sunday best name. The conclusion was that “they make it easier for people to shop around, and improve competition – which is a spur to lower prices, higher quality, innovation and efficiency”.

DCTs offer two main benefits. First, they save time and effort for people by making searches and comparisons easier. Second, they make suppliers compete harder to provide lower prices and better choices to consumers. In short, they bring the real world closer to the perfectly informed consumers and perfectly competing firms in the ideal world of economic theory.

But even in this market, there is an issue which goes right to the heart of much of the information which can be accessed through the internet: how do we know whether we can trust it?

The main problem is that the comparison sites typically provide their services free of charge to consumers. They make money by charging a commission to suppliers.

This creates an incentive for a DCT to promote those suppliers which pay it the most commission. An effective way of doing this on the internet is by the order in which the information on the various suppliers is presented.

It is not that DCTs deliberately provide misleading information, or even that a site leaves off a major supplier which does not pay the particular website enough. But they can put those that pay the most at the top of the list.

Notoriously with Google searches, people rarely click through to anything which is not in the top three results of the search.

Allegedly, 60 per cent of the time, only the site which comes at the very top of is accessed.

Obviously on a DCT, consumers are likely to look at more. That is the whole point of using the site. But although the CMA does not provide hard data on this, it expresses a clear concern about the ways in which the sites rank the suppliers.

How the DCTs themselves set their prices raises a more general question for economics. The basic rule, described in the textbooks since time immemorial, is to set price equal to marginal cost – in other words, at the point where the revenue from one extra sale equals the cost of producing that extra item.

The standard assumption made in teaching generations of students their introductory course to economics is that as the level of output increases, marginal cost first of all falls but eventually rises.

But on the internet, once the site is set up, the cost of dealing with an extra user is effectively zero. The time-hallowed formula of economics is a recipe for bankruptcy.

The internet is forcing companies to innovate in their pricing strategies. And it is forcing economists to rethink some of their theories.

As published in City AM Wednesday 18th October 2017

Image: Man and Laptop by Pexels is licensed under CC by 0.0
Read More

Beware the dysfunctional consequences of imposing misguided incentive systems

Beware the dysfunctional consequences of imposing misguided incentive systems

Following the disclosure of salaries at the BBC, it has hardly seemed possible to open a newspaper or switch on the television without being bombarded by stories about pay.

By pure coincidence, an academic paper entitled “Pay for Performance and Beyond” has just appeared. So what, you might ask? Except that it is one of the 2016 Nobel Prize lectures, by Bengt Holmstrom, a professor at MIT.

Holmstrom’s work began in the 1970s on the so-called principal-agent problem. This is of great practical importance. For example, how should the owners of companies (the “principals”, in economic jargon) design contracts so that the interests of the directors (the “agents”) are aligned as closely as possible with the interests of the shareholders?

Many aspects of economics have a lot of influence on policy making. But this is not yet one of them. We have only to think of the behaviour of many bankers in the run up to the financial crisis. Stupendous bonuses were paid out to the employees, and, in examples such as Lehman Brothers, the owners lost almost everything.

It is not just at the top levels that scandals occur. Towards the end of last year, Wells Fargo had to pay $185m in penalties. Holmstrom cites this prominently in his lecture. The performance of branch managers was monitored daily. They discovered that one way of doing well was to open shell accounts for existing customers. These were accounts which the customers themselves did not know about, but they counted towards the managers’ bonuses.

A culture of pressure to perform against measured criteria can lead to problems even when the organisations involved are not strongly driven by money.

The education system in the UK has many examples. But the one given by Holmstrom is even more dramatic. The No Child Left Behind Act of 2001 in the US was very well intentioned. But the test-based incentives eventually led, around a decade later, to teachers in Atlanta being convicted of racketeering and serving jail sentences for fixing exam results.

Holmstrom is in many ways a very conventional economist – his Nobel lecture rapidly becomes full of dense mathematics. He believes that, given the right information and incentives, people will make rational decisions.

This is why his conclusion is so startling.

He writes: “one of the main lessons from working on incentive problems for 25 years is that, within firms, high-powered financial incentives can be very dysfunctional and attempts to bring the market inside the firm are generally misguided”.

The whole trend in recent years has been to bring even more market-type systems inside companies, from bonuses for meeting potentially counter-productive targets, to devolving budget authority away from the discretion of mangers and handing it to specialised departments.

Holmstrom’s conclusion implies the need for a pretty radical rethink of the way incentives are structured, in both the public and private sectors.

As published in City AM Wednesday 26th July 2017

Image: Lehman Brothers Headquarters by Sachab is licensed under CC by 2.0
Read More

Cautious corporates sitting on hoards of cash are to blame for our slow recovery

Cautious corporates sitting on hoards of cash are to blame for our slow recovery

The slow recovery since the financial crisis remains a dominant issue in both political and economic debate.

The economy has definitely revived since 2009, the depth of the recession, in both Britain and America. The average annual growth in real GDP has been very similar, at 2.0 and 2.1 per cent respectively. This is much better than in the Mediterranean economies, where growth over the 2009-2016 period is still negative. Even so, the Anglo-Saxon countries have not expanded as rapidly as they have done in previous recoveries.

A key reason for this is the lack of vision being shown by the corporate sector. True, highly innovative companies like Facebook have emerged over the past decade, and start ups continue to proliferate.

But the longer standing major firms in both the UK and the US have become real stick in the muds. Caution, safety first and an increasingly stultifying bureaucracy envelop them.

The contrast in the behaviour of the corporate sector in the two major financial crises of the 1930s and late 2000s makes this clear. The US national accounts only have data going back to 1929, the year before the Great Recession. But in that year, the net savings of non-financial companies was 3.5 per cent of GDP.

When the recession struck, firms ran down their accumulated cash. Between 1930 and 1934, their net savings were negative, averaging -2.4 per cent of GDP. That amounts to a shift during the recession from a surplus of $650 billion in 1929 to an annual overspend of $450 billion in today’s prices.

In the United States, during the decade prior to the crash, 1998-2007, companies on average had net savings of 2.6 per cent of GDP each year. Since 2009, this has averaged 4.0 per cent. So instead of spending their assets, as they did in the 1930s, companies this time round have simply saved more.

To be fair, American firms are gradually moving back towards their savings patterns prior to the crisis. From 5.4 per cent of GDP in 2010, net savings in 2016 were back down to 3.1 per cent. They are gradually getting their confidence back, their “animal spirits” as Keynes called it.

There are signs of this happening in Britain as well. Between 1998 and 2007, net savings by non-financial companies averaged 1.3 per cent of GDP.  From the trough of the recession to now, the annual average has been 2.7 per cent. As in the US, the figure has come down from 2009-2011, when it averaged 3.8 per cent. But firms remain cautious.

But in both the UK and the US, companies are sitting on piles of cash and lack the entrepreneurial spirit to spend it. Boards obsess about fashionable concepts such as lean and agile processes and management. At the same time they set up procurement systems more suited to the old Soviet Union in terms of the tick box mentality which prevails.

Capitalism must be seen to be delivering the goods, and many of our major companies are simply not doing this.

As published in City AM Wednesday 12th July 2017

Image: London Construction by Bonny Jodwin is licensed under CC by 2.0
Read More

Corbyn and McDonnell’s delusional tax plan would cut revenue and harm growth

Corbyn and McDonnell’s delusional tax plan would cut revenue and harm growth

The income tax system in the UK is highly progressive.

Not many people know that, to use a catch phrase attributed, rightly or wrongly, to the great actor Michael Caine.

The top one per cent of earners contribute 27 per cent of all income tax receipts. To put it in context, just 300,000 people pay nearly three times as much in total as the bottom 15m taxpayers. Despite all the political rhetoric about tax avoidance, high earners cough up a very large amount of money to the Exchequer every year.

Under the Labour government of the 1970s, the highest marginal tax rate was no less than 98 per cent. But the top one per cent of earners paid only 11 per cent of all income tax.

Jeremy Corbyn and shadow chancellor John McDonnell pledged in their manifesto to raise around another £15bn a year in tax from this group. In addition, corporation tax on profits would allegedly raise a further £19bn.

The realism of Labour’s costings as a whole was called into serious question at the time by people such as Paul Johnson at the Institute for Fiscal Studies.

A paper published in the latest American Economic Review produces strong evidence that it is purely wishful thinking to imagine that anything like these amounts could be raised. In the modern world, both skilled labour and capital are highly mobile. There would simply be movement out of the UK altogether.

The authors, Enrico Moretti and Daniel Wilson of Berkley and the San Francisco Federal Reserve Bank, carry out a very detailed statistical analysis of the impact of the different state income tax rates in the US on where highly skilled people choose to work.

Personal taxes vary enormously across the American states. In California, for example, the average tax rate arising on top earners which is due solely to state rather than federal taxation is eight per cent. In contrast, in Texas (and eight other states) it is zero. Over the period of the study – 1977 to 2010 – rates have also varied substantially within individual states.

Moretti and Wilson compile an impressively detailed set of data on individuals they describe as ‘star scientists’, defined as those scientists who are very prolific in generating patents. They examine the location decisions of some 260,000 individuals during the period they analyse.

Their conclusion is unequivocal: “we uncover large, stable, and precisely estimated effects of personal and corporate taxes on star scientists’ migration patterns”. Essentially, steep taxes drove away high-achievers.

Tax rates are important not just to individuals in choosing where they want to work. The different corporate tax rates levied by individual states affect where companies such as Microsoft and General Electric locate their most productive and innovative researchers.

There are of course many factors which determine where people and firms decide to locate. But the idea that innovative people will simply sit around en masse and wait to be fleeced is pure fantasy. There may be little chance of the current Labour leadership understanding the real world, but the electorate needs to.

As published in City AM Wednesday 5th July 2017

Image: Jeremy Corbyn and John McDonnell by Rwendland is licensed under CC BY-SA 4.0
Read More

How to stop tech hubs in urban hotspots from intensifying geographic inequalities

How to stop tech hubs in urban hotspots from intensifying geographic inequalities

Perhaps George Osborne’s most abiding legacy from his time as chancellor will be the creation of the concept of the Northern Powerhouse. Certainly Manchester, its principal focus, is booming.

The landscape of the centre is being altered dramatically by skyscrapers. Peel Holdings, the huge investment and property outfit, is planning to double the size of the development around Media City in the old docks, where the BBC was relocated. The airport, already the third busiest in the UK, is expanding.

All in all, it seems a triumph for modern capitalism. After decades of relative decline, a city is being transformed by private enterprise. But what is really going on?

In a piece this month in the MIT publication Technology Review, urban guru Richard Florida has picked up on a startling new trend in the location of new technology companies in the US.

In the 1980s, there were essentially no high tech companies in city locations. Instead, we had Intel and Apple in Silicon Valley, Microsoft in the Seattle suburbs, the Route 128 beltway outside Boston, and the corporate campuses of North Carolina’s Research Triangle.

Now, urban centres are rapidly becoming the places which attract technology companies. In 2016, the San Francisco metro area was top of the list for venture capital investment, attracting more than three times the amount of the iconic location of Silicon Valley. Google has taken over the old Port Authority building in Manhattan. Amazon’s headquarters are in downtown Seattle.

The impact of this new, high concentration of tech firms is to intensify geographic inequalities. As Florida puts it: “tech startups helped turn a handful of metro areas into megastars. Now, they’re tearing those cities apart.”

A relatively small number of urban areas in America, and within them a small number of neighbourhoods, are capturing all the benefits.

The same sort of thing seems to be going on in Greater Manchester. A few areas are soaring away and attracting wealth and talent. In 1981, fewer than 600 people lived in what the Council describes as “the heart of Manchester”. Now, over 50,000 do, almost all of them young graduates.

But the more traditional outlying boroughs of the city region, especially to the north and east, are struggling to capture any trickle down from this massive transformation. Indeed, they are at risk of losing out, as their young bright sparks are attracted by the life of the inner metropolis.

Richard Florida does not just identify the problem, he suggests some possible solutions. One of which is a programme of building lots of good housing in the outlying areas, supplemented by a top class public transport service. This would keep house prices down, and attract some of the people stuck in rabbit warrens in the urban centres.

Manchester already has a modern tram service. But the new Labour mayor, Andy Burnham, is resolutely opposed to building on the green belt just to the north and east of the city. Yet another example of the sanctimonious intentions of the Left serving to intensify, not reduce, inequality.

As published in City AM Thursday 29th June 2017

Image: Media City UK by Magnus D is licensed under CC by 2.0
Read More

Does the productivity gap actually exist?

Does the productivity gap actually exist?

Whoever wins the election tomorrow will have to grapple with what appears to be a fundamental economic problem. Estimated productivity growth in the UK is virtually at a standstill.

The standard definition of productivity is the average output per employee across the economy as a whole, after adjusting output for inflation – or “real” output, in the jargon of economics.

The amount in 2016 was the same as it was almost a decade ago in 2007, immediately prior to the financial crisis.

Productivity is not just some abstract concept from economic theory. It has huge practical implications. Ultimately, it determines living standards.

Productivity is real output divided by employment. The Office for National Statistics (ONS) has a pretty accurate idea of how many people are employed in the economy. They get data from company tax returns to HMRC.

What about output? The ONS uses a wide range of sources to compile its estimates. But these essentially provide it with information about the total value of what the UK is producing.

The ONS has the key task of breaking this number down into increases in value which are simply due to inflation, and those which represent a rise in real output.

This problem, easy to state, is fiendishly difficult to solve in practice. To take a simple illustrative example, imagine a car firm makes exactly 10,000 vehicles of a particular kind in each of two successive years, and sells them at an identical price. It seems that real output is the same in both years.

But suppose that in the second year, the car is equipped with heated seats. The sale price has not changed. But buyers are getting a better quality model, and some would pay a bit extra for the seats. So the effective price, taking into account all the features, has fallen slightly.

Assessing the impact of quality changes is the bane of national accounts statisticians’ lives. The car example above is very simple. But how do you assess the quality change when, for example, smartphones were introduced?

The ONS and its equivalents elsewhere, such as the Bureau of Economic Analysis in America, are very much aware of this problem. But even by the early 2000s, leading econometricians such as MIT’s Jerry Hausman were arguing that the internet alone was leading inflation to be overestimated by about 1 per cent a year, and real output growth correspondingly underestimated.

Martin Feldstein is the latest top economist adding his name to this view. Feldstein is a former chairman of the President’s Council of Economic Advisers, so he is no ivory tower boffin.

In the latest Journal of Economic Perspectives, Feldstein writes:

“I have concluded that the official data understate the changes of real output and productivity. The measurement problem has become increasingly difficult with the rising share of services that has grown from about 50 per cent of private sector GDP in 1950 to about 70 per cent of private GDP now”.

The Bean report into national accounts statistics last year acknowledged these problems. It could well be that there is.

As published in City AM Wednesday 7th June 2017

Image: Smartphone by JÉSHOOTS  is licensed under CC by 2.0
Read More