Paste your Google Webmaster Tools verification code here

Our tech advances are difficult for productivity stats to compute

Our tech advances are difficult for productivity stats to compute

One of the most depressing aspects of the decade of the 2010s, well before Covid-19 struck, was the apparently very slow growth in productivity.

This is not a mere ivory tower issue.  It is only through increasing productivity that rises in living standards can be sustained. Productivity is the key measure of the efficiency of the economy.

On an everyday level, the previous decade seems to have witnessed a surge in innovative ways of doing things. Companies like Amazon and Netflix make life easier, more enjoyable. Computing power has made dramatic advances. In the past few years, there have been major new developments in the science of artificial intelligence.

But hardly any of this seems to be reflected in the official statistics. Between 2010 and 2019, these show that productivity in America grew by only 0.6 per cent a year. In the UK, growth was even lower, at just 0.3 per cent a year.

An important paper by Stanford’s Erik Brynjolfsson, in the latest issue of the American Economic Journal, goes a long way to resolving this paradox.

The analysis is based on the concept of what is known in economics as a general purpose technology (GPT).

GPTs are technologies which have a large and pervasive impact on both the economy and society. The steam engine was the first, during the first Industrial Revolution in the late 18th and the first half of the 19th century. Electricity was another, around a hundred years later.

Such technologies are far more efficient than the competitors which they replace. In 1830, for example, the crack London to Edinburgh stagecoach took 39 hours. By the middle of the century, it had been driven out of business by the railways.

They revolutionise many aspects of life. Steam power enabled factories to be built. These in turn led to huge shifts in population from rural to urban areas.

Computers have had a major impact since around 1980. But as economics Nobel Laureate Bob Solow remarked, “one can see the computer age everywhere but in the productivity statistics”.

Brynjolfsson and colleagues argue that GPTs need a lot of complementary investment in order to realise their full impact.

Computers, for example, require firms to develop new business processes, develop the experience of management, retrain workers and the like.

The key point is that many of these investments are intangible and do not appear on the balance sheet. They are particularly difficult for national accounts statisticians to deal with when they estimate the size of an economy.

The authors estimate that productivity levels in the US were 15.7 per cent higher in 2017 than the official numbers suggest. This means that the size of the American economy has been potentially underestimated by some 3 trillion dollars.

A similar exercise has yet to be done for the UK. But we can reasonably expect it will boost the numbers by between £200 and 300 billion.

So we are, in another piece of good news for the New Year, much better off as a nation than the Office for National Statistics says we are.

As published in City AM Wednesday 13th January 2021
Image: Numbers via Pixabay
Read More

The government should have been working on multiple tracing apps all along

The government should have been working on multiple tracing apps all along

The NHS contract tracing app has been scrapped in favour of a system developed by Google and Apple.

Although health secretary Matt Hancock has been heavily criticised for this failure, the UK is by no means alone.

For example, Denmark, Germany and Italy each tried to build their own app, based on the same type of centralised system as was attempted in the UK. But they have already ditched their efforts and taken up the decentralised approach of Apple and Google.

Australia is widely perceived as having had a “good” Covid-19 crisis. But the same cannot be said of its tracing app. It seems to have had serious problems working with iPhones at all.  The Aussies, too, are now taking the Google/Apple approach.

The simple fact is that most technological innovations fail.

The government can be criticised legitimately for not appreciating this fundamental feature of new technology. But it is a more subtle critique than merely pointing to the failure itself.

Given the importance of the tracing app, it would have been perfectly reasonable for the government to have pursued parallel tracks. At the same time as trying to develop its own NHSX app, it could have been collaborating with Apple and Google too.

Critics might have tried to pan this as an example of waste. But there is rarely such a thing as wasteful competition.

Spending on two completely different approaches at the same time would have been a hedge against the uncertainties which are inherent in the development of new technology.  No matter how smart you are, or how much prior information you gather, you just do not know whether an innovation really will work.

The tech companies themselves protect against this uncertainty by holding far more cash than conventional economic theory regards as rational. At the start of the Covid crisis, Apple, Microsoft and Google’s parent company Alphabet between them held over $450bn in cash or marketable securities.

Pharmaceutical companies face a similar challenge  Most new drugs fail. They fail when they are still in the lab, and they fail once they go out for testing to get regulatory approval.

In America, for example, there are three phases to the test process, each more demanding than the last.

The time scales are long. Andrew Lo, an MIT polymath, and his colleagues published a paper last year in the journal Biostatistics. They gathered a sample of over 400,000 clinical trials carried out between 2000 and 2015. Even after all the initial development work in the lab was completed, the typical successful drug took 8.3 years to obtain approval.

This puts into perspective the current frantic efforts to develop treatments and vaccines for Covid-19.

The probability of obtaining regulatory approval varies widely across categories. But overall, when a candidate drug enters phase one trials, its chances of eventual success are less than 10 per cent.

The government should embrace the idea that money spent on technology or drugs which fail is not money wasted. Indeed, the real mistake is not to risk enough, to stake everything on a single project.

This is the true failure of NHSX.

As published in City AM Wednesday 24th June 2020
Image: Covid tracing app by Gerd Altmann via Pixabay
Read More

Let the Iowa fiasco serve as a warning — new technology isn’t always the answer Opinion

Let the Iowa fiasco serve as a warning — new technology isn’t always the answer Opinion

Last week, the entire world witnessed the shambles of the vote counting in the Iowa Democratic caucus.

It should have been straightforward — but adding all the votes up in a consistent way took a whole week.

The list of errors is as long as your arm. In some precincts, for example, the total number of votes reported exceeded the number of eligible voters. But the main source of the problem seems to have been a mobile phone app that the Iowa Democratic Party used to collect results from caucus sites.

A system was in place which had stood the test of time through many election campaigns. But it was old-fashioned. It needed to be “modernised”. Hence the app.

Many layers of modern management share this obsession with technology for technology’s sake.

Financial institutions, for example, are fixated on apps: apps to manage day-to-day expenses, apps to help manage investments, apps to boost savings for your old age.

No doubt some of these have their uses. But the idea that apps can make major changes to the behaviour of individuals, and therefore drive productivity and prosperity, is something of a pipe dream.

As a further example of technology’s counter-productive impact, consider an elderly relative of mine who is in a care home. On entry, you used to sign the visitors book and enter the time of arrival; on exit, you put the time you were leaving — with a pen. A new computer system has been installed. It takes several times longer to enter these details. As far as I can judge, virtually no visitors use it.

Nobel laureate Bob Solow famously pronounced 30 years ago that “you can see the computer age everywhere but in the productivity statistics”.

In the 1980s, the decade to which Solow was basically referring, personal computers and fax machines were the cutting-edge of new technology. It seems like the Stone Age compared to the technology available to us now. Yet Solow’s problem remains.

Productivity growth was very low during the most recent decade, despite the massive advances made in technology. There are clearly many reasons for this. But one of them is, quite simply, that technology is often being introduced in situations where it is quite unnecessary. As a result, people become less rather than more productive.

More generally, new technology is proliferating in areas where the potential productivity gains are not that high. For example, it is convenient when buying a round of drinks to be able to tap your card rather than delve into your pockets for loose change, but it is unlikely that this innovation enables more drinks to be sold in any given pub or bar.

Similarly, self-service checkouts in supermarkets help avoid standing in long queues, but these rely on customers being willing to supply their own labour for free, rather than requiring paid staff to scan the goods for them.

The Iowa app incident is a source of amusement, but it may be telling us something more profound about why productivity growth remains low. New technology is being applied when it is simply not needed.

As published in City AM Wednesday 12th February 2020
Image: Iowa State Line via Flickr by Tony Webster licensed for use CC BY 2.0
Read More

It’s not cutting-edge AI we should fear, but mediocre automation

It’s not cutting-edge AI we should fear, but mediocre automation
If there were a betting market in future winners of the Nobel prize in economics, MIT’s Daniel Acemoglu would be at pretty short odds. His highly innovative work has already won him a string of prizes. So his research is always worth following – especially when he challenges the conventional wisdom, as in his paper in the latest issue of the Journal of Economic Perspectives. Economists are usually optimistic about the impact of new technology. The innovation itself destroys jobs – the Luddite riots in the early nineteenth century, for example, were in direct response to the displacement of skilled handloom weavers by the new machinery in textile factories. But this, along with all subsequent waves of innovation, enabled goods and services to be produced more cheaply. As a result, the spending power of everybody else in the economy increased, and new jobs were created. Mass production in factories during the industrial revolution was of course a phenomenon without precedent in the history of the world. Other completely revolutionary technologies followed, such as the railways and electricity.  The rapid advance of robots and artificial intelligence seems to be the latest example of a transformative new technology. Acemoglu argues that it is not these “brilliant” (as he puts it) technologies which threaten jobs and wages. These enable things to be produced much more cheaply than before, substantially boosting real incomes elsewhere in the economy. Then new kinds of goods and services can be created as a result of the increase in spending power. Rather, the risk to overall employment and living standards comes from the introduction of “so-so technologies”, which generate only small productivity improvements. Examples of so-so technologies include automated customer service, which has displaced human service representatives. It is, however, generally deemed to be low-quality, and thus unlikely to have led to large productivity gains. The cost of your bank charges or your supermarket shop have not exactly been reduced much by the introduction of automated answering systems or self-service check-outs. But jobs have been lost as a result. Acemoglu suggests a key reason why modern economies have, as he puts it in the jargon, “moved along this [particular] innovation possibilities frontier”. In the US and also here in the UK, the tax system has evolved in ways which subsidise the use of equipment and penalise the use of labour through payroll taxes such as our employers’ national insurance contributions. Interestingly, he also points the finger at the big tech companies. Their business model is based on automation and small workforces. The impact of innovative technology which destroys particular jobs needs to be counterbalanced by innovation elsewhere, which creates new tasks, new jobs which no one had previously thought of. We have had some, such as software and app development and database design, but nowhere near enough. Governments need to rethink the tax system as it applies to investment and employment. And they need to rebuild support for long-term innovation, which gives more scope to invent completely new jobs.
As published in City AM Wednesday 8th May 2019
Image: Self Checkout by Ben Schumin via Wikimedia is licensed under CC-BY 2.0
Read More

Artificial intelligence will dominate every aspect of our lives, but it won’t replace us

Artificial intelligence will dominate every aspect of our lives, but it won’t replace us

Guess which of the 964 jobs listed in the widely used Occupational Information Network online database is the least susceptible to replacement by artificial intelligence (AI).

The unsurprising answer is that of “massage therapist”.

This is one of the findings of a paper in the latest issue of the American Economic Review by Erik Brynjolfsson and colleagues at MIT’s Sloan School of Management.

But, while this answer might seem obvious, the study itself is a serious and innovative attempt to analyse the potential impact of AI on occupations across the economy.

A key point is that AI technology itself is going through a period of revolutionary progress.

The success of Google’s Deep Mind team in defeating the world champion at the immensely complex game of Go received wide publicity.

Unlike the algorithms which vanquished chess some years previously, the latest AlphaGo programme – improved since its annihilation of the Go champion less than two years ago – does not simply rely on pure computing power to outperform humans. The algorithm starts by knowing absolutely nothing about the game. It becomes stronger by playing against itself and learning as it goes along.

In short, it teaches itself, remembering both its mistakes and its successes. This type of algorithm is very new, and is known as deep learning. The programmes automatically improve their performance at a task through experience.

Brynjolfsson and colleagues regard this as so significant that they describe deep learning as a “general purpose technology” (GPT).

GPTs are technologies which become pervasive throughout the economy, improve over time, and generate further innovations which are complementary.

Historically, they are few and far between. Steam and electricity are examples. If they disappeared tomorrow, we would rapidly be driven back to the living standard which existed several centuries ago.

Deep learning will take years – or even several decades – before anything like its full effects are realised. But we will then look back and find that it is just as hard to imagine a world without deep learning as it is a world without electricity.

What will that look like? The authors analyse 2,069 work activities and 18,156 tasks in the 964 occupations. From this, they build “suitability for machine learning” (SML) measures for labour inputs in the US economy. They find that most occupations in most industries have at least some tasks that are SML. Pretty obvious. But few, if any, occupations have all tasks that are SML.

This latter point certainly is surprising – and from it the MIT team derives a positive message: very few jobs can be fully automated using this new technology.

A fundamental shift is needed in the debate about the effects of AI on work. Instead of the common concerns about the full automation of many jobs and pervasive occupational replacement, we should be thinking about the redesign of jobs and reengineering of business processes.

Economics is often described as the dismal science. But Brynjolfsson’s paper certainly provides very positive food for thought.

As published in City AM Wednesday 6th June 2018

Image: Robots by By Kai Schreiber is licensed under CC2.0
Read More

Neo-Luddites won’t like it, but the UK must keep on (driverless) truckin’

Neo-Luddites won’t like it, but the UK must keep on (driverless) truckin’

The announcement that experiments will take place with driverless lorries on UK motorways ought to be a cause for celebration. Once again, human ingenuity is pushing out the frontiers of technology.

But the general reaction in the media has been one of anxiety and concern. Wholly contradictory arguments have been advanced against them.

Driverless cars it is argued, for example, do not mean that you can summon one to your front door and be taken to and from the pub with impunity. The drink driving laws, the opponents of progress pronounce with confidence, will still apply to the humans being transported. Yet it is also claimed that the concept of responsibility for accidents involving driverless cars does not yet exist. Until it is, they cannot legally be used.

As with the introduction of railways, the law around a revolutionary technology will take some time to evolve. But the idea that a man should walk in front of the train carrying a red flag was soon given short shrift. The new technology was far too convenient to have it impeded in this way.

The opposition to driverless cars and lorries seems almost Luddite in its intensity. People currently employed in and around the activity of driving vehicles will become unemployed. Where will the new jobs come from?

I am writing this in a country house hotel in Aberdeenshire. In the room is a magazine dedicated to weddings. This, a eulogy to expensive popular culture, tells us a great deal about how the labour market evolves.

Many of the activities around modern weddings involved jobs which were either completely non-existent only a few decades ago, or only catered to a tiny number of ultra-rich individuals.

The adverts for venues, for example, usually stress that a dedicated wedding co-ordinator will be assigned to you during the planning stages. And a dedicated wedding events manager will ensure the day itself goes smoothly. Bridalwear experts can be hired to advise on the choice of costumes. People can, and do, pay substantial fees to be told that “if you plan to marry at the height of summer in Spain, a heavy material such as velvet is inadvisable”.

Special courses of dance lessons are available so that the bride and groom can perform a “full-on choreographed, fabulous first dance”. The potential activities around hen and stag events know no bounds. An adventure activity day is offered involving “Segways or zorbing”.

Specific fitness courses are offered to ensure that not only the bride and groom but their entire supporting cast look suitably “toned and sculpted”. Even your faithful pooch can be groomed for the occasion, and look glowing through consuming organic dog food. What a pity there was no advert for vegan canine sustenance.

This is a snapshot of how innovation impacts the economy. Technology enables a product or service to be provided more cheaply and at a higher quality.  Some people directly involved lose their jobs. But everyone else is made better off, and their extra spending creates entirely new types of jobs.

Paul Ormerod 

As published in City AM Wednesday 29th August 2017

Image by Pxhere used under CC0 attribution.
Read More