WHY is productivity growth low if information technology is advancing rapidly? Prominent in the 1980s and early 1990s, this question has in recent years again become one of the hottest in economics. Its salience has grown as techies have become convinced that machine learning and artificial intelligence will soon put hordes of workers out of work (among tech-moguls, Bill Gates has called for a robot tax to deter automation, and Elon Musk for a universal basic income). A lot of economists think that a surge in productivity that would leave millions on the scrapheap is unlikely soon, if at all. Yet this year’s meeting of the American Economic Association, which wound up in Philadelphia on January 7th, showed they are taking the tech believers seriously. A session on weak productivity growth was busy; the many covering the implications of automation were packed out.
Recent history seems to support productivity pessimism. From 1995 to 2004 output per hour worked grew at an annual average pace of 2.5%; from 2004 to 2016 the pace was just 1%. Elsewhere in the G7 group of rich countries, the pace has been slower still. An obvious explanation is that the financial crisis of 2007-08 led firms to defer productivity-boosting investment. Not so, say John Fernald, of the Federal Reserve Bank of San Francisco, and co-authors, who estimate that in America, the slowdown began in…