<img alt="dcsimg" id="dcsimg" width="1" height="1" src="//www.qsstats.com/dcsuuvfw300000gkyg9tnx0uc_3f7v/njs.gif?dcsuri=/index.php/c/a/Trends/IT-Productivity-Measuring-the-Immeasurable/1&amp;WT.js=No&amp;WT.tv=10.4.1&amp;dcssip=www.cioinsight.com&amp;WT.qs_dlk=XGivzWqk8vdreTkgb5KNKAAAAAI&amp;">

A Brief History of

By Edward H. Baker  |  Posted 01-07-2007 Print

A Brief History of Productivity

As it turns out, economists attributed much of the recent slowdown to a general weakening in economic growth: When gross domestic product slows down, as it did in the third quarter of 2006, it's often accompanied by a slowdown in productivity growth, as companies anticipating further gains continue to add labor to the productivity equation without concomitant gains in output. But this time around, the fears of a productivity slowdown seemed to rattle more than a few nerves in the IT community.

The measures ordinarily used to measure productivity, such as labor productivity and even multifactor productivity, involve simply adding up all the known inputs and outputs and then doing the math. In the case of labor productivity, that's just output divided by the number of hours worked. In the case of multifactor (or total factor) productivity, that's all the known outputs divided by all the known inputs. Pretty straightforward, and it generally worked, at least as a rough measure during the Industrial Age, when the inputs and outputs were relatively easily tallied.

But the arrival of the Digital Age has exponentially exacerbated the problem. In the early days of the information revolution, when IT resided beyond the glass wall, IT's contribution to productivity was relatively easy to assess. Take the cutting of payroll checks, for instance: How many laborious hours spent writing individual checks could you eliminate by feeding payroll data into a computer that automatically calculated the amount of the check, and withholding and Social Security taxes, and then printed it? Machines replaced humans, and it was simple to see the return.

By the late 1980s, however, with the advent of personal computers, networks and back-end systems that touched more parts of the business, the math became more difficult. In 1987, Nobel Prize-winning economist Robert Solow wrote, "You can see the computer age everywhere but in the productivity statistics." The problem, analysts decided, was either a lag in the time it took for companies to benefit from their IT investments, or the difficulty of measuring IT's contribution accurately—or both.

Then came the 1990s, which brought about a transformation in IT and how businesses used it. The Internet, the graphical browser, e-commerce, as well as enterprise-level software such as ERP and CRM, brought about massive investments in computers, networking equipment and software. But all that investment only fueled the controversy surrounding just how much IT contributes to productivity gains, and whether it is accurately measured.

In 2001, McKinsey & Co.'s McKinsey Global Institute published a still-controversial report arguing that most of the productivity gains in the 1990s were attributable to just six sectors of the economy: telecom, semiconductors, computer manufacturing, securities, and wholesale and retail distribution. In other sectors productivity was essentially flat or had actually declined.

That led McKinsey to suggest that IT was just one of a number of factors that generated the decade's productivity gains. "The problem," says Diana Farrell, director of the McKinsey Global Institute, "was that all sectors of the economy spent on IT, but because the productivity gains were concentrated in just those six sectors, you clearly can't attribute the growth to IT alone." IT is needed to facilitate productivity gains, she says, but only under the right conditions, which include sufficient competitive intensity and sufficient demand—exactly the conditions faced by those six sectors that contributed so much to the productivity gains of the 1990s.

Erik Brynjolfsson, the George and Sandra Schussel Professor of Management at the MIT Sloan School of Management and director of the MIT Center for Digital Business, offers a different rationale. To explain what happened, Brynjolfsson has developed a concept he calls organizational capital:

"The work I've been doing suggests that most of the benefits from IT come from complementary investments in what we call organizational capital—business processes and other changes in the way companies are organized. For every dollar spent on IT hardware, $10 is spent on this kind of business reorganization.

"What happened in the 1990s was that companies were investing a lot, not only in IT, but also a tremendous amount in organizational capital and new business processes, and those investments tend to take several years to pay off. Go forward a few years to 2001–02, when IT spending dropped, as did investments in organizational capital. The focus was not in adding to organizational capital, but in harvesting what had been done. The way that affects productivity is that the government statistics don't measure organizational capital at all, but they do measure the output that's generated from it. So that is why we had very high productivity, circa 2001–03, as we were reaping the benefits of these investments but not incurring the expense of new investments in organizational capital," he says.

And the recent drop in productivity? Says Brynjolfsson: "If you don't invest in IT or organizational capital, then three, four, five years down the road, you're not going to be in the position to get decent returns. So five years later, 2006, that's more or less exactly what's happened: The investments that weren't made in 2001 are the reason we're not having the comparable level of productivity growth today."


Submit a Comment

Loading Comments...
eWeek eWeek

Have the latest technology news and resources emailed to you everyday.