Expert Voice: Paul Romer on the New Economy

CIO Insight Staff Avatar

Updated on:

In the late 1990s, the name of Paul Romer was frequently associated with the rise of the “New Economy.” In 1997, Time called him one of America’s 25 most influential people, and Newsweek included him in its list of “100 people for the New Century.” That’s because Romer, who teaches economics at Stanford University’s business school, is the originator of “New Growth Theory,” a key tenet of which holds that long-term economic progress is dependent on advances in technology and the ideas that produce those advances. Economists have long sought a rigorous explanation of the connection between the wealth of nations and the technologies that contribute to that wealth. But it wasn’t until the mid-1980s that Romer and others began to make that connection explicit, overturning the theorists of scarcity from Malthus to Paul Ehrlich and the Club of Rome. Such notions naturally appealed to the exponents of the New Economy, who saw Romer’s work as an aid in justifying their own activities.

The New Economy has been dead for two or three years now. And Romer never bought the notion that the Internet boom proved—or the bust disproved—the importance of New Growth Theory. Romer welcomed the attention that coverage of the boom brought to his theory, but in his work he typically takes the long view of economic progress, measured not in the nano-seconds of day-trading but in the rise and fall of whole economies. Yet he has much light to shed on what was temporary and what was permanent about the late 1990s.

Professor Romer recently took some time out of his busy schedule—he’s been spending a lot of it running Aplia Inc., the company he founded in 2000 to make productivity software for professors in the classroom—to chat with Executive Editor Edward H. Baker about why the economy is not in as dire a situation as many make it out to be, and what actually drives productivity and profits in a highly competitive economy. What follows is an edited version of that conversation.

CIO INSIGHT: The Internet boom was advertised as a permanent change—the “New Economy.” Yet it collapsed. What’s your assessment of the late 1990s and the current state of the economy?

ROMER: There’s a pronounced tendency for people to overreact to current events, to think that a temporary deviation in one direction or the other represents some kind of permanent fundamental change in the economy. Every time there’s an increase in growth associated with a boom—the Internet boom, for example—people tend to extrapolate and say, ‘Oh, you know, our long-run prospects have permanently changed.’ Whenever I’m asked about the state of the economy, I like to make the distinction between what economists call potential output and actual output. Potential output—what the economy could produce without generating inflationary pressures—grows at a relatively steady rate over time and gives you some indication of the level of output the economy could track along without causing any disruptions. If you’re below potential, you’d have unemployment; if you are above potential, you have risk of accelerating inflation. But potential is kind of a happy median along which output can grow.

On top of potential output are superimposed fluctuations that sometimes move us above potential, sometimes below. Those are the recessions and the booms that attract lots of attention. But observers often fail to distinguish potential and actual output. During booms, they think the current fast growth means we’ll have fast growth and high potential output for the indefinite future. Conversely, every time things slow down, they think, ‘Oh, the good times are over; we’re never going to have growth again. It’s all doom and gloom.’

Graphic

We’ve gone through this kind of cycle again and again. In the 1970s the theory was that limits to growth and resource scarcity meant that standards of living not only weren’t going to grow but were actually going to fall in the near future. Then we had a sharp downturn in the early 1980s, then a recovery, and then a growing sense of confidence in the late 1980s that we’d turned things around in the economy with supply-side economics and the Reagan revolution and so forth.

Then we had a relatively short recession during the early 1990s. Doom and gloom came back; there was all this talk about the jobless recovery—we weren’t generating enough jobs, and unemployment was going to be permanently high. And then just a few years later, unemployment was at the lowest rate in three decades.

So with that as framing, consider the late 1990s. We had very low unemployment, low inflation, high stock prices relative to earnings and strong productivity growth. But instead of thinking, ‘OK, this is a temporary movement that won’t be sustained forever,’ some economists, public commentators, the press, consultants, politicians and the like sold this as a permanent, fundamental change in the whole economy.

So we had a boom that was to a large extent investment-driven. Then we had a sharp falloff in investment by corporations. The Federal Reserve Board has reacted the way it should react in this kind of environment. They were actually raising interest rates toward the end of the boom, and now they’ve dramatically cut them. And the economy is responding the way it usually does: Growth is returning, but it takes a while. Yet every time we get to this situation in the economy, we get all this hand-wringing—’Oh no, nothing’s happening yet. The Fed cut interest rates, and it’s not going to work this time.’ The reality is it just takes time for the economy to recover. If anything, this recession was pretty mild and has turned around relatively quickly.

Are you suggesting that technology—the Internet, massive investment in information technology—had little real effect on the so-called Internet boom?

The way to think about technology is that it’s the thing that drives that steady improvement in potential output, and we should be careful not to think the short-run fluctuations in growth rates are driven by sharp fluctuations in technology. If you think about things like computerization, that’s a technological process that has been unfolding for decades in the economy, and it’s paying off in terms of higher productivity. But there’s reason to think that process wasn’t dramatically different in the late 1990s than in the early 1990s or the late 1980s. So technological change is really important. It’s in some sense the most important economic force, but it probably doesn’t have as much to do with shorter-run fluctuations as many people think.

For instance, I think people probably overestimated the importance of the Internet to the economy as a whole. It got lots of attention, and it probably was critical in creating the climate for this speculative bubble. We’ve seen episodes in the past, such as the rise of the railroads, where exciting new technologies generated a lot of speculation in asset markets.

But if you look at what really drives productivity, it’s much more complicated than just putting Web interfaces on traditional activities. In 2001, McKinsey & Co. published a study that looked at sources of productivity growth in the U.S. economy between 1995 and 2000 and pointed to the remarkable amount of productivity in retail trade that could be attributed to just one company, Wal-Mart. That growth relied on the effective use of IT, but most of it was not directly Web-based. It involved a lot of careful attention to the blocking and tackling of economic activity—managing the distribution system, making sure the right goods were on the shelf when people wanted them. That kind of activity is what ultimately leads to improvements in standards of living: taking new technologies that provide the ability to track goods and have more centralized inventory management, but linking those with workflow patterns and work practices in ways that give you many small improvements in productivity and cost and additional benefits for consumers.

I always try and get people to think about technology in a broad sense. McKinsey is debunking the view that IT by itself led to systemic productivity growth throughout the whole economy. Instead, we should be thinking about many other kinds of technology. As an example, in stores like Starbucks, it used to be that there were three different sizes of coffee cups with three different sizes of lids. Somebody had to track all that inventory and make sure customers had access to all these lids. And then somebody realized they could redesign the three different size cups so they all used the same lid. That way, you save on worker effort and tracking and costs, and you get some persistent advantages from that change.

Well, that’s an example of a technological change in the sense that I use it: people figuring out a way to redesign things and change work practices to get a small improvement in productivity. Thinking that way makes you recognize that the focus on digital information technology is much too narrow, and there is this whole broad range of different ways to do what we do. Technology interpreted that way is pervasive and fundamental to productivity growth.

So the idea that somehow the Internet arrives and suddenly all by itself leads to a big change in productivity and standards of living is just implausible. One way to think about this is to ask typical consumers to look around and see what they spend money on—housing, food, clothing. Think about their expenditure patterns. Think what would make them better off.

Then ask, well, how much does the Internet directly impact all of the economic activity they engage in? It’s reduced the prices of books somewhat, perhaps. It’s actually gotten a little bit cheaper to refinance your home mortgage, for example. But if you really look at all the things we do, the Internet got too much attention compared with all the other things that matter for improving standards of living.

Furthermore, the Internet’s share in a $10 trillion economy is still pretty small, as valuable as it is. It takes a lot of changes to raise productivity by 1 percent in a $10 trillion economy. That’s $100 billion.

You have to look at productivity growth in terms of interactions. It’s the combination of something like an IT innovation together with changes in work practices and organizational structure that leads to productivity growth. It’s a naïve view to think that you can just layer on IT and technology and get a big gain from it.

Now a second factor other commentators have pointed to is that even if the Internet creates a lot of value, it’s not clear who’s going to capture that value. Let’s imagine the Internet creates a flow of, say, $50 billion of value. It could be that the vast majority of that value actually flows through to consumers in the form of lower prices and time savings and doesn’t generate all that much profit for the firms that are participating in that activity.

That’s what you would expect if you have a lot of competition on the Internet: The gains will go to the consumers rather than the shareholders or the firms.

And so the lack of profits from all that investment in technology resulted in the relatively sudden downturn in IT investment?

In the late 1990s everybody was rushing to implement technology as fast as they could because they thought, ‘I might be the next eBay,’ or ‘I’ve got to do it before somebody comes in and displaces my whole business.’ Then, people finally realized that all that in-vestment in technology isn’t actually leading to the kind of profits that would justify that level of investment, and so they scaled back investment to a more sustainable level.

If anything, one of the problems with investment—with all kinds, not just IT—is that when you overinvest and you realize you’ve got too much stuff, your spending can change dramatically. If I’ve been buying cereal and I stick it all in the pantry, and then suddenly I realize I’ve got a six months’ supply of cereal, it’s not like my spending on cereal is going to go down 5 percent or 10 percent. I stop buying cereal entirely because I don’t need any more boxes.

So there was a predictable and not very surprising adjustment from an unusually high level of investment in IT to a lower level while we clean out the backlog. IT investment will return to a more sustainable level, something like what it was in the 1980s and early 1990s, and it will continue to have big cumulative impacts on the economy. It’s another case of fluctuations around the trend.

If a greater and greater portion of the value of new ideas is going to the consumer and not to companies, will that reduce the incentives to create new ideas?

The evidence seems to point in that direction. The very same highly competitive conditions that benefit consumers mean that a new entrant who has a valuable new idea doesn’t actually capture all of the value they create with that new idea. Lots of the value created by the new idea flows through to the consumer. The person who comes up with the new idea cannot patent and control all its benefits. What that means for the economy as a whole is there isn’t as much new idea creation as would be ideal. The incentives for creating new ideas aren’t as big as they should be.

Now there are two ways you could respond to that. One would be to try to make intellectual property rights much stronger, by strengthening patents and legal protection. But that can pose a lot of risk to the continued process of innovation. We might end up with a system that gives a lot of patent protection revenue to people and corporations right now, but makes it much harder for somebody new to come along with a new idea.

So for decades, many economists have been hesitant to rely exclusively on property rights and proprietary control to create additional incentives for ideas. What I’ve suggested as an alternative is this: If you want to get more ideas, one way is to subsidize the activities that lead to the production of those ideas. In particular, subsidize universities as important sources of idea generation, and subsidize the training of the people who go through those universities and then enter the economy and come up with ideas like cross-docking at Wal-Mart.

So other economists and I have been arguing for a long time that the government has an important role in encouraging the creation of new ideas, and letting them get fed out into a market system where people can capture profits from innovating. Those profits are important, but they will never be big enough by themselves to encourage the amount of idea creation that would be ideal for the economy. The market is a wonderfully powerful engine for economic growth, but it runs much faster when the government turbo-charges it with strong financial and institutional support for education, science, and the free dissemination of ideas.

Paul Romer Prof. Romer is the STANCO 25 Professor of Economics at the Graduate School of Business at Stanford University and a Senior Fellow at the Hoover Institution. The founder and chairman of Aplia, Inc., a San Carlos, Calif., software firm that designs and markets productivity software for the classroom, Romer received his Ph.D. in economics from the University of Chicago in 1983.

CIO Insight Staff Avatar