The IT Financing Paradox

By Edward H. Baker  |  Posted 04-01-2003

Analysis: The Simplicity Paradox

The concept is compelling: An IT architecture designed to free companies from the complexity and expense of islands of IT infrastructures. The ability to provide computing capacity to business units as they need it. Pay-as-you-go outsourcing models that free corporations from the need to buy, operate and maintain large, unwieldy, redundant IT operations. Simplified platforms and standards on which every business unit can depend. CIOs unleashed from the drudgery of micromanaging operations and maintaining old technologies, and freed up to help business units define flexible corporate IT strategies.

Making the vision of utility computing real has the power to revolutionize the business use of IT by automating the management of IT infrastructure, tightly integrating all kinds of business processes, and flexibly managing capacity. The result: savings of up to 50 percent of infrastructure costs, says John Parkinson, chief technologist for the Americas at Cap Gemini Ernst & Young. That's a prospect few CIOs will be able to resist, especially at a time when cost-cutting pressures are high. Meanwhile, the shift to utility computing—whether companies build their own IT utilities resources or farm out their computing needs in an on-demand model—will mean a significant reorganization of how the IT industry produces and sells product to its corporate clients.

The dream of rapidly and flexibly delivering computing power, applications and storage dates back to the time-sharing of the 1960s and passes through the mid-1990s, promise of distributed computing. But all of a sudden, the vision has begun to come together. In the past year, companies as disparate as IBM Corp., Hewlett-Packard Co., Sun Microsystems Inc., EDS Corp. and Microsoft Corp. have announced products and services designed to support the effort, and a host of smaller software companies are developing the middleware to manage and automate the systems.

Says Parkinson: "In ten years, of the 10,000 largest companies in the U.S., only about 200 will be big enough to be their own utility, and only about 100 of those will choose to do it themselves. The driving economics of talent scarcity, user expectations and rate of change will force the rest into a utility provisioning strategy."

None of this is lost on CIOs. CIO Insight recently asked more than 400 IT executives whether they thought a utility computing model would be beneficial to their companies. Despite the novelty of the technology, 44 percent said yes, and that number jumped to 54 percent among companies with more than 1,000 employees. But completing the vision—whether it takes three years or ten years—isn't just a matter of technology. The advent of utility computing poses several very thorny non-technical paradoxes—the prospect of more business process reengineering, a new financing model, and a shift in the role of the CIO—that CIOs must resolve if they are to meet the future head on.

The Business Process Paradox

The Business Process Paradox

There's little point in developing and implementing a utility computing model if the move can't help reduce complexity. Just about every corporation is saddled with layer upon layer of needless, redundant IT systems that don't communicate with each other. And those layers of technology are inevitably mirrored in costly layers of needlessly complex business processes. Utility computing holds the promise of doing away with all those old, separated systems, replacing them with a streamlined infrastructure that every business unit can access for its computing needs.

The paradox, however, lies in the potential for utility computing to set off another round of wrenching business process re-engineering. The drive to consolidate servers and standardize platforms and applications will become a powerful force for simplicity and savings. But that simplicity doesn't come easy. Will the standardization required on the IT level force business units to rethink all their business processes to conform?

Forcing separate business units, with their own cultures, agendas, and ingrained business and IT process habits, to standardize on, say, a financial system or a new server platform is never easy, say many experts—no matter how attractive the benefits. Dev Mukherjee, vice president of e-business on demand strategy at IBM, concedes that reaping the full benefits of utility computing is a symbiotic process: "You can't do the process transformation unless you do the IT transformation," he notes. "And you don't really get the full value out of the IT transformation unless you transform your business processes alongside them."

Corporations beginning to head down the road toward a utility computing infrastructure already have had to come to terms with the need to revamp their business processes. Steve Karl, senior vice president of technology operations at American Express, which last year signed a contract with IBM Global Services worth more than $4 billion over seven years, champions the virtues of standardization. But, he concedes that standardization can have a negative effect on business process innovation. "Innovation is important to us, and we're aware that standardization can kill innovation," Karl says. "We need to support innovation, yet even if it takes more time, it's an absolute requirement to be really clear what we're trying to achieve on a business objective level."

Getting the full value of utility computing will always depend on whether CIOs can get cooperation from the business units. Will they be willing to develop business processes that can share corporate IT resources, and then actually share them? "A big impediment to this whole concept is the idea of sharing among business units," says Gartner Inc. analyst Donna Scott. "Each business unit has spent more money than they need to on their infrastructure, but that doesn't mean they necessarily want to give it up. You've got to change that orientation, and that will take time."

The only way to do that, says AmEx's Karl, is to work from a highly centralized IT structure: "Some IT structures lend themselves to standardization and utility computing and some do not. In companies where IT is very decentralized, with a corporate CIO and line of business CIOs, frankly, I think achieving utility computing would be very difficult."

Even as they concede the pitfalls, many experts believe that a utility computing infrastructure provides an ideal environment for re-engineering processes—indeed, that it will encourage the process. "In consolidating your servers," says IBM's Mukherjee, "you standardize and you simplify. The result: You make your IT infrastructure more efficient, more flexible and more resilient. And that's an ideal environment for business transformation." Gartner's Scott agrees: "The idea is that as business processes are reengineered, the IT infrastructure can provide what they need." The more standardized your IT, the more agile your business processes. But you can't standardize everything, Scott concedes: "There will still be customized infrastructure within each business unit, and of course, every enterprise will have to decide where it makes sense to standardize and where it makes sense to customize."

And some contrarians believe there's no paradox at all. "The goal is to transform the corporation from the inside out," says Vernon Turner, an analyst at IDC. "You don't want to try to change the behavior of the business units. You want to change the behavior of the data center." Indeed, Turner warns, if another round of business reengineering is required, "then utility computing will never be implemented."

The IT Financing Paradox

The IT Financing Paradox

Companies have traditionally purchased the equipment and software for their IT infrastructures and then capitalized the cost, turning to a variety of entities—from vendor financing to third-party commercial equipment financing company—to finance those purchases. But what happens in a world in which much of that infrastructure is taken over by utility service providers, with the computing capacity sold back to corporations on an on-demand basis, in which there are no assets to back up financing deals and the costs that need to be financed are variable?

Moving to a utility model will save so much money that the costs can be paid for out of operating cash flow, say many experts. And even if corporations need to finance those costs, finding financing should be straightforward. Says Charles Rutstein, research director at Forrester Research Inc.: "The captive financing guys don't much care. 'You want to buy a server? Here are the financing terms. You want to buy on-demand services? Same financing terms.' This is a fairly routine decision."

Others, however, insist that it's not so simple. Says CGE&Y's Parkinson: "CIOs are saying to themselves: 'I'd really like to take my computing platform costs off my balance sheet and put it on my operating statement so I only pay for what I need rather than what I have to buy.' " As he notes, that's a "hugely attractive deal." But its very attractiveness holds some hidden traps. Consider just the bottom 500 companies in the Fortune 1000: They spend an average of $57 million on IT annually, according to CGE&Y's calculations, of which capital costs associated with IT account for 20 percent. That leaves about $46 million in annual IT operating costs. An outsourced utility computing effort will reap $6 million in savings in the first year, increasing to $15 million in the sixth. Not bad. That means the average company in the sector will spend some $237 million in operating costs over the first five years of an on-demand program—about $120 billion for the whole sector. By the sixth year, that works out to some $15 billion annually—and that's just for these 500 companies. Says Parkinson, "You've still got to have a source of financing for those operating costs and you've got to find someone who's willing to take the financing risk."

That's a large amount for vendors to finance alone, and the equipment financing industry is not yet structured to accommodate much of it at all. "The product risk is different in a usage-based program. Without knowing how much or how long the customer will use the service, it's very difficult to come up with a schedule of lease payments to recover costs. And that's not a risk finance companies have traditionally taken," says Greg Goldstein, head of portfolio management at CIT Group Inc., a commercial and consumer finance company. "And I can't foresee that happening. So if there's three parties to a transaction—the customer, the vendor and the finance company—who's going to accept that risk? Probably the vendors. A further challenge is that vendors may not be willing to accept the additional credit risk in financing the ongoing expenses of less creditworthy companies."

Equally problematic from a cost perspective is how to price the software needed. Software companies have traditionally priced their wares on a per-seat or per-box basis. Variably pricing software as a function of demand or utilization isn't quite as easy. Says Leon Billis, president and CEO of AXA Technology Services: "In some cases you're finding software companies saying, 'Okay, I see the trend, and I'm willing to repackage my pricing for you.' Others are saying, 'No, you're going to pay for the size of the box that license sits on, regardless of consumption.'" About 10 percent of AXA Tech's total annual IT spending of about $1 billion goes to OEM software (not including desktop software), so gaining cost variability on software would mean significant savings.

Software companies will not be able to hold the line against increasing demand for variable pricing, say most observers. "Software vendors understand that the world is changing," says Forrester's Rutstein. "But in terms of what to do about it, they're not so sure. Is it a perpetual license? Is it a subscription? Do I create some elasticity? Do I do it by user, by CPU, by capacity? We're going to go through a period where it's a very ad hoc negotiated settlement, case by case."

The Paradox of the

CIO">

The Paradox of the CIO

How do CIOs fit into a utility computing model? Will they be relegated to the role of managers of utility services or of the on-demand contract? Or will utility computing free them from the mundane chores of IT management to become full strategic partners of their business-side colleagues? The latter, say many experts. And there's an added benefit, claims IBM's Mukherjee: "With the savings from a utility model, IT can actually be the funding source for business transformation. If you can turn your budget into real value-added, you have a greater chance of being a hero."

No one is suggesting that moving to a utility computing model will be easy, so a further role of the CIO is to make it work. "The CIO has to have a vision," says Gartner's Scott. "There are so many impediments to making this happen. And it's not just technology. How do you charge back? How do you share inside the organization? How do you reassure business units they'll still get the resources they need? This is an evangelistic kind of sell for the CIO, and he's extremely important to the process."

The role of the CIO as evangelical hero is an attractive one. But heading down this visionary path will require many cultural changes on the part of IT and the CIO. Consider the still-prevalent notion of the CIO as technology guru. Says Gartner analyst Ben Pring: "Moving to utility computing is not really about the actual management challenge. It's really more about the philosophy of what you're trying to do. It's whether you as a technology person, who has existed in the mystique of technology, really want to simplify what you're doing. A lot of people, whose careers and personal reputations rest on being the witch doctor, don't really want to. But if they don't, it's going to be wrested away from them."

CIOs have a lot to gain in the shift to utility computing, but they also have a lot to lose. CIOs who feel that their power base depends on the size of their budget and on their role as technology guru face the possibility of being outflanked. That's because the benefits—even if they're still largely theoretical—are just too attractive. "A year from now," says CGE&Y's Parkinson, "you'll be reading articles about companies that have already gained significant savings—and competitive advantage—by moving to this model."

How should CIOs prepare to make that move? "Get real," says Parkinson. "Look very hard at the assets you manage on behalf of the corporation and simplify and standardize the heck out of them, because you can't justify having the range of ways to do the same thing you have today." You can capture significant benefits through server consolidation and standardization even before a true utility model is ready for use.

In moving to utility computing, Gartner's Pring advises taking a portfolio approach to IT assets: "CIOs have got to categorize their portfolio of applications according to their strategic value. Applications that aren't truly differentiating for the business should be provided as a utility." Once that process is completed, says Pring, CIOs can concentrate on looking for new applications that will truly give their companies a strategic business advantage.

A further concern will be the ability to equitably and accurately apportion IT costs among the business units, says IDC's Turner. "Internal service-level agreements are going to be the big thing to make this work. CIOs who come up with a good billing mechanism will have an advantage because that will make this much easier and much more acceptable to the business units."

The bottom line: Utility computing ultimately will bring IT and the business side even closer together, but that will demand a new level of cooperation between the two. Can they work together to develop new, streamlined business processes based on simplified, standardized IT architectures? Can they work out adequate structures for sharing, and paying for, computing services? Can they leverage the money saved to create new, strategically beneficial technologies? If so, the lovely vision of utility computing will finally become a reality.

What

's in a Name?">

What's in a Name?

Vendors, consultants and analysts are already trying to brand their version of utility computing.

Adaptive Enterprise – Cap Gemini Ernst & Young

Dynamic Systems Initiative – Microsoft Corp.

e-Business on Demand – IBM corp.

N1 – Sun Microsystems inc.

On-Demand Computing – EDS corp.

Organic IT – Forrester Research inc.

Policy-Based Computing – Gartner inc.

Real-Time Enterprise – Gartner inc.

Utility Data Center (UDC) – Hewlett-Packard co.

Resources

Resources

PAPERS

"The Adaptive Imperative"
Perspectives on Business Innovation
Cap Gemini Ernst & Young, March 2003

"Organic IT"
By Frank D. Gillett, et al. TechStrategy Report
Forrester Research Inc., April 2002

"Innovative Technologies: What's the Impact for IT Services Providers and End-User Organizations?"
By Alan MacNeela and Peter Redshaw Gartner Inc., November 2002