Pay As You Go

It's just a small leap from hosted, subscription-based applications to software priced and delivered as a utility—that is, to software distributed via the Web, but not as an ASP, and charged by usage, not by preset fees and a contractual arrangement. And, while this model is still one of the smallest of the many new alternatives to purchasing software, it is also one of the most intriguing. The value of utility computing, proponents say, can be demonstrated at companies with highly variable but temporary demands on technology. At retailers, for example, the six weeks at the end of the year represent a large proportion of their business. During this time, inventory and supply-chain networks are stressed to extremes, and it's impossible for these companies to know beforehand exactly how much additional computing capacity they will need.

"That kind of company should not purchase additional computing capabilities up front for their peak periods. Instead, they should literally draw them in and pay for them as they're needed," says Jeff Smith, vice president of the On Demand Project Office for IBM Tivoli software, which, along with H-P and Sun Microsystems Inc., is among the biggest supporters of utility computing.

Despite the heavyweight backing, however, utility computing has been slow to be adopted, and it represents only about 5 percent to 10 percent of software sales, according to analysts. The reasons are mostly budgetary. Customers worry that it will be impossible to project full-year utility-computing costs because technology needs can be so unpredictable. The result: CIOs could face a paralyzing shortfall in computing resources if there were no money earmarked for unexpected needs. In addition, some technology chiefs fear that utility arrangements could stifle innovation, particularly if corporate bean counters closely monitor minute-by-minute software use and require justification for every additional expense.

"We're not interested in utility computing in our office," says JPL's White. "It's too hard for us to predict how much software time we need, and I don't want to spend an enormous amount of energy coming up with explanations every time we turn on a computer, or every time someone decides to be creative and uses more technology to explore a new design or a new way of doing something."

Given these doubts, the most prevalent utility-computing application in use so far appears to be process integration—programs that optimize disparate hardware and software resources in a computing network in order to streamline it and make certain that it is being used efficiently. A key portion of IBM's On Demand initiative is focused on this problem—it's an effort that has grown out of the need to rationalize the huge expenditures on IT that took place in the 1990s, and an attempt to gain a true return on investment from prior software purchases. To the degree that customers embrace this utility-computing application, it is because network optimization is not a core business operation, such as an ERP system. Thus, it is easily delivered as a Web- distributed overlay to existing technology. If usage exceeds the available budget, it could be turned off without compromising essential corporate systems.

This article was originally published on 02-01-2004
eWeek eWeek

Have the latest technology news and resources emailed to you everyday.