The Cost Of Bad IT Economics
Benchmarking guru Howard Rubin says companies get it wrong when valuing their IT. Hereâs how they must change.
In times of economic crisis, companies are quick to slash IT costs. With fears of a recession looming over the corporate world, CIOs are facing the axe once again. But no matter how many times CIOs proclaim their strategic value--or the strategic value of their departments--businesses too often view IT as a cost, plain and simple.
IT executives have been rallying against that perception for years, but they're getting little attention. Howard Rubin says it's about time that changed.
An IT benchmarking pioneer and self-proclaimed "metrics arms dealer," Rubin is a former fellow with corporate strategy consultancy Norton, Nolan & Co., where he developed metrics for application portfolios and helped create the balanced business scorecard. He also serves as a senior adviser to Gartner and a research affiliate to the Center for Information Systems Research at MIT Sloan School of Management. A former executive vice president of META Group and professor emeritus of computer science at Hunter College of the City University of New York, Rubin has been an IT adviser to the U.S., Canadian, Indian and other governments.
Rubin says businesses see better results when they make sound IT investments during economic downturns. When they don't, they run the risk of falling behind competitors. And the cost of catching up, he says, can be insurmountable.
But CIOs are as guilty as their bosses in misinterpreting the true value of IT investments. To turn the tables, Rubin says, IT executives must recast the conventional language around IT expenditures and strategy.
Rubin recently spoke with CIO Insight online editor Brian P. Watson about what companies do wrong in valuing their IT investments, and what CIOs must do to change that perception. What follows is an edited transcript of their discussion.
CIO INSIGHT: What's the biggest problem today with the way companies view their IT investments?
Rubin: It's been a long-term problem: Most companies view their IT investment as a cost and therefore they try to minimize the cost.
Corporate leaders always ask me how they can use benchmarks to compare themselves with their peers, and how they can compare against best in class. Best in class, in conventional wisdom, is typically who's doing IT for the least amount of dollars, relative to revenue, relative to operating expense. Best in class sometimes is synonymous with who's the low cost in IT. That's a big problem, because the low cost may not be the best user of IT. When people talk about understanding how to calibrate themselves, they need to redefine best in class in terms of the aspirations of the business. Best in class could be, what's the spending pattern of the companies that are most profitable in our industry the last three years, or what are the spending patterns of the companies that have increased their business performance earnings per share the last three years?
Companies treat IT like a cost and look for outside validation to drive costs down. They really should be looking at the outcome. What should I do with my IT strategy, my IT investment; am I maximizing return? That's better from a business perspective.
Companies understand that IT produces value. But they view IT as a cost because it's so much harder to figure out where the value shows up in terms they can get their hands on. They gravitate to what's tangible, and what's tangible is the budget item.
When you look at annual reports, go to "operating expense." First you find "compensation," and then you find "communications and technology." IT, also through accounting principles, just makes itself look like a cost. The businesses themselves aren't necessarily cost-focused: IT shows up as a cost and finding the value is much more elusive; it's a self-feeding system.
Page 2: Becoming IT Savvy