If venturing into cloud computing today can be compared with Lewis and Clark’s epic trek across the United States, then Douglas Menefee may one day be considered an information technology trailblazer.
Three years ago, when Menefee assumed the CIO post at The Schumacher Group, a $300 million-a-year company that manages hospital emergency departments and physicians, he immediately noted the amount of time his staff spent writing application code and maintaining the infrastructure to support it. He knew there had to be a better way.
“I walked into the CEO’s office,” recalls Menefee, “and said, ‘Do you want to be a software development company, or do you want to be a medical management company supported by software?’”
That question jump-started the company’s journey into the IT phenomenon known as cloud computing. Three years after signing on as a subscriber to Salesforce.com‘s flagship on-demand customer relationship management (CRM) application, Menefee estimates half of the business processes his IT staff supports now happen in the cloud.
In the future, it won’t matter where your software lives. “As a matter of fact, if you have it on premise, you’re short-changing yourself, because you’re throwing half of your resources at supporting that server infrastructure,” Menefee says.
Most CIOs continue to depend on those server infrastructures for a simple reason: They’re not convinced cloud computing is ready for prime time. If the ramblings of cloud computing user groups are to be believed, the real question isn’t whether the technology is reliable enough to pass muster with big corporate IT shops. The crucial question is whether it even matters.
For those weighing such concerns–and after the high-profile outages that hit Amazon.com‘s Simple Storage Service (S3) and Apple‘s first cloud computing effort, MobileMe, last month, many are–consider this posting of a Google user group member: “Anyone who thinks that the cloud, or even their own data center or infrastructure, should provide the reliability has it backward. If you want real reliability, write more resilient applications.”
Think of it as a Clintonesque take on the digital age: It’s not the infrastructure that matters, it’s the applications, stupid.
Regardless of one’s views on the readiness of cloud computing to meet corporate IT needs, it’s a development that cannot be ignored. Like it or not, the idea of renting applications, development platforms, processing power, storage or any other cloud-enabled services has emerged as a replay of the Internet’s rise as a business tool: It’s a potentially game-changing technology that’s expected to reshape IT over the next decade.
But IT executives are wary of cloud computing for reasons that go beyond the perception of unreliability. They’re fearful that their data won’t be safe in the hands of cloud providers; they’re convinced they won’t be able to manage cloud resources effectively; they’re suspicious of providers that won’t share details of the infrastructures supporting their cloud environments; and they’re worried that the technology could threaten their own data centers, or even their staffs. Collectively, these fears are helping to hold back a cloud computing market that Merrill Lynch estimates could be worth $95 billion in five years.
More importantly, analysts say that if CIOs let these fears paralyze them, even temporarily, odds are they’re just delaying the inevitable. In the meantime, business executives, salespeople and even rogue IT workers are using their corporate credit cards to tap their expense accounts or departmental budgets to pay for subscriptions to cloud services.
“If you’re a large enterprise, somebody in your organization is using cloud computing, but they’re not telling you,” says James Staten, principal analyst at IT adviser Forrester Research. “So there’s a good chance that in the next five years, you’re going to inherit things that were born in the cloud anyway, and now you’ll have to manage them.”