Jumping on the Cloud Bandwagon

By Tony Kontzer

Cloud Computing: Anything as a Service

If venturing into cloud computing today can be compared with Lewis and Clark's epic trek across the United States, then Douglas Menefee may one day be considered an information technology trailblazer.

Three years ago, when Menefee assumed the CIO post at The Schumacher Group, a $300 million-a-year company that manages hospital emergency departments and physicians, he immediately noted the amount of time his staff spent writing application code and maintaining the infrastructure to support it. He knew there had to be a better way.

"I walked into the CEO's office," recalls Menefee, "and said, 'Do you want to be a software development company, or do you want to be a medical management company supported by software?'"

That question jump-started the company's journey into the IT phenomenon known as cloud computing. Three years after signing on as a subscriber to Salesforce.com's flagship on-demand customer relationship management (CRM) application, Menefee estimates half of the business processes his IT staff supports now happen in the cloud.

In the future, it won't matter where your software lives. "As a matter of fact, if you have it on premise, you're short-changing yourself, because you're throwing half of your resources at supporting that server infrastructure," Menefee says.

Most CIOs continue to depend on those server infrastructures for a simple reason: They're not convinced cloud computing is ready for prime time. If the ramblings of cloud computing user groups are to be believed, the real question isn't whether the technology is reliable enough to pass muster with big corporate IT shops. The crucial question is whether it even matters.

For those weighing such concerns--and after the high-profile outages that hit Amazon.com's Simple Storage Service (S3) and Apple's first cloud computing effort, MobileMe, last month, many are--consider this posting of a Google user group member: "Anyone who thinks that the cloud, or even their own data center or infrastructure, should provide the reliability has it backward. If you want real reliability, write more resilient applications."

Think of it as a Clintonesque take on the digital age: It's not the infrastructure that matters, it's the applications, stupid.

Regardless of one's views on the readiness of cloud computing to meet corporate IT needs, it's a development that cannot be ignored. Like it or not, the idea of renting applications, development platforms, processing power, storage or any other cloud-enabled services has emerged as a replay of the Internet's rise as a business tool: It's a potentially game-changing technology that's expected to reshape IT over the next decade.

But IT executives are wary of cloud computing for reasons that go beyond the perception of unreliability. They're fearful that their data won't be safe in the hands of cloud providers; they're convinced they won't be able to manage cloud resources effectively; they're suspicious of providers that won't share details of the infrastructures supporting their cloud environments; and they're worried that the technology could threaten their own data centers, or even their staffs. Collectively, these fears are helping to hold back a cloud computing market that Merrill Lynch estimates could be worth $95 billion in five years.

More importantly, analysts say that if CIOs let these fears paralyze them, even temporarily, odds are they're just delaying the inevitable. In the meantime, business executives, salespeople and even rogue IT workers are using their corporate credit cards to tap their expense accounts or departmental budgets to pay for subscriptions to cloud services.

"If you're a large enterprise, somebody in your organization is using cloud computing, but they're not telling you," says James Staten, principal analyst at IT adviser Forrester Research. "So there's a good chance that in the next five years, you're going to inherit things that were born in the cloud anyway, and now you'll have to manage them."

Jumping on the Cloud Bandwagon

Jumping on the Cloud Bandwagon

Despite the hesitancy of many CIOs, the growing ranks of cloud providers are aggressively positioning themselves to take advantage of the anticipated explosion in cloud provisioning. Amazon, Google (with its Google Apps and Google Apps Engine) and Salesforce.com have the most mature offerings for IT departments and seem to add new functions daily.

IBM, which last year joined Google in a cloud computing research effort, is aggressively marketing its Blue Cloud architecture to support private clouds. Not to be outdone, Hewlett-Packard in July partnered with Intel and Yahoo to form a vast cloud computing testbed that sounds very similar to the IBM-Google effort.

Dell sells an assortment of cloud computing hardware targeted at telecom firms, cable companies and Internet service providers. And EMC joined the cloud storage marketplace late last year with its $76 million acquisition of startup Mozy. Plus, dozens of smaller firms--among them Joyent, RightScale and Terremark--are also hoping to get a piece of the pie.

Whether or not IT executives decide to enter the cloud now, during its infancy, depends largely on their tolerance for risk. The lingering questions about reliability, security and overall performance hang over cloud computing providers, while skittish CIOs (and they're in the majority) sit on the sidelines, waiting for the technology to mature before putting even the most non-essential applications on someone else's servers.

But make no mistake, there are a growing number of large companies that are accepting the risk and jumping on the cloud with both feet--or, at least, are allowing isolated teams working on one-off projects to peek into the cloud and catch a glimpse of its value. Whether they're tapping the capabilities of software as a service (SaaS), platform as service (PaaS), infrastructure as a service (IaaS) or any of the countless other XaaS alternatives flooding the market, businesses find all manner of benefits from the cloud.

The Schumacher Group is on the short list of companies jumping in. However, in a decidedly nontechnological and macabre twist of fate, it was Mother Nature that provided Menefee with the motivation he needed: It came in the form of Hurricane Katrina, which pummeled the Gulf Coast shortly after Menefee took over as Schumacher's CIO.

Though the company's main data center, housed in its Lafayette, La., headquarters, was spared from the devastation surrounding it, the storm provided a much-needed wake-up call. At the time, Menefee was in the midst of selecting a new CRM system, but, after Katrina hit, subscribing to an application residing on someone else's servers--in this case, Salesforce.com's--looked awfully good. So a new approach to IT was adopted at Schumacher.

About 18 months later, the company's reliance on cloud computing shifted into higher gear when Salesforce.com introduced its Force.com application development platform. Backed by the vendor's servers, database and development tools, Menefee's staff started rolling out business processes in the cloud at a dizzying pace.

They implemented physician recruitment, contract management, insurance carrier applications and operations workflow in Salesforce.com. Then the staff integrated these applications with Peake Software Labs' Tangier emergency physician scheduling application, another on-demand software program.

And then came the pièce de résistance: Menefee and his crew built a hurricane-tracking mashup on top of Salesforce.com, using APIs from Google, Salesforce.com and Tangier to combine data on weather, airport and road conditions from multiple sites. The result is a powerful tool that helps Schumacher more effectively deploy physicians where they're needed during a storm, such as the recent Hurricane Dolly.

Despite the myriad of cloud computing successes under his belt, Menefee admits the transition wasn't painless. Notably, in the first 18 months after the company started its foray into the cloud, 85 percent of the IT staff left, forcing Menefee to build a cloud-friendly crew.

"Some didn't like me, some didn't like my approach and some didn't like the transition to on-demand," he says. "We had a lot of folks who wanted to build software." Since Menefee replaced all the departed employees, his staff has experienced zero turnover.

Menefee isn't ready to put the most mission-critical systems in the cloud just yet, but he's not ruling it out--and that puts him in an even smaller group of risk-embracing CIOs. "I've not put out a mandate that we're going to shift to the cloud, but I feel that as we grow as a company, we're going to continue to put more processes there," he says, citing billing and document imaging as potential candidates.

Taking It Slow

Taking It Slow

Still, for every cloud-loving IT organization, there are several others who prefer to experiment with the cloud on a one-off basis before making larger commitments. Take The New York Times. Its first venture into the cloud was last year as an effort to convert the digitized content of all its issues from 1851 to 1922 into a Web-friendly format. In the end, it became a full-blown cloud computing experiment that turned into a compelling public domain archival resource.

Derek Gottfrid, the senior software architect behind the TimesMachine, knew that if his team sought to ac-quire the servers needed to process some 4 terabytes of data, the project probably never would have gotten off the ground. Instead, he got approval to use his corporate credit card to expense the required processing and storage capacity from Amazon Web Services' S3 and Elastic Compute Cloud (EC2). With only some tweaks to The New York Times' firewall, the data was uploaded into S3, converted into manageable image and JavaScript files using EC2, and then dressed up by the paper's design team.

Along the way, Gottfrid and his team realized that they'd done something that might be useful to the paper's readers, so they decided to leave the data in S3 and create a public-facing Web interface into the files, with Amazon serving them up from the cloud. The result is the impressive application Times-Machine, which lets users view scanned images of 150-year-old newspapers by hovering their mouse over a particular region of the image and then zooming in on that article, photo or ad.

Gottfrid says the IT team was more relieved than threatened by the use of the cloud. "They were largely indifferent," he says. "This was a project that they were glad didn't come to them because of the hardware requirements. We weren't taking an established product and moving it to cloud computing, so we weren't taking anything away from them."

It doesn't hurt that the cost of using Amazon's services has been negligible: The entire effort has cost the newspaper about $1,500 thus far, and the small ongoing monthly fee fluctuates depending on the level of traffic the TimesMachine attracts.

While the company's IT leadership is likely to be among those who won't bet heavily on cloud computing until the technology is more mature, Gottfrid is optimistic that the success of the effort will open the door to additional uses of cloud computing for one-off projects. "All this stuff is so new, so we're not going to put anything mission-critical in a cloud that hasn't been tested over time," he says. "But it's definitely something that's on our radar as we evaluate things."

Rescuing a Stranded Project

Rescuing a Stranded Project

A similar scenario unfolded at NASDAQ OMX, which provides technology and services to some 60 exchanges around the world. An R&D team there had been batting about an idea for a desktop stock research tool, but the project had remained stranded on the to-do list, largely because of cost.

"It would have been a major investment if we had built it using traditional technology," says Claude Courbois, associate vice president of data product research and development. That investment would have included hiring a database administrator, maintaining a boatload of code, and purchasing additional storage and Web servers.

Then, about a year ago, after Courbois and his colleagues held separate meetings with representatives of Amazon and Adobe Systems, they had a breakthrough realization: By using Amazon's S3 service to store some 4 terabytes of data, and combining that with Adobe's AIR (Adobe Integrated Runtime) application, they could develop the product affordably and rely on AIR to handle the last bits of processing on users' desktops.

So the project team used some internal applications to prepare the stock data, built a desktop interface, and started uploading the most recent month's worth of data to S3. And they used Courbois' corporate credit card to pay for it.

Soon after, NASDAQ OMX started selling the downloadable product, called Market Replay, to brokers and financial Web sites. The product automatically downloads AIR if it's not already installed on the user's PC. Then, any time a user wants to verify pricing info, a query for any 10-minute period in the past 30 days of a listing on NASDAQ, the New York Stock Exchange or the American Stock Exchange returns a dizzying amount of detail on the events surrounding that stock's price fluctuations during that time. The only continuing interaction between NASDAQ OMX and Market Replay is the persistent uploading of some 300,000 files (about 30 gigabytes) into S3 each day, with available data being as recent as 30 minutes in the past.

Courbois wouldn't share the exact cost of the effort, nor would he reveal the product's pricing, but he did say the initial upload cost less than $1,000, and ongoing monthly costs for the storage and data transfers are in that range, as well. At that price, Market Replay won't have to generate a lot of revenue to justify the expense.

What really excites Courbois about his team's foray into the cloud is that it expands their ability to be entrepreneurial. "I've got 20 ideas I'm working on right now, and if IT has time for five, and if I can get a sixth done because I use cloud computing, that's fantastic," he says. "We were able to create something we think is groundbreaking in our industry, and we didn't have to buy hundreds of thousands of dollars worth of servers to deliver this product to our customers."

And Courbois isn't done. His team plans to develop versions of Market Replay for NASDAQ OMX's European customers, as well as one that centers on options, which NASDAQ began trading earlier this year. He's also exploring ways to take advantage of the cloud processing power of Amazon's EC2 service.

This model, in which cloud computing is used to support an application that IT doesn't have the staff or computing resources to support, could dominate the market during its infancy. But it won't be long before more sophisticated uses become common. "Where it becomes even more interesting is when you have users contributing to the functionality of the application using Web 2.0 technology," says Chris Howard, vice president and service director for IT advisory firm Burton Group.

In fact, Web 2.0 technology played a critical role in a recent, and ambitious, cloud-enabled project at Sogeti Group, a $1.5 billion-a-year IT services firm owned by French IT consulting giant Capgemini. The effort started last November when, during a meeting with an IBM executive, Sogeti CTO Michiel Boreel, expressed his desire to "democratize innovation" at the company.

It was suggested that Boreel adapt an IBM concept known as an "idea gem," in which Web 2.0 tools are used to create a temporary real-time collaboration environment. With 21,000 employees, Sogeti faced significant technological hurdles in making that happen. Enabling such an environment to be accessed by thousands of employees simultaneously would require lightning-quick servers, a fat pipe delivering lots of Net bandwidth, and a robust and stable application.

Since Boreel's main concern was speed, he was reluctant to go through Sogeti's bureaucratic IT channels to get the project off the ground. So when IBM offered to fast-track the project's development by building the needed application on its Lotus Connections social networking platform and hosting it from its cloud computing facility in Dublin, Boreel leapt at the opportunity.

That's when he approached Sogeti's CIO to let him know of his plans, saying that IT had enough on its plate, and he wanted the project--which he had re-dubbed an "idea storm"--to be ready fast. This didn't go over well with an IT department accustomed to supporting massive projects.

As Boreel recalls, "Their initial reaction was, 'Why are you trying to do our job? You tell us what you want to accomplish, and then we'll figure out how to do it.'"

Boreel's reaction? "I didn't want to spend the next half year going back and forth with the hows and the whats," he says. "This wasn't about technology. It was about communication and collaboration."

He assuaged the CIO by pointing out that experimenting with an IBM cloud would provide Sogeti with a roadmap for how its own infrastructure should be defined in a cloud computing-dominated world. He had the IT executives talk with IBM about security and performance concerns, and they agreed to let the effort proceed, with IBM at the helm.

Boreel says the episode reminded him of the mid-1990s, when the first commercial uses of the Internet started popping up, and many CIOs reacted by banning it from their networks because of security concerns. He recalls that people simply brought their own modems to work to connect their computers and phones. As a result, companies suddenly faced thousands of security threats instead of just one.

Boreel's instincts proved spot on. The idea storm was ready in a few months, and last April, the 72-hour collaboration went off without a hitch, despite the fact that one-fourth of the company's employees--more than double the number Boreel hoped for--logged on to collectively create 2,000 ideas.

Subsequently, the 60 most viable of those ideas were clustered into six categories. The company is currently in the process of turning those ideas into concrete projects.

When asked what would have happened if he had agreed to let IT support the project rather than placing it in IBM's cloud, Boreel replied, "I'm sure that at this time, we'd still be discussing how we were going to do this."

Instead, the success of the effort has sold Boreel on cloud computing as the right technology to support future idea storms, which he says will be expanded to include customers. He also sees opportunities to place other collaborative applications--e-mail, agendas and scheduling, for instance--in a cloud computing environment.

And for those CIOs who fear cloud computing for its perceived lack of sufficient security and reliability, Boreel has some succinct advice: Get over it, quick. "Cloud computing is much more of an opportunity than it is a threat," he says. "But when you ignore an opportunity long enough, it becomes a threat."

Spoken like a true trailblazer.

Key Questions to Ask

Key Questions to Ask Before Leaping into Cloud Computing

Before leaping into cloud computing, ask these questions of people inside and outside your company:

Ask your CEO:
Can our corporate culture accept the risk that comes with a new approach to computing?

Ask your R&D teams:
Are there entrepreneurial projects you'd tackle if you had access to the necessary computing resources?

Ask your CSO:
What are your greatest security concerns related to putting our data in the cloud?

Ask your potential vendors:
How would you secure our data?
Do you offer service-level agreements that guarantee performance and availability?

Ask your staff and yourself:
Do the potential benefits of providing cloud services to our internal customer base justify the risks?
Will we be able to enforce restrictions on appropriate uses of the cloud?

Back to CIO Insight

This article was originally published on 08-05-2008