SHARE
Facebook X Pinterest WhatsApp

10 Hidden Costs in the Public Cloud

Written By
thumbnail
Mark Tonsetic
Mark Tonsetic
Jun 7, 2011

The public cloud is emerging as a disruptive innovation for IT — and not just for small and medium-size businesses. At this point, most large enterprises are experimenting with the public cloud as a development-and-test resource, or for production applications with low security, privacy and service level requirements.  The conventional wisdom is that the public cloud may even be just a niche interest for large companies, given their extensive legacy investments and the mission-critical nature of their systems. Nevertheless, a number of these companies do see big potential in the public cloud; they feel an acute need to choose between proactively working with the public cloud or getting left behind.

At the Corporate Executive Board, we have spoken with many of the early enterprise adopters of public cloud services from vendors such as Amazon and Rackspace. Naturally, the applications that these enterprises have considered for the public cloud are studied for their cost-effectiveness in this model.  Below is our consolidated feedback from these early adopters on the 10 hidden costs in the public cloud. We’ve broken these cost areas into four broad categories to watch:

  • One-Time Migration Costs
  • Billing Model Limitations 
  • Retained Management Costs
  • Risk Premium

One-Time Migration Costs

These are costs associated with the migration of existing applications from traditional, physical infrastructure to public cloud, including applications retrofitting time, server migration time, and potential impact on depreciation write-offs.

In this category, there are two potential costs to watch:

  1. Application retrofitting. The majority of a typical company’s in-house application portfolio is not yet cloud-ready. Some of applications already suitable for virtual machines, or developed according to platform standards, can be ported easily. But most applications would require considerable retrofitting or recoding to become compatible. This is particularly true of legacy applications. Organizations need to evaluate the cost effectiveness of porting these applications, versus leaving them as-is or even totally decommissioning them in favor or new applications. Promoting platform standards, and building the business case for a technology refresh, have been perennial challenges for application teams. This remains true when considering the public cloud.
  2. Depreciation write-offs. Companies that choose to accelerate application or infrastructure refresh in order to jump-start public cloud migration face the possibility of no longer writing-off depreciation on existing hardware.  That explains why many companies are thinking of evaluating the cloud at existing refresh points.

Billing Model Limitations

There are three features of the current public cloud billing model that may be a poor fit for your enterprise applications.

  1. Elasticity premium. One of the most heralded features of the public cloud is its pay-as-you-go billing feature, which enables companies to tackle increasing application loads at peak usage time. Since prices are set accordingly, this may come at a premium for applications that sit in the public cloud full-time and don’t peak greatly above their routine base.  For example, consider that Amazon’s Large On-Demand Windows Instance costs 48 cents per hour while its comparable Reserved Instance is just 20 cents. Choosing the right type of cloud instance for each application is important; applications with steady or predictable workload will not be most cost effective in the on-demand models.
  2. Toll charges. The in-bound and out-going data transfer charges in the public cloud are a significant factor to keep in mind, especially for applications that experience heavy data usage. For instance, Amazon charges 10 cents per GB inbound and between 8 cents and 15 cents per GB transferred out. The additional latency introduced due to large-scale data transfer requests on the cloud servers is also a cause for concern.
  3. Storage costs. A virtual, multitenant server architecture introduces new storage costs and complexity, creating the need for optimization tools such as storage virtualization, thin provisioning and data de-duplication. These are tools with which most companies are just beginning to become familiar.

Recommended for you...

Best Business Travel Items: 11 Business Travel Essentials
Kaiti Norton
Aug 4, 2022
IBM on the Evolving Role of the CIO: Interview with Kathryn Guarini, CIO of IBM
Shelby Hiter
Jul 26, 2022
Can’t Hire a CIO or CISO? Go Virtual
Drew Robb
Jul 11, 2022
An In-Depth Guide to Enterprise Data Privacy
Jenna Phipps
Jun 25, 2022
CIO Insight Logo

CIO Insight offers thought leadership and best practices in the IT security and management industry while providing expert recommendations on software solutions for IT leaders. It is the trusted resource for security professionals who need to maintain regulatory compliance for their teams and organizations. CIO Insight is an ideal website for IT decision makers, systems integrators and administrators, and IT managers to stay informed about emerging technologies, software developments and trends in the IT security and management industry.

Property of TechnologyAdvice. © 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.