By Michael Vizard
One of the tools commonly used to fight a major fire is to detonate an explosion that essentially extinguishes the fire by consuming all of the available oxygen. In recent years a fire that has afflicted enterprise IT has been the proliferation of shadow IT systems. However, with the explosion of cloud computing, the ability to put out many of the fires caused by those shadow IT systems may be at hand.
A case in point is the storage vendor NetApp, which is rolling out its own virtual private cloud, called Ncloud, as part of an effort to consolidate as many of its shadow IT systems as possible.
According to NetApp CIO Cynthia Stoddard, the challenge is for IT to develop a governance system that meets compliance requirements but doesn’t impose too many restrictions on the business. “The governance mechanisms have to be light enough to not get in the way of the business processes,” says Stoddard.
As part of its campaign to persuade NetApp business leaders to buy into Ncloud, Stoddard says the IT organization embarked on a campaign of IT transparency. “You have to be willing to open up and be transparent about the real state of IT,” says Stoddard. “That can be a difficult conversation.”
The IT transparency approach pays off because it engenders trust, says Stoddard. Rather than mandating that all systems be run by IT, Stoddard says the goal is to persuade business leaders to shift applications to Ncloud so they can free up budget dollars and allocate them elsewhere.
In fact, it’s the budget issue that is making shadow IT a board-level topic. Organizations are discovering that not only do they have lots of duplicate data, but many of their shadow IT systems are also redundant.
NetApp’s approach is to combine internal data centers and cloud computing resources to create an agile IT environment that can dynamically scale to meet business requirements. The company, for example, just built a state-of-the-art data center in Hillsboro, Ore. But at the same time, Stoddard believes cloudbursting will play a significant role in helping NetApp control its future IT costs by allowing the company to “right size” IT investments according to the nature of the application workload involved.
“We call that leading with architecture,” says Stoddard. “In the future our internal data centers will be smaller and denser, which means from a resource perspective they will be more precious.”
As such, Stoddard says business units will have to prove that their applications are critical enough to run on NetApp’s internal data centers versus being serviced by external cloud resources managed by NetApp’s IT department.
As part of that effort, NetApp is investing in in-memory computing systems such as the SAP High Performance Analytics Appliance platform. The goal, says Stoddard, is to be able to gather enough information about the performance and attributes of NetApp systems in the field so they can better predict when customers might experience a service or support issue. Similarly, that information may also be used to identify changes in application usage that might require customers to upgrade their systems to products that are specifically designed for that use case.
Much of the capability is being provided by the rise of less expensive Flash memory that makes it possible to run a lot of these applications in real time, while at the same time leveraging technologies such as Hadoop to inexpensively store massive amounts of data.
“I like to think of it as a world without batch processing, a world without having to script ETL processes,” says Stoddard. “We want to put our business processes on steroids.”
All these efforts essentially mean that NetApp is reinventing the way the IT organization operates within NetApp in order to make the business, as a whole, more agile, says Stoddard.
“We’re really retooling the IT factory,” says Stoddard. “As part of that I really see our role evolving into that of being an IT broker.”