Page 3By CIOinsight | Posted 10-01-2004
Storage Management: Closet Space
At the Fred Hutchinson Cancer Research Center in downtown Seattle, storage management is a necessity.
The nonprofit institution, which pioneered bone-marrow transplants and is a leader in gene therapy, is working around-the-clock to find a cure for cancer and related diseasesand currently generating more than 5 terabytes of data annually in research alone. That doesn't include e-mail or Web servers, which have their own dedicated systems.
"We have about a 100 percent storage growth rate year over year," says Tim Hunt, the center's Research Computing Support Manager, who oversees the storage infrastructure.
Hunt, and the Research Computing Support group, are tasked with making sure all that datamore than 12 million files and countingis not only stored properly, but can be retrieved at a moment's notice. "It's an interesting challenge," he says.
Hunt is not alone.
No one in the IT business can deny that storage is a burgeoning issue literally. According to Horison Information Strategies, a consulting firm that researches the storage market, the amount of corporate data is increasing at a rate of 50 percent to 70 percent every year.
The average size of corporate storage per employee has risen from less than 100 megabytes in 1993 to more than 3 gigabytes in 2003, according to META Group Inc. Furthermore, a 5,000-user organization has at least 15 terabytes of data to managea number that will jump to 80 terabytes by 2008.
Unfortunately, though the cost of memory and hardware has plummeteda megabyte that cost $50 ten years ago sells for pennies todaymore data still means more servers and memory requirements, and that creates greater complexities in the storage infrastructure.
Compounding that problem, companies generally are not hiring more storage staffers, which means your IT department needs to get efficient, fast.
Over the past year, the trend has been to move storage from direct- and network-attached storage to storage area networks, centralizing data and connecting different kinds of hardware.
Creating storage networks allows disparate business units to share vital data, which is especially important for companies that have grown quickly through acquisitions and mergers.
As a result, managing the storage infrastructure has become a major effort.
"Right now, you typically have to launch a different element manager for every type of vendor in your network," says Stephanie Balaouras, a senior analyst at Yankee Group. "That's not only a time constraint, it also means that you need expertise for each vendor's equipment and software."
The next step in storage is managing the storage infrastructure enterprisewide from one console, creating a storage management cockpit. By centralizing storage resources and getting a better handle on the topography of the storage infrastructure across the enterprise, CIOs are finding greater efficiencies and cutting costs.
"The challenge is making hardware a true commodity such that you can take [software from] EMC, or Veritas, or Microsoft, or whomever and manage them all with one system," says Rob Schafer, a program director at META Group.
The benefits of a centralized storage management system are numerous. First, the system can greatly reduce backup and recovery timesone of the most time-consuming tasks in the data centerin some cases from days to hours.
Deeper insight into your storage systems also means increased efficiencies from optimizing storage capacity, and that translates into savings. Couple that with some information lifecycle management (ILM) policies (a must) and add storage resource management (SRM) software, and you'll get even greater savings from being able to place the most valuable data on the most valuable storage device and offloading older data to less expensive options, such as tape drives.
All of this can help your company improve business intelligence by making key data more accessible, and in turn, sharing it across business units in a timely manner.
In addition, centralized storage management helps you comply with federal regulations such as Sarbanes-Oxley and the Health Insurance Portability and Accountability Act.
But, as always, it is not easy. Though new standards have been put in place to make future products work together more seamlessly, integration is still a problem today. And the biggest headache is developing data retention policies and procedures around information lifecycle management.
Storage isn't the most exciting function of the IT department, but it is one of the most critical.
"I don't think once you get out of the IT structure that anybody has a clue how critical storage is [to the business]," says Forrester Research Inc. Principal Analyst Bob Zimmerman.
Analysts and CIOs agree that one of the major hurdles in storage management is deciding how to put the right data on the right type of storage at the right time, a concept often referred to as "tiering."
In other words, which information is so critical that it needs to be on your most accessible systems? And how do you value the depreciation of data? By determining where your most important data should be stored, companies can improve data analysis by speeding access to it.
"At the end of the day, data is the lifeblood of the business," says META's Schafer. "The question isn't, do I have the data I require? It's, can I get it, assemble it, and manipulate it so my executives can make some timely decisions? All that depends on effective storage management."
To figure out how best to tier your data, sit down with your business managers and assess the type of data you have, how often it is accessed and how long it needs to be kept.
Bob Massengill, manager of technical services at the Wake Forest University Baptist Medical Center in North Carolina, says he didn't do an assessment when he began rolling out his storage strategy in 2000 because there weren't as many options as there are today, and he regrets it.
"We went into this so fast that we probably did not review our data like we should have," he says.
HIPAA requires the medical center to keep all its patient records for at least seven years, and longer if the patient is a minora newborn's records, for example, must be stored for 25 years. "There's a lot of data that will probably never be used again, but we're required to keep it, and it's all on high-priced storage. With ILM, we can reduce that cost." Massengill is working to develop a more formal ILM strategy within the next eight to ten months.
SRM software can look into your systems and report not only what hardware you have, but what kind of data is on them, how often those files are accessed and how much capacity is left on the system. Analysts say customers are often surprised by the results.
SRM software traditionally assessed your systems and generated reports, but newer tools can take action on their discoveries, actually moving data from high-tier storage to low, depending on the policies you set.
Centralized storage management also reduces the time it takes to backup your company's systems.
Massengill, for example, says the medical center's departmentsmore than 380 of themused to be responsible for their own backups.
"It seemed like every other day we'd get a call about a backup that hadn't been performed and a file that had been lost."
A rollout of StorageTek's Silo and Virtual Storage Machine allowed the medical center to begin centralizing the process and shortened backup times by three hours simply by enabling the staff to do the backup from one location. It also eliminated the need for some part-time staffers as well as an estimated $150,000 mainframe upgrade.
Hunt, of the Fred Hutchinson Cancer Research Center, adds that by doing incremental backups through its IBM Tivoli software, his organization has saved $50,000 a year in tape media alone. The center now backs up only the data that changes from day to day rather than the entire server every day.
And don't forget to educate employees about how they should store their data.
Duplicate files are a major source of data-center pain, says Schafer. "In most Fortune 2,000 companies, any important bit of data is eventually replicated 15 times. That's huge. And it's only going to get worse when you talk about what's coming down the road with all the unstructured data out there."
Schafer adds that encouraging employees to be more responsible about the number of files they duplicate can create a 30 percent improvement in storage costs.
The Canadian Museum of Civilization Corp., which manages the Canadian Museum of Civilization and the Canadian War Museum as well as a virtual museum on the Web, is in the process of digitizing its archive of almost 4 million artifacts. Chief Information Technology Officer Gordon Butler says one of the biggest challenges to his storage management initiative was getting his company's 400 employees to overcome the perception that storage is free. To do that, the museum, which outsources nearly all of its IT functions to Computer Associates International Inc., offers training sessions and also caps certain servers and networks, encouraging employees to save files to the central storage network.
But the true business goal is to turn storage into a product that is bought and sold within the enterprise. By pooling all the storage disks and networks, the IT department can begin holding business units accountable for the capacity they use.
Making storage a part of each department's budget creates an incentive to keep costs down in the data center. "Folks always want the fastest storage and the quickest recovery, but that always comes at a higher price," says Carolyn DiCenzo, a vice president with Gartner Research. "If a department wants everything online and recoverable, it will cost them X. But if they're willing to archive old data and put older data on lower cost storage, the IT shop can do it for a lower price. So storage as a service allows companies to better manage their storage from a price performance perspective." Of course, IT departments need to be careful not to design storage restrictions so rigidly that it hampers the ability of business units to store and share data effectively. Roughly 40 percent of large-cap companies have already put such plans into place, she adds.
Allianz Life Insurance Company of North America, a subsidiary of Munich-based Allianz Group, partnered with consulting company Glasshouse to overhaul its storage infrastructure. Now, storage is offered as a service for four lines of business, says David Kaercher, vice president of core services for Allianz Life. The IT department passes its costs directly to the business, and storage investments and expenses are shared by its customers based on their volume and utilization. This allows both the IT department and the business units to better track resources and expenses.
In the storage market, software is the new name of the game. Yankee estimates the storage management market will grow from $6.2 billion in 2004 to $8.7 billion by 2008. ILM and SRM are the latest acronyms, but analysts warn companies to conduct a thorough evaluation of the products before they purchase them.
"I think the biggest challenge is getting through the vendors' claims," says Yankee's Balaouras. "A lot of vendors claim they can manage competitors' storage equipment, but when you pull back the covers it's really just basic monitoring, not configuration." She advises CIOs to insist on seeing a real-world demonstration of the software's capabilities, "not just PowerPoints," and make sure products conform to Storage Management Initiative Specification (SMI-S) standards.
Kem Hutchinson agrees. The associate director of operations and LAN technology manager of TSYS, a third-party credit card processing company based in Columbus, Ga., says his company went through a six-month testing phase of products before it chose to implement EMC Corp.'s Visual SRM. TSYS employs more than 5,300 people and oversees more than 300 million credit card accounts, but was having difficulty managing the storage capacity of the company's shared drives. "We were in a reactive, not proactive, mode," he says. "We would get to a point where we were at 98 percent capacity and we'd go in and fix them for the time being, and then weeks later we'd be back in the same boat."
As a test, Hutchinson and his team put the SRM tool to work on one of the company's most heavily used public drives. They began the scan on a Friday night, and within 24 hours the software had tagged 540,000 of the disk's 950,000 files as data that could be moved to a cheaper storage device. Hutchinson says the savings generated from that one test alone paid for the cost of the SRM software. "That definitely saved us from having to purchase more capacity," he says.
And although vendors promise that their software works on all types of hardware, don't underestimate the integration issues. If your shop has a number of different or proprietary systems that need to be cobbled together, you could be facing some challenges. Massengill says Wake Forest University Baptist Medical Center still faces issues in getting all its disparate systems to be managed under one software program. "I have not found one product that handles everything," he says.
Finally, make sure you're shopping for the storage management product that fits your company's needsand budget. Large-cap companies have a range of options from a number of vendors, including EMC, Veritas, IBM Corp., Hewlett-Packard Co. and Sun Microsystems Inc. The software you choose should depend on the complexity of your infrastructure and the number of different types of storage devices in your data center. Fortunately, there are plenty of options out there for small and medium-sized companies (fewer than 500 employees), and vendors are creating products that are preprogrammed and easy to use.
Implementation costs vary wildly, depending on the product you choose and the amount of data you need to store. Massengill says his rollout has so far cost the medical center about $3.5 million, but prices have fallen drastically since he began the rollout in 2000.
The SMI-S standard was recently completed and rolled out by the Storage Networking Industry Association. The standard will be included in products that ship in 2005. According to Roger Reich, SMI committee chair of the SNIA, the standard will help CIOs solve many of the integration challenges in the data center. "We're trying to solve the end user's integration nightmare and make the data center easier to manage," he says. Of course, the new standards won't help older, noncompliant equipment, which will have to be converted to meet the new standards.
Meanwhile, analysts say storage virtualization will see greater adoption within the next five years but is still being developed. Virtualization allows companies to pool storage capacity from their various hardware devices, essentially creating one large pool of storage space. If one of your storage drives has only 4 gigabytes of memory left on it, while another has 6 gigabytes, and you have to store a file that's 10 gigabytes, virtualization would allow you to connect the drives' memory space as if it were all sitting in one place. "I would say the nirvana is probably another couple of years in terms of complete heterogeneous virtualization," says Yankee's Balaouras. "Vendors can do it today with their own equipment to varying degrees of success." Virtualization will also make migrating data from high tiers of storage to lower, less expensive storage devices a much simpler process, Balaouras adds.
Although virtualization is not quite ready for prime time, META Group suggests that CIOs ask potential vendors about their plans to develop virtualization tools and to begin thinking about how to best use virtualization strategically. Says Schafer, "Virtualization is the next step in creating a seamless storage utility and eventually moving to centralized storage management."