Technology: Storage

Gary Bolles Avatar

Updated on:

Every day at the Aviation Weather Center in Kansas City, Mo., a team of 60 meteorologists sifts through a torrent of weather data from thousands of sources, such as satellites, radar ground stations, weather balloons, ships, pilots, even offshore buoys. The group’s nine different forecast desks run the information through a variety of computers, including one of the world’s fastest supercomputers, generating model after model of dense graphics detailing weather conditions covering two thirds of the globe, from tornados in Kansas to floods in central Pakistan. All in all, says Clinton Wallace, an IT specialist at the center, AWC’s two aging Hewlett-Packard K-class servers, which move all the data from one place to another, have to juggle 13 gigabytes of data a day—roughly equivalent to about four and a half million pages of text in the nondigital world.

When they were functioning, that is. But the servers used to crash frequently. “We’re just beating them to death,” admits Wallace. “We’re driving them at 100 miles an hour, all day long.” Outages used to last up to an hour, leaving forecasters without the ability to generate advisories for thunderstorms, ice, turbulence and visibility for thousands of pilots crisscrossing the planet.

Aviation Weather’s data storms aren’t exceptions in today’s data-saturated world, and with the amount of information to be warehoused in the digital economy growing at 30 percent a year, according to Gartner Inc., companies and organizations everywhere will need to create a better and cheaper way to handle the data glut, or risk costly—even life-threatening—information failures.

Companies “should have secure access to information at any time, over any distance, no matter where the information is kept, no matter the type of computing platform—all at the fastest possible speed,” says Dan Tanner, a senior storage analyst for the Aberdeen Group, a research firm. “That’s the Holy Grail. That should be the strategic aim [of IT]. If you can do that, your business will run smoothly.”

Worrying about storing and accessing critical business data wasn’t always a concern. In the days of mainframe computers, storage was centralized, and users always knew where it was—on the big iron—even if they couldn’t always get at it. But with the advent of cheaper and more accessible networked computers in the 1980s and 1990s, IT departments opted to phase out their company’s more reliable, enterprise-class storage systems. Cheap and accessible networked devices also let companies put computing and storage horsepower where they thought it belonged—in a workgroup, for instance, or a hosting company’s data center. But the downside is complexity. “When you vastly increase storage, and you vastly complicate the network,” says Tanner, “managing the storage and movement of data becomes steeply more difficult.”

Storage Everywhere

Strategic Storage chart
Strategic Storage
Corporate strategy should always drive every major IT architecture decision. The infographic suggests a way to think about how a particular storage design affects strategy, in a company where increased nimbleness and employee productivity are essential to delivering value to customers. To ensure rapid access to data, IT must provide an efficient networking infrastructure to provide the necessary speed. The data must be widely available, which typically means a strategy that blends centralization, reliability and scalability. These factors, in turn, are supported by an increasing trend toward standardization, allowing IT to “virtualize” storage hardware, making it possible to save and access data wherever necessary in the networking infrastructure.

Part of the problem stems from the fact that distributed computers merged all major pieces of the computing puzzle into one box. That means storage was merged with other components, from applications to operating systems to processing. These puzzle pieces aren’t easily uncoupled, making it difficult to isolate storage to maximize its effectiveness. Says Tanner: “Now the data center, instead of a mainframe, might have a bunch of open systems. But these computers won’t share the information very well if they all have their own storage.”

And that complexity can be costly. For IT, managing distributed storage means heavy spending on hardware and staff time. According to Mike Kahn, chairman and cofounder of the Clipper Group, a Wellesley, Mass.-based consulting firm, the cost of labor can be as high as seven to eight times the cost of the hardware itself. That’s why most IT shops, especially amid the current economic downturn, are trying to do more with less. The challenge, says Kahn, is: “How can I manage two to four times more storage, and do it better?”

Also contributing to the storage problem: Few business executives have any interest in the technology their company is using to store data. Few understand the link between a company’s storage strategy and its ability to use data, in real-time, in the course of cutting costs, boosting profits—or predicting hurricanes. “The most critical thing we need for forecast operations is data,” says the Aviation Weather Center’s Wallace, and that means “having your data available when people need it.”

But the trade-off between cost and complexity doesn’t have to be a bad one. In an effort to keep costs down while achieving the reliability of the mainframes of old, many companies are rethinking their digital storage networks.