How to Increase the Reliability of Your IT Infrastructure Using Predictive Analytics REGISTER >
Oakwood Worldwide Inc., a Los Angeles firm providing temporary housing for relocated employees, once had a system that fit the company's needs like a glove. Two thousand employees in 200 offices around the country relied on a custom HP 3000-based system to match customers with vacancies. Have a client who needs to find a three-bedroom, non-smoking apartment in Sacramento that's near a park? Oakwood's system could help its staff find several to choose from. And it could do it reliably in just a few secondsan important consideration, since it only took eight days of vacancy to wipe out the profits from a two-month rental.
Then those needs changed. Corporate clients wanted to tap into the database so they could stay on top of their relocation spending. Oakwood wanted to analyze all this sales and real estate data so it could uncover opportunitiescould they, for example, boost profits if they offered units with Asian-style kitchens in Washington D.C.? If Oakwood's staff could analyze that data, they could figure out which units and features appealed to different demographic groups. Then salespeople could target Japanese or Korean companies, for example, and the property managers could make modifications. New data analysis tools could do all thatif those tools could utilize the data in the company's computer. But there was the rub, says CIO Ric Villarreal. "I cannot apply any new analytical tools to the data because I don't have access to [the database in] real time," he says. And that would require modifying five million lines of COBOL.
Building a flexible architecture that can integrate existing corporate data and applications while accommodating new business needs and systems is an enormous challenge. Yet companies are finding it is a business necessity and that they can overcome the obstacles incrementally by completely changing applications and databases when they must, and building bridges between legacy architectures and new needs and software when they can.
Companies seldom have complete data architectureswhich comprise the structure of all corporate data, the relationships among the data, and the implementation of data structures in applications and databases, says Paul Harmon, senior consultant for the distributed computing architecture advisory service of the Cutter Consortium, an IT advisory firm in Arlington, Mass. Mergers and acquisitions force the assimilation of dissimilar data architectures, and short deadlines to show financial results often push managers to cobble together the previously separate systems. "About the time they figure out a nice, clean, logical way to merge the databases is about the time management buys another company," Harmon says.
Some organizations have enough operational and organizational stability to sustain a single architecture. The University of Miami has used the same integrated database management system, Computer Associates' IDMS, for 20 years, almost as long as the tenure of Dean, Vice President and CIO M. Lewis Temares. IDMS centrally controls and enforces a data model used by the university's applications, which track students, manage admissions, determine financial aid, create class schedules and support fundraising.
Still, the data architecture has shortcomings. "You have to practically be a programmer to get [information] out," Temares says. Two decades ago, there were no graphical interfaces or high-level tools for university administrators to extract data and create reports. Now, the IT department is providing Web portal front-ends to give administrators easy access to all the information they need to sort through student demographics for marketing purposes, or look at course preferences to plan next year's teaching assignments. Temares is already planning a transition to another database, probably IBM's DB2, at an expected price of $2 million to $3 million.
Creating an organizationwide data architecture is an expensive proposition. A few years ago, the Jersey City, N.J.-based Pershing division of Donaldson, Lufkin & Jenrette Securities Corporation, a Credit Suisse First Boston company, decided it needed a more rational data architecture. Services demanded by consumers, such as consolidated statements, as well as new SEC requirements led Pershing to consider redesigning the structure of its data on customers, accounts and securities trades. But there was a problem: The new architecture would cost tens of millions of dollars and be risky to tamper with.
"We would have to modify every legacy system to use the new data [architecture]," explains director of database technology Albert DiGiovanni. "And the legacy stuff is importantit continually runs the firm." Many of the original business systems run on a mainframe and were built with COBOL, CICS and VSAM. So the company is first mapping out the legacy data, defining additional data required by business needs and storing the extra information separately in DB2 databases. The approach allows Pershing to evolve existing systems while making them compatible with new applications and tools.
"You won't be able to build new all the time," notes DiGiovanni, "so you have to find out how to make the connection [between old and new systems] and then leverage the new data structure."
Managing data architecture changes is challenging enough when planned, but can be harder if end-users unexpectedly force modifications. Another food chain, Red Robin Gourmet Burgers of Greenwood Village, Colo., tries to keep a stable data architecture so data is entered only once and then propagated throughout the system. The group within Red Robin that's responsible for building new restaurants wrote its own Microsoft Access management tool. Now it wants the software integrated with the company's financial application, and Red Robin CIO Howard Jenkins wants to proceed carefully to ensure that the data models are compatible. "Clearly we have to integrate [the tool] or replace it with something easier to integrate. We don't know which yet."
No, it isn't easy to create and stick to a good data architecture. Still, CIOs keep chipping away at the problem, doing what they feel they can.
And effective efforts can pay off in the future. The inflexible architecture at Oakwood is on the way out. A new architecture, supported by an Oracle database and running new third-party applications, will let employees, for the first time, use new tools that will enable them to better analyze data from their operations, and find new ways of increasing profits.
"If I do it right, the architecture makes the processes easy to program or develop, gives the data integrity, and lets me go after the data in lots of different ways," Villarreal says. At least, it will for now.Erik Sherman is a freelance writer in Marshfield, Mass.