Analysis: ComplexityBy Terry Kirkpatrick | Posted 02-01-2003
In 1999, Don Morrison, Novell Inc.'s director of information support services, was looking to simplify the hiring of new employees. Digging into the process, he discovered 170 different applications throughout the company that its 6,000 employees might need access to in doing their jobs, things like e-mail, benefits accounts, customer data and expense account forms. The hiring process alone involved 19 different applications that generated more than 75 different data flows and integration points. Many of these applications belonged to departments that didn't care to give up control of certain types of data, such as employee titles, salary information or e-mail addresses. Had Morrison not found a way to allow departments to own and control data they considered critical, he says, "the project would most likely have ended prematurely because of the political nightmare it presented."
|Novell's IT architecture used to be made up of more than 190 applications. Human resources alone included more than 100 interfaces. The orange squares represent Novell's three primary enterprise applications, PeopleSoft, Phoenix and Oracle. The green squares symbolize the various single-purpose applications, and the red squares show the outside vendors to which Novell's systems are connected. |
The solution was to use XML to integrate applications and IT services into a single workflow process that tracked employees from hiring to firing, providing access to appropriate managers and synchronizing changes automatically. When an employee leaves, for example, access to networks, phones and buildings is automatically shut off. Novell says it saves more than $475,000 every quarter in employee productivity, new-hire setups, help-desk work and other costs, for a return on investment of 323 percent.
Morrison and his colleagues had come face-to-face with complexity, the devil of CIOs in every companylarge and small. "Anybody who has more than three employees deals with complexity every day," says Andrew Rowsell-Jones, a research director in Gartner's Executive Programs. Too many CIOs face daunting levels of unnecessary complexity, which detracts from the efficiency of their systems and business processes, adds millions to annual costs and reduces drastically their ability to respond flexibly to new technologies, business processes and markets. In a January CIO Insight survey, 42 percent of the almost 500 IT executives polled said their systems were more complex than necessary, a number that rose to 54 percent of those at companies with more than 1,000 employees. And maintaining and managing that excess complexity cost them an average of 29 percent of their IT budgets.
The good side of complexity is what it lets companies doget new employees up to speed, for example, or offer customers a new online service quickly. Rowsell-Jones cites Citigroup's proprietary network for clearing international settlements. "That's an example of using a phenomenally complex network of relationships among their own subsidiaries and other banks. They use that complex network to compete, to do things more cheaply than other financial institutions that don't have that scale and complexity," he says.
The dark side for most companies, however, is a costly, confusing spaghetti bowl of systems. Forrester Research Inc. has put some numbers on the problem: a server utilization rate of only 60 percent, meaning $20 billion in new servers was wasted last year, while the largest 3,500 firms will spend an average of $6.4 million this year on systems integration. And CIOs are devoting up to 70 percent of their budgets to simply keep their beasts at baymoney that can't go to new business opportunities. McKinsey & Co. estimates that IT infrastructure and application costs are 20 percent higher than necessary because of the confusing variety of technologies, such as database and desktop operating systems, in use.
A large part of that cost is people. Alan Ganek, head of the autonomic computing division at IBM, which develops technology to automate IT operations, estimates that in the past 10 years the cost ratio of software and hardware to the people required to maintain it all has reversed; 80 percent of the money previously went to technology, while 60 to 75 percent now goes to people.
Yet the benefits of cutting complexity can go far beyond saving money. "Everything is much easier across the whole value chainfrom training the users to maintaining the systems to being able to scale up the systems, even negotiating vendor licenses," says Howard Lapsley, a partner at Mercer Management Consulting.
Toward a Definition
Toward a Definition
What is complexity? It has many strands, some technical, some organizational. On one level it is the sheer number and variety of devices and people using them. Consider the financial services industry: In 1996, a major bank's 30,000 ATM machines might have been considered a really large online environment, says IBM's Ganek. "Six years later, big banks have Internet services accessible by anybody in the world," he says. "They have 10 million, 20 million, 100 million users. At first, ATMs all looked alike and had a limited set of functions. Today these banks' networks are accessed by notebook computers, desktops, eight different versions of Windows, Macs, PDAs, smart cards. That's a much more complex environment to support. And the scale: We've got customers that have 30,000 serversmore servers today than they had PCs six years ago."
Yet it is more than the number of boxes that proliferated as IT moved out of the glass house. "The problem is the number of point-to-point connections between systems," says André LeClerc, a software developer and IT architecture consultant at the Cutter Consortium. "How many times does a large enterprise have to interface with its legacy systems? That's the measure of complexity. Take an order-processing system. How many groups in the company have to register orders? How many systems do they go through to process an order? And then, when you expose that to the Internet, how many touch points are there from customer to information? There's been logarithmic growth in these connections."
Complexity lurks in the layers of heterogeneous platforms and applications that came with client/server networks and Internet computing. "You have the network layer, which involves your WAN, LAN and PCs," says Dick LeFave, CIO of Nextel Communications Inc., the wireless phone company. "You've got the middleware layer, where you're handling a variety of protocols. You've got the application layer and the database layer." All these layers must work with each other seamlessly to enable seemingly routine processes. "We're talking about technology layers that are five to 10 times more complex than they were 10 years ago."
Moreover, islands of expertise have grown up in IT shops around each of those layers. Back in 1997, financial services giant Wachovia Corp., with $5.3 billion in revenues at the time, had a problem with its new browser application for corporate customer access. Chris Edden, then a senior vice president and group manager of commercial services in the IT department, quickly put together a team of experts to fix it. Edden gathered his troops in a windowless room in the bank's operations center in Winston-Salem, N.C. Among the dozen people were experts in applications, telephony, security, mainframes, desktops, middleware, databases and computer operations. After three hours and as many pots of coffee, they isolated the problem: a piece of transaction software had failed. It took just 15 minutes to fix.
What the incident highlighted for Edden is that each IT discipline tries to maximize its contribution by offering the most sophisticated solution, regardless of the impact on everything else. The CIO has to see the big picture and manage all these disciplines as a unified service. "The trick for the CIO is, how do you take these centers of excellence and associate them to create a solution?" Edden says. "It wasn't always an issue, but the organization is a much more complex system now."
Perhaps the most useful definition comes from Bob Reinhold, vice president and head of Americas Technology Consulting Services for Cap Gemini Ernst & Young: "I see complexity as the non-value-added redundancy across all the dimensions of your applications or infrastructure or data store. Having multiple ways of delivering a service without a good reason for them introduces a complexity that doesn't add value to your organization." Reinhold is careful to distinguish between good and bad redundancybetween redundancy that might assure a zero-tolerance transaction system, for instance, or necessary business continuity. As for the rest, "the fact that you have multiple hammers doesn't necessarily mean you're redundant. And the fact that you've got 12 hammers doesn't mean you are overly complex. But if you have seven of the same kind, you might be overly complex." And each of those hammers costs money and time to maintain.
Too Many Boxes
Too Many Boxes
At some point, the day of reckoning must come. The "buy-and-hire" strategy, patching together more and more boxes and software, administered by more and more people, for more and more money, that characterized the go-go 1990s, explodes. And the trigger point can vary.
For some, the trigger is a missed opportunity when market conditions change. Sunil Subbakrishna, an independent IT strategy consultant, cites MCI's Friends and Family billing program. "The other long-distance companies, which didn't have that capability in their billing systems, took up to two years before they could offer that feature," he says.
It's a question of agility: Business success today increasingly means connecting diverse constituentscustomers, suppliers, partnersin real time, but "system complexity is a drag chain that prevents IT from responding to these new constituencies at a time when competitive pressures are forcing enterprises to respond to them," Gartner's Rowsell-Jones says.
But for most, it's the budget. Says John Kaltenmark, chief architect and a partner at Accenture Ltd. in Chicago: "One key indicator that we tend to look at is the amount of the IT budget that goes to maintenance and support costs. In the last several years, we've seen that creeping up in some organizations to 70, 80 almost 90 percent. If you're spending much more than 40 or 50 percent, whether it's people costs, or assets or tools, or development to keep the systems running, that's an indicator that it's probably time to step back and look at rationalizing your systems environment."
What To Do
What To Do?
Complexity is inevitable, but the goal is a cost-efficient and easily managed system. The frequently cited analogy is the modern car, which is far more complicated than a Model A, but also far easier to drive.
In the short term, CIOs can take advantage of new technologies that help manage pieces of their complexity by automatically monitoring and configuring systems, and thus freeing up people. A number of companies have sprung up recently to offer just that. Out of his war room experience at Wachovia, for example, Edden launched a team that designed software to monitor the systems and give business executives a total view, via the intranet, of a technology service such as the corporate banking application. It observes applications as they operate and isolates problems so system administrators can fix them quickly. The software spread around Wachovia by "sneaker net," and in 1999 Edden cofounded a subsidiary, Silas Technologies Inc., to sell the software to other companies.
"Our solution allows senior management to observe real-time the health of both IT and operational investments," Edden says. "It's not a toolset for the network administrator. It reports on the health of complex architectures by exercising those services and subservices. Home banking, for example, might be periodically tested for availability for the customer, but it will look into subservices like the mainframe, database or Web server. Not only does a manager know whether he has a problem or not but also where the problem most likely is."
But others warn that many of the vendor offerings allegedly designed to help solve the complexity problem do little more than add yet another application onto companies' already overburdened architectures. Such technologies "provide some management monitoring of a complex environment, but that doesn't in and of itself reduce complexity. It just gives you a better idea of what's going on," says Gartner research director Debra Curtis. "Hopefully, such tools don't compound the complexity. In reality, you have to introduce structure into the environment, eliminate complexity, reduce the vendor choices."
To get at the root of the problem, companies need a complete picture of their IT architectures. Knowing how much money they're wasting with current systems is the first step toward cleaning them up through rationalization or integration. Or they could spring for a whole new architecturebut who can afford it? "No IT department has the time or budget to do that kind of wholesale forklift change to a structured environment," says Curtis. That's the great dilemma facing most CIOs dealing with complexity.
But cleaning things up requires discipline. Says Mercer's Lapsley: "You have to get under the hood. First, you have to make sure you have a very good process in place to control and prioritize your discretionary spending. Then you have to look at the infrastructure spending that is supporting not only the discretionary projects now and in the future, but also projects that are already in place. You have to ask, 'How many platforms are we supporting? What's the driving business need for them? What are the costs to rationalize them? What is the game plan for doing so?'"
CGE&Y's Reinhold prefers to attack the problem by rationalizing all the many technology standards large companies tend to acquire over the years. But you can't simply assign standards and then replace everything that isn't compatible. "Instead, most companies should look at the total complexity in their environment and say, 'OK, out of all the standards I have, I am picking these to be my standards going forward. And every opportunity I get, every natural event that causes me to evolve, I am going to evolve to the same standard. And over time, I will dramatically reduce my complexity.'"
Ultimately, reducing IT complexity will mean examining the complexity of the business processes IT supports. "If you're able to sort out the business processes, there's obviously still a big technical challenge, but it's much easier to proceed," Subbakrishna says. "The difficulty is when you can't sort out the business processesyou basically have to build a system that's capable of doing everything, and that creates the complexity."
In fact, the first piece of advice IBM CIO Phil Thompson gives to customers isn't about technology: "First of all, if you don't have a formal system of governance, get one. And have it very clear what role you're going to play in the process and what role all the other participants are going to play, including the senior leadership of the company from the chairman on down. Second, as part of that governance, you need to put in place an enterprise architecture board that helps you understand your architecture, starting at the layer of process and working all the way through applications, data and then machines. They also have to drive compliance across the company so that no renegades are out doing things that are not compliant."
In other words, complexity must be fought not just in the technical trenches but at the highest levels of the corporation. It's a question of aligning the culture around simplicity.
A Simpler Future
A Simpler Future
In the long run, CIOs can anticipate advances in systems that are expected to radically change IT operations. Forrester Research says technologies coming during the next decade will lead to what it calls "organic IT," architectures built of cheap, redundant components that automatically share and manage resources. These technologies, says Frank Gillett, a principal analyst at Forrester, will solve the two fundamental problems of complexity: applications that can't talk to each other, and hardware that, bought for peak demand, is typically used no more than 20 percent of the time. In this vision, Web services and portals will let applications talk to each other relatively cheaply, while new software will allow servers to be quickly reconfigured for other tasks, increasing their capacity. Gillett has built financial models that point to potential savings of 80 percent on additional peak server capacity and 35 percent on storagewith total infrastructure cost reductions of 52 percent.
Meanwhile, a number of IT vendorsCGE&Y, EDS Corp. and IBM, among themare putting in place the beginnings of a utilities computing model. The goal: to provide corporations with all or part of their computing needs on an as-needed basis, a kind of "leave the computing to us" pitch. IBM's well-publicized effort, called e-Business on Demand, aims to use the power of grid computing, controlled by autonomic, self-healing systems, to provide low-cost, flexible computing to businesses.
IBM's autonomic initiative is an effort, Ganek says, "to make technology take care of things technology can take care of, freeing people to do the things they do best, which is thinking about how they can use technology to solve business problems. Going out 10 years, we can expect to have a far more cohesive behavior of systems where components interact and make constant adjustments. The CIO's life is going to be different in that the CIO will be much more of a partner in the business.
"But this is not an overnight thing," Ganek says. "This is a grand challenge in computing, as hard a problem as exists."
TERRY A. KIRKPATRICK is a contributing editor for CIO Insight.
Simplicity: The New Competitive Advantage
By Bill Jensen
Perseus Publishing, 2000
"Fighting Complexity in IT"
By Frank Mattern, Stephan Schönwälder and Wolfram Stein
The McKinsey Quarterly; 1, 2003
By Frank E. Gillett, et al
Forrester Research, Inc. April 2002
IBM Corp., 2001