The Risk of InactionBy CIOinsight | Posted 09-23-2002
Computer Security in Flux
Dr. Peter G. Neumann, principal scientist at SRI International's elite Computer Science Laboratory, is a world-renowned expert on computer security, privacy and systems. Last year he was awarded a contract by the Defense Advanced Research Projects Agency to help ease extensive security problems in the nation's data infrastructure. He was given the National Institute of Standards and Technology and the National Security Agency's 2002 Computer System Security Award this past June. His book, Computer-Related Risks, is in its fifth printing.
CIO Insight: What impact has Sept. 11 had on the way organizations look at security?
Peter G. Neumann: I think there's a great deal of confusion as to what to do. When people don't really understand the detailed risks they're facing, they tend to dither. In this particular case we've been attempting to throw some high-tech solutions at problems that don't easily respond to high-tech solutions. There are a great many things that technology can do, but the reliance on technology as a solution to non-technological problems is inherently risky. So I think in the absence of specific understanding of what might work and what doesn't work, there are a lot of organizations with their heads in the sand.
In the Clinton administration, the President's Commission on Critical Infrastructure Protection came to the conclusion that pretty much all of our critical infrastructures are at risk. When we extend that to corporate America, the same conclusion can be drawn: For anybody who has extensive pieces of their business accessible from the Internet or accessible by remote telephone dial-up or accessible by wireless modems that are not adequately protected, their entire enterprise is at risk. The challenge here is to recognize what the risks are and to act accordingly.
CIO Insight: What impact does that have on corporate strategy?
Neumann: The question is, where does one most effectively put resources? And the resources are people and money. There are a lot of organizations that are saying, oh, don't worry, we've got everything under control; we have all of the risks covered, and we've protected ourselves. Anybody who says that is either incredibly naive in believing that their own spin is going to protect them. This is sort of security by obscurity, with heads in the sand and pretending that everything is now OK.
From what we've seen in computer security, everything is vulnerable, and the extent to which it's vulnerable may vary a little bit from one organization to another, but basically all of the mass-market computer software that's out there is riddled with security vulnerabilities. So the question is: How are you using it and how are you interfacing with the rest of the world?
The Risk of Inaction
The Risk of Inaction
CIO Insight: A smart company at this point would be thinking differently about risk, right?
Neumann: The first step is to understand what the vulnerabilities are and what your threats are. It's classical risk analysis. Except that it's a seat-of-the pants analysis; it's not trying to understand things in tremendous depth. Because as soon as you start trying to put numbers onto potential losses, you're starting to make all sorts of assumptions, and it is exceedingly unwise to do detailed risk management based on numbers that aren't right. The better strategy is to eyeball the problem and see where your risks are, and try to figure out what you can do about it.
The question always arises, what is the risk of not doing something? What does it do to our business? In the past people have said, gee, we've never had an attack or we've never had a break-in or we've never had an enormous financial loss that we couldn't cover. Therefore, why do we need to do anything about it?b
CIO Insight: Has that thinking changed?
Neumann: I think so, yes. I don't think it's changed enough, but I think we're at least beginning to get a little more reality into the thinking.
CIO Insight: Why hasn't it changed enough? Why are heads still in the sand?
Neumann: It's a standard problem. Unless you yourself have been burned, you don't tend to see a need to react. Most people tend to look at things from their own personal point of view without understanding the broader implications of the things that happen around us.
The typical CIO is going to say, "I realize everybody around me is having problems, but I've only had a few minor break-ins, and we're already doing a lot to protect ourselves." But once he has a major break-in, the CIO suddenly has to say, "My gosh, I really have to do more." When you realize how vulnerable most corporations are, and certainly how vulnerable most individuals are without realizing it, it is clear that the situation we're in is grossly inadequate.
Now, part of the problem is that the computer systems themselves do not have adequate security. Another problem is that the system administration of those systems is tremendously complicated, and it's difficult to keep a set of highly trained system administrators within your corporation. And, third, the user community is fairly naive. That exacerbates the problem as well, because you have people with trivial passwords and in some cases blank passwords. I think the challenge to CIOs is that they really need to assess what the vulnerabilities are, to recognize that there are threats all over, and to analyze what risks would, in fact, cause tremendous difficulties for their companies.
Assessing the Threat
Assessing the Threat
CIO Insight: How likely is it now that there will be some kind of organized effort by malicious groups to hurt this country or parts of our infrastructure?
Neumann: It's very dangerous to say this is our most important threat. Some massive attack on our computer communication infrastructures could easily occur in combination with attacks on various other things. I think if you realize that essentially all of the critical infrastructures are tremendously dependent on computers and communications and electrical power, the fact that all of these things are so closely coupled suggests that an attack could be multithreaded, something that happens in many places simultaneously, thereby paralyzing your ability respond.
CIO Insight: Does scenario planning become a more important tool?
Neumann: The problem is that if you have a lot of scenario planning, a lot of these scenarios seem very far-fetched until they've happened. And the fact that there has been scenario planning that anticipated many of the things that did happen merely suggests that they were incredible at the time, and suddenly they become credible.
The assumption prior to Sept. 11 was that hijackers would always want to land the plane safely. That was an assumption that was embedded into everything that was done. Assumptions suddenly changed dramatically. Now the trouble with all the scenario planning is that you make a lot of assumptions. If your assumptions are wrong, your analysis is completely wrong.
Correcting Business Assumptions
Correcting Business Assumptions
CIO Insight: What are some of the business assumptions that were wrong?
Neumann: We tend to believe that our electrical power and our telephone systems will work. We have come to believe that our telephone systems work even when the electrical power goes down. Of course, we had the case in New York City where they were running on backup batteries and didn't realize it and shut down the telephone system.
CIO Insight: Are we getting any better in formulating assumptions?
Neumann: Most people don't think out of the box enough. They are thinking relative to what their experience has been in the past. One of the problems I have continually faced over the years in dealing with risks to computer-related systems is that we see the same kind of problem happening over and over again. In 1980 it was the ARPAnet collapse, resulting from a single point failure in one node bringing down the entire network.
In 1990 we had the AT&T long-distance collapse, where a single point failure brought down the entire network for half a day. Now in some sense we don't design our infrastructures and our systems to withstand things as simple as a single point failure. The standard answer is, this can never happen. That was said prior to the collapse in 1980, and it was said prior to the AT&T collapse in 1990there's no way in which a single point failure can bring down the entire network, and yet that's exactly what happened.
So the idea that we can design perfect systems is one of the fundamental fallacies. No system is perfect. Every system has the ability to collapse on its own, irrespective of being misused by insiders or penetrators. And then it's vulnerable to misuse. The idea that we have a foolproof solution to this problem is very naive.
CIO Insight: What can we do to counter the head-in-the-sand attitude?
Neumann: Now you're getting into a very sticky wicket. The critical infrastructures are mostly privatized, if not completely privatized, and the government has very little ability to jawbone the critical infrastructure providers into doing things significantly more robustly. If you look at electrical power, 10 years ago there was a tremendous amount of surplus power. Now there's almost no surplus power because from a competitive point of view you can get it from elsewhere, so why should you overproduce? This means when crisis hits, there's no surplus.
You're getting into the whole issue of privatization versus nationalization. The government needs to have controls, some regulations on the sensible management of the critical infrastructures. You can't deregulate everything to the point that everything is free enterprise. And that's a sticky wicket in our current setting.
CIO Insight: If I'm a technologist working for a company, how do I respond to this? I've got a limited budget, I've got a bunch of managers with their heads in the sand. I'm not kidding myself about the extent of the complexity. As a responsible person, what should I be thinking about?
Neumann: Well, you need an enlightened CEO to start with, one who is not so greedy that his only real motivation is putting more money in his pocket. You need somebody who's enlightened enough to realize that if you don't do some of the protective things, the entire company goes down in flames. And the CIO needs to have real access to the CEO, not just in nameyou're the CIO, you can do anything as long as it doesn't increase the budget.
The CEO has to be responsible to the purpose of the company, not just to the stockholders, and not just to his own pocket. There a lot of things that need to be adjusted here, but one of the most fundamental ones is that the CIO has to have a great deal of vision as to what the risks are, and especially the risks of not doing things. And the CEO has to be responsive to that.
CIO Insight: Seeing for so long these vulnerabilities and seeing that people continuously put their heads in the sand, are there ways to motivate people to do things differently?
Neumann: The biggest incentive is being burned. Once you've been burned, you at least respond to the thing that caused you to get burned in that particular way. It doesn't mean that you look at the broader picture and think about how else could you be burned. But the effect of having been burned does dramatically increase your awareness in one respect, namely, the particular way in which you were burned is something you think you'd better prevent in the future.
And I think the risk there is in trying to find simplistic solutions such as banning plastic knives on airplanes or very high-tech solutions such as the face-recognition stuff, which doesn't work very well, with a database of something like a couple of dozen terrorists and a false positive rate of 90 percent or something like that, whatever the rates are.
CIO Insight: A number of CIOs have said that one of the lessons is to back up your backup systems, and some are talking about distributed work forces. Others are talking about different ways of managing in a crisis. And others are talking about an array of wireless technologies.
Neumann: All of the things you mentioned are useful if they're done intelligently, as parts of the approach. But there is no easy answer, and you can't just say, we've done this one thing and that's going to protect us now. The real question is: Do you understand what's going on? You do that by going through different kinds of scenarios or just reading the experiences of others. At least it gives you an idea of what other people have experienced. But that's not enough, because there are all kinds of failure modes and attack modes that we know about in the research community that haven't occurred in the wild yet. The virus thing is an example. In concept there are all kinds of viruses that we haven't seen yet, but potentially there are huge opportunities for Trojan horses and worms.
CIO Insight: So it is just a matter of time before a terrorist group uses some of those against us?
Neumann: I would say in combination with other things those techniques could be effective. Now you mentioned a backup. One of the risks is in believing that your backup system is robust.
Suppose a terrorist organization or a computer criminal or whatever decides what he's going to do to you is contaminate your backup for maybe six months or a year before the attack takes place. All your backup facilities are going to fail when you try to resort to them. Now this is a serious problem, but it begs the question of how organized and how carefully planned an attack might be.
If you start thinking about scenarios of coordinated, carefully planned, long-term attacks, it changes the picture. The problem here is that the defenders are at a serious disadvantage when it comes to malicious attacks, particularly with respect to computer systems, because there are so many vulnerabilities.
There are, though, still a lot of heads in the sand. When you look at the big picture, you want systems that are secure, reliable, survivable in the face of all kinds of adversities, malicious and accidental and environmental and whatever, including floods and earthquakes and so on. It's a very difficult problem to get your hands around. And so the tendency is to put your efforts into the things you understand best and ignore everything else and just hope that everything else will be irrelevant. You've got to think out of the box in this business.