Expert Voice: Peter G. Neumann on Securing our Data GridBy Peter Neumann | Posted 09-01-2001
Peter G. Neumann isn't your typical computer nerd. For starters, the principal scientist at SRI International's elite Computer Science Laboratory plays the bassoon, piano, French horn and a variety of other instruments. He writes poetry and music, and he's good enough at softball to have started a triple play at third base that involved movie director Spike Lee three summers ago in a pickup game on Martha's Vineyard. He's also an inveterate punster, billing himself on his Web site as an "eclectical engineer"a reference to his diverse interests and talents.
The 69-year-old Neumann is a world-renowned expert on computer security, privacy and systems who has been on various computing and Internet advisory groupsincluding one in the mid-1990s to advise the commissioner on service problems at the Internal Revenue Service. In July, he was awarded a contract by the Defense Advanced Research Projects Agency to help ease extensive security problems in the nation's increasingly complex, hacker-prone, bug-ridden data infrastructure.
But perhaps Neumann's most visible role beyond the rarified air of computing is as moderator of a popular Internet newsgroup he has run for the past 17 years called "The Risks Forum." His "Illustrative Risks to the Public in the Use of Computer Systems and Related Technologies" cites thousands of incidents of flawed information technology and systems, from software-triggered glitches in heart pacemakers to the recent Code Red virus, versions I and II.
What worries Neumann the most, however, isn't the smart hacker who can penetrate and plant "trapdoors" nearly anywhere then return undetected at any time, as with Code Red. "That stuff is trivial," he says. What really worries him, Neumann told Executive Editor Marcia Stepanek during a recent interview in his paper-cluttered Silicon Valley office, is "the big one that could have a major effect on everyone's lives."
CIO INSIGHT: You once referred to yourself as a Cassandraa prophet who warns about the risks to the public in the use of digital technology, only to find your warnings fall, mostly, on deaf ears. How bad is the problem?
NEUMANN: Essentially, every computer system is breakable. And that brings us to the bottom line, which is that our information infrastructure is flaky and weak, and we've got to do something dramatic to improve it. It will require intelligence on the part of developers, CIOs and customers. If none of them are demanding better quality and security, we'll never get it.
True, most attacks so far have not been devastating. But the potential is there for massive disruption of the U.S. infrastructure and the economy. The National Security Agency, Central Intelligence Agency, Federal Bureau of Investigation, U.S. Department of Justice, Air Force and many corporations have had their Web sites broken into. Is this serious? Yes, because it's symptomatic of the enormous number of vulnerabilities that exist in these systems. If all of the examples of break-ins today have been merely ankle-biting, then you have to think about the bigger picture of, say, corporate espionage, terrorism, foreign nations bringing down infrastructures in other nations.
What is chiefly responsible for the increasing vulnerability of our information infrastructure?
Stupidity. In the past four or five years I think, more than half of the security vulnerabilities reported by Carnegie Mellon University's Computer Emergency Response Team have been due to buffer overflowsmissing bounds checks and other related flaws. We've known about them for 40-some years, and yet they persist. My point is that these events are symptomatic of the fact that the state of the art of software engineering is abysmal. Now, you'd like to think that when you're dealing with life-critical systems, mission-critical systems, military-critical systems, money-critical systems, that the software engineering would be dramatically better. It's not. There's an enormous amount of stuff that keeps happening over and over again, the same kinds of failures. We never learn.
People still install systems with the root password as delivered on the installation disk, for example. You'd be astounded at how many systems have this problem. Maintenance people use the same password on every system that they're maintaining all over the world for their own convenience. And the average CIO is told by vendors that everything is perfectly secure, which it isn't, and yet the CIO believes them. He wants to believe them because if he doesn't, he's got a problem.
The biggest problem today is that all of our infrastructure systems are increasingly dependent on one other. Transportation needs telecommunications, telecommunications need power generation and distribution, power needs energy sources, energy distribution needs telecommunications and transportation. All of these infrastructures need computers, and indeed, networked computers. It all runs now, increasingly, over some form of the Net.
We're dealing with problems where the entire enterprise, the entire corporation, the entire industry, the entire government, must be considered as a system. And you have to do some intelligent analysis of the risks in that system. And what we always discover is that there are enormous risks.
Is it possible to improve things without government regulation?
Yes and no. Regulation can work in many different dimensions. But regulation or oversight of some sort is probably necessary when dealing with life-critical systems. Regulation can discourage innovation, as it has to some extent in cryptography and security. Regulation can stimulate system interoperability, but it can also hinder it. Unfortunately, governments are not very good at long-term thinking, and more of that is needed. It also takes a lot of altruism on the part of system developers who need to build things that are really robust, rather than just good enough to be marketable. We might need some software lemon laws. Perhaps the best example of what can be done without government regulationand to some extent in spite of government regulationis the highly collaborative open-source and free software movement, which is heading in a very important direction.
You're a well-known proponent of the open-source movement. Is that a solution here?
Open source by itself is not a panacea because it still requires all the discipline in software development, maintenance, administration, patching and so forth. But it gives you the opportunity to fix things. If you have any skills in-house, you can repair things that a vendor of proprietary software won't let you repair. If you have some intelligent architecture, and operation and system administration, and intelligent CIOs, there are opportunities to create systems that are substantially better than your typical off-the-shelf stuff.
If you look at the networking server marketplace, the open-source community, Linux, UNIX, whatever, is now in excess of half of the market, which is an interesting turn of events. That's certainly going to get the commercial vendors much more up in arms as to what they can do to combat it. But, unfortunately, the knee-jerk reaction is, "Well, we need proprietary stuff because that way there's somebody behind it." The real challenge for the open-source community is providing support, maintenance, integrity, assurance, security and reliability that you can't get out of some of the proprietary commercial stuff.
What can a CIO do in this instance?
In many procurements, there's a security requirement, and the vendors win the bids based on their proposals, and then they say: "Oh, by the way, we don't think we can meet the security requirement. We have to back off on it a little bit." And the organization that made this procurement says: "Well, we desperately need the system, so we'll have to back off on the security part." This has happened over and over again. After it's developed or sometimes in operation, they test it and say: "Well, gee, this is not doing what it was supposed to." And folks knew this all along.
How about contracts with real penalties in them or how about telling a vendor that they don't get paid unless this thing really works? Now, the problem there is that the requirements for those tests are usually inadequate. So the vendor builds the system to minimally meet the requirements but not to do what it ought to do. I've had several cases of that, cases where it was obvious from the beginning that the system wouldn't do what it needed to do, but it would meet specs. And there are a couple of notorious cases where the vendor knew all along that there was no way this was going to work, but the letter of the contract was such that they would win big on lawsuits and incremental contracts.
But on balance, the laws never quite make the difference in what is needed. The legal expenses of litigating are horrendous. It can drag on for years and years. The insurance companies do have a role. Organizations that impose standards have a role, but the standards tend to be minimal. And unenforceable.
You've been named to help lead a project sponsored by DARPA. What's the idea behind the effort?
People in government who get it recognize that they're not getting what they need out of mass-market operating system developments. So, they've issued 10 contracts that relate to the development of secure, open-source operating systems. This is an opportunity for quite a few folks to take advantage of the research that we have known and loved for the past many years, and take the concept of the open-source movement and develop systems that are substantially better, ideally, than what can be bought or rented or leased out of the proprietary software development community.
This is a huge challenge, and it's one that I think DARPA has addressed primarily because it cannot get the very high integrity, high reliability and high security that they really need for their mission-critical applications. And this will also, hopefully, pressure mass-market proprietary software developers to do a better job.
Do you see a solution on the horizon?
What is most frustrating is when you are trying to tell somebody they have a problem, and they insist that there's no problem. I refer back to Cassandra, to whom nobody wanted to listen, but she was usually right. You never know where it's going to hit you, but you have to assume that it will. If you accept from the beginning that you're vulnerable, then you have to think about things differently. I'm suggesting that more people have to start thinking differently or we'll never crack the security problem. The Internet is waiting for its Chernobyl, and I think we won't have to wait much longer. We're already very close to the edge.
In November 1952, when Peter G. Neumann was a junior at Harvard University, Albert Einstein paid a visit to the family's apartment in New York City's Greenwich Village. Neumann's mother, a noted mosaic artist, had eight years earlier rendered Einstein's visage in Italian tilesa work that now hangs in the reference room of the main library at Boston University. Neumann's two-hour conversation with Einstein left an indelible mark on the young Neumann. "We talked about music and art, and also about complexity," Neumann recalls. "It had a huge impact on my subsequent approach to computer systems and my life." What stuck, in particular, was Einstein's playful admonition that "everything should be made as simple as possible, but no simpler."
In this month's issue, Neumann, 69, the principal scientist at SRI International's Computer Science Laboratory, evokes that lesson in a passionate discussion of how the world's information infrastructure is becoming increasingly complex and fragilea catastrophe waiting to happen. "The quality of systems and of software is horrendous and the state of engineering is abysmal," says Neumann.
In July, Neumann was named by the Defense Advanced Research Projects Agency to head up a new global project aimed at building a safer, more secure data infrastructure. But Neumanna passionate voice in the open-source software movementis also a self-described Cassandra, fairly certain of the inevitable problems that will continue to occur as a result of increasingly complex systems. "The Internet is waiting for its Chernobyl," he says, "and I don't think we'll be waiting much longer."