Heads in the Sand
Meanwhile, more than half of the respondents said their IT departments had not performed a formal risk assessment to check their organization's level of security risk. Says AT&T Labs' Bellovin: "The fact that viruses and bugs are showing up inside firewalls, which so many CIOs still believe will protect them-is a real warning sign." Adds Schneier: "Most CIOs still have their heads in the sand."
CERT's Spafford says the problem starts with programmers who don't understand security issues, and system administrators and CIOs who are either insufficiently concerned about security or going to the wrong sources for advice and protection. "There are a lot of solutions out there that aren't too far removed from snake oil," he says.
In addition, too many CIOs think most security threats are overstated-by government officials, in the media, and by vendors and consultants with something to sell. "All this talk about the sky falling or our lack of security on the public Internet destroying us is nonsense," says Craig Miller, a CTO for Johannesburg, Africa-based Dimension Data p.l.c., an Internet services company. "The fact is we now have the knowledge to engineer around it. Provided everybody is diligent, you can keep things working."
But there's the rub: Not all companies are diligent. The threats of "doomsday" worms, breakdowns and break-ins have become all too real, and companies can no longer afford to rely on one-time technological fixes or vendor assurances that aren't backed up by at least minimal service guarantees. The real danger is complacency, and that's especially bad news if the lines of code guarding the gates are riddled with defects.
Research conducted in 2000 by Watts Humphrey, a fellow at the Software Engineering Institute and formerly IBM's director of programming quality and process, showed that programmers inject an average of 100 defects into every 1,000 lines of code they write. Most programmers write code as fast as they can, compile it and worry about quality only later, once the product is in tests. "The fundamental problem," says Humphrey, "is that many defective programs work. Nobody catches the flaws. Nobody looks for them. So it's like a minefield that explodes on you later. More than 90 percent of all break-ins are taking advantage of defects in software. That's all Code Red is-something that takes advantage of a very common defect."
To be sure, the modern world would be unimaginable without software and the Net. But the challenges of the new millennium, say experts, will be to exterminate the most pernicious bugs, bolster security and bring software quality and security to the same levels of safety and reliability that we already expect from our cars, refrigerators and televisions.
Unfortunately, software is exempt from most lemon laws and product liability suits, and software engineering remains a fairly primitive activity. The vast majority of computer code is still handcrafted from raw programming languages by artisans using techniques they neither measure nor can repeat consistently. "It's like musket-making before Eli Whitney," says Brad Cox, a professor and computer and distance-learning expert. "If we're ever going to lick this security crisis, we're going to have to stop this hand-to-mouth, every-programmer-builds-everything-from-the-ground-up, preindustrial approach."
The inconsistency of software development and the growing complexity of systems of every kind can turn even the simplest software programs, over time, into Russian roulette. Consider Microsoft software. More features mean more lines of computer code, and thus, more risk. Windows 3.1, released in 1992, had some 3 million lines of computer code. Windows NT 4.0 had 16 million lines of code. Windows 2000 has about 42 million lines. Security experts say that any program with one bug in 10,000 lines of code is unusually well-written. The new Star Wars initiative would require 60 million lines of code and be expected to work right the first time it's deployed. "The chance of 60 million lines of code working correctly the first time is so small as to be less than the molecular density of an atom," says Peter G. Neumann, a security expert and the principal scientist at SRI International's Computer Science Laboratory. In 1998, an AT&T frame-relay collapsed, taking down long-distance data networks worldwide for up to 26 hours in some locations, causing companies to lose billions of dollars. The problem, according to Neumann: a four-line patch of code that wasn't configured correctly. "There were no hackers involved in that one, just sloppy code," he says.
Indeed, rather than boosting software quality we mostly issue patches-more and more every year. More than 40 patches have been issued by Microsoft this year for its most common platforms, Windows, Internet Explorer and Outlook. More than 70,000 known virus attacks have established vulnerabilities in Microsoft software, says CERT's Spafford, but "how many people and companies can install a new patch every five days?"
Even the experts can get foiled. At a gathering of 2,500 security specialists at an Internet Engineering Task Force meeting in London in July, more than a dozen laptops were infected with Code Red; in some cases, the infection spread to the companies as e-mail was sent to and from the conference. "We never learn," Neumann says. "Twenty years ago we knew about buffer overflow problems"-the type of crude software bugs that enabled this summer's Code Red I and Code Red II software worms. "That we still have such problems is reflective of the abysmal state of engineering."
Adds Paul Strassmann, a former CIO for Xerox and for the Defense Department who now heads a private consulting firm: "Software easily rates among the most poorly constructed, unreliable and least maintainable technological artifacts invented by man." And costly. All told, says The Standish Group, bad software cost U.S. businesses $100 billion in lost productivity last year.
None of this needs to be as bad as it is, according to those who scrutinize the way software gets created and developed. Humphrey, author of an upcoming book aimed at CIOs, titled Winning With Software: An Executive Strategy, coaches engineers at big companies such as Honeywell and Xerox to think differently, and to demand software quality from vendors. "It's the only way the fixes will get made-up front-and companies and industries can become less vulnerable to cyberattacks," Humphrey says. "As long as customers don't demand quality, they won't get it, and they'll be vulnerable."
This article was originally published on 09-01-2001