Analysis: Data InfrastrutureBy Keith Epstein | Posted 09-01-2001
Analysis: Data Infrastruture
It happened over a winter weekend in 2000, in a small bedroom in rural Wales littered with cigarette butts, New Age books and empty soda cans. There, using a no-name computer cobbled together from spare parts, 18-year-old Raphael Gray sat down at his keyboard and, fueled by what he called "sheer adrenaline," proceeded over the next 48 hours to click and pry his way through state-of-the-art firewalls located on corporate servers and databanks halfway round the world.
Before he finished, Gray, a.k.a. "Curador," had lifted some 26,000 credit card numbers from corporate Web sites hosted by companies in five countries-all by exploiting years-old software bugs that most teen hackers learn about before they can drive. "My initial reaction was that this couldn't be happening," recalls Chris Keller, founder of Buffalo, N.Y.-based SalesGate, Inc., one of the companies Curador hacked that weekend. "We had things set up and things in place so that kind of thing would never happen."
But it does-much more than many CIOs may realize. It's not news that hackers are often way ahead at exploiting the ever-growing number of bugs in complex software programs that can turn even state-of-the-art firewalls into the cyber equivalents of Swiss cheese. As you read this, there are at least 50,000 computer viruses crawling along the branches of the Web searching for glitches to exploit, and hundreds more are created each day-with graffiti-like names such as Chernobyl, SirCam, LoveLetter and-this just in-a new wireless virus called AirSnort, released on the Internet in late August. The program lets "whackers"-wireless hackers-snatch sensitive data as it is transmitted through the air.
But the Raphael Grays and AirSnort programs of the world are not what worry the nation's top information scientists and security experts the most. "All that stuff is child's play," says Eugene Spafford, co-founder of Carnegie-Mellon University's national Computer Emergency Response Team and the director of the Center for Education and Research in Information Assurance and Security at Purdue University. Every other kid, he says, can hack into a computer network, download ready-made viruses off the Net or flood servers to a standstill with a barrage of fake e-mail.
What really worries security experts like Spafford is something far more threatening: Everyone from teenagers to terrorists and hostile governments now has the ability to blast away at the very foundations of the nation's fragile digital grid by crashing satellite systems, unplugging the Federal Reserve System from Wall Street, even taking down the phone system and disrupting the movements of the stock exchange. Indeed, some of these things are already happening, the result of either hacking or random software bugs that bite without warning. On June 28 and 29, Nasdaq trading was disrupted for nearly two hours when software bugs surfaced during routine testing of the exchange's systems. During the peak of California's electricity crisis, prankster hackers cracked into a segment of the electrical grid and left messages that made it clear that with a little more sophistication, they could cripple power plants, water systems and hospitals. Perhaps most frightening is an operation the Pentagon calls Moonlight Maze. In 1998, the Defense Department discovered that professional hackers using Internet connections based in Russia had been stealing secrets from the DOD and its top research labs. The hackers remain at large and their efforts are continuing, unabated.
Much of the problem, experts say, is bad software and the realization that data systems-much like the physical infrastructure of roads, bridges and electrical power grids-weren't built very well in the first place. "The reason hackers can feast is because software bugs give them entry, and patches give them entry again," says Spafford. For example, of 1,200 Department of Commerce workstations scanned recently for problems, fully 30 percent had Category Red vulnerabilities, meaning they could be hacked by any grade-schooler with a penchant for trouble.
With thousands of computers now guiding ambulances, bank accounts, police dispatch units, fire brigades, transportation switches, overnight-mail dispatching schedules, nuclear power plant fail-safe devices and telecommunications grids, government and corporate interests are becoming intertwined as never before-and equally susceptible to politically motivated attacks and error-spurred outages. In short, whatever threatens our vast data grid, what's bad for the Pentagon and the Department of Energy is very bad for General Motors and the rest of corporate America. "We're all connected now by information networks in ways we've never been before," Senator Robert Bennett (R-Utah), told a Senate panel on security in July. John Tritak, director of the Department of Commerce's Critical Infrastructure Assurance Office, describes the increasingly interconnected world as "a growing web of dependencies in a veritable digital nervous system," noting that "what happens in one sector could very well have serious impact on another." Nobody is immune from the risks, and there is no such thing as safe networking. "There is no Internet pixie dust. People who think there is absolute security protection are deluding themselves," says Steve Bellovin, a security expert at AT&T Labs.
The Internet boosts the vulnerability stakes. "Every year, there's new research, good technology and good products, yet every year the situation gets worse," says Bruce Schneier, a cyrptographer and author of Secrets and Lies: Digital Security in a Networked World. Schneier says that as technology evolves into more sophisticated uses and flavors, it also becomes more complex-too complex to be secure. AT&T's Bellovin takes it even further, suggesting that we're only a few doomsday clicks away from the ability to bring down the entire Internet-and therefore, pretty much everything with it. "Systems and software complexity is the enemy," Schneier says. "The Internet is the most complex machine mankind has ever built. Every year it will get more and more complex and less and less secure."
But don't look to Uncle Sam for any quick fixes. So far, the government's efforts to fight back have been largely rhetorical and, say congressional auditors, frustrated by political infighting and ineptitude. According to a report issued in April by the General Accounting Office, the congressional watchdog agency, the three-year-old National Infrastructure Protection Center-created by the Clinton White House to mobilize government and the private sector to build a safer information grid-is being hampered by lack of expert staff, outdated computer equipment, bureaucratic snafus and chronic underfunding. "The NIPC has developed only limited capabilities for strategic analysis of threat and vulnerability and often is not able to provide timely information on changes in threat conditions or warnings of imminent attacks," said the report. Further, the GAO said, "the NIPC does not yet have adequate staff and technical expertise." Example: The chief of the analysis and warning section position, which was to have been filled immediately by the Central Intelligence Agency, went vacant for nearly 18 months, and the agency continues to operate with only 13 of 24 analysts that NIPC officials estimate are needed to develop even minimally sophisticated threat-analysis abilities. "As a result, there are no specific priorities, milestones or program performance measures to guide NIPC actions or to provide a basis for evaluating its progress," the GAO report said.
And don't look to private industry for help, either. GAO investigators also cited a distressing lack of industry cooperation with government emergency response teams. "Establishing the trusted relationships and information-sharing protocols necessary to support such coordination have met with mixed success," the report said. For example, while the Federal Bureau of Investigation has identified more than 5,000 "key assets" in the nation's critical infrastructures-from leading research labs and universities to hospitals, telecommunications lines, rail and shipping routes, air and data traffic, fiber-optic cables, chemical storage sites and medical supplies, for example-agents "have not yet been successful in obtaining the agreement of the industry sectors responsible for these assets" to get a precise idea of where the grid is most vulnerable. In some instances, agents seeking to compile such data say they've had to rely on the Yellow Pages to get certain information about the companies. Further, only one industry, the electric utilities sector, is communicating regularly about potential intrusions and vulnerabilities with NIPC investigators. Everyone else-from telecommunications to manufacturing-is mostly holding back.
The private sector has its own problems to wrestle. Many corporations are relying too heavily on technology to ease the security threat, experts say. And that is creating a Maginot Line mentality among many CIOs, lulling them into believing that their computers and systems are safe. According to a February 2001 poll taken by Menlo Park, Calif.-based RHI Consulting of 1,400 CIOs across the U.S., 91 percent said their systems are secure from error-prone collapses and cyberattacks. In a June survey by CIO Insight, a majority of 556 CIOs and senior IT executives polled rated security as being a big issue for companies of all sizes, but said their non-IT colleagues simply didn't share their concern.
Heads in the Sand
Heads in the Sand
Meanwhile, more than half of the respondents said their IT departments had not performed a formal risk assessment to check their organization's level of security risk. Says AT&T Labs' Bellovin: "The fact that viruses and bugs are showing up inside firewalls, which so many CIOs still believe will protect them-is a real warning sign." Adds Schneier: "Most CIOs still have their heads in the sand."
CERT's Spafford says the problem starts with programmers who don't understand security issues, and system administrators and CIOs who are either insufficiently concerned about security or going to the wrong sources for advice and protection. "There are a lot of solutions out there that aren't too far removed from snake oil," he says.
In addition, too many CIOs think most security threats are overstated-by government officials, in the media, and by vendors and consultants with something to sell. "All this talk about the sky falling or our lack of security on the public Internet destroying us is nonsense," says Craig Miller, a CTO for Johannesburg, Africa-based Dimension Data p.l.c., an Internet services company. "The fact is we now have the knowledge to engineer around it. Provided everybody is diligent, you can keep things working."
But there's the rub: Not all companies are diligent. The threats of "doomsday" worms, breakdowns and break-ins have become all too real, and companies can no longer afford to rely on one-time technological fixes or vendor assurances that aren't backed up by at least minimal service guarantees. The real danger is complacency, and that's especially bad news if the lines of code guarding the gates are riddled with defects.
Research conducted in 2000 by Watts Humphrey, a fellow at the Software Engineering Institute and formerly IBM's director of programming quality and process, showed that programmers inject an average of 100 defects into every 1,000 lines of code they write. Most programmers write code as fast as they can, compile it and worry about quality only later, once the product is in tests. "The fundamental problem," says Humphrey, "is that many defective programs work. Nobody catches the flaws. Nobody looks for them. So it's like a minefield that explodes on you later. More than 90 percent of all break-ins are taking advantage of defects in software. That's all Code Red is-something that takes advantage of a very common defect."
To be sure, the modern world would be unimaginable without software and the Net. But the challenges of the new millennium, say experts, will be to exterminate the most pernicious bugs, bolster security and bring software quality and security to the same levels of safety and reliability that we already expect from our cars, refrigerators and televisions.
Unfortunately, software is exempt from most lemon laws and product liability suits, and software engineering remains a fairly primitive activity. The vast majority of computer code is still handcrafted from raw programming languages by artisans using techniques they neither measure nor can repeat consistently. "It's like musket-making before Eli Whitney," says Brad Cox, a professor and computer and distance-learning expert. "If we're ever going to lick this security crisis, we're going to have to stop this hand-to-mouth, every-programmer-builds-everything-from-the-ground-up, preindustrial approach."
The inconsistency of software development and the growing complexity of systems of every kind can turn even the simplest software programs, over time, into Russian roulette. Consider Microsoft software. More features mean more lines of computer code, and thus, more risk. Windows 3.1, released in 1992, had some 3 million lines of computer code. Windows NT 4.0 had 16 million lines of code. Windows 2000 has about 42 million lines. Security experts say that any program with one bug in 10,000 lines of code is unusually well-written. The new Star Wars initiative would require 60 million lines of code and be expected to work right the first time it's deployed. "The chance of 60 million lines of code working correctly the first time is so small as to be less than the molecular density of an atom," says Peter G. Neumann, a security expert and the principal scientist at SRI International's Computer Science Laboratory. In 1998, an AT&T frame-relay collapsed, taking down long-distance data networks worldwide for up to 26 hours in some locations, causing companies to lose billions of dollars. The problem, according to Neumann: a four-line patch of code that wasn't configured correctly. "There were no hackers involved in that one, just sloppy code," he says.
Indeed, rather than boosting software quality we mostly issue patches-more and more every year. More than 40 patches have been issued by Microsoft this year for its most common platforms, Windows, Internet Explorer and Outlook. More than 70,000 known virus attacks have established vulnerabilities in Microsoft software, says CERT's Spafford, but "how many people and companies can install a new patch every five days?"
Even the experts can get foiled. At a gathering of 2,500 security specialists at an Internet Engineering Task Force meeting in London in July, more than a dozen laptops were infected with Code Red; in some cases, the infection spread to the companies as e-mail was sent to and from the conference. "We never learn," Neumann says. "Twenty years ago we knew about buffer overflow problems"-the type of crude software bugs that enabled this summer's Code Red I and Code Red II software worms. "That we still have such problems is reflective of the abysmal state of engineering."
Adds Paul Strassmann, a former CIO for Xerox and for the Defense Department who now heads a private consulting firm: "Software easily rates among the most poorly constructed, unreliable and least maintainable technological artifacts invented by man." And costly. All told, says The Standish Group, bad software cost U.S. businesses $100 billion in lost productivity last year.
None of this needs to be as bad as it is, according to those who scrutinize the way software gets created and developed. Humphrey, author of an upcoming book aimed at CIOs, titled Winning With Software: An Executive Strategy, coaches engineers at big companies such as Honeywell and Xerox to think differently, and to demand software quality from vendors. "It's the only way the fixes will get made-up front-and companies and industries can become less vulnerable to cyberattacks," Humphrey says. "As long as customers don't demand quality, they won't get it, and they'll be vulnerable."
Patching and Praying
Patching and Praying
Before they buy and install new software, or agree to modifications, Humphrey suggests CIOs and system administrators ask vendors: Are your people trained to do quality work? Do you measure and manage quality? What quality measurements do you seek? How do you track performance against quality goals? What do you do if they're not met? What will you do if defects cause problems? Most CIOs, Humphrey says, "have no idea. Most of them don't even know it but they're suffering from software quality issues all the time."
The consequences can be severe. Software glitches are already "causing chaos and costing vast sums," he says. "You could go out of business over this. You could lose money, lose your competitive edge. If you don't pay attention to this, a lot of things could happen. Most companies today are in the software business, whether they know it or not-and to the extent they're dependent on the Internet, almost everyone is vulnerable."
Spafford agrees that CIOs need to do a better job boosting quality as a strategy to improve security. He also says he believes things are worse now than they have to be because most CIOs are "ignorant of the alternatives, the risk, and the bottom-line costs" of depending on inadequate software, systems and safeguards. "They aren't setting their priorities appropriately," he says. CIOs, he says, should be making purchases with "the whole picture in mind"-the stability, vulnerability and quality of the next generation of software under consideration.
But to secure systems correctly means having to spend more money, and often, it's not in a CIO's best political interest to do that. Meanwhile, convincing higher-ups that security is no longer just IT's responsibility can be difficult. More and more companies are shifting the security burden to specialists. International Data Corp. projects that aggregate global spending on IT security, already about $14 billion, will grow by nearly 37 percent annually, reaching $20 billion by 2004. In the U.S., the $3.4 billion e-security market is expected to exceed $8 billion over the next three years. Still, companies that outsource their security may be risking too much. "Why would anyone want to outsource the protection of their crown jewels?" says SRI's Neumann. "Whether it's outsourcing your code or your system administration or your management or your analysis, there are huge security vulnerabilities."
One way to get around the complexity-breeds-bugs problem, some security experts say, is the open-source movement, which draws programmers together from around the globe to continuously develop and debug major programs. The Net provides a platform for such collaboration and an instant feedback channel when things go wrong. But open-source server software like Linux, an increasingly popular alternative to closed-source Windows NT, for example, is no panacea. "It still requires all the discipline in software development and maintenance and administration and patching and all that stuff," Neumann says. "But it gives you a greater opportunity to fix things." In Neumann's view, it's better than proprietary code, which cannot be reviewed by a community.
Microsoft, for its part, insists that its products are kept "security strong" by patches and new versions of programs that correct old problems. And in fairness to Microsoft, experts say, quality comes at the cost of convenience to customers. Fixing all bugs before release would not only be impractical business-wise, but nearly impossible. "Tell someone at Microsoft to delay releasing a new product so that all the bugs can be worked out, and they'd promptly show you the door,"Schneier acknowledges.
Schneier suggests that a good insurance policy might well be the ultimate way to force fast improvements in software quality. "What will happen when the CFO looks at his premium and realizes that it will go down 50 percent if he gets rid of all his insecure Windows operating systems and replaces them with a secure version of Linux?" Schneier asked members of the Senate Commerce Subcommittee on Science, Technology and Space during a July hearing on infrastructure vulnerabilities in Washington. "The choice of which operating system to use will no longer be 100 percent technical. Microsoft and other companies with shoddy security will start losing sales because companies didn't want to pay the insurance premiums."
For their part, insurance companies are starting to step up to the plate-but so far, few businesses are buying. In 2000, Marsh, Inc. and Lloyd's of London began offering e-business protection policies in conjunction with American International Group Inc., Chubb Corp. and Zurich North America Surety & Financial Enterprises. The policies cover privacy, content and software code infringement, attacks, viruses, programming errors, theft of information and fraud. Lloyd's said in May that its business for its e-Comprehensive policy has almost doubled in the past year. Still, a July 2001 survey by the Human Resource Institute and Eckherd College for the industry shows that less than 24 percent of businesses polled have business-interruption insurance, and less than 13 percent of those have virus transmission coverage.
Even so, there's still no substitute for better in-house security strategies and better systems management. Schneier, who developed the widely-known Twofish information-scrambling security code, personifies that thinking. He started out believing technology could trump humans in the fight for better security, but decided that mathematical algorithms were no match for the chaos of human error. So he founded his own security consultancy, Counterpane Internet Security, to fight the problem on a broader front. "Concentrate on human solutions and you'll get better results from your antivirus efforts," says Schneier, now Counterpane's CTO.
But don't tell hackers like "Count Zero," who developed the "Back Orifice" Trojan horse program, which allows hackers to spy on any computer running Windows 95 or later. In the coming age of information appliances, he told PBS Frontline in a documentary on hackers in February, "everything becomes computerized. Your refrigerator will tell your watch that you need milk, and your watch will then speak to you and say, 'Hey, why don't you go and pick up some milk?' All of this will be part of a global conversation that happens in this digital world. That's the main reason I'm convinced that this Internet world is just going to come crashing down."
Bravado? Maybe not. Says Spafford: "We're heading into a future where every 30 minutes there's a new computer virus, where there will be dozens of networkwide denial of service attacks every day, and where your personal data isn't safe because the system gets broken into. It's not just business that's in for it, but government agencies and utilities. We're building up an infrastructure on an incredibly unstable foundation. And so far, we're lucky none of these people playing around are serious anarchists or criminals. But how long do you think that's going to last?"
Keith Epstein is a freelance writer based in Fairfax, Va. CIO Insight researcher Barbra Kiss contributed to this report. Comments on this article can be sent to firstname.lastname@example.org.
What Can Go Wrong
What Can Go Wrong
SOURCE: INTERNET RISKS FORUM DIGEST / PETER G. NEUMANN / COMPUTER SCIENCE LAB / SRI INTERNATIONAL
In 1998, AT&T suffers a massive, software-triggered collapse of its frame-relay network. For up to 26 hours in some locations, hundreds of multinational companies can't send data between offices, losing billions of dollars.
Software coding errors in Therac-25 radiation therapy machines allow massive overdoses of radiation to be administered to cancer patients between June 1985 and January 1987, causing four deaths and one injury.
A crippling, deeply embedded software glitch in the system used by CTB/McGraw-Hill to grade standardized tests in 1999 makes scores lower than they actually are, mistakenly sending nearly 9,000 students from six states packing off to summer school.
Buggy software and failed efforts in 1996 to merge IT systems after Union Pacific Railroad acquired Southern Pacific triggered a 40-day freight-train gridlock throughout the Southwest. UP's cost: $633 million. Clients' cost: about $2 billion in lost sales.
The glitch-riddled, $191 million baggage-delivery system at Denver International Airport-100 networked computers, 5,000 electric eyes, 400 radio-frequency receivers and 56 barcode scanners-shreds and loses luggage, forcing a 16-month airport opening delay.
Because a computer processor crashes a dozen times on May 17, 1999, air traffic controllers in Philadelphia temporarily lose track of planes and erase some data on each flight. Similar glitches occur again three nights later.
E*Trade, Charles Schwab and Ameritrade suffer consecutive days of computer crashes in February 1999, caused by software bugs and a deluge of trading that exceeds system capabilities. Many E*Trade and Schwab customers lose thousands of dollars in mid-trade outages.
Empire Blue Cross is forced to write off $50 million in uncollected insurance claims due to faulty software dating back 10 years, which, among other problems, cannot compute numbers greater than 100,000.
In May 2000, the ILOVEYOU virus, written by a teenager in the Philippines, clones itself worldwide and strikes 45 million computers in 20 countries, wiping out eight kinds of files at hundreds of companies, including AT&T, Microsoft, Merrill Lynch and Ford Motor Co.
The First National Bank of Chicago wrongly deposits $763.9 billion into customer checking and savings accounts in 1996 due to a glitchy software upgrade.
General Motors Corp. recalls 292,860 Buicks, Oldsmobiles and Pontiacs in 1996 because of engine fire problems potentially triggered by bugs and coding errors in the models' Powertrain Control Module.
In September 1995 a software snag in Bell Atlantic's switches sends emergency 911 calls in Richmond, Va., to a customer named Rosa Dickson. For a frantic half hour, Dickson fields the urgent calls herself and passes messages to police.
During the Gulf War, a software glitch is partly to blame for throwing off a Patriot missile's timing by one-third of a second-enough to miss an Iraqi Scud missile that on Feb. 25, 1991, killed 28 soldiers and wounded 98 in Saudi Arabia.
In 1996, MCI refunds $44 million to customers who were charged an extra minute for collect calls during the previous three years due to a software problem.
Los Angeles County pension fund contributions over 20 years fall short by $1.2 billion, due to decades-old computer "calculation errors" that went undetected for years.
In 1990, AT&T long-distance lines are knocked out for 11 hours across the United States due to a minor programming error in switching software.
A Samsonite Corp. software system upgrade causes factory forklifts to run amok and computers to shut down, freezing deliveries of back-to-school orders for three weeks and hampering operations for months. Loss: $4 million in profits and $10 million in sales.
In January 1999, authorities discover that 600,000 cubic yards of silt had been dumped in the wrong spot off the Los Angeles coast, thanks to bad data fed to the Global Positioning Satellite system used to locate the site.
Over 17 days during California's rolling blackouts in early 2001, hackers break into computers belonging to the Independent System Operator, which runs the state's electricity transmission grid.
In April 1999, the Chernobyl computer virus-timed to strike 13 years to the date after the Russian nuclear disaster-wipes out hundreds of thousands of hard drives and crashes systems around the world, causing companies to lose billions of dollars.
In May, 1999, a new air traffic control system breaks down and massively delays 362 flights at New York-JFK, Newark, Philadelphia and Washington, D.C. airports. Problems occur when new computer screens are hooked up to the FAA's mainframe.
Hershey Foods' $112 million ERP system collapses in July 1999 during the peak of the back-to-school and Halloween candy-buying season. Third-quarter profits drop by 19 percent.
A 16-year-old San Fernando Valley boy using his own personal Linux system in January 2000 hacks into Pacific Bell's Internet server and lifts 200,000 customer passwords; some 63,000 subscribers whose passwords he decrypted had to come up with new ones.
A failed software upgrade forces the New York Stock Exchange to completely shut down for nearly 90 minutes on June 8, 2001, making it impossible to calculate market indexes like the Dow and S&P 500.
The Melissa virus penetrates more than 1 million computers and causes an estimated $300 million in damage when it sweeps around the world in March 1999, paralyzing e-mail systems.
A pump-station software bug triggers a power outage that causes 5.4 million gallons of raw sewage to spill into the Willamette River in downtown Portland, Ore. in 1988.
's High-Speed Spread">
Code Red's High-Speed Spread
On July 19, 2001, the first Code Red software worm spreads through certain Microsoft-engineered systems world-wide in less than 24 hours, turning some 341,015 business servers into electronic weapons aimed at Internet traffic and the White House Web site.
SOURCE: JEFF BROWN/DAVID MOORE; COOPERATIVE ASSOCIATION FOR INTERNET DATA ANALYSIS/UC REGENTS, 2001. MAPS RE-CREATED BY JACK HARRIS.
Security breaches reported in 1998
Breaches reported in 2000
Vulnerabilities reported in 1995
Vulnerabilities reported in first half of 2001
Security alert published in 1998
Security alerts published in first half of 2001
Source: Carnegie-Mellon Computer Emergency Response Team/CC Statistics
A February 2001 poll of 1,400 CIOs across the nation reveals that 91% have confidence in their network security, despite estimates that billions of dollars are lost every year to cyber break-ins.
The survey, conducted by RHI Consulting, raises eyebrows among security experts who say it's generally in a CIO's best interest to keep quiet when security breaches occur.
The CERT Guide to System and Network Security Practices
by Julia H. Allen Addison-Wesley, 2001
Secrets and Lies: Digital Security in a Networked World
by Bruce Schneier
John Wiley & Sons, Inc. 2000
The Mythical Man-Month: Essays on Software Engineering
by Frederick P. Brooks Addison-Wesley, 1995
Computer Related Risks
by Peter G. Neumann Addison-Wesley, 1995
Online compendium of computer-technology mishaps and disasters
Carnegie-Mellon University's national Computer Emergency Response Team
The National Infrastructure Protection Center