Know the Risk: Digital Transformation's Impact on Your Business-Critical Applications REGISTER >
If you compare LEO with an off-the-shelf desktop PC of 2001, you see that 50 years of technological progress have increased memory capacity almost a million-fold and processor speed by a factor of roughly a thousand. Meanwhile, prices have fallen by another factor of a thousand. Although these measures of computing efficiency are rather crude, when you combine them in the obvious way, they suggest that one dollar's worth of computing today is equal to nearly a trillion dollars' worth in LEO's era. This represents an extraordinary record of sustained exponential growth in computer power per unit cost. But while toasting our good fortune, we might pause to ask why business productivity has not also increased a trillion-fold.
The architects of the LEO system already had a glimmer of an answer to this question. They recognized that computing horsepower was of little use if it could not be effectively harnessed to the problem at hand. Caminer writes, "It was plain that applications for the computer had to be defined in such a way that an item of data, once taken into the system, must automatically be used and reused for every purpose in which it played a part."
John Aris, another LEO veteran, echoes that theme. "Getting anything into a computer is a major effort, so make the most of it," he says. "Understand the system in its entirety. Plan it as a whole. Rethink, rather than automate, what is there. Maximize the savings in clerical effort, stock holdings, response time to customers or whatever. Make the data sweat by producing management information as well as transactions. Don't leave what could be done effectively by computer to be done by hand."
In retrospect, all these entreaties to system integration and business re-engineering suggest a hypothesis: that the factor limiting computer efficiency was never memory capacity or processor speed, but rather the awkward interface between the computer and the outside world. In the 1950s, and for some decades afterward, the way to alleviate this congestion at the interface was to speed input and output. When a computer was viewed as a tool for converting a deck of punch cards into a heap of printouts, it made sense to invest first in faster keypunches and line printers.
Only in recent years has another strategy become possible: not to improve the interface but to eliminate it, by moving the whole core of the business permanently into the computer. Input and output cease to be a serious constraint when everything of importance lives and dies inside the computer and never leaves that digital environment. With ubiquitous networking, with industry standards for data interchange and electronic commerce, and with the attitude that digital information is primary and archival, the merits of this idea can finally be put to the test.
Brian Hayes writes about science, technology and mathematics for American Scientist magazine and other publications. Comments on this story can be sent to email@example.com.