Alan Kay is not a fan of the personal computer, though he did as much as anyone to create it. A winner of the Turing Prize, computer scientist Kay was the leader of the group that invented object-oriented programming, the graphical user interface, 3D computer graphics, and ARPANET, the predecessor of the Internet. After helping to create the Alto, the Xerox PARC PC prototype that inspired the Apple Macintosh, he took on the role of chief scientist at Atari Corp. and became a fellow at Apple Computer, Walt Disney Co. and Hewlett-Packard Co.
While most people regard the personal computer as a modern miracle, Kay sees the PC as a chronic underachiever. To him it’s an invention that, like television, has fallen far short of the potential foreseen by its early proponents. Today, at age 66, Kay runs the Viewpoints Research Institute, his own nonprofit research organization in Glendale, Calif. He is busy with several projects involving education and technology, including the “One Laptop per Child” project overseen by MIT’s Nicholas Negroponte, which Kay hopes will one day transform the PC into a machine that not only changes the way we work, communicate and entertain ourselves, but improves how people—especially children—learn and think.
Kay believes the limitations of the PC are due as much to lack of imagination and curiosity on the part of computer scientists, the unwillingness of users to invest effort into using computers, and the deadening impact of popular culture, as they are to technical constraints. He says the push to make PCs easy to use has also made them less useful; their popularity has stunted their potential. Executive Editor Allan Alter spoke with Kay about the future of the PC. The following is an edited version of their discussion.
CIO Insight: Do you feel PCs and Macs have come close to reaching their potential yet?
Kay: No, I don’t think so. Computers are mostly used for static media, basically text, pictures, movies, music and so forth. The Internet is used as a distribution network, so computers are essentially players for this media. This is incredibly useful, but it tends to overwhelm uses that require a much longer learning curve.
When I started in computing in the early sixties, people realized that while the computer could simulate things we understood very well, one of its greatest uses was simulating things that we didn’t understand as well as we needed to. This has happened in the sciences; physicists, chemists, biologists and other scientists could not do what they’ve been doing if they didn’t have powerful computer simulations to go beyond what classical mathematics could do. But it’s the rare person who quests for knowledge and understanding.
A great thinker in our field is Doug Engelbart, who is mostly remembered for inventing the computer mouse. If you search Google you will find Doug’s Web page, where there are 75 essays about what personal computing should be about. And on one of the early hits you can watch the demo he gave in 1968 to 3,000 people in San Francisco, showing them what the world of the future would be like.
Engelbart, right from his very first proposal to ARPA [Advanced Research Projects Agency], said that when adults accomplish something that’s important, they almost always do it through some sort of group activity. If computing was going to amount to anything, it should be an amplifier of the collective intelligence of groups. But Engelbart pointed out that most organizations don’t really know what they know, and are poor at transmitting new ideas and new plans in a way that’s understandable. Organizations are mostly organized around their current goals. Some organizations have a part that tries to improve the process for attaining current goals. But very few organizations improve the process of figuring out what the goals should be.
Resources |
Viewpoints Research Institute Squeak eToys Croquet NSF Reinventing Programming Project Doug Englebart: |
Most of the ideas in that sphere, good ideas that would apply to business, were written down 40 years ago by Engelbart. But in the last few years I’ve been asking computer scientists and programmers whether they’ve ever typed E-N-G-E-L-B-A-R-T into Google-and none of them have. I don’t think you could find a physicist who has not gone back and tried to find out what Newton actually did. It’s unimaginable. Yet the computing profession acts as if there isn’t anything to learn from the past, so most people haven’t gone back and referenced what Engelbart thought.
The things that are wrong with the Web today are due to this lack of curiosity in the computing profession. And it’s very characteristic of a pop culture. Pop culture lives in the present; it doesn’t really live in the future or want to know about great ideas from the past. I’m saying there’s a lot of useful knowledge and wisdom out there for anybody who is curious, and who takes the time to do something other than just executing on some current plan. Cicero said, “Who knows only his own generation remains always a child.” People who live in the present often wind up exploiting the present to an extent that it starts removing the possibility of having a future.