Book Review: Microchip: An Idea, Its Genesis, and the Revolution It Created
Book review: Adam Smith's invisible hand works just as well in technology advancement as it does in economics—something all managers should keep in mind.
Microchip: An Idea, Its Genesis, and the Revolution It Created
By Jeffrey Zygmont
Perseus Publishing December 2002
272 pages, $25
It is safe to say that as he wandered around a sweltering Texas Instruments complex in Dallas in the summer of 1958, Jack Kilby wasn't thinking about a future in which his grandchildren would stroll through the mall watching their favorite movie on their own portable DVD players slung around their necks. He was looking for a way to do more interesting work—and move up the corporate ladder.
And there is little doubt that when Robert Noyce got fed up with the bizarre genius of William Shockley, and left Shockley Semiconductor to help found Fairchild Semiconductor in 1957, he wasn't imagining that the invention he was working on would be found everywhere from rocket ships to those annoying musical greeting cards. He was just trying to improve his lot in life.
But without Kilby and Noyce, who are jointly credited with the creation of the integrated circuit—most people refer to it as the microchip—what we refer to as the "Information Age" would never have happened.
As business writer Zygmont, author of The VC Way, points out, "The current age is often called the Computer Revolution, but that's just because computers are the most obvious representation of the change. The boxes with keyboards and screens that sit on every desk didn't exist until the new capability called microchips made them possible."
And what made microchips possible was not a government-directed program or a fuzzy corporate mandate to be "more innovative." It was people maneuvering to do what they like to do best—and hoping to be rewarded for it.
Zygmont shines the spotlight on both little-known innovators and the better-known names who created the computer chip and figured out what to do with it. The first part of the book is devoted to relatively unknown heroes such as Kilby, who was awarded the Nobel Prize in Physics two years ago for building the first electronic circuit in which all of the components were fabricated in a single piece of semiconductor material. The author then moves on to the people who put the chip to work—innovators such as Gordon Moore of Intel and An Wang, who founded Wang Computer.
While the technology discussion will move too fast for non-techies, Zygmont does an excellent job of putting the early days of the semiconductor industry into perspective.
If that was all he did, the book would be worthwhile as history. But implicit in the tale is perhaps the best way to go about conducting technology research. Again and again in the microchip's history, a huge market need is identified—be it more reliable rockets in the 1960s, faster personal computers in the 1980s or more dependable and efficient cars today. Researchers at various semiconductor companies are then charged with solving the problem. They are given no prescribed course of action; instead, competitive pressure—be it between Texas Instruments and Fairchild in the early 1960s or between Intel and AMD today—is enough to make sure innovation occurs. And giving recognition and rewards makes sure that people work their hardest to achieve corporate goals.
Adam Smith's invisible hand works just as well in technology advancement as it does in economics—something all managers should keep in mind.
Reviewed by Paul B. Brown, the author of 13 business books including the international bestseller Customers for Life. Doubleday just published an updated version of the book, written with Carl Sewell.
The Role of Standards in Cloud Security
Security is often cited as a primary cause for concern...Watch Now
Ensuring Resources for Mission Critical Workloads
Application workloads can thrive in cloud environments,...Watch Now
Improving Security in the Public Cloud
One of the main concerns about moving data to a public...Watch Now