Vint Cerf: Keeping the Internet HealthyBy Edward Cone | Posted 09-25-2008
Vint Cerf: Keeping the Internet Healthy
Success is said to have a thousand fathers, and many people share credit for bringing the Internet into existence. Even so, the title "Father of the Internet" fits Vinton G. "Vint" Cerf better than most; he and Robert Kahn designed the TCP/IP protocols that govern data transfer across the Net, along with the Internet's basic architecture. The two men were awarded the Presidential Medal of Freedom, the nation's highest civilian honor, in 2005.
Cerf, who holds a Ph.D. in computer science, worked for many years at the U.S. Department of Defense's Advanced Research Projects Agency, where the Internet was incubated. Starting in the early 1980s, he held senior positions at MCI and the Corporation for National Research Initiatives. He's now vice president and chief Internet evangelist for Google, where he looks for "new enabling technologies and applications on the Internet and other platforms."
A doting parent to the Net, Cerf has served as chairman of the board of the Internet Corporation for Assigned Names and Numbers, founding president and board member of the Internet Society, and visiting scientist at the Jet Propulsion Laboratory. His list of professional associations and fellowships is almost as long as his list of awards, commendations and honorary degrees. He spoke about the future of the Internet, technology policy and competitiveness with senior writer Edward Cone. This is an edited version of their conversation.
CIO Insight: You've mentioned the importance of a national technology policy, but you favor a distributed approach.
Vint Cerf: I worry about the idea of trying to centralize everything. The Washington tactic is, when there's a problem, you appoint a czar, and the czar is responsible. It's like the War on Drugs, or the War on Poverty. But it never quite works; you don't get very good solutions.
That's because the economy is highly distributed, and our entire governmental structure is highly distributed, so what you're looking for is to infuse our very distributed environment with certain postures and principles that will influence people's decisions, whether it's a company CEO or a policy maker somewhere in the governmental structure, whether it's local or state or national.
Drawing on the technical community at different levels of government is what I'm looking for here. We have lost a great deal of that input over the course of the last eight years. I'd like to see the reconstitution of bodies providing technical input to policy makers. That is really valuable, and not just at the national level, but at the state level, and maybe even at the local level, when you're talking about infrastructure development, broadband access to Internet, things of that kind, you want some locally sensible decision-making that's driven in part by technology and economics.
If you want to draw attention to the importance of technology in policy making at the national level, perhaps you do need to have a cabinet-level person. I would compare it to the evangelist position I have at Google. I don't make decisions: I don't believe it's appropriate, but I can lobby like crazy in every venue where people will listen. It's encouraging people to draw on valuable and distributed resources of information that strikes me as the most important outcome.
A National CTO?
The Obama campaign has talked about naming a national chief technology officer.
Cerf: If there were such a position, whether a CIO position or a CTO, as the Obama campaign refers to it, having that position in the cabinet leads to the question, What does that party actually do? Does that party have a budget? Will the organization formed under this position have authority for certain things and, if so, what will they be?
The worst thing is to have a position where all you can do is say "no," because if you say "yes," you can't afford to pay for anything. That's a source of frustration for a number of people in the private sector who serve as chief technology officers: If they don't have budget and staff, it's very hard to make something happen.
Your advocacy for network neutrality carries some weight, given your role in Internet history. What's your thinking on the issue?
Cerf: This is more complicated than it looks. The debate was boiled down to bumper stickers for a while, which was not helpful in terms of understanding what the issues are.
Openness to new applications, openness to devices that are compatible--those things are important to us. At Google, we take the view that the providers of Internet access should not take advantage of their access position to interfere with people offering competitive applications to the applications provided by the underlying transport and access provider. We don't think that's a good thing from the consumer point of view, and certainly not from the innovative point of view.
On the other hand, the opponents of this position argue that the Net neutrality people were saying that every bit has to be treated identically, and that you couldn't charge more for more capacity and so on, and that's not the case. I accept the idea that you do have to manage your network and there may be more demand than there is capacity. What you have to do at that point is somehow share the available capacity in a fair way. It might be that somebody has purchased higher capacity to the Net than you have, and that person should have more access to the Net than you do, even in congested conditions, but on a pro rata basis.
Should the network access providers have the responsibility and the ability to defend against denial of service attacks or other problems? Absolutely. What about low-delay stuff, low-latency things like voice? There's no disagreement that you could treat some things differently. That would work for things like file transfers or e-mail. This is a fairness issue. It doesn't mean you can't manage your network; it means you have to do it in a fair way.
You've sounded the alarm on another problem: the inaccessibility of data stored in outmoded formats, which is known as "bit rot."
Cerf: Bit rot is a stunningly big problem. It's very real for any company that plans to have any longevity. We're only seeing the beginning elements of it, and we're being inundated with new information. Over time, we will need increasing access to that older information. I'm already experiencing problems, like TIFF images that aren't interpretable, JPEGs that aren't interpretable as I move from one software base to another, and e-mail that isn't readable or has attachments that have gotten lost. Things like that are quite frustrating and critically troublesome.
We need to step back and think about how to combat this tendency to lose information. I suspect that there are some tough intellectual property issues built into this problem. What happens if a piece of software is no longer being supported; do we still have access to it? Under what ground rules and what conditions?
Is it required that it be made available as source code? Do you have to provide it online, in the cloud somehow, so that people have access to the functionality? I don't think there are any rules right now. I suspect that we need to ask what we should do in order to ensure that information that's important to us is accessible.
The historians, of course, are beside themselves because more and more information about our society is in online form. We start to lose track of what people did, and what actually happened, because we can't see it anymore, can't read it. Even though the bits are there, we don't know how to interpret them.
If there were a chief technology officer in the new administration, that would be one of the areas where I'd encourage serious consideration, because it has the potential to be very damaging.
The Last Mile
What else should IT executives keep an eye on?
Cerf: One of the most important things CIOs should be asking themselves is, Are we ready for IP version 6?
And if we're not, why not, and what can we do to fix that? The reason that's so important is that the Internet cannot continue to grow effectively without the new address space. There are efforts going on to implement that, but it's absolutely critical that our business sector, the private sector, be prepared for operation of both IPv6 and IPv4. The Internet service providers need to start offering that service. Not very many of them are; they're claiming they don't see a market for it. The answer is: We're going to run out of v4 address space somewhere around 2011, and that's not very long from now in terms of preparing a fully operational IPv6 system running concurrently with IPv4. So please pay attention to that.
We've spoken before about exaggerated claims that the Internet is ready to choke on traffic volume, especially video traffic. Those claims obscure real problems at the edge of the network, the so-called last mile.
Cerf: There is substantial capacity--potential, anyway--in the core of the Net. The edges are at issue, and part of the reason is that there too few competitors providing service. In the United States, the idea that the Internet is choking at the edge of the Net might have some validity. Our delivery capacities are far less than what other countries and other Internet providers have been able to achieve.
This raises big questions. What kind of network environment, what kind of information environment, are we providing the general population and the business community in the United States? If the answer is a weaker, less-effective one than in other places in the world, is that going to disable us in some way? Is it going to retard our ability to be competitive?
It worries me that we are not showing the kind of capacity and economics that other places are. We have to guard against an argument that says, I can provide these kinds of capacities and capabilities only if you remove from me any responsibility for fairness or any responsibility for openness. That has been a thematic argument that many of the broadband providers have made over the course of the last decade. The consequence now is that we don't have very effective broadband services.
I no longer believe that intermodal competition is going to be a solution to the problem. That had been the thematic argument in the FCC for quite some time, that if they just relax all regulatory strictures, lots of people would jump in to offer broadband service. So we need other kinds of mechanisms for maintaining fair access to these resources, especially for value-added providers. We need to think more about the Internet infrastructure provision, maybe even go so far as to reconsider decisions that segregate basic and enhanced access, and common carriage requirements for people providing basic service.
If the objective is widespread broadband access to Internet service, then it's time we stepped back and ask ourselves, How do we achieve that? Because we don't seem to be moving down a path that's effective.
You helped launch the Internet for Everyone group, which aims to close the digital divide.
Cerf: Having access to information can make you a lot more competitive, and so from a national point of view, having a well-informed population is valuable. Unless the rest of the world is as enabled as we are, it doesn't represent a market for the things we can do with this online environment. So we care a lot about erasing the digital divide for the purely selfish reason of increasing our economic benefit. Of course, we hope, that it also increases economic benefit for everyone else.
But I also believe that cooperation and sharing of information is by far the most powerful tool we've got. So, when people speak about competitiveness, I cringe a little bit, because part of the value of the Internet is its openness and ability to share information.
Future of The Web and Society
Should government play a role in building out infrastructure or is that best left to the private sector alone?
Cerf: It sometimes takes steps to illustrate the existence of a market to motivate the business sector. In the late 1980s, I asked the Federal Networking Council for permission to put a commercial electronic mail system up on the Internet. My motivation, in part, was to allow commercial traffic to flow on the government-sponsored backbone as a way of demonstrating to the business sector that there might be a market that [businesses] should invest in.
Getting rid of that barrier created an opportunity for commercial Internet service without having to build the backbone. Once that market was demonstrated, it didn't take long before the government said: Gee, we don't need our government-sponsored backbone anymore, because everybody can buy commercial service.
With Google unveiling its Android operating system to challenge the iPhone, I'm reminded of Jonathan Zittrain's thoughts on "generative" technologies--open platforms that allow people to tinker and innovate--versus closed or tightly controlled platforms like the iPhone. What kind of phone are you carrying, and what does it say about you?
Cerf: I use a RIM BlackBerry. I'm anticipating the use of an iPhone or something like it. What I'm eager for is a phone that runs the Android operating system, because of the openness of the design. It's the evolving flexibility of mobile platforms that's so critical.
One can understand some of the decision-making that went on at Apple when preparing the iPhone.
A closed device has the benefit that people can't make changes to it that may cause it to stop working.
The counterpoint is that almost every information technology I can think of, as it becomes more useful and competitors arise, leads to demands from users that interoperability is paramount. In the case of the Internet, the TCP/IP protocols turned out to be demanded by the buyers of new equipment, so that they wouldn't be locked into any particular manufacturer. So standardization has this wonderful benefit of leading to interoperability, and it also creates a platform on top of which new innovations can happen. But there's this tension between differentiation and interworking that repeats itself over and over again as time goes on.
What do you see the Internet--and the society around it--looking like in 20 or 50 years?
Cerf: Looking ahead, we can say several things. There will be substantially more connectivity available. No matter where you are, you will have access to this online facility. That turns out to be very important, because the cloud computing notion has utility only if you can get access to it whenever you need it, in the capacity that you need it. I see a lot of utility in cloud computing, and I anticipate that it will be increasingly available.
Another change I'm pretty sure will happen over the course of the next 20 to 50 years is the way we interact with these online systems, or even with local ones. Today, it's keyboards and mice, but I expect interactions--conversational interactions, gestural interactions--to be normal. I may be personally instrumented in some ways, so that my locale is known, or at least my devices know where I am. That way, my questions can be related to this information, something like, Where is the nearest restaurant?
I expect to see much more interesting interactions, including the possibility of haptic interactions: touch. Not just touch screens, but the ability to remotely interact with things. Little robots, for example, that are instantiations of you and are remotely operated, giving you what is called telepresence. It's a step well beyond the kind of video telepresence we are accustomed to seeing today.
This image of little robots is different from the typical autonomous robot you see in the artificial intelligence world. It could be sitting in a conference room, representing me, not autonomously, but allowing me to be in more than one place at the same time. They could move around, interact with things, talk to people, see like everyone else can.