Monday, February 21, 2011

Internet pioneer wants to take the Net via planet

Vint Cerf takes seriously its title of Chief Internet Evangelist for Google. He is floury in different projects to bring the next version of the Internet in the world--or in some cases, beyond the world and in the solar system. One of his pet projects includes an extraterrestrial Internet that uses a protocol other than IP.

Cerf sat down with Cisco Subnet editor of network world, Julie Bort, at the annual Conference of the digital broadband migration in Boulder, Colo., to discuss the interplanetary Internet, cloud computing standards, the Semantic Web and other topics.

PART 1: Cerf: future of the Internet does not include a IPv7

Interview: Podcast and transcript

About a year ago, she started talking a lot about a concept called "Interplanetary Internet" stretching the Internet so that it can reach outer space. What can you share about that project?

Photo: courtesy of GoogleIt happenings. It's not using the Internet Protocol. Is using the new Protocol package that was developed as part of the more general concept of delay-and disruption-tolerant network-.

Since 1998 we have recognized that traditional Internet design was implicit in the idea that there was good and relatively low latency connectivity, while in an environment of space, when it comes to interplanetary distances, there are delays and the speed of light can be minutes to days. We need this new Bundle Protocol to overcome the latencies and disconnects that occur in, from celestial movement [and by] orbit satellites.

Bundle protocols are running onboard the international space station. I am running into a number of locations around the United States, in the laboratories of NASA and in academia. There is something called the bone Bundle, which is the backbone of IPv6, which connects a lot of these research activities, one to another. There is at least somewhat experimental Bundle Protocol implementation for the Android operating system, but is not production quality, so that he really needs to be redone/revisited.

There is one called EPOXI spacecraft called the probe Deep Impact (it fired a penetrator in a comet in a few years ago, in order to expose the internal analysis with spectrographic oils analysis). The probe is still in orbit around the Sun and just visited comet Hartley 2 November 2010. We have uploaded the interplanetary probe and protocols that we did their tests to about 80 seconds.

So in 2011, our initiative you qualify "space" protocols in order to standardize them and make them available to all interplanetary space faring countries. If you have chosen to adopt them, and potentially every probe launched by that time will be weaved from a communications standpoint. But perhaps more important, when the spacecraft have completed their primary missions, if they are still functionally operable--have power, computer, communications--can become nodes on an interplanetary backbone. So what can happen over time, is that we can literally grow an interplanetary network capable of supporting both human and robotic exploration.

Part of the motivation for everything that has more space exploration until now has been supported by links to radio point-to-point. We see much more complex missions that need a richer communications environment. We also found that due to the tolerance for delay-and-disruption, we can get data back from the scientific mission.

Here on Earth, Google is engaged in numerous projects of speeding up Internet "such as the new Internet Protocol called SPDY. Should we pay attention to SPDY and there is much support for it?

Yes, you should pay attention. These are efforts by Google to make more efficient implementations of the Internet. A lot of this stuff is available via open source. You don't need many people to Google to make something happen, that is what's so cool about Google. You have a little "Sherpa Team" that actually does this job.

There must be a lot of talk about the Semantic Web is used. It is still hot--or not?

Well, I don't know if it's still hot. I can tell you that Tim Berners-Lee is still very, very determined. He calls "deep linking" now and is related to how you identify data on the network so that we can converge or Conjoin data from disparate sources and still make sense. My impression is that it's a hard slog, and is going for almost a decade now. But Tim has been successful in the past, so it would not exclude this as a potential successful, but it's a long way.

Last year, you were talking about very standard cloud, and now it seems that OpenStack Rackspace had a groundswell of support. You can declare a winner?

It does not declare a winner yet, and it's not because I have no preference for something else. By my count, there must be 25 or 30 different groups that are looking at cloud-based standards. The real problem is going to be implementing and testing. Until we get some serious experience in the clouds to interact with others in various ways, I think we don't know what works and what doesn't.

All these efforts are laudable, but they are going to have to be in the real world before we can declare any winner. There is a real issue of features we're looking for.

Read more on the lan and wan LAN & WAN section of the network in the world.

For more information on the corporate network, go to NetworkWorld. Story copyright 2010 Network World Inc. All rights reserved.

No comments:

Post a Comment