Abstract
In April 1998, then-Vice President Al Gore announced funding to develop a next-generation Internet infrastructure through an effort known as Internet2. "By expanding the boundaries of our scientific understanding," he promised, "we will unleash breakthroughs that will power American industries for years to come and give us all a higher quality of life."
Six years later, that promise hasn't yet been fully realized, but proponents of this high-speed infrastructure say things are coming along as planned. Industrial applications might be a good ways off, but Internet2 is nonetheless on the verge of fulfilling its mission to improve the quality of life, especially through academic research.
To ask how Internet2 helps academic research today "is sort of like asking: How does electricity help the academic research experience?" says Joe St. Sauver, director of user services and network applications at the University of Oregon Computing Center. In short, Internet2 products are now indispensable in the research environment.
Over 200 US universities lead Internet2, a not-for-profit consortium with many projects in the works. For the networking community, the centerpiece is Abilene, a networking infrastructure available to consortium members. Already very fast and very high-performance, Abilene is being upgraded. Soon it should be able to move data at 10 Gbits per second—four times its previous capacity and about 10 thousand times faster than most computer networks used today.
What can you do with that much muscle? Just ask Dale Alverson, medical director of the Center for Telehealth and Cybermedicine Research Center at the University of New Mexico.
SUCKING UP BANDWIDTH
"Two medical students are traveling together and they come across a car crash," says Alverson, describing the grisly scene and its aftermath. "The victim has been thrown from the vehicle and is lying on the ground in a severe condition. The victim begins to convulse and turn blue. The students try to stabilize the patient. They work on the patient at the crash scene and, despite their best efforts, the patient dies."
The students step back, reflect on these terrible events, and talk about what they could have done differently. Then the clock rolls backward and they get the chance to try it all again.
This is a virtual 3D environment. The students wear head-mounted displays and express their actions through joysticks. They can see one another in this virtual landscape, and their surroundings respond immediately to their choices. Better still, one student is in New Mexico, the other is in Hawaii.
"We suck up a lot of bandwidth," says Alverson, who has such a system in the advanced R & D stage. "It requires incredible high-speed connectivity because if you have multiple participants in these virtual worlds, you need to be able to refresh that environment and react to those participants without there being any sense of time lag."
Alverson also has more mundane uses for Abilene. For example, he uses it for weekly videoconferences between his lab and his collaborators in other states. That's relatively new: It was just in fall 2003 that Internet2 made videoconferencing services available to its participants. The service can facilitate any H.323-based video call, with subscribers reserving ports ahead of time or launching ad hoc meetings once they've paid a subscription fee of about US$2,000 a year.
Such applications are possible not just because of the upgrade to 10 Gbits but also owing to fundamental network improvements.
"At one time, the distributed system community was enabled by the work of Internet2 to assume that the bandwidth available to connect the distributed systems together was of the same order to magnitude as they had within those systems," explains Doug Van Houweling, Internet2's president and CEO. Although the reliable bandwidth was terrific to have, it wasn't always the most useful infrastructure due to certain shortcomings—but this is changing.
"Today they can think of it as not only a high performance connection, but also a reliable connection, one with high security, one that can take advantage of advanced protocols like IPv6 [IP Version 6]." As these tools are increasingly built into the network, it becomes significantly easier to make all that fabulous bandwidth available to users.
The attention Internet2 gave IPv6 in particular has been a relief to system administrators. Conventional wisdom has long held that transitioning to IPv6 would be difficult, but in fact Internet2 has made great efforts to ease the pain. For example, the consortium's IPv6 working group promised to hold one or two seminars a year to address issues such as multicast, multihoming, mobility, and global routability. The group will also offer two hands-on workshops each year to instruct participants in constructing an IPv6 network. Regular publications and other forms of support will also aid in implementing the new protocol.
If the Internet2 leaders seem to be taking pains to make sure IPv6 rolls out smoothly, it's with good reason. As Abilene users have learned over the years, having a bigger, faster system doesn't necessarily solve all your networking woes.
INS AND OUTS
Micah Beck explores the confluence of storage and networking as associate professor of computer science and director of the Logistical Computing and Internetworking Lab at the University of Tennessee. He's been involved in Internet2 almost from the start and learned one very important thing about high-speed networks: "There are aspects of system design that are not solved just by raw capacity."
Take the question of storage. Even with tons of capacity, it doesn't make sense to go over the network every time you want to access data. Latency and other factors make it impractical. With that in mind, Beck has been developing a distributed storage platform that anyone in the Internet2 community can access. Essentially, he's re-envisioned storage as something that's part of the overall network rather than appended to the network.
The point here is that a big pipe poses as many questions as it answers. "Access to the high-performance backbone has really challenged us to think about the relevance and importance of various ideas in a world in which bandwidth is plentiful," Beck says. "Just as the Internet opened up questions of telecommunications that the traditional telecommunications industry would not have validated or thought important, [Internet2] changes the nature of what the important questions are and what kind of problems you might pose and solve."
Jill Gemmill encountered a similar phenomenon. As assistant director of academic computing at the University of Alabama at Birmingham, she explores middleware for collaboration tools. What she has found, in part, is that settings at the TCP layer-such as packet size and default router setups-typically have defaults that engineers set with a T1 line in mind.
To get the maximum performance out of a faster network, she explains, network engineers must reexamine and reset these diverse defaults.
"There was an assumption that simply making the network faster and fatter would make applications improve instantly, and that has turned out to not be the case," she says.
Van Houweling says that such an evolution was to be expected. "Today we understand that taking advantage of the potential of that bandwidth requires that we address issues such as performance, authentication, and how high-end applications work in the networked environment. It is more complicated-surprise!-than we thought it would be in the beginning."
INTERNET2 ISSUES
Even as the networking community explores Abilene's perils and possibilities, some researchers have raised questions about the Internet2 organization itself. They say that although Internet2 generally does a great job of bringing together high-end networking enthusiasts, the organization can sometimes be lax in particular policy areas.
Take, for instance, security. "There is some risk that security measures deployed at individual member sites to counter genuine threats … may interfere with users' ability to innovate," says St. Sauver. At the same time, "a lack of security awareness at other sites, coupled with large network pipes, poses an equally bad alternative."
This means that security needs a high level of sophistication, yet "Internet2 still does not even have a designated security contact for each participant," St. Sauver says.
In fact, Van Houweling acknowledges that security could represent a present shortcoming. "It turns out that security exposures can happen anywhere between the application, all the way down to a problem with routing packets in the network," he says. "What we have discovered in the last three years is that we really need to think of this whole set of activities as a system" rather than in a layered model.
Others have complained that the high-speed infrastructure lacks sufficient redundancy and that Internet2 planners haven't found an effective way to extend Abilene's capabilities to the smaller liberal arts colleges.
Still, networking experts in the academic research community agree that such concerns are far outweighed by not just the evolving network's power but also the Internet2 organization's value.
"Through the Internet2 meetings and organizations I have had access to the people who are responsible for defining and bringing forward the most advanced networks for the university community," says Beck. "This is a creative stew that you couldn't have in the commodity networking world."
Speaking of the commodity networking world, one might wonder if and when Internet2's many and varied benefits "will unleash breakthroughs that will power American industries," as Al Gore promised. On this score, Internet2 leaders are vague at best.
"We expect that the people who run those networks will take what they have learned working with us and turn those into products and services in the commercial sector," says Houweling. But no one is saying when that will happen.