Earlier this summer, HCI Editor-in-Chief Mark Hagland interviewed Gigi Lipori, the University of Florida Health’s senior director for operational planning and analysis, and Erik Deumens, Ph.D., the University of Florida’s director of research computing, regarding the work that they and their colleagues are currently involved in around their organization’s participation in the creation and forward evolution of Internet2. Lipori and Deumens and their colleagues have been involved in a very exciting initiative along with leaders at 26 universities (and in some cases, the medical centers affiliated with those universities) that is opening up a whole new area of endeavor, as all those involved with Internet2 are working on developing new channels for communication and connectivity.
As its website indicates, the collaborative known as Internet2 encompasses more than 240 U.S. universities, 60 leading corporations, 70 government agencies, 38 regional and state education, networks, and “more than 100 research and education networking partners representing over 50 countries.” The website also notes that “Internet2 is an exceptional community of U.S. and international leaders in research, academia, industry, and government who create and collaborate via innovative technologies. Together, we accelerate research discovery, advance national and global education, and improve the delivery of public services. Our community touches nearly every major innovation that defines our modern digital lives, and we continue to define ‘what’s next.’”
Meanwhile, Hagland also spoke earlier this summer with Robert Vietzke, vice president, network services, at Internet 2, which has offices in Washington, D.C., Ann Arbor, Mich., and Emeryville, Calif. The West Hartford, Conn.-based Vietzke provided him with a broad overview of the strategic goals and objectives of the Internet2 initiative and collaborative. Below are excerpts from that interview.
Tell me about the origins of Internet2 as an initiative.
If you go back to the birth of the commercial internet, where the government agencies—the defense industry and the NSFNET, allowed the research network to spin out to be commercial, well, in this case, the leading research universities—26 of them—met in a basement room in Chicago, founded Internet2 to make sure there was always an organization available to build leading-edge Internet technologies that would always be a step ahead of the commercial Internet. And it’s not competitive with the commercial Internet; it’s actually creating new markets. The first meeting took place back in 1996, before I was personally involved.
What are the key pieces of the organization?
Let me start with people. One of the things we are that’s unique is that we’re a community-based organization. So our staff is very small, but the resources come from the universities. So we have fewer than 100 staff, but there are thousands of people at the leading universities who are absolutely a part of Internet2.
And our folks are struggling with how to move these very large data sets, as with genomics, around. So it’s a current challenge, not necessarily a future one.
Where are you at technologically right now with the initiative?
So, Internet2 has spent a lot of time thinking about what the key aspects are of supporting data-intensive science research, as opposed to supporting millions of consumers in the home. So we’ve spent a lot of time in the last couple of years thinking about, what is the platform? And what are the key aspects? And the University of Florida is on the leading edge in that regard. They’re the first to complete the three components in the innovation platform, as we call it; it’s really an enabling platform that allows an institution to participate with others in this. There are three elements to the platform: the first element is bandwidth abundance; we’re talking about 100 gigabits a second. Think about a home user: they’re thinking about megabits a second. The small clinical site is using dozens of megabits a second; a hospital is probably in the gigabit range. But if you’re moving these genomic data sets, you need a much higher bandwidth. Think about what’s happened with our home voice calls, right? Most of us no longer pay per minute. And for the most part, we’re uncapped in terms of data use in the way we use data online, as consumers.
The second element involves being intentional about using data in your network. We call it “science DMZ” [demilitarized zone]; or, support for data-intensive science. This is really about security, and the realization that network security is critically important, especially in the healthcare environment, right? But most of the security has been about protecting millions of small flows online. But the support for these big data flows is very different. You can’t think of everything as one class of science, you’ve got to think differently.
Erik Deumens was talking about the functional need to not have to inspect every data packet.
I’m really rather agnostic about that issue, but these big data flows—you’ve got to be intentional about the way you architect your security regime, to make sure that you look at your own campus’s data flows and security protocols and make sure the big flows are treated differently from little data packets.