Earlier this summer, HCI Editor-in-Chief Mark Hagland interviewed Gigi Lipori, the University of Florida Health’s senior director for operational planning and analysis, and Erik Deumens, Ph.D., the University of Florida’s director of research computing, regarding the work that they and their colleagues are currently involved in around their organization’s participation in the creation and forward evolution of Internet2. Lipori and Deumens and their colleagues have been involved in a very exciting initiative along with leaders at 26 universities (and in some cases, the medical centers affiliated with those universities) that is opening up a whole new area of endeavor, as all those involved with Internet2 are working on developing new channels for communication and connectivity.
As its website indicates, the collaborative known as Internet2 encompasses more than 240 U.S. universities, 60 leading corporations, 70 government agencies, 38 regional and state education, networks, and “more than 100 research and education networking partners representing over 50 countries.” The website also notes that “Internet2 is an exceptional community of U.S. and international leaders in research, academia, industry, and government who create and collaborate via innovative technologies. Together, we accelerate research discovery, advance national and global education, and improve the delivery of public services. Our community touches nearly every major innovation that defines our modern digital lives, and we continue to define ‘what’s next.’”
Meanwhile, Hagland also spoke earlier this summer with Robert Vietzke, vice president, network services, at Internet 2, which has offices in Washington, D.C., Ann Arbor, Mich., and Emeryville, Calif. The West Hartford, Conn.-based Vietzke provided him with a broad overview of the strategic goals and objectives of the Internet2 initiative and collaborative. Below are excerpts from that interview.
Tell me about the origins of Internet2 as an initiative.
If you go back to the birth of the commercial internet, where the government agencies—the defense industry and the NSFNET, allowed the research network to spin out to be commercial, well, in this case, the leading research universities—26 of them—met in a basement room in Chicago, founded Internet2 to make sure there was always an organization available to build leading-edge Internet technologies that would always be a step ahead of the commercial Internet. And it’s not competitive with the commercial Internet; it’s actually creating new markets. The first meeting took place back in 1996, before I was personally involved.
What are the key pieces of the organization?
Let me start with people. One of the things we are that’s unique is that we’re a community-based organization. So our staff is very small, but the resources come from the universities. So we have fewer than 100 staff, but there are thousands of people at the leading universities who are absolutely a part of Internet2.
And our folks are struggling with how to move these very large data sets, as with genomics, around. So it’s a current challenge, not necessarily a future one.
Where are you at technologically right now with the initiative?
So, Internet2 has spent a lot of time thinking about what the key aspects are of supporting data-intensive science research, as opposed to supporting millions of consumers in the home. So we’ve spent a lot of time in the last couple of years thinking about, what is the platform? And what are the key aspects? And the University of Florida is on the leading edge in that regard. They’re the first to complete the three components in the innovation platform, as we call it; it’s really an enabling platform that allows an institution to participate with others in this. There are three elements to the platform: the first element is bandwidth abundance; we’re talking about 100 gigabits a second. Think about a home user: they’re thinking about megabits a second. The small clinical site is using dozens of megabits a second; a hospital is probably in the gigabit range. But if you’re moving these genomic data sets, you need a much higher bandwidth. Think about what’s happened with our home voice calls, right? Most of us no longer pay per minute. And for the most part, we’re uncapped in terms of data use in the way we use data online, as consumers.
The second element involves being intentional about using data in your network. We call it “science DMZ” [demilitarized zone]; or, support for data-intensive science. This is really about security, and the realization that network security is critically important, especially in the healthcare environment, right? But most of the security has been about protecting millions of small flows online. But the support for these big data flows is very different. You can’t think of everything as one class of science, you’ve got to think differently.
Erik Deumens was talking about the functional need to not have to inspect every data packet.
I’m really rather agnostic about that issue, but these big data flows—you’ve got to be intentional about the way you architect your security regime, to make sure that you look at your own campus’s data flows and security protocols and make sure the big flows are treated differently from little data packets.
The third element involved is software-defined networking, which is a big trend. That means a lot of things to a lot of different people; but what it could be, and where we’re focusing a lot of energy, is… If you think about computing and storage and visualization, and a lot of the work that genomics is involved in, often, the network was a black box. They understood how much data was involved, but in terms of moving the data from one place to another, they didn’t have much knowledge or control; it was a black box. So you want to rethink your network environment, especially when you’re working in the cloud, or when you’re working with some genomics institutes—you’ve got a couple of very large genomics centers inside and outside the country and a couple of large hospital institutes in that area, and coordinating data across those sites is very important.
So we should think about that system-wise, and optimize it together. And the concept of software-defined networking means that the application should be aware of what resources, including the network, that are available to finish the job for the user, and the network should be programmable and controllable by the application. This is a little bit farther out there than network abundance or “science DMZ.” But if we’re really going to do widespread genomics worldwide, we’re going to need to the network to be more adaptable.
On a scale of 1 to 10, where are you now?
I think with regard to the capabilities that are available nationally through a network like Internet2 today, are probably a 9. We’ve got a 100-gigabit national backbone that’s available to every state and that can quickly go to 88 times that, to 8.8 terabits. And it’s fully configurable and ready to go. When we built this in part with the help of a stimulus program from the ARRA [American Recovery and Reinvestment Act], using the network doesn’t change the cost. So we’re ready to go in terms of supporting these capabilities and use cases.
So they need to catch up to your capabilities?
Yes, absolutely. So we feel really good about that. And in terms of supporting the applications for genomic science, this thing is ready to go. And when you start thinking about hundreds of thousands of sequencers (personal sequencers), and potentially millions and millions of gene sequences a year that you want to compare and contrast to one another, we’ve got work to do to set that up. But in terms of supporting the current state of the art, we’re in really good shape.
You’ve been collaborating with Brocade, correct?
Yes, the folks at Brocade [the San Jose, Calif.-based Brocade, a network solutions provider] have been part of the solution on a couple of fronts. One is that they get this importance of creating the 100-gig market. So they’ve really invested in the idea of bandwidth abundance, by the way they’ve air-conditioned their products; they’re also heavily invested in working on software-defined networking.
What should CIOs and CMIOs be thinking about all this, as Internet2 develops and evolves forward?
They may not feel they can relate to all of this, but I actually think that what we do in Internet2 is a collaborative community of folks, and that that is a very common theme in healthcare. When folks go to clinical meetings and talk about rare findings and such, what we do is very much the same. And helping to enable the IT workers to think collaboratively, could be really helpful in transforming that part of the discipline in the next few years. Plus, this is fun to do!