Changing business needs and hungry applications continue to beleaguer the healthcare network infrastructure. Throbbing with vital bits of data, images, voice and video, the ideal healthcare network is silent, strong and capable of whatever users push across through its channels. For most users, it is simply taken for granted--until, of course, it’s down.
The network not only needs brute strength, it must be well planned and well organized if it is to stand up to the many new and bandwidth-hungry applications on the market. Unfortunately, many networks are faltering under the current load, let alone future loads. Whether or not the problem is an antiquated infrastructure or information silos that must be pulled together, managing the overhauls and upgrades with an organizational view is a monumental task. "Many organizations are linking islands of networking more through crisis than by good management," says Andrew Rushmere, president of Aviant Technologies of Simi Valley, Calif.
The fundamental changes in the market have made it hard enough for IT to maintain what they have, let alone look at new technology. In addition, there is a lot of confusion in the marketplace about what minimum technology capabilities are necessary. Much of it has been created by our industry--by manufacturers who push technology for technology’s sake, says Rushmere. "Technology should follow the business plan, not determine it," he says, adding, "Many CIOs are caught up in a technological bandwagon that is leading to a dead-end."
From its backbone to its most remote point, the network carries the organization’s most valuable asset--information. Increased information flow naturally adds more burden to the network--just as it does to IT--but the vision and the business plan must come before you expand your network capabilities: What do you want to do with your business and what type of network is necessary to support it? What services do you plan to offer?
Consultants, analysts and vendors alike point to mergers, acquisitions and consolidations as a major source of problems in bringing together data and people. The diversity of technologies and disparate systems in any given organization is a monumental challenge in connectivity--not to mention the extensions necessary to the continuum of hospitals, clinics, physician offices and homes. For years, health networks started and stopped at the physician’s desktop, says Gail Gulinson, VP health networking solutions, IBM Global Healthcare Industry in Chicago. That is changing as the network rapidly expands outward to reach health plan members and consumers. Decreasing technology costs plus the increasing numbers of provider and payor networks is fueling the rapid growth of extranets to provide organizations such as the VHA and Blue Cross access to nonpatient specific information such as best practices and cost benefits.
The problem of connecting many disparate healthcare information systems is real, but what is less often considered, points out Gulinson, is that many of them are not asset-based--they’re contractual relationships. And, even though partners are often in the position of bidding for care against each other, they still must behave as one system. Sharing data is a problem. No one is willing to put all the information or all of the data into the middle of the systems. So, networks with a database of identifiers in the middle are emerging across the private network to resolve the problem.
For the long haul, Ann Thryft of Electronic Engineering Times magazine foresees three technologies vying to be data transmission trunk lines: synchronous optical networks (SONET), asynchronous transfer mode (ATM) and all-optical networks. As LANs and WANs continue to move toward convergence, expectations for voice, video and data transmissions across a common network are revising technology projections. Since ATM is not optimal for LANs, says Thryft in the December 1, 1997 edition, many are looking to Ethernet and Internet Protocol as the future for network foundations.
Del Jenkins also sees a trend in all networking toward Internet Protocol (IP) but sees no silver bullet. "The Internet Protocol Suite (IPS) will support a lot of quality of service over plain IP protocol with less need for ATM," says Jenkins, VP and general manager North America of GTE Data Services, Temple Terrace, Fla. Although some of the ATM and SONET technologies lack widespread implementations and the related proof of service, they are much more sophisticated. "ATM has definite advantages and is not going away," he says, "but it is not an either/or for ATM or SONET. I think you will see a mixture of these protocols." He emphasizes that data and application types should determine the protocol.
The services you plan to offer will determine whether ATM makes sense for your organization, says Mary Verhage, director of healthcare information systems for Boston-based Aberdeen Group, an IT research and consulting firm. If the organization is considering realtime information transfers such as videoconferencing and telemedicine applications, and/or integration of many, geographically dispersed points of care, then it’s time to consider ATM and more sophisticated networks. ATM also makes sense if the organization is considering replacing voice trunks and doing voice over data and voice over IT, adds Aberdeen’s David Dines, manager of network technologies. It is more robust, more scalable, provides needed bandwidth and has bandwidth guarantees.
ATM has been downplayed and has its detractors, says Aviant’s Rushmere, but it is still the only technology that is deterministic, meaning that it can get the information to a certain point in a determined time frame with guaranteed response. Absolutely essential in a healthcare environment, a deterministic delivery channel delivers voice/video/data in its entirety. Without it, he explains, words might be clipped off, creating a risky environment for transmissions such as physician orders and prescriptions.
But ATM is restricted to the backbone for the time being, says Keith Forrester, Aviant product marketing manager. Although 90 percent of the networks Aviant is building now have some level of ATM in the backbone, none has ATM to the desktop--not yet. It’s still too expensive. But that may change now that all carriers are providing ATM networking in the remote LAN environment, says Rushmere. And the trend is to move ATM to the wide area network within the next eight to 12 months. This is good news for consumers for two reasons. Not only will the capability to support voice and data across the same pipe allow the organization to consolidate services with a single carrier across the enterprise, but increased competition among carriers should result in some very aggressive pricing.
Instead of adding more local area networks to the network, says Rushmere, the challenge in designing networks over the next two to three years is in allocating speed-sensitive lanes for network traffic. The biggest problem in networking technologies now is quality of service (QoS)--how to determine and assign bandwidth needs and priorities.
QoS is ATM’s biggest advantage over competing technologies such as Fast Ethernet. QoS may be ATM’s ace, but ATM’s disadvantages are holding it back. ATM is more expensive. It is more difficult to install, to put into operation and to manage. In addition, it requires specific technology expertise for IT staff, says Aberdeen VP Virginia Brooks--not the least of which is rewriting applications to accommodate it.
Speed on the cheap?
So where do the faster ethernet-based technologies fit into the picture? And do they offer a cheaper solution to ATM? For one, Rushmere says, "Fast Ethernet can’t replace ATM. They are two different types of technologies." Ethernet is a pure data delivery channel with some capability to carry voice, but none for video. Interoperability between the two technologies is restricted to data. Fast Ethernet may be the solution to the desktop, but ATM will still be necessary in the backbone.
Maureen Ryan sees a battle brewing. "ATM and Gigabit Ethernet will go head-to-head in a fight over the backbone," says this senior industry marketing manager for Bay Networks USA, Inc. in Santa Clara, Calif. It’s a little early to tell; Gigabit Ethernet is still an emerging standard. It hasn’t been deployed long enough to adequately test its potential for supporting voice and video. Nonetheless, there are those who believe that if you can throw enough bandwidth at Gigabit Ethernet and you don’t have a lot of other applications running, it can fill the bill. If multimedia applications and videoconferencing are not priorities and the IT staff doesn’t know ATM, she suggests that Gigabit Ethernet may be the more appropriate technology.
Although Gigabit Ethernet delivers a bigger data pack for faster data transmission, Rushmere wants to know if you really need a bigger pack. Does the user need a technology solution that can deliver work at 30 to 40 times the capacity needed? "The industry has created a terrible hype for new technology," he says. "The real question is: Do you really need this technology?"
Can’t decide? It may be wise to go back to the business plan: Is the need today or in the future? "If it’s today," says Dines, "it’s necessary to go with ATM." However, for those organizations that can wait, comparable data transmission capabilities over ethernet-type networks will be available in about two years. He expects Resource Reservation Protocol (RSVP) to give QoS levels via ethernet, although it is uncertain whether QoS levels will be guaranteed.
Low comfort levels
In addition to the challenges of tying together diverse technologies and systems into a cohesive work unit, the shortage of trained engineers to build and manage the network is a real problem.
In a recent user survey of regional hospitals, Brooks found that many staff members had on-the-job training, but not much else. These staffers are interested in the technology, she found. They are dedicated and they often are able to squeeze high performance out of their networks--mainly because they have to. But many staffers had little access to objective educational resources, with most of their knowledge coming through vendors and distributor channels.
Meanwhile, Year 2000 (Y2K) threatens at the network level, too, says Dines. Y2K issues are not only diverting funds that might be earmarked for network upgrades, but most organizations have concentrated on finding and resolving Y2K problems at application and server levels. "It is easy to overlook the network," he says, "and it will be a big issue--particularly for those who have not upgraded the network for awhile."
Although the growth of intranets and extranets increases the need for more sophisticated network management tools, security is a real donnybrook. "The security issue in healthcare is not a technology one," says Verhage. Shannah Koss, IBM manager, governmental programs and healthcare, agrees with her, adding that today’s technology solutions offer significant security coverage as long as they are implemented comprehensively. Unfortunately, existing security has been system specific and most large integrated delivery systems have not even begun to tackle security at the enterprise level.
A security program throughout the continuum will be a major investment for most organizations, Koss says. In planning toward security requirements expected to be released later this year by the Department of Health and Human Services, Koss says it may not be necessary to rush through an implementation. The government has announced that it does not plan to specify one technology to meet these security requirements. Therefore, so long as some security is in place, it might make sense to await governmental guidelines before finalizing an enterprisewide policy.
Larry Haggerty, GTE senior product manager for internetworking, agrees that technology isn’t the biggest barrier to data transmissions via public networks. He thinks it’s fear. Consumer education is the only answer here, he says, but adds that most customers must be more than enlightened--they must be overconvinced.
One of the key security technologies that would enable the entire industry to move forward with expanded data transmission across public networks is digital certificates, says GTE’s Jenkins. Sometimes called digital IDs, digital certificates authenticate the sender’s identity through a third-party license. There is high interest for this international standard suite of protocols, he reports, but it’s not moving forward. "If every doctor’s office in the country installed a digital certificate on their PCs tonight at a cost of about $2," he says, "healthcare information networks could do a five year leapfrog."
Start simple, grow fast
Not long ago, the questions were "Should we or shouldn’t we put our money in networks?" Not anymore, says IBM’s Gulinson. Now it’s, "How do we take these networks and integrate them into our strategic business initiatives?" The big shift in building is to "start simple and grow fast" rather than wait for the perfect network that is all-encompassing, she says. It is not only more realistic to focus on one part of the network before moving on, but it provides a quicker return on investment. Many organizations start in the radiology practice, reports Bay Network’s Ryan. The move from film-based to digitized imagery shows up rapidly as a straightforward cost reduction and is one of the quickest ROIs on the network, she says.
Newer technologies are also bringing opportunities to extend the reach of the healthcare organization, to build on the continuum of care model and to find new business opportunities, many of which are outside the cachement area. Public demands will play an important role in the growth of the networked healthcare organization too, where they will not only drive care models, they’ll drive providers’ use of IT. Once the public recognizes and is convinced of the benefits, says GTE’s Jenkins, it will demand electronic transmission of records and force providers in that direction.
It is a given that tomorrow’s network requirements will be different from today’s. The need for a flexible, scalable pipeline to carry more and different kinds of information will continue to expand. Keep the focus on the organization’s business goals and strategies and an eye on the right technologies, says Aviant’s Rushmere: "Don’t choose the technology and then try to warp the business to meet it. Every organization uses information in a different way; it’s as unique as a thumbprint. You cannot design around someone else’s thumbprint."
Networking Terms Defined
Asynchronous Transfer Mode (ATM)
A very high-speed network technology designed to transmit data and realtime voice and video over LANs and WANs. Defined in the Broadband ISDN (BISDN) standard, ATM was originally to be used with SONET but is now considered a separate technology.
Supporting data transfer rates up to 10 times faster (100 megabits per second) than ethernet, Fast Ethernet is a relatively new networking standard for shared media LANs. Also known as 100BASE-T, Fast Ethernet is based on the older ethernet standard. Fast Ethernet may also refer to a Hewlett-Packard-developed 100BaseVG or 100VG-AnyLAN that supports Token Ring as well as 10BaseT networks.
An emerging ethernet technology that raises transmission speed to 1 gigabit per second. Expected to be a major alternative to ATM, particularly if it can effectively support realtime voice and video, Gigabit Ethernet offers a natural upgrade path for most current ethernet installations at lower costs than other, comparable speed technologies. Formal ratification for the IEEE 802.3z Gigabit Ethernet Standard draft is scheduled for summer 1998.
Internet Protocol Suite (IPS)
The dominant open data communications architecture for computer-based data transmissions, the Internet Protocol Suite (IPS) is expected to be a key protocol in the emerging National Information Infrastructure (NII). Internet Protocol version 6 (IPv6), as defined by IETF, replaces IPv4 as the next generation Internet Protocol.
Quality of Service (QoS)
QoS specifies guaranteed throughput levels on the network. Particularly important for voice and video transmissions, QoS ensures that the amount of time it takes for a data packet to travel from its source to the destination does not exceed the specifications required for the application.
Resource Reservation Protocol (RSVP)
Largely dependent upon the Integrated Services Model (ISM) architecture devised by the Internet Engineering Task Force (IETF), the RSVP communication protocol implements Quality of Service (QoS) over TCP/IP by reserving resources across the network for realtime data transmissions.
Synchronous Optical Network (SONET)
A synchronous fiber optic transmission system for high speed digital traffic. Used by telephone companies and common carriers, SONET speeds can carry data at speeds ranging from 51 megabits to multiple gigabits per second. It can carry any type of traffic, including ATM.
CASE IN POINT
Providence Health System
The number one problem in RichardSkinner’s march toward integration within his healthcare network is the changing business climate. As CIO and regional director of information services of Providence Health System in Portland, Ore., Skinner expected integration of his systems to be a done deal by now. He also expected the industry to be comprised of a much smaller group of much larger healthcare organizations. "When I look back," he says, "I was starry-eyed with all the integration. We were going to have a master patient index (MPI) and a central repository and so forth. We have it of sorts, but the real benefit of moving down the path toward system integration has been making everybody use the same system. That has driven standardized business practices which we have then been able to leverage in terms of cost and quality."
Providence was in the right place at the right time. Five years ago, Providence was a holding company of five hospitals, some ancillary services, a small, 60,000 member health plan, and a vision--an integrated system vision. Today, Providence has six hospitals, a 1.1 million member health plan, 150 primary care physicians and about 1,500 independent practice associations.
Skinner and his staff started from scratch using a combination of consolidation and standardization. They pulled out systems so as to begin with only one of any kind of system. They also defined a single information services division, a single budget and a set of standards for all Providence-owned facilities. Early on, they thought the only bang for the buck was integration between systems--so they tried it. "What we found out was that the integration was a lot harder than we thought it was going to be," he notes.
"We were so naive," he says as he cites the $8 million the health system put in the capital budget five years ago for an enterprisewide electronic medical record. With this year’s budget, Providence will have spent $82 million to $83 million in that five years. "We didn’t know what we were doing, but by focusing on consolidation, standardization, standard practices and a lot of the nonclinical functions, we are now starting to get the integrated clinical system going--through the MPI. If we had focused on that five years ago, I’d probably be somewhere else."
In Oregon, most of the business is managed care. And since most of the physicians are in the plan, Providence has been able to provide incentives to encourage physicians to do some things differently. "Fortunately, the Oregon market supported the integrated system vision," says Skinner, "and we’re sticking with it." The entire business is now managed as a single company across the entire state and there is only one leader for each department, be it laboratory, medical records, radiology, etc.
However, all business climates are not created equal. As the Providence Health System extended its reach into Washington State, it planned to follow the established Oregon business model. However, with managed care penetration less than half that in Oregon, Providence quickly found resistence: the Washington market wouldn’t pay for it. Providence will most likely employ a more localized business approach there with "operational excellence" programs. "We probably won’t do the integration kinds of things there for a while," Skinner says; "the market doesn’t care and we don’t have the tools to work with. We are not the dominant health plan, our hospitals are not as respected nor do they have a distinct place in the market."
Molding and shaping
Another business challenge is developing a process or a set of processes that support an integration model. The business plan throughout all parts of the organization must change when you bring in an integrated model, says Skinner, admitting that leadership and people issues are creating more problems than technology. Skinner has become skeptical of applying information technology after business process reengineering. He notes that in his organization most integration efforts have been driven by an information system conversion. "It doesn’t necessarily have to be that way," he says, adding, "If you can make it work the other way--more power to you, but in our organization, we need the excuse, we need the scapegoat, we need something to motivate people to change." Consolidation announcements had little effect on ways of doing business, but then Providence brought in one information system and forced users to change procedures to accommodate the system. "Nobody liked that and it was pretty uncomfortable for a while," he reports, "but that’s how we got it done."
The biggest strength of software may not be the underlying technology, says Skinner, "but the fact that it forces people to change the way they do business." The reasoning is simple: If the parent company spent a lot of money to install the system, then it’s not acceptable to do business the old way. The move to a centralized business office not only reduced accounts-receivable days, it showed profits on physician practice acquisitions. "Now, I wouldn’t recommend that approach unless you have a bullet-proof vest, but experience has been that it’s expedient and it works," he says.
"They’re painful and they’re tough, tough changes," he adds. You can discuss them in strategic planning meetings and visioning councils and whatever--but when it gets down to the clerk that’s going to do something different--it’s a big problem. Putting in a standardized set of systems may cause some processes to be less efficient. In some instances, the organization may need to consider cost reallocations as full-time employee work-requirements shift.
Getting the right mix of long- and short-term strategies is a constant goal for Skinner. Some strategies may be worthwhile but very difficult, and necessarily--either for technological or organizational reasons--very long range. In the meantime, he counsels, it is important to produce enough value so that when people complain about not meeting certain goals, they also remember what has been accomplished. "It may sound simple," he says, "but it’s survival."
"The big winner in managed care going forward is the organization that can truly provide better customer service," says Skinner. Because of managed care, customer service in the healthcare industry is abysmal. The same rise in consumerism and expectations that have swept across virtually every other industry is coming to healthcare. It will be very difficult to provide that customer experience without some means to tie all the components together. You don’t necessarily have to own them, he says, but you must have something that aligns the incentives. Skinner doesn’t think managed care as we know it today will accomplish this integration. Rather, a cooperative venture to provide healthcare services when, where and how the customer chooses, will emerge to supplant today’s facility-based healthcare. Nonetheless, Skinner says, "It’s a sure bet that IT is going to form the foundation to do that, just as it has in other industries."
The sense of cost management throughout the organization is fragile, observes Skinner. Unlike other business models, there are hundreds of lines of business in a hospital. Not only is it more complicated, it’s more information intensive, more political and very territorial. Healthcare has traditionally focused on attaining optimal profitability at the departmental level--and managed to that level. Making the transition from departmental accountability to organizational accountability is not easy--profitability calculations at the departmental level are so embedded and so complicated that sub-optimizing one department’s profitability risks fracturing the whole organizational structure.
The industry has not learned how to be optimally profitable as an organization--and measure and account for the underlying trade-offs. "The trick is in managing the hand-offs," he says. Just increasing business is not enough. Rather, growing the business in terms of quality and an organization view is much more important for the corporate bottom line.
Look at private industry, he says. They can identify expenses--how many units of service they produce, the cost for each of those units; and what mix of units of service any particular customer decided to employ--or by class of customer. "We can’t do that and that’s where we’d like to be."
CASE IN POINT
Staten Island University Hospital
When Staten Island University Hospital, a multi-site healthcare system in New York City, planned to grow its patient base, it built its network as a mission-critical component of the business plan. Managed care is driving the healthcare business in New York, where Patrick Carney, VP and CIO, estimates managed care to be a little more than one-third of the Staten Island system’s business. State mandated Medicaid managed care enrollment is expected to dramatically increase enrollment this year. "For IT, managed care is a wake-up call to administration for increased investment requirements," he says. "As an IT professional, it’s the best thing in the world."
Early in the planning, Carney and his team sought to incorporate short- and long-term IT plans into the strategic business plan. "You must build a reliable and scalable network that can support all emerging applications for the integrated delivery network (IDN) toolkit," says Carney, referring to master person index, central data repository/data warehouse, enterprisewide scheduling, imaging, multimedia and point-of-care applications. By identifying each application in terms of business and IT requirements, Carney was able to summarize his "dream list" of network requirements.
For example, from a business perspective, a master person index would serve as a unique identifier across the continuum but it would require high-volume transaction processing, high-speed connectivity to ensure realtime processing and a scalable network architecture. A central data repository to provide a longitudinal view of patients’ clinical and financial data, outcomes research, clinical decision support and a common user infrastructure to data originating in many different systems would need ubiquitous access, sophisticated security capabilities and also support for high transaction volume. In the end, Carney had identified a list of requirements that were common to all applications: scalability, ubiquitous access, sophisticated security, bandwidth, reliability and advanced network management, as well as a list that included such variables as speed, connectivity, bandwidth to support wireless, video and voice communication and quality of service.
At Staten Island, the great unknowns of future traffic--when and how much--led Carney to use scalable network technology to install an ATM backbone. Since they built the network to the business plan, he is now seeing that as the business plan progresses, the network is beginning to pay off. And he has learned some valuable lessons:
- The infrastructure is critical to the success of both business and IT plans and is the business enabler.
- The return on investing in the infastructure is demonstrated in well-implemented applications.
- Never underestimate an application’s appetite for bandwidth.
- Build around the business needs, not around a product set.
Since beginning with imaging functionality in mid-1996, where he reports some teleradiology victories, Staten Island has added connectivity for the data repository. This year is expected to be a banner year as the repository goes live and IT extends functionality to its departments. Next year, the circle will broaden to enterprise-wide benefits.
Now, the network is not the issue for Carney--it’s the integration of the information. And integration--whether it’s total or simply aligning ancillary services with the core acute-care business--is proving the major challenge. "By definition, integration implies compromise," he says. Just getting people to compromise their piece of the organization for the benefit of the whole is a big problem, not to mention the problems associated with lack of standardization and business process changes necessary in the integrated environment.
He has flirted with the idea of installing an end-to-end solution to tie all of the processes together, something on the order of an SAP/healthcare, but thinks it’s still too risky: "It took 15 years for private industry (to get there)--it will take much longer than that for healthcare," he says. "It’s hard enough getting everybody to use the same lab system, much less tying all the processes together from supplier to customer." But he thinks somebody in healthcare will eventually do it… "and when they do, there’s enough inefficiency to be eliminated that the company will be unbeatable in the market."
The process of integration also exposes hospital revenue and cost containment gaps. While acknowledging a major problem with counterproductive reimbursement systems (where the more you spend the more