In the early days of heathcare computing, all software applicatoins ran in a mainframe environment in which dumb terminals were directly connected to mainframes for accessing data, processing patient financial transactions and other needs. But this type of environment was rather limited in terms of interfacing available to end users. Terminal applications limited users to doing one thing at a time. And these terminals lacked color, earning them the nickname "green screens," thanks to the monitors’ monochrome green glow.
Some computer experts predicted that the rise of client/server spelled the end of the mainframe. But the mainframe is not going to go away anytime soon, thanks to its robustness, scalability, ease-of-management and ability to be integrated into multi-tiered client/server systems.
Bigger is better
"There is a lot of investment in the System 390 in the healthcare industry today," says Peter McCaffrey, healthcare program manager for the IBM System 390. "The ideal scenario for most of these customers is to leverage that investment as opposed to throwing it away and starting over. Healthcare has been one of the System 390’s fastest growing industries. A lot of this has to do with consolidation."
It used to be that the cost of mainframe computers was significantly higher per unit of processing power because mainframes relied on special semiconductors that were faster, but consumed considerable power and thus required extensive cooling in order to operate. But about five years ago, IBM began using the same complementary metal oxide silicon semiconductor technology used in PCs, eliminating the need for the extensive cooling and glass houses required in the old days. McCaffrey says that this shift in technology has enabled IBM to drop the price per unit of computing power by 35 percent per year and to shrink mainframes from the size of a building to a box the size of a refrigerator.
But the real strength of the mainframe is its scalability and robustness, which will be needed to meet the transaction requirements of managed care organizations in the future. GartnerGroup, Stamford, Conn., estimates that computer transactions will explode as healthcare enters the next phase of IT systems reengineering. Analysts predict that the number of business transactions for the average 1 million-member system will rise from 124 million to 190 million transactions per year. More significantly, the number of sub-transactions, such as queries to labs, hospitals and medical record systems will grow from 414 million in 1997 to 1.5 billion in 2001.
While in theory UNIX and PC servers could be clustered together to manage this level of transactions, it would require a large number of independent boxes, which raises management concerns. "If you just look at the surface, the Intel PC environment looks cheaper, but customers are finding you need a lot more of them and the cost of managing them is prohibitive," McCaffrey notes. "There is a business value in consolidating back to a mainframe server. You can incrementally grow the processor and the speed of those processors is doubling every 18 months. In the old days, it used to be cost prohibitive to scale up because if you wanted to upgrade your environment you had to do it in large chunks. With the new technology you can incrementally grow your environment."
The client may be right
The advent of the graphical user interface enabled application developers to create more sophisticated interfaces to their programs. Healthcare software vendors such as Erisco, New York City, began developing two-tiered client/server applications in which a PC client could access data on a central server using these interfaces.
But two-tiered client/server applications ignored the mainframes that were the workhorses of the healthcare industry. This strategy limited the usefulness of client/server because of the limited scalability of servers and the extensive cost involved in migrating mainframe applications written in COBOL to UNIX or Windows NT applications written in C and C++. To address these needs, many developers began creating multi-tiered architectures that use middleware servers for accessing data wherever they reside--whether they are stored on mainframes, mini-computers, UNIX boxes or Windows NT servers.
Larry Justice, systems network administrator for Sumter Regional Hospital, Americus, Ga., points out that his organization deploys all new applications on client/server architectures because they are easier to manage and change. But they also have a number of legacy applications running on a mainframe that do not make sense to throw away just yet because of the costs and risks involved in migration.
Greg Cornellier, systems director at Harvard Pilgrim Healthcare, Hooksett, N.H., advocates building a front end using some of the newer languages used in writing client/server applications, rather than rebuilding mainframe applications. "It takes three times longer to code something in COBOL vs. something like Sybase’s PowerBuilder on the front end side," he says. Harvard Pilgrim is New England’s largest non-profit managed care organization with 20,000 physicians and 140 affiliated hospitals.
The most cost-effective client/server strategy involves integrating information from different sources so that it can be used and managed more efficiently and cost effectively. "I have always been a believer that there is room for lots of different technologies," says Tom Borger, CEO of Health Systems Technologies, Seattle, which specializes in multi-tiered client/server systems for managed care. "There is really nothing that answers everyone’s problems all the time. We are an augmentation and not a replacement strategy. We fill in gaps for providers, and [to do so] the ability to interconnect and exchange information is important. Typically that data does not exist in an integrated form. Our approach gives caretakers the ability to use real-time data to make decisions."
One more benefit of migrating to client/servers that operate independently of mainframes is that client/server systems provide an easy solution to the year 2000 problem. "There are less year 2000 issues with client/server because everyone is building new applications with that in mind," notes Debbie Hartman, healthcare technical manager at Sybase, Burlington, Mass. "A lot of the people that built applications on the mainframe are still trying to make sure that their software can handle the year 2000."
Multi-tiered client/server technology is maturing to the point that it can be used to create robust, mission critical applications. "We have been doing two-tiered client/server over the last 10 years, but only over the last four have we figured out how to do it well," Cornellier says. "We are just beginning to understand three-tiered client/server architectures because they are three to five years old. We are starting to see success stories and are seeing how that works over time."
Despite the rise of PC-based client/server applications, the fact remains that there are close to 40 million mainframe and host terminals in managed care organizations and hospitals across the United States that lack the sophisticated interface users desire. To address this need, a number of manufacturers such as Wyse, San Jose, Calif.; NCD, Mountain View, Calif.; and Neoware, King of Prussia, Pa., have developed thin client terminals that emulate applications running on a variety of hosts, including mainframes, UNIX, Windows and Web servers.
When Chip Childress, director of IT at the Holston Medical Group, Kingsport, Tenn., began planning a PC expansion in early 1996, he took a long look at using thin client technology because of the reduced deployment cost. At the time, Childress estimates, it would have cost close to $3,000 per machine to meet the client/server software specifications of vendors requiring a Pentium-100 processor and at least 24 megabytes of RAM. A fully-equipped Wyse Winterm device costs about $1,500.
But Childress discovered that the initial deployment savings were only the tip of the iceberg. He estimates that he has saved far more in terms of management time and the increased productivity of workers. Holston now has about 175 Winterms, 25 PCs and 20 mainframe terminals for delivering applications to end users. The low cost of the Winterms has enabled Childress to place one in each examining room so that physicians and nurses do not have to take paper notes and then transcribe them into a computer.
The Winterms can access all Windows applications through a Citrix-enhanced Windows NT server, which allows Childress to centrally manage and deploy all software applications. Whenever Childress needs to make a change, he only has to do it once, instead of updating each machine. The Winterms allow Childress and his staff to remotely monitor users when a problem occurs. In the PC days, Childress would have to travel up to 26 miles to a remote site to solve a problem, delaying the work and costing him valuable time. "Now you don’t have to spend 30 minutes driving out to a site to see that someone is pressing the wrong key," he says.
Holston has a number of "floaters"--people who move from facility to facility. With PCs, they had to get familiar with a different desktop at each location, but with the thin client approach, the desktop is stored on the server, giving the floaters the same layout regardless of where they log in, thus reducing training needs and increasing productivity.
As managed care organizations grow, they tend to take on a wide variety of different types of servers and application environments, which in general can all be accessed by a thin client. For example, the Holston Medical Group has 320 employees and physicians spread across 13 facilities in the Kingsport, Tenn., area. It has a wide variety of systems that need to be accessed from any server in the enterprise, including an IBM RS/6000 server, a Digital Alpha server running UNIX, and a number of PC applications. Clerks, physicians, nurses, accountants, lab technicians and others use these systems.
Thinking ’total cost of ownership’
Sumter Regional’s Justice has begun deploying a number of Neoware thin clients throughout his facility to replace traditional mainframe terminals and run Windows applications on a gateway server. "The reason we decided to go with thin clients is that the total cost of ownership on PCs just kills us," he explains. "PCs are a pain setting up and maintaining; hard drives keep crashing; and it is difficult to do backups constantly. Users are always playing with the settings, which cause other problems down the road."
Barnes Jewish Community Hospital, St. Louis, has launched one of the largest deployments of thin clients in a managed care environment to date. Administrators are installing 1,500 NCD Exploras in hospital rooms to allow doctors to access EMTEK, a UNIX-based charting application via an X-Windows interface--a terminal emulation environment for UNIX workstations with a GUI. "The original plan was to put a small UNIX workstation next to each bed, but that is as problematic as putting a PC everywhere," says Colin Melville, an IS engineer at Barnes. "You have a hard drive to maintain and it requires lots of support. That is a pretty expensive bedside terminal."
As managed care organizations merge or acquire one another in order to achieve economies of scale, the standards-based interfaces of thin clients can make it easier to integrate the information technology assets of these technologies. "The whole BJC Health System is going through a merger," Melville notes. "Deinstalling old systems and merging is going to be a pretty long process."
Another architecture for deploying applications in the managed care environment is through a Web interface using the HTTP and HTML protocols. Virtually all host and server platforms including IBM’s System 390, UNIX and Windows NT servers support some form of Web server today.
"People need to access applications from many locations and the Web environment enables them to do just that," explains IBM’s McCaffrey. "Now that you have exposed those applications to many people, it is a very unpredictable environment and you need a computing environment that can handle it."
Opening applications to the Internet can create unpredictable results, requiring the need to invest in new equipment and networks to handle the load. "People looking at a thin client, intranet or extranet need to look at their networking infrastructure and need to be prepared for unpredictable accesses," says Sybase’s Hartman. "If a managed care organization is building customer service applications for the Web, they are not necessarily going to know when their peak times are going to be, whereas in a mainframe environment it is much more controlled."
But the beauty of the Internet is that it allows widely distributed applications. For example, the New Zealand Ministry of Health is building a nationwide intranet for the intercommunication of information between 50,000 hospitals, general practitioners and labs, using the Internet and Java as a front end.
Today’s computer platforms are not about mainframe vs. client/server, but about coexistence and how IT managers can help these different models work together. "Organizations grow for two reasons," Cornellier says. "One is that they just get old, and the other is through acquisitions. In recognizing that, the coexistence strategy is what you need to come up with. Converting one system to another is really expensive and can be risky to the organization. The strategy of coexistence is really an integration strategy about how you can make those legacy systems last longer."
George Lawton is a technology writer in the San Francisco Bay area.