Imagine | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

Imagine

January 1, 1998
by Charlene Marietti
| Reprints

A robot helps a surgeon drill the precise cavity for a hip replacement… an internal medicine resident practices surgery using a virtual reality headpiece… a busy cardiologist keeps close tabs on her 75-year-old patient at home through a wireless wristwatch that sends vital signs to her palmtop computer. Twenty-first century medicine is closer than you think.

Yet, unlike some of the gadgets portrayed in science fiction, futuristic ideas in progress today are not simply one mad scientist’s attempts to outdo another: such innovations are necessary if we are to discover best practices and lower the cost of care. As the delivery process expands beyond the hospital environment to the outpatient and/or home care environments, so too must the technologies be mobile, flexible and scalable to bridge the chasm between the hospital, the clinic and the home.

Consider biosensors--devices that can read and transmit responses to physical or chemical changes. What if a small biosensor device could be implanted in a person with, say, diabetes, to continuously manage this chronic ailment? That is precisely the goal for David Gough, PhD, professor of bioengineering at the University of California, San Diego (UCSD). One of many researchers seeking a means to control blood glucose levels in people with diabetes, his goal is an implantable glucose sensor. Based on electrodes and immobilized enzymes, the implanted sensor would continuously monitor glucose levels--eliminating the need for finger-sticks and blood draws--and maintain a normalized blood glucose level by controlling an insulin pump. Adequately funded, Dr. Gough thinks technology advances could make it practically viable in the next few years.

Real-time biosensors are potentially lucrative products for healthcare--and not just for monitoring patients. For diabetes patients, bypassing frequent blood draws and insulin injections would make the disease more manageable, and would help reduce visits to the hospital or clinic. Another possible use is to help detect wound infections through electronic nose, or E-nose, applications.

The transition of services outside hospital walls is no surprise, but few foresee the home as the center of healthcare services as clearly as Kenneth Kaplan, principal research scientist at the Massachusetts Institute of Technology in Boston. "The home will be the cockpit for healthcare delivery," he predicts. Home-focused care will be facilitated by the family of technologies that support telehealth and telemedicine--smart systems, imaging, telecommunications and biosensors among them. "It will require coordination to make it happen," he says, "but it will happen because consumers want it and will demand it."

As an outgrowth of his work with the U.S. Department of Defense and the Defense Advanced Research Projects Agency, Kaplan and others have formed the National Healthcare Project to re-engineer the delivery of healthcare. The cross-disciplinary team, including leaders from private and government sectors, will focus on developing the prototype for a consumer-centric healthcare system. (For more on the National Healthcare Project, see interview on p. 87).

The computer-based surgeon
The home may become the primary and secondary care unit sites, but access to high-tech facilities and specialists, particularly for surgery, is unlikely to fade. New ways to generate and use electronic data for diagnostic tools and surgical procedures are coming forth: among them more precise patient assessment, more accurate diagnoses and better therapeutic interventions. Multidisciplinary technologies such as scientific visualization--which uses multimedia and animation--and robotics promise to refine the art of surgery. Also called visual data analysis and SciVis, scientific visualization provides models of data that are graphical rather than numerical.

Augmented reality is the term preferred by Henry Fuchs, PhD, to describe his research goal of interdisciplinary scientific visualization that will enable the clinician to "look inside" the patient. Augmented reality is different from virtual reality, explains Fuchs, professor of computer science and radiology oncology at the University of North Carolina at Chapel Hill. In addition to virtual images, augmented reality incorporates actual images of the environment surrounding the user.

Fuchs has spent the last 20 years working on head-mounted displays and medical imaging and is still enthusiastic about the potential of these technologies. His display unit includes a tiny camera that projects a synchronized view of multiple electronic data images to a small display screen, visible to the surgeon, but occupying only part of the field of vision. The surgeon continues to have nearly full range eye movement in his surroundings, while at the same time viewing 3D images of internal body parts built from patient data. "It will be another 10 to 15 years before the technology is appropriately advanced for widespread adoption," Fuchs says.

"Imaging and image guidance will become closely integrated into the way surgery is done," says Ron Kikinis, MD, director of the Surgical Planning Laboratory at Brigham and Women’s Hospital, Boston. As a result of his scientific visualization project combining computer science and medical resonance imaging radiology disciplines at the laboratory, he expects these capabilities to become available to non-research facilities within five years.

Surgeons may be able to better visualize the patient internally in the next few years, but surgical tools and dependence on the individual surgeon’s skills have changed little over the years. Often compared with skilled craftsmen, surgeons must rely on good eye-hand coordination and sterile, but otherwise rather crude tools, particularly for orthopedics. Robotics offers great promise on both fronts.

"The computer revolutionized manufacturing," says Kevin Dowling, PhD candidate, Carnegie Mellon University, Pittsburgh. "Now it will revolutionize surgery by creating tools that will provide better patient outcomes." Robots can provide decision support before surgery, facilitate less-invasive surgery, provide surgical tools customized to the patient and help guide the surgeon’s hand during the procedure--all key factors in shortening post-surgery recovery times for patients.

Surgeon empowerment through robotic tools has been the focus for researchers at the Robotics Institute at Carnegie Mellon, an active center for robot development and the birthplace of the RxOBOT for the pharmacy (See Healthcare Informatics, Nov. 1997, p. 37). The institute’s current project, HipNav, has developed robotic tools for hip surgery at the socket site. The surgeon remains fully in charge of the procedure, stresses senior research engineer Mike Blackwell. "Robotics is not mature enough to allow a robot to fully perform surgery," he says. HipNav is now undergoing clinical trials in the operating room.

A new and controversial use of surgical robots for hip-replacement surgery is the American-grown and European-tested ROBODOC. Originally developed at the University of California, Davis, Integrated Surgical Systems, Inc. of Sacramento, Calif. now develops, manufactures and markets the computer-controlled, image-directed robot. It is commercially available in Europe, but not in the U.S.--at least not yet. (The company is awaiting FDA approval.) This robotic device automatically drills into the bone and shapes a cavity for the implant after the surgeon has exposed the patient’s thigh bone. More than 1,500 patients in Europe have undergone the automated surgical procedure.

Although medical robotic research has primarily focused on orthopedics, robotics may also be used for I.V. medication delivery, noninvasive surgery and endoscopy. Dowling envisions using robotics in assisted care to help the elderly and disabled develop daily life skills and become more independent.

The communications explosion
The means to link together these increasingly sophisticated medical departments, offices, facilities and other points of care is overwhelmingly dependent upon how quickly the telecommunications industry--now in the race to consolidate and ramp up new services--can deliver affordable access to advanced technology. Deregulation of the U.S. telecommunications industry has opened unequaled opportunity to transform the way we will communicate in the near future.

Two developments promise to loosen the current bottleneck at the local level: new wireless applications and telephony services across utility lines. Major telecommunication companies will offer a host of business services for communications throughout and beyond the enterprise, including Internet access. While the technology is now available, the struggle for the telcos will be putting in place the support system required to provide those services, according to Michael Smith, senior analyst at Probe Research, a telecommunications consulting company in Cedar Knolls, N.J.

Take for example, remote monitoring digiphones. Now in the first stages of deployment in Japan, these watch-size cellular devices use digital signals to monitor patient locations, register pulse rates and, if necessary, generate real-time emergency data transmissions to a care center. Such devices could be a boon for home healthcare, yet digiphones are not available in this country because the U.S. lacks the telecommunications infrastructure to support the digital technology on which they operate, according to Bill Hartwell, manager of sales and marketing for software developer Etsee Soft, Inc. of Pennington, N.J.

There is hope. As the wireless voice market becomes saturated and telecommunication providers seek new markets, digiphones and other cutting-edge devices may become available in the United States, according to Scott Midkiff, PhD, researcher and associate professor at Virginia Polytechnic Institute and State University, Blacksburg, Va. But, he warns: "It will take more than an infusion of new hardware. It also will take new software, applications and work processes to fully leverage the benefits of wireless."

Advances in protocols, better understanding of more efficient uses for bandwidth, and development of new bands supporting higher frequency devices will follow market demand. Midkiff predicts decreasing costs and increasing data rates from telecommunication providers--good news for all data seekers; and improved wireless displays--good news for image-intensive healthcare applications.

Home-based care may also benefit from a new technology--announced by Ontario, Canada-based telecommunications giant Nortel and British firm Norweb--that allows electricity companies to run communications networks over standard electrical power lines. This breakthrough promises high-speed, low-cost access to the Internet and other data communication services for nearly all businesses and homes. Coupled with the university-based Internet2, one of several projects developing the infrastructure and applications for the next generation of the Internet; this will provide the sound, voice and data communication channels necessary for sophisticated telehealth and telemedicine applications that can serve patients in any location.

People-smart computers
Bringing technology to the masses must entail a departure from the cumbersome methods of creating and using software that mark the history of computerization. Creating "human-serving technologies" based on how people design, implement and use computer systems is a goal at the Human-Computer Interaction Institute in the School of Computer Science at Carnegie Mellon University. Research there has centered upon making software easier to develop and customize.

One focus is on simplifying scripting and programming languages. A fair amount of research has been completed that analyzes how people learn to program, but the information has never been used to design a new programming language, according to Brad Myers, PhD, a researcher at the institute. The Natural Programming Project correlates the thought paths people use in problem solving with computer system functionality. How people naturally think and talk about the programs they would like to write will be used to develop a language that is more aligned with human thought processes.

There are many domains, including healthcare, where, given the right tools, computer-proficient users could modify their own systems and make them more effective, Myers says. The User Interface Software Project provides tools to simplify the design and implementation of interactive software. He is working on mechanisms that would allow users with few programming skills to customize applications--difficult to do with some of today’s popular languages like Java. Work is under way on alternative tools to customize Web applications that do not require the power of Java.

If computers are going to simulate humans, speech recognition will be an absolute necessity. Although voice recognition technology is maturing rapidly, interactive speech functionality is the ultimate goal, says William Meisel, PhD, president of consulting group TMA Associates, Tarzana, Calif., and editor of the industry newsletter, Speech Recognition Update. And not only must it be interactive, it must be portable. Portable prototypes are just beginning to come to market, but they are cumbersome and expensive. He expects both prices and sizes to shrink in the next few years. Add Microsoft’s recent commitment in the form of a $45 million investment in Belgium-based speech company, Lernout & Hauspie, and speech-enabled Windows may be in the box in the next few years.

However, as MIT’s Kaplan points out, "Managers need to better organize themselves to benefit from emerging healthcare delivery and technology trends." New technologies must be met by an industry ready and able to manage the new industrial processes. Rather than taking the chronic short-term view, healthcare leaders must plan for the next five to 10 years with a long, hard look at the whats, the whos and the hows. What would we like to see happen? Who are the consumers we want to reach? And how can these technologies help us accomplish these goals? Ultimately, new technologies won’t change healthcare--people will.


Emerging Tech Web Sites

American Assoc. for Artificial Intelligence
http://www.aaai.org

Biosensor Research Group
University of California, San Diego
http://www-bioeng.ucsd.edu/research_groups/biosens/public_html/index.html

Etsee Soft, Inc., Pennington, N.J.
http://www.etseesoft.com/

GTE Telemedicine Virtual Tradeshow
http://www.gte.com/Biz/Tradeshow/Telemedi/telemedi.html

Integrated Surgical Systems, Inc., Sacramento, Calif.
http://www.robodoc.com

NSF Science and Technology Center for Computer Graphics and Scientific Visualization
University of North Carolina, Chapel Hill, N.C.
http://www.cs.unc.edu/Research/stc

Telemedicine Research Laboratory
U.S. Army Medical Research and Material Command, Frederick, Md.
http://www.matmo.org

The Bradley Department of Electrical and Computer Engineering
Virginia Polytechnic Institute and State University
http://www.ee.vt.edu/ee

The Human Computer Interaction Institute
Carnegie Mellon University, Pittsburgh, Pa.
http://www.cs.cmu.edu/afs/cs.cmu.edu/user/hcii/www/hcii-home.html

The Robotics Institute
Carnegie Mellon University, Pittsburgh, Pa.
http://www.ri.cmu.edu

The Surgical Planning Lab
Brigham and Women’s Hospital, Boston, Mass.
http://splweb.bwh.harvard.edu:8000

Xerox Palo Alto Research Center
Palo Alto, Calif
http://www.parc.xerox.com/parc-go.html

dpiX, A Xerox New Enterprise Company Palo Alto, Calif.
http://www.dpix.com


Charlene Marietti is senior technology writer at Healthcare Informatics.



The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


/article/imagine

See more on

betebet sohbet hattı betebet bahis siteleringsbahis