Skip to content Skip to navigation


January 1, 1998
by Charlene Marietti
| Reprints

A robot helps a surgeon drill the precise cavity for a hip replacement… an internal medicine resident practices surgery using a virtual reality headpiece… a busy cardiologist keeps close tabs on her 75-year-old patient at home through a wireless wristwatch that sends vital signs to her palmtop computer. Twenty-first century medicine is closer than you think.

Yet, unlike some of the gadgets portrayed in science fiction, futuristic ideas in progress today are not simply one mad scientist’s attempts to outdo another: such innovations are necessary if we are to discover best practices and lower the cost of care. As the delivery process expands beyond the hospital environment to the outpatient and/or home care environments, so too must the technologies be mobile, flexible and scalable to bridge the chasm between the hospital, the clinic and the home.

Consider biosensors--devices that can read and transmit responses to physical or chemical changes. What if a small biosensor device could be implanted in a person with, say, diabetes, to continuously manage this chronic ailment? That is precisely the goal for David Gough, PhD, professor of bioengineering at the University of California, San Diego (UCSD). One of many researchers seeking a means to control blood glucose levels in people with diabetes, his goal is an implantable glucose sensor. Based on electrodes and immobilized enzymes, the implanted sensor would continuously monitor glucose levels--eliminating the need for finger-sticks and blood draws--and maintain a normalized blood glucose level by controlling an insulin pump. Adequately funded, Dr. Gough thinks technology advances could make it practically viable in the next few years.

Real-time biosensors are potentially lucrative products for healthcare--and not just for monitoring patients. For diabetes patients, bypassing frequent blood draws and insulin injections would make the disease more manageable, and would help reduce visits to the hospital or clinic. Another possible use is to help detect wound infections through electronic nose, or E-nose, applications.

The transition of services outside hospital walls is no surprise, but few foresee the home as the center of healthcare services as clearly as Kenneth Kaplan, principal research scientist at the Massachusetts Institute of Technology in Boston. "The home will be the cockpit for healthcare delivery," he predicts. Home-focused care will be facilitated by the family of technologies that support telehealth and telemedicine--smart systems, imaging, telecommunications and biosensors among them. "It will require coordination to make it happen," he says, "but it will happen because consumers want it and will demand it."

As an outgrowth of his work with the U.S. Department of Defense and the Defense Advanced Research Projects Agency, Kaplan and others have formed the National Healthcare Project to re-engineer the delivery of healthcare. The cross-disciplinary team, including leaders from private and government sectors, will focus on developing the prototype for a consumer-centric healthcare system. (For more on the National Healthcare Project, see interview on p. 87).

The computer-based surgeon
The home may become the primary and secondary care unit sites, but access to high-tech facilities and specialists, particularly for surgery, is unlikely to fade. New ways to generate and use electronic data for diagnostic tools and surgical procedures are coming forth: among them more precise patient assessment, more accurate diagnoses and better therapeutic interventions. Multidisciplinary technologies such as scientific visualization--which uses multimedia and animation--and robotics promise to refine the art of surgery. Also called visual data analysis and SciVis, scientific visualization provides models of data that are graphical rather than numerical.

Augmented reality is the term preferred by Henry Fuchs, PhD, to describe his research goal of interdisciplinary scientific visualization that will enable the clinician to "look inside" the patient. Augmented reality is different from virtual reality, explains Fuchs, professor of computer science and radiology oncology at the University of North Carolina at Chapel Hill. In addition to virtual images, augmented reality incorporates actual images of the environment surrounding the user.

Fuchs has spent the last 20 years working on head-mounted displays and medical imaging and is still enthusiastic about the potential of these technologies. His display unit includes a tiny camera that projects a synchronized view of multiple electronic data images to a small display screen, visible to the surgeon, but occupying only part of the field of vision. The surgeon continues to have nearly full range eye movement in his surroundings, while at the same time viewing 3D images of internal body parts built from patient data. "It will be another 10 to 15 years before the technology is appropriately advanced for widespread adoption," Fuchs says.

"Imaging and image guidance will become closely integrated into the way surgery is done," says Ron Kikinis, MD, director of the Surgical Planning Laboratory at Brigham and Women’s Hospital, Boston. As a result of his scientific visualization project combining computer science and medical resonance imaging radiology disciplines at the laboratory, he expects these capabilities to become available to non-research facilities within five years.

Surgeons may be able to better visualize the patient internally in the next few years, but surgical tools and dependence on the individual surgeon’s skills have changed little over the years. Often compared with skilled craftsmen, surgeons must rely on good eye-hand coordination and sterile, but otherwise rather crude tools, particularly for orthopedics. Robotics offers great promise on both fronts.