Faced with skyrocketing imaging data storage costs, CIOs at three healthcare facilities found alternatives that have enabled them to slash expenditures, monitor their storage requirements, and provide more cost-effective backup solutions.
Brad Harrison, director of IT at 450-bed Regional Center of Medicine in Memphis, Tenn., was running out of space to store an exploding volume of diagnostic images. “It was almost a daily struggle,” Harrison says. “We would be at capacity, and we wouldn't be able to pull off any images because our system had to have free space to do that. So we had a recurring issue of space.”
Cost was also a problem. Regional Center of Medicine is the principal trauma center for Mississippi and Arkansas, as well as southwestern Tennessee. As a safety net hospital with a high proportion of self- and public-pay patients, the hospital is typically in the red, losing anywhere from $5 to $15 million a year.
Changes in the technology of imaging modalities are making the information more dense and more comprehensive.-Alan Howard
So Harrison was looking for a way to accommodate growing image storage needs without breaking the bank. “We were paying about $50,000 for 4 terabytes of space, and 4 terabytes would last maybe six months. The amount of money it took to run an onsite and an offsite data center and still have some growth and capacity was astronomical. We were paying in maintenance and circuits about $250,000 to $300,000 a year,” he adds.
Harrison slashed annual image storage expenditures to $85,000 by eliminating the two data centers and reducing the manpower to support them. An IT staff of 20 supports the 1.2-million-square-foot facility and its 2,000-plus computers, in addition to a LAN that connects to multiple other sites. “It was important that we were able to leverage a third party to take a big portion of the workload,” Harrison says.
The single cloud storage option for backup and archiving gave Harrison the chance to pay for storage as he went along instead of incurring upfront capital expenditures for long-term storage solutions. It also allowed him to free up storage bandwidth and physical space.
Harrison in fact is no longer worried about having enough space for diagnostic images. “There will never be an instance where we will have to pull a study out of our image library to free up space. There will never be a time when we will have a storage concern,” he says.
SIMPLE STORAGE MONITORING
Flexibility was a principal concern for Alan Howard, director of IT at Princeton Radiology, a full-service diagnostic imaging practice that handles clinical data for its 35-physician primary imaging practices, as well as six outside imaging centers in central New Jersey.
When Princeton Radiology transitioned from an analog environment to a digital architecture four years ago, Howard wanted to be sure he could accommodate the fluctuating image data storage demands that come with advancements in technology and patient volumes. “Changes in the technology of imaging modalities are making the information more dense and more comprehensive. Multi-slice imagers are taking more slices. Pixel matrices are getting larger. So we expect not only that our patient volume will always be growing, but the amount of data we have to store per patient is also growing all the time,” Howard says.
The trick was to figure out how much space to provide and when to ramp up storage capacity. “We always try to predict how fast imaging data is going to expand, but it always surprises us,” Howard says.
Because Princeton Radiology does not have inhouse storage architecture expertise, Howard thought he would have to resort to outsourcing for professional services that would help in storage planning. “We were originally at a point where we didn't know what we would need, which would have meant bringing in professional services probably multiple times,” he says.
A storage area network (SAN) gave Princeton Radiology simple tools for monitoring storage capacity and disk utilization in real time so it could track the pace of change in imaging data storage requirements. “We can provision what we think we're going to need ahead of time, and if we're wrong and we run out of space faster, it's a matter of a few extra disks, rather than professional services coming in to make a lot of modifications,” Howard explains.
Princeton Radiology consequently can buy extra space when the price is right. “If we had to plan maybe two years of growth with our initial purchase, we would have been wrong in the first place, and we would have bought extra technology at a time when it would have been more expensive,” Howard says. “Because the capacity of the physical drives and the cost are going down, the actual cost of a terabyte drive was less than the cost of a 300-gigabyte drive. We are able to buy at the best pricing on demand.”
CUTTING BACKUP COSTS
Six-year-old Kansas Spine Hospital in Wichita had disk space issues almost from day one because its picture archiving and communication system application had a relatively small image cache running on a spinning disk and a DVD jukebox archive.
So when Mike Knocke joined the 38-bed hospital as CIO three years ago, the first thing he wanted to do was to remove or eliminate the DVD jukebox. “Moving to a disk-based backup solution was kind of a Holy Grail,” he recalls. But software-based versions of a spinning backup were expensive. “What you've got is a backup of your data that has to be restored because the data are compressed and in a small footprint. When you start putting in a client with a bunch of drives onsite and think about having a vault offsite, the price quickly goes to six figures,” he says.
Knocke opted for a two-tiered production SAN because the data are replicated in a format that does not require a restore, and he could save about 85 percent of the costs of acquiring a traditional backup solution, cutting his data storage expenditures from $100,000 to $15,000 for a backup server, tape drive, and software.
The SAN is scalable, so Knocke has no qualms about adding more storage. He may one day use a higher tier of storage for the fastest, most recent images. But, he says, “The groundwork and foundation are there. I don't have to worry about adding any functionality that I don't already have.”
Karen Sandrick is a Chicago-based freelance writer who writes frequently about diagnostic imaging topics. Healthcare Informatics 2010 August;27(8):16-18