Recently, a client of mine was wrestling with PACS vendor differentiators, and we had a discussion of potential image visualization advantages. It gave me a chance to reflect on over thirty years of imaging technology, and how far we have come. It also reminded me of the reversal in the proverbial “chicken and egg” scenario!
In the early days of imaging technology, there were no video games, high definition TV, etc. upon which to develop. Imaging requirements far exceeded anything that was commercially available, and consequently, imaging paced commercial developments. For example, the earliest image displays for Computed Tomography (CT) were 256 by 256 pixels, or a whopping 0.07 megapixels! GE pioneered a 320 by 320 pixel display with the development of its earliest CT scanner, and that was revolutionary. CRT (Cathode Ray Tube) displays were also custom, as they needed to be high quality non-interlaced gray scale, and commercial TV was typically interlaced color, so even they were unique. And, image manipulation controls were all custom developed as well, as they initially preceded even the lowly track ball! All of these technologies were custom development and their cost reflected it.
Fast forward to today. Now, commercial developments seem to be pacing imaging technology, and “COTS” (common-off-the-shelf) technology has driven down the cost of imaging equipment. Yet, it seems commercial developments are outpace imaging, which still hasn’t caught up. In some respects, requirements are still somewhat unique. Commercial HD TV is still 1080 pixels, while imaging displays utilize high-end graphic flat panel technology that can range from 3 megapixels up to 9 megapixels. When it comes to image manipulation, commercial technology still outpaces imaging.
Witness the iPhone and other smart phones that have ushered in hand gestures as a means of control. I recently became an adopter with the Palm Pre, and now would have a hard time readapting to a Palm or Blackberry format where one cannot manipulate the image size when viewing the internet or a mail message. The Nintendo Wii ushered in a whole new video gaming experience by interjecting normal hand movements to gaming control, thereby simulating natural motions such as swinging a baseball bat, or rolling a bowling ball.
Slowly these technologies are beginning to emerge within imaging. Over the past several years, vendors have shown “works-in-progress” applications of hand motions and large display formats for image manipulation. And researchers are experimenting with gaming controllers to control the display and navigation of images. But to date, there is no commercially available application of the technology, and the 3 megapixel flat panel and mouse continue to dominate.
One area remains open as to who is the chicken and who is the egg! 3D visualization has long been a promise for both imaging and gaming, but to date, it has not become a major factor in either. I’m not speaking to the 3D software visualization tools that present 3D representations on 2D displays. I am speaking to the full-scale 3D visualization. Granted, there are some high-end gaming 3D glasses, and holographic 3D projectors, but nothing has caught on in the mainstream.
Personally, I believe this is the game changer for diagnostic imaging displays. Imagine a radiologist having the ability to view the entire body in true 3D, and then being able to zoom in on specific areas and view in any cross sectional plane desired! Instead of taking valuable time to view thousands of cross sectional images and build up the 3D representation in their mind, the diagnostician could begin with the entire visualization, and then concentrate on suspect areas. Such technology could easily make use of the more advanced control technology such as hand motions to provide a more interactive experience.
So how far off is “Buck Rogers” technology for healthcare? In today’s environment, it is hard to say. Some things are probably a given under current conditions. It is highly unlikely that diagnostic imaging can support custom development for its limited application. Changing healthcare policies just won’t allow it. Therefore, it seems we will be dependent on how fast such visualization technology is adopted for commercial use, and how applicable it would be for imaging application. Might there be some compromises along the way? Most certainly. For example, ask any old timer from radiology as to whether an x-ray developed on a glass plate was superior to wet chemical processed film, and they will tell you it was the gold standard. But, volume and throughput demands hastened the acceptance of the wet chemical 90 second processor. So, might a holistic 3D display compromise image quality for visualization productivity? And will this eventually become the norm?
My crystal ball is still a bit hazy, but I venture to say, the future looks promising!