Skip to content Skip to navigation

Brief Thought: What Would You do with a Petaflop? Or a Petabyte?

November 28, 2009
by Marc D. Paradis MS
| Reprints

Some Technologies Coming Down the Pike

In the most recent Top500 List of Fastest Supercomputers a Cray XT5 Supercomputer known as Jaguar achieved a sustained 1.75 petaflop/second, running the Linpack benchmark, with a theoretical maximum of 2.3 petaflop/second. One petaflop/second is one quadrillion floating point calculations per second - that’s a million billion calculations per second. While petaflops and linpack are somewhat specific to the solution of a dense N x N system of linear equations, which are a common class of engineering problems, it is worth noting that “a single modern PC is now more powerful than a 10-year-old supercomputer”. In fact, most of today’s supercomputers are massively parallelized x86 and x86-64 chips, the same chips that run in most of the world’s laptops, desktops and servers. Petaflops on your desktop and teraflops in your smartphone are not far away.

We’re all aware of terabyte data centers these days, but are you aware that many of the largest are measured in petabytes? These data centers, whether running in support of The Cloud, data archives, games or other applications are so big and so new most engineers, administrators and mathematicians don’t even really understand how to work with them. Just for scale, it is estimated that the entire written works of humankind, from the beginning of time, in all languages is about 50 petabytes. Furthermore, if you believe in Moore’s Law, we are less than 5 years away from flash memory (i.e. thumb drives) in excess of 1 terabyte. Remember, there are no moving parts (other than electrons) in flash memory and I/O is several orders of magnitude faster than with traditional magnetic spinning disks and their read/write heads.

Combine both of the above trends with columnar databases, hot-cold data management strategies and the increasingly wirelessly-wired, interconnected and real-time healthcare environment where every device spews forth a stream of data and every human action and reaction, both of the provider and of the patient, is digitally recorded (think portable monitoring of metabolic states as part of intensive disease management).

Do you see the utopian worlds of Gene Roddenberry and Buckminster Fuller, or the dystopian worlds of Aldous Huxley, George Orwell and Nolan and Johnson?

Disclaimer: The opinions expressed herein are my own personal opinions and do not represent my employer's view in any way

Topics

Comments

Marc,
Brief Comment.

I don't have a smart answer to your Peta question, but I do have a related thought.

More than 10 years ago, I was at a talk in Boston by Nicholas Negroponte of the MIT Media Lab. He quipped, "I dont know how fast computers will be in the future but I suspect it will still take them 3 minutes to boot up."

I'd like to have a a faster computer, just to get the snap-factor back. I find myself running enough stuff concurrently, with funny/slow browser pages. I can watch the second hand moving and nothing happening in my multi-giga everything multi-core PC. I wonder what would happen if we all got the snap factor back to our computers.

I feel your pain Joe. Zero-boot/Always-On computers are coming. The biggest enabler of this will be solid-state memory such that the OS never needs to go down.

There also needs to be an architectural shift from a handful of processors on a chip and an OS that basically does one thing at a time but does each thing so fast that it _looks_ like its multitasking to architectures with tens, hundreds or even thousands of CPUs on a chip (believe it or not, there are even folks working on a supercomputer on a single chip) with OS's built to take advantage of this.

If Adobe or IE or some other app sucks up 99% of my CPU, that's just bad programming and atavistic architectures. OS' of the future will have CPUs dedicated to screen refresh, CPUs dedicated to specific applications and so on, such that even if one App goes haywire with a memory leak, the basic OS and other apps will/can be "insulated".

Following up on the supercomputer on a chip, Intel has just released a 48-Core chip that runs on Windows and Linux for research and testing purposes (http://news.cnet.com/8301-30685_3-10407818-264.html?tagleftColpost-1299).

Let me state that again. 48-cores on 1 chip. And note that they are really at the beginning of this architecture.

Effectively and efficiently using this many cores in a traditional computing environment requires completely re-architecting and re-writing essentially all the applications that you are used to using.

Marc,
Your points about the TI calculator computational power being being more than sufficient to support Apollo is a great perspective. So was your observation that "there should be no such thing as a slow word processor."

Many of us had this fantasy around 2005 that the dual-core CPU becoming standard would make it possible to assure an end-user, snappy experience. As you outlined, the linear development of software is giving us, only in 2009, operating systems with dedicated sub-systems to exploit the hardware advance. (I'm talking about Snow Leopard and Grand Central Dispatch, GCD, which will allow future programmers to write code that will speak intelligently to multiple cores. 2009. Is that shameful? appropriate? predictable?)

I should have added Google, it's Android, Motorola's implementation of it, and Palm's Pre to the iPhone story. It's clearly more than Apple who can innovate. This does, however, clearly demonstrate a first-mover advantage (for a large player.)

For those familiar with the non-technology component, Apple's move also demonstrates an ecosystem awareness that's very uncommon. For example, the 50 thousand apps for the iPhone in June 2009 (http://brainstormtech.blogs.fortune.cnn.com/2009/06/10/apple-fact-check-50000-iphone-apps/ The contraversy is not about the scale of the number).

 


Pair with that the effective absence of apps that reliably persist on my Blackberry Storm, or are readily and effectively available. It's zero. The twitter client I added a year ago has disappeared in the course of just keeping the darn thing running, using Verizon's help. The voice recorder that disappeared with my taking the last required upgrade. None of that has happened with my iPod touch. And I have stayed current wtih those software releases as well.  Blackberry has really struggled just to keep up with the most basic hardware and software on the Storm.

It would appear, to your point, that ecosystem development (which includes your concept of architecture and role of bloatware) grows at less than linear rates. Except among those very few companies that have pulled it together. I haven't looked recently myself but major industry columnists would suggest that MS's CE platform is perhaps the largest negative exemplar.

Closing on a HealthCare IT specific note, I continue to be impressed by my personal health record (the PHR I use; I haven't written one and am not likely too!). As I type this, my IPT running iPod+Nike software is recording my foot steps. It knows my pace, distance and historical exercise since I bought this device 3 months ago. The nike web site, that it sync's with transparently knows my exercise history going back for more than a year, from previous compatible iPod devices. And all of this is a by-product of a device that has a different primary role.

Going full circle to your linear and exponential point, when will I have commercial (meaning widely available, inexpensive, complete and supported) alternatives from the other portable computing platforms? Ones that integrate with free sites like Nike, track my weight and heart rate as well, and make calorie and protein recording something that a reasonable person would want to do? Never, of course -)

Here's a camera that you wear around your neck, that can take up to 30,000 pictures per day (http://www.newscientist.com/article/dn17992-new-camera-promises-to-captu...), it sense when you change environments and when someone approaches and takes a picture with a timestamp.

This is the kind of real-time monitoring, data collection and data analysis challenge that we need to begin grappling with now. The cloud will enable the rapid capture and storage of this information as well as the ability to monitor, analyze and predict future actions and reactions.

I especially like your last paragraph here Joe. To my uneducated and outsider's eye (no CS major am I), there has always been a Malthusian relationship between HW's capabilities (which seem to grow exponentially) and SW's abilities (which seem to grow linearly).

When you consider that the original computer which put man on the moon was less powerful than a 1970's Texas Instruments calculator you begin to realize this. Look also to the discussions in this very thread. Truth of the matter is that with todays chips and CPUs there should be no such thing as a slow computer for word processing, spreadsheeting or even web surfing. The disconnect comes largely from terrible architecture and bloatware.

Also great point about the iPhone and I agree 100%. I fully expect the iPhone to revolutionize the desktop experience and consumer expectations to the same degree, or perhaps more so, that the original Apple and the original Mac did. Even netbooks, while in and of themselves largely dying as a product category, have inadvertently set new standards for weight, connectivity and battery performance in the commercial laptop market.

One could argue that the computers you're describing are here in the form of the iPhone and IPT. As you know, the non-Apple applications are encouraged to be written according to "Design Patterns" and the window/mac desktop applications are described as "anti-Patterns." (Best starting point is here: http://en.wikipedia.org/wiki/Anti-pattern )

Part of the iPhone design pattern is no splash screen, no end-user visible wake-up, and completely asleep when control is handed over to next application. This is exemplified by good twitter clients, good email clients and other apps that completely duck out of the way when the user heads somewhere new or dives into a workflow (like an OmniFocus personal task management scenario where your shopping and errand list know, via the GPS, where you are, and your focus is defined by context, not project in GTD parlance.)

As you know, the iPhone/IPT's are already using SSD drives and people are delighting in zero-boot, always-on behavior. I tricked out a netbook with a fast SSD (they're not all fast, by the way), and two gigs of memory. For two or so concurrent tasks, it's pretty respectable. Then its performance simply degrades massively.

It's the large-screen, co-operative user interface like Partner's Next Generation EMR that I'm hopeful might really benefit from your Peta Flops. The concept of guided results review, guided documentation, guided ordering, with background reasoning including trumping rules that would make Petaflops really shine in healthcare.

Pre-ARRA, we were about 15 years away from that being commonplace in HCIT. With ARRA, it's completely unclear whether it will be sooner because of the investments pouring in, or later, because of the risks of certification, competing agendas under MU, and the anti-innovation impacts that have been raised as concerns. The declining cost of the petaflop is unlikely to be influencing the effective bottleneck to the positive promises of informatics..

I was involved in my son’s school for their campaign against smoking (called the Green Smoke campaign) and I met another parent who is working on super-computing and was talking about this. It is just amazing how people can talk passionately about their work and I spent the entire day talking to him about it, in between talks about encouraging people around us to quit smoking and give good examples to our children.

Marc D. Paradis MS

Marc D. Paradis, MS, is Director of Strategic Data Services in Applied Informatics and a Manager...