This morning, SC’05 featured a keynote address from Bill Gates, CTO and founder of Microsoft. Prior to the keynote, we watched a video loop, and we heard from the heads of the ACM, and the president-elect of the IEEE, as well as the chairperson of the board for SC’06 in Tampa Florida. The president-elect gave a good and short talk on a number of things, including the need to get more women and minorities into the profession. She said that we needed to do a better job, and that it was just good business sense to be inclusive. She is right.
Then Mr. Gates came on, and talked.
I am not a fan of some of Microsoft products due to the various issues we run into with them. However, I have great respect for anyone who can take a two person company over a 25 year period and turn it into a powerhouse. Anyone who runs a business can tell you how hard it is, if they work at it. Success is never guaranteed. It is elusive. You have to be smart enough to recognize opportunity, bold enough to go after it, and courageous enough to continue with it and stay the course. In short, it ain’t easy, it isn’t for the feint of heart.
I respect Mr. Gates, and what he has done, and continues to do. This is independent of what I think of his companies products. They need work. His vision though, does not.
He spoke at length about the purpose and rationale for high performance computing. His talk was for the most part, dead on (prior to the examples and demos). Supercomputing is a massively enabling technology.
No, not just clusters, as supercomputing isn’t just about the platform the application runs on. We lose sight of this when every supercomputer is the hammer to beat on the HPL/Linpack nail. Supercomputing is about time to insight. This had been a mantra when I worked at SGI years ago. It was a cliche’, but one of those annoying ones that you can’t get rid of, as it is was right.
Supercomputing isn’t being done for supercomputing’s sake.
Mr. Gates talked about what supercomputing has done, and could do. The video loop which I mentioned (which we would love to have BTW), is something that should be required viewing by any person questioning the value of what supercomputing could bring. Mr. Gates talk (again, which we would love to have a copy of), should be required reading of those who don’t get it. There are long lists of such folks, those who question the value to society and therefore the market impact of such things.
After all, if we have done without it, we can continue to do with out it… right? The fallacy of this argument is easy to pick out. The internet didn’t exist until recently, so we could do without it. Penicillin did not exist until recently (within the last 100 years), so we could do without it. And so on.
Does anyone want to argue that the economy and the world is better off without the internet, or Penicillin? No?
If you lack courage to take the risks, you will not reap the rewards. Yes, the internet has lots of garbage on it. This does not seriously diminish its value as a tool of communication, commerce and growth. Yes, Penicillin is not for everyone. This does not diminish its value as an inhibitor of bacterial growth.
You don’t get reward if you don’t take the risk. You shouldn’t demand the benefits of something without paying the costs of getting that thing. Supercomputing is like this. Mr. Gates recognized this. I wish others would as well.
He had a demo run by his product manager for HPC systems. I won’t go into the details of the demo, but one thing of note is that he created a workflow that appeared to include a Linux cluster as one of several compute elements. Speaking to a few customers after this, they were excited by it. It did look interesting, though it was quite Microsoft centric. That was to be expected.
But the point that was being made here was not one of interoperability. That was a side note. The point being made was that programming supercomputers is hard, and to get benefit, you need to look into building workflows that people can share. That is, you write your code at a much higher, more abstract level. The example used Matlab. It was quite interesting, basically a search for cancer biomarkers in a sample of proteomic data from some large number of patients.
Again, he is right. You will get better results when you think higher up the food chain and work there. You will get faster results when you look at the chain you have and optimize it. You will get more results per unit time when you are inclusive (just like the president-elect indicated), and work with everything without discrimination. As she said, it is just good business.
Video and pictures from this are on the photo site.
The rest of the day was spent talking to people. Some interesting observations have been made which we need to factor into our thinking. Sort of like playing spot the missing thing in a room full of a million things. Takes a while to see it.