The 60 second guide to big data by gogrid

The GoGrid folks have put together a nice marketing slide on big data, in the sense that they are explaining the features of it without explaining it, or how/where they fit. Its implied that they provide all you need for Big Data, but its their points along the way that make a great point for the day job and especially our new Fast Path Big Data Appliances.
Our argument has always been that you can’t approach Big Data with last millennium’s architecture. Without a tight coupling between computing and storage, you basically have no hope of accomplishing your analysis in either a reasonable time frame or at a reasonable cost.
Moreover, this data is not analyzed once and stored. No. It is continuously reread, with new analyses and data sets integrated. You absolutely cannot even begin to approach this with filer head type designs … it just will not work. You need to very closely couple the analysis software, the data base software, the computing power, the high performance networking, and the extreme performance storage in one unit, and then replicate that unit, having the analysis software deal with the inherent parallelism. This is (almost) exactly what Hadoop and other X-reduction (X = {map, …}) techniques do.
Start with very high performance density hardware, excellent distributed computing software, very fast networks. Build an analytical engine from that.
We are biased of course, as this is what we do.
This said, the original graphic is here.
Smaller image below, links to the larger downloaded image sourced from here.