This article makes a case that it is. As with many articles about X dying, its worth asking if their argument makes sense.
Basically the point they are making boils down to density, resiliency, and other aspects. Specifically they point out that the fundamental flash design is inherently flawed … it self destructs after a while … wears out. So their argument begins, the denser the bits per cell, the fewer write cycles before the cell is unusable. We can work around some of these issues using intelligent writing, signal processing, etc. But it doesn’t change the fundamental physics/mechanics/chemistry of the wear-out process.
The argument continues, pointing out the relationship between increased bit density and decreasing numbers of program-erase cycles. None of what they speak of is false, faulty in logic, etc. They are essentially correct. There are severe limits to bit density, and part of the constraint comes from the chemistry/physics of the cell, the hysteresis associated with the P-E cycle, and the accumulating structural damage to the cell.
Again this is correct.
So let me take a brief anecdotal side trip.
In 1991, we didn’t have blue LEDs. Couldn’t build them in a way that would be useful. They would tear themselves apart due to threading dislocations within a matter of minutes of lighting up. Basically, the chemistry of the LEDs was such that if you were using them as an LED, chances are you were dumping enough energy into the junction to dislocate atoms and start building these physically large defects.
Fast forward to today. Yeah, more than 20 years later, you cannot escape the blue LEDs. They are everywhere.
The issue came down to how you stabilized the junction, what you built it out of. They still rip themselves to shreds, only it takes much longer now.
What I am basically saying is that I am not completely convinced that there are no other options for the Flash chips … there may be some unexplored region of a chemical compositional space which enables longer life (more PE cycles).
The article doesn’t quite go that far though. They assume that Flash is as Flash is, and the only improvements will be material shrinks and bits per cell.
Couple this with improving capability of alternatives (and they called out memristors, racetrack, phase change, etc.), they forsee the IP we’ve (not us per se, but we as an industry) developed to handle the myriad of issues with flash will be less useful in the longer term.
If their supposition is true, then their prediction is correct. I’m not tied into the Flash research community, so I don’t know if people are currently looking at different materials or stoichiometry for such things. I’d imagine the chip folks would be quite interested in this … the ones that build Flash chips.
This said, its hard to do anything but baseline silicon these days … there is simply too much momentum/investment there. So I don’t expect radical changes. Which probably biases my thoughts to agreeing more with the article than not.
There conclusion is that the long term value of a flash company is quite low. Certainly not worth billions.
Given that instagram went for $1B, I claim that actual real value, and what people might pay for something (valuation) are rather decoupled now … so I wouldn’t make the broad sweeping generalization that they made. Aside from this, we’ve learnt a number of important things.
First, the form of the technology isn’t as important as how you present it to the end users for their use. That is PCIe vs SSD flash. Or rather than flash, lets call it FOO. Presenting FOO in a way a customer understands and sees value and less risk lowers the barrier to have them consider it.
Second, some of the lessons of the interconnect shouldn’t be tossed. PCIe makes for a great high performance connection network. Why not connect machines with it? Its possible to swamp SAS/SATA pretty easily with a fast enough design.
Third, one of the bigger lessons is that RAID silicon has a shelf life … the concept/design of RAID and HBA connects was for a world of the 1990s, when disk performance metrics were very different.
Changing flash for a different technology doesn’t change these lessons. So the companies less effected by the changes in the technology, and able to present out a FOO++ (the FOO followon) without too much pain, won’t likely go away.
I am making the case that “its complicated”, not that the author is right or wrong. Just that reality might be a bit more nuanced than they suggest.
Viewed 68705 times by 7295 viewers