on /. there is a link to a story on the imminent death of silicon semiconductor as a basis for computing.
These predictions have a history of being wrong.
This is not to say that silicon will go on forever. It is an indirect bandgap semiconductor which dissipates some energy as phonons (sound / heat waves in the material, think of hitting an iron bar, and the tones it makes, thats energy you imparted to sound heat in the material). We are rapidly approaching the point at which we cannot describe these things classically, and need to worry about quantum effects (tunneling between structures, etc). My guess has been ~8nm structures we start seeing significant onset of quantum effects (thermal electrons at 300K or about room temperature, have a deBroglie wavelength of about 7.4 Angstrom, or about 0.74 nm … 8 nm is 10x that, and even with exponential tails, if the lines are close enough, you create a Kronig Penny potential, which looks like a square wave, and get conduction and electronic structure between these lines …)
The death of silicon had been predicted for a while now. I studied GaAs in grad school, it (Gallium Arsenide) was supposed to be the next big thing, after superconductors were supposed to be the next big thing to build CPUs out of …
There are real concerns over Si, but I am not convinced that it is dead as a material yet. Other things should prove interesting (nanotubes, etc). But we are a ways away, and hype doesn’t help here.
I also argue that we need to leave the world of fermions and start on bosonic computational basis (use photons for example), but few subscribe to this notion.