A post has been making the rounds on Reddit and Facebook, graphically demonstrating the 1000-fold increase in (flash) memory density over the last decade. The question is, will this scale indefinitely, even through 2024? After all, look at how far we’ve come…
I’ll posit this: a 128TB device 10 years from now might look the same from above, but it’ll likely be quite a bit thicker. Moore’s Law (such as it isn’t) is already breaking down at the transistor level, and as memory cells get smaller they become more prone to “single event upsets” – bit flips caused by secondary particles from cosmic rays (though error-correcting coding helps [PDF]).
Then there’s the problem of heating – yes, you may make a little block with hundreds of terabytes of memory, but to use it takes energy, and write operations in particular are heat-intensive. And the more heat you have in a small space, the more chance you have for more random bit flips and other heat-related issues (electromigration, etc.)
Finally, there’s the problem of interfacing with this hypothetical 2.5D chip stack or 3D device… the micro-SD card above uses anywhere from one to four pins total for its data path, and write speeds aren’t scaling dramatically (most of the speedup we see today in newer USB 3.0 devices seems to be in fan-out… several devices in parallel being written to / read from at once).
That’s not to say there won’t be applications for such a high-density device – particularly in the scientific realm and the social- and mass-media space. But will they scale down to the consumer level, at today’s price points, with the reliability we’ve come to expect? I’m going to say no with a caveat – I think there will be far more interesting technologies and applications in 10 years, technologies that may necessitate new ways of looking at digital (and analog) memory.