Data is exploding at a rate of thirty to forty percent a year; but not fast enough. A surprising conclusion when one considers this rate of increase. But surges and extrapolations can map their own limits. For those who chart this increase, the limits or buffers are human.
And yet, the researchers say that computers are capable of processing much more information in a much more sophisticated fashion. The problem? There are not enough human programmers to come up with more sophisticated ways to crunch data.
The problem is not hardware, but software. Data analysts argue that the information could be used far more effectively (and with radical consequences) if more sophisticated algorithmic methods were found.
It is clear that when breakthrough software is achieved and implemented, the data manipulation that we currently see appears clumsy and primitive by comparison. We are not yet in the age of data customisation with our own portable menomes.