In this article over at the Register, they talk about how Google’s Eric Schmidt had had his team do some math and they figured that they had indexed 170 terabytes out of a possible 5,000,000 terabytes currently in the world’s data stores. Most of that is of course, “surface web”, rather than “deep web”.

And the internet continues to double traffic (not storage) at somewhere between every 3 months to every 12 months depending upon who you believe (1, 2).

Seems like the search companies have some runway ahead of them yet!