Is there a Moore’s law for bandwidth?

We all know about the Moore’s Law: in 1965, the Intel’s co-founder Gordon Moore has stated in a paper that the number of transistors which can be placed inexpensively on an integrated circuit has doubled approximately every two years (since 1958), and made a prediction that this trend will continue for about a decade. Surprisingly, this assertion generally holds true even today, over half a century later – and it’s expected to continue at least until the physical limits of miniaturization are reached (but, with the advances in quantum computing, it might continue to hold if we replace the word “transistors” with a generic term “logic units”).

However, I’m curious if there is anything comparable to Moore’s Law with relation to the Internet bandwidth – let’s formulate it as “the speed at which an average person could inexpensively access the Internet at will”. Is there any research which tells us how did that kind of bandwidth increase over the years?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s