When one looks back over the history of science and technology, it is amazing to think how far humanity has come. It is humbling to think how recently many things were invented and how far they have developed since their first inception. I never knew a world without calculators (though I have a vague idea of what slide rules are), and the children of today (at least in the West) have never known a world without computers, video games, and mobile phones.
Technological revolutions sometimes occur as the direct result of scientific breakthroughs and sometimes as an ingenious new way to use existing science. Inventions start with a single prototype, but they become revolutionary only when these inventions are made available to the general public. Then, what was originally a bright idea known only to a single man or woman becomes of practical value to others. Once these ideas enter the public domain as commercial products, they are developed and improved as a result of public demand, and this progress continues so long as it is useful.
One area where such development and improvement is particularly apparent is in computing, where it has been quantified. In 1965 one of Intel's co-founders, Gordon Moore, predicted that the number of transistors in the silicon chips used in computers would double every two years with a corresponding increase in computing power. (Transistors are the electronic switches that perform logical calculations within computers.) Despite the skeptics, "Moore's Law," as it has become known, has continued to hold true until the present day. Speaking at an international silicon conductor conference earlier this year, Moore, referring to the future prospects of this law, said it would hold for at least another decade, and added, "There is certainly no end to creativity." See, for example, Michael Kanellos, "Moore's Law to rule for another decade," zenet.com.com/2100–1103–984051.html.