Monday, December 16, 2013

Moore's Law for Dummies



References to Moore’s Law abound when people speak about the Singularity. The author has deemed it wise to make a brief explanation of this significant observation about the increasing efficiency of computers over the years.

The Essence of Moore’s Law

In 1965, Gordon E. Moore wrote a paper in which he described a trend in the increase in computing power. That computing power was increasing was obvious to most people who thought for any amount of time about the subject. That there was a pattern in this growth was not so obvious. Moore was a co-founder of the computer company Intel, so his observation was notable.

Simply put, Moore had studied the data and noticed that the integrated circuits on which computers rely tended to double their capacity for microprocessors about every two years. This was not something planned but rather the result of intense competition between various companies that were making smaller and smaller microprocessors so that the integrated circuits in their computers could possess more speed and power.

What Are Integrated Circuits and Microprocessors?

An integrated circuit is basically what people refer to when they speak of a microchip. It is actually a set of circuits embedded in one small chip of silicon. The earliest computers did not actually use these types of circuits. They were invented in 1949 by a German engineer but not immediately applied in industry. Through the next decade the advantages of this development became generally known and replicated.

A microprocessor, or transistor, is a unit of these circuits made form a semiconductor material such as silicon. Each transistor is able to perform critical tasks for a computer, controlling signals and even amplifying them. More transistors means more capability and speed.

What Does This Mean in Terms of Technology?

The significance of all this for the average person is in the rapid advances in various types of technology that seem to be newsworthy almost every day. The doubling of computing power approximately every two years has continued almost uninterrupted since the 1960s. This has enabled machines to do an exponentially increasing amount of work.

A great example of the power behind this growth in computing power is the story surrounding the Human Genome project. When this program formally began in 1990, skeptics doubted the ability of those involved to make meaningful progress in counting, mapping and identifying the total number of genes inn human DNA. Progress was meager during the first few years. However, the pace of this progress picked up along with the increase in computing power. Sudden advances were made after several years and the project was declared complete in 2003.

As a result of this increasing capacity and the shrinking of microprocessors, various advance shave been made in science and industry. The average individual should have noted over the past decade how personal computing became more and more powerful while the sizes of the devices involved became ever smaller.

A man in his 40s now was born during the Moon landings. He probably grew up without a personal computer in the home and may not have even seen one in school. He now owns a smart phone that can do more than the computers that sent men to the Moon and brought them back.

The End of Moore’s Law

This increase in power cannot continue forever, though, along the same lines. The walls of the circuits involved cannot achieve negative thinness. That is to say that such borders must be at least a few atoms in thickness or lose their ability to conduct signals effectively. Furthermore, as this “here be dragons” point is approached, the difficulty of maintain the pace of the advance increases as well. Computers will still be able to become faster and more powerful but they will have to do this by increasing the number of chips rather than simply squeezing more microprocessors onto the same number of such chips.

Some experts believe that we are already reaching this stage of development. Other theorists have already seen, though, possible ways of outmaneuvering some limitations and continuing to increase power while reducing size.

No matter where you think that this course of progress will end, it is obvious that the future holds immense possibility for continuing advances thanks to the growing power of convenient computing. Indeed, the advances already made have not become widespread enough or really sunk into the popular conscious. Consider 3-D printing, which is only beginning to make headlines. Even if the advances in squeezing microprocessors into microchips stopped dead right now, there would still be a wealth of knowledge to be gained by the advanced systems already in place.