Moore's Law is generally taken to mean that the number of transistors on a chip -- and by extension, processing power -- doubles every two years. In reality, Gordon Moore, the computer scientist who originated Moore's Law in 1965, was talking about the economic costs of chip production and not the scientific achievements behind advances in chip design.
Moore believed that the costs of chip production would halve annually for the next 10 years but may not be sustainable afterwards [source: Hickins]. The limit to Moore's Law may then be reached economically instead of scientifically.
Several prominent computer experts have contended that Moore's Law cannot last more than two decades [source: IEEE]. Why is Moore's Law doomed? Because chips have become much more expensive to produce as transistors have become smaller.
One analyst has predicted that by 2014, transistors will be 20 nanometers in size but that any further reductions in chip size will be too expensive for mass production [source: Nuttall].
For comparison, as of summer 2009, only Samsung and Intel have invested in making 22-nanometer chips.
The factories that produce these chips cost billions of dollars. Globalfoundries' Fab 2 factory, set to begin production in New York in 2012, will cost $4.2 billion to build. Few companies have those kinds of resources, and Intel has said that a company must have $9 billion in yearly revenue to compete in the cutting-edge chip market [source: Nuttall].
That same aforementioned analyst believes that companies will attempt to make the most out of current technologies before investing in new, more expensive, smaller chip designs [source: iSuppli]. So while the end of Moore's Law may limit the rate at which we add transistors to chips, that does not necessarily mean that other innovations will prevent the creation of faster, more advanced computers.