Moore's Law

From ScenarioThinking
Revision as of 23:18, 5 March 2007 by Thelles (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Breadcrumbs: The Future of Ubiquitous Computing --> The Driving Forces: Economical Forces --> Efficiency

Description

Moore's Law is the empirical observation made in 1965 that the number of transistors on an integrated circuit for minimum component cost doubles every 24 months. It is attributed to Gordon E. Moore (born 1929), a co-founder of Intel.

The most popular formulation is of the doubling of the number of transistors on integrated circuits every 18 months. At the end of the 1970s, Moore's Law became known as the limit for the number of transistors on the most complex chips. However, it is also common to cite Moore's Law to refer to the rapidly continuing advance in computing power per unit cost, because increase in transistor count is also a rough measure of computer processing power.

A similar law has held for hard disk storage cost per unit of information. The rate of progression in disk storage over the past decades has actually sped up more than once, corresponding to the utilization of error correcting codes, the magnetoresistive effect and the giant magnetoresistive effect. The current rate of increase in hard drive capacity is roughly similar to the rate of increase in transistor count. However, recent trends show that this rate is dropping, and has not been met for the last three years. See Hard disk capacity.

Another version states that RAM storage capacity increases at the same rate as processing power.

Although Moore's Law was initially made in the form of an observation and forecast, the more widely it became accepted, the more it served as a goal for an entire industry. This drove both marketing and engineering departments of semiconductor manufacturers to focus enormous energy aiming for the specified increase in processing power that it was presumed one or more of their competitors would soon actually attain. In this regard, it can be viewed as a self-fulfilling prophecy.

The implications of Moore's Law for computer component suppliers are very significant. A typical major design project (such as an all-new CPU or hard drive) takes between two and five years to reach production-ready status. In consequence, component manufacturers face enormous timescale pressures—just a few weeks of delay in a major project can spell the difference between great success and massive losses, even bankruptcy. Expressed as "a doubling every 18 months", Moore's Law suggests the phenomenal progress of technology in recent years. Expressed on a shorter timescale, however, Moore's Law equates to an average performance improvement in the industry as a whole of close to 1% per week. For a manufacturer competing in the competitive CPU market, a new product that is expected to take three years to develop and is just three or four months late is 10 to 15% slower, bulkier, or lower in storage capacity than the directly competing products, and is usually unsellable. (If instead we accept that performance doubles every 24 months, rather than every 18 months, a 3 to 4 month delay would mean 8 to 11% less performance.)

Enablers

Inhibitors

Paradigms

Experts

Timing

Web Resources

Wikipedia entry on Moore's Law