The end of Moore’s ‘law’?

Posted in blindspot, digital, emerging issues, technology by thenextwavefutures on 2 August, 2009


Every so often you notice a story that flashes a hazard light about the prevailing wisdom about the way the world works. Thus it was last week, reading Jack Schofield’s Guardian article about the limits of Moore’s ‘law’, the long-standing projection by Intel co-founder Gordon Moore that the number of transistors that can be placed on an integrated circuit will doubled every two years (effectively doubling performance against price in that time). It’s been suggested before that the microprocessor manufacturing process will reach physical limits. Schofield’s piece was on the increasing investment cost of processor manufacturing plants. One of the consequences is likely to be the end of the exponential growth in computing performance which has underpinned the speed and scale of the information and communications technology revolution.

While not wanting to write a long post, it’s worth making a few points.

  • The growing cost of the fabrication plants has always been the ‘dark side’ of Moore’s law – the performance-to-price ratio of chips may double every two years, but the cost of building the production plants keeps on growing (if not quite so quickly). Nonetheless, the numbers are now quite large; Intel is spending $7 billion upgrading its fabrication plants;  Global Foundries, an AMD spinout, is spending $4.2 billion building a new fabrication plant in upstate New York, helped along by a $1.2 billion incentive from the state. As well as production costs, design costs are also increasing.
  • The cost grows because the engineering becomes more complex as transistors shrink in size. The scale is staggering, but the current generation of chips is 45 nanometers (nm) across – or 45 billionths of a metre. The next generation will be 32 nm, with 22 nm following along behind it. By way of a yardstick, the Intel 8088 chip, in 1982, was 3,000 nm (or 3 m icrons), while atoms are about a nanometre across. As Charlie Stross said in an interesting discussion of this, “it’s hard to see how we can miniaturize our integrated circuits below the 10nm scale”. Since increasing miniaturisatioon is the basis of current sector competition, there will be a big shake-up in the industry as we reach these limits.
  • I’ve written here before on the work of Carlota Perez, who has a model of technological innovation going back to the Industrial Revolution which sees industrial innovation progressing in waves of around 50 years, with each wave (this is a Schumpeterian model, not from Kondratiev, by the way) progressing through a cycle of rapid innovation, leading to irrational investment exuberance (with speculative bubbles thrown in) to a shift to slowing rates of growth and more normal rates of return. her model has the current ICT wave starting in 1971, and the characteristics described above, with cost and complexity reducing the rate of return, fits her model pretty well.
  • Of course, this doesn’t mean that there won’t be innovation in the sector. But it does mean that innovation is more process driven, and may require suppliers to think about what customers need and value, rather than simply pushing for ever greater density of transistors.

The final observation here is that maybe the Moore’s law curve wasn’t exponential after all – maybe it was just a sigmoid S-curve with a particularly steep gradient on the way up. Without getting into arguments here about The Singularity (and thereby revealing myself to be a simple-minded type whose brain is not sophisticated enough to understand the implications of exponential growth) it’s worth just mentioning that Theodore Modis has been over this ground in some technical detail. As Modis observes, “Nothing in nature follows a pure exponential. All natural growth follows the logistic [S-curve] function, which indeed can be approximated by an exponential in its early stages”. Equally, systems theorists would observe that in general, in any system, a balancing loop will eventually slow the acceleration created by a reinforcing loop.

And that’s what seems to be happening here. The relationship between reducing the scale (from 45nm to 22nm and beyond), and its cost, isn’t linear; it gets progressively more expensive as scale reduces and density increases. As a result, the cost of the fabrication plants and chip design accelerate to a point where they are no longer viable investment propositions. Two connected balancing loops change the underlying dynamics of the computer business – and change our assumptions about the continuing rate of technological change.


The image at the top of the post – showing processing power performance on a logarithmic scale, is from Computer Measurement Group, and is used with thanks.


4 Responses

Subscribe to comments with RSS.

  1. Dale B. Ritter, B.A. said, on 6 August, 2009 at 11:21 pm

    Talking about the next generation of transistorized circuit components, there is an advancement in nanomaterial mathematical 3D video modeling software which can image the picoyoctometric scale of atoms or waves to build interactive computer screen displays used in nanotechnical design or analysis.
    This model begins with a series differential expansion of the Schrodinger equation for a single atom named psi. The advancing rates of nuclear radiation of gravity and positive charge are taken as transformation of nuclear mass by [ e = m(c^2) ] physics. While the series of rates is confined only by spacetime, and space bonded to the atom by gravity, the psi model pulsates at the frequency [ Nhu =e/h ] by alternate phases of nuclear emission and absorption of forcons with valid joule values as a GT integral psi wavefunction.
    That the RQT (Relative Quantum Topological) atomic model images exact 3D designs for the 5/2 kT J internal heat capacity energy cloud particles is due to the combination of the relativistic Lorenz-Einstein transforms for time, mass, and energy
    with the quantized rules for wave frequency and wavelength.
    The images of the h-bar magnetic energy waveparticle of ~175 picoyoctometers are available online at along with discussions, graphics, essays, and the complete manual for picoyoctometric scale MAVCAM (Molecular or Material Animated Video Computer Assisted Modeling) software build projects.

  2. tele2002 said, on 10 August, 2009 at 1:22 pm

    Great article, we have to remember as well that although they may get twice as fast what is processed is also getting twice (if not more) as complex. This in effect doesn’t provide the endusers with the apparent speed increase that they are expecting, as other components of a system may not have had the same increase in speed (take a hard drive for instance, it may have had unbelievable growth in it’s storage capacity but it’s ability to access that data has not had the same growth)
    These variables need to be calculated and communicated to the world in ways that they can actually understand.
    I used an example in my own blog on Moore’s Law – the effect on productivity for the print, pre-media and publishing industry.

  3. […] it down another 10 to 15 years. Nothing touches the economics of it.”  There has been ample speculation and analysis (like this one by author and blogger Andrew Curry) on when Moore’s law will end but another […]

  4. […] Curry, The end of Moore’s ‘law’?. The Next Wave. Aug. 2, […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: