Will Moore's Law slow down?

I spent time last week with a friend who works in the chip industry. One of the things we talked about was heat, and what it's doing to the world of processor design.

For years there have been warnings that heat dissipation was starting to become a problem for microprocessors. Five and a half years ago, Intel's CTO said, "we have a huge problem to cool these devices, given normal cooling technologies." The industry made do for another five years, which is why I've been tuning out the latest round of warnings -- I figured it was just more Chicken Little ranting. But my friend is
worried that the problem is now approaching a critical point, and could have some important effects on the tech industry.

Look inside a modern PC and it's easy to spot the central processor and the graphics processor – they're the things with the big metal heat sinks on them, probably with fans on top to force air over the metal.

There's a limit on how big those sinks can get. Meanwhile, processor chips continue to get larger, which makes it harder to pull the heat out of them. And the metal lines on them continue to get smaller. Smaller lines are more likely to leak electricity, which heats the chip further. Already the average processor produces more heat per square inch than a steam iron, according to IBM technologist Bernie Meyerson. Extrapolate the trends into the future a bit, and things start to look ominous.

There are, of course, lots of potential ways to cope with the problem. Companies are talking about cutting the big processors into multiple small ones, which would be easier to cool (because you can put heat sinks around all the edges). Startups are proposing exotic cooling technologies like etched microchannels through which water could be circulated. My friend thinks optical computing technologies may play a role.

The thing all of these approaches have in common is that they're experimental. They might work, but they also might not. The implication is not that Moore's Law is coming to an end, but it's becoming much less reliable. Rather than getting a predictable increase in computing performance, we may end up with surges in progress that alternate with surprise periods of stagnation.

We may still be able to muddle through the whole situation; this isn't the first time people have predicted an end to Moore's Law. But it's worrisome because over the years, the assumption that Moore's Law will continue has been built into the thinking and structure of the industry. The computer hardware business depends on obsoleting its installed base on a regular basis. If that doesn't continue, sales will slow, which could force cutbacks in the R&D investment that's needed to solve the performance problems.

Instability in Moore's Law would also threaten nerd rapture, the Singularity. Creating transhuman intelligence depends on getting another 15 or 20 years of uninterrupted exponential growth in computer processing power. If that growth slows, the shipment date for our bio-cybernetic brain implants could start to slip seriously. Which would be a bummer – I've been looking forward to getting mine so I could finally understand the last episode of Twin Peaks.


Anonymous said...

Moore's Law is an old mind experiment that has no actual correlation to physics etc except within a very narrow frame of technology, so why do we still talk about it as a law (of physics? of marketing b.s.?)?

E.g. switching to super-cooled technology, optical technology or what-not would immediately take away the heat problem, and could potentially also increase performance 10-fold or 100-fold over night.

To develop something like an x86-compatible CPU out of such core technology would take a very long time, but maybe it's also time for a complete shift in chip design and CPU architecture. Windows has locked how a CPU should work for a very long time now. Intel's Cell architecture shows one way to make a better CPU with simple means, and that's not stretching it far.

It's just that the industry needs to think more outside the box now ... and stop talking about Moore's never-existing Law.

Michael Mace said...

Thanks, and I agree. The thing about most of the experimental approaches to the heat problem is that they might help a lot (the 100-fold increase in speed you mentioned) or they might not help at all. The future of processor performance is becoming a lot less predictable than it was in the past.

>>It's just that the industry needs to think more outside the box now ... and stop talking about Moore's never-existing Law.

Maybe we should rename it "Moore's Expectation." ;-)

Anonymous said...

People are also looking at spintronics