Why do they have to talk about something when they don’t know anything? Just the hype. Just the news they get from CNet/ZD Magazine/w-e. Just the little, tiny bits of information their minds retain.
Case in question:
When most people’s brains first light up on why parallelism is the next BigThing, some jump to the conclusion that Moore’s law is over. Let’s clear that up below.
All of you know Roger Moore’s law which boils down to the prediction of
"the number of transistors on a chip will double about every two years"
clock speed increases and that is what has tricked most of us to associate Moore’s law with CPU speed.
So, now that chip manufacturers cannot make single CPUs any faster (well, they can, but they can’t cool them down enough to make them useful), they are resorting to having chips with multiple cores, which we are terming the manycore shift. The manycore shift has a profound impact on developers (especially those programming for the desktop client) in that their software now has to learn how to take advantage of parallelism.
So if you followed the logical flow so far, you’ll conclude that Moore’s law is still alive: we are still getting more silicon, but it does not translate to increased linear speed, but rather to parallel "engines" that your software must learn to utilise.
I am glad we cleared that up :)
Please read the funny stream of comments. And a bit unleashing of my own wrath.
I am glad you don’t actually do anything in hardware industry.
First off, it is Gordon Moore not Roger Moore. Get your facts right before you post.
Second, holy parallelism has its own nemesis, the Amdhal’s Law. You cannot go faster than 1/[portion of serial work] even with infinite processors.
Third, the problem isn’t even cooling. They can’t make the a transistor’s channel short enough.
[Update: 30/07/08 As expected, my comment wasn’t approved.]
[Update: 15/07/08 I just found out, he is also a PM at Microsoft. Yikes!]