Originally Posted by Ninjalawyer
I don't really disagree with your post, but Moore's Law is still alive and well. While consumer CPUs aren't seeing the huge bumps, they never really did; it has always been incremental changes in that space. Most consumers simply aren't interested in the top-end of desktop performance, let alone the top-end of what silicon can do generally. Now, if you're doing video editing or serious gaming and willing to buy a $1,000 video card, you'll see that the changes aren't as minor as the experience of the average consumer would suggest.
I'm not quite up to date on really high-end computing tech such as the stuff used for scientific research and so forth, but even enthusiast gamers are finding themselves requiring less frequent tech upgrades. A mid-high end CPU even a few generations old provides enough grunt to run demanding games. GPUs are still progressing along nicely, but even there the progress hasn't been so impressive (though power efficiency has improved quite a bit). Nvidia's $1k Titan (a boutique item more geared towards professional use) has maybe 25% more processing power than a $500 card, which itself is more than plenty to provide a good gaming experience for several years (as will a $300 gpu for that matter).