Now I always thought AMD was good for the desktop PC as well. Not so good for laptops though, especially in recent years. I've noticed AMD laptops always seemed to be the cheaper ones, rather bulky, heavy, hot running and short battery life. AFAIK there's no AMD based ultrabooks or netbooks. They're all Intel, either Atom, Core i3 or i5 processors.
I've seen the CPU market go through phases. When I was building one data center, the Pentium 2-based Xeon line was the hot server setup. When I was building another, the AMD Opteron was not only a better value, it was the top performer.
At the time when AMD was outperforming Intel in just about every CPU market, Intel took notice and made sure that AMD would never be on top again, no matter what the cost to Intel (or maybe I should say Intel
buyers). Now that ARM technology is the next great thing, Intel is trying every trick in the book to make its chips consume less power to be competitive with ARM. But the way they're doing it is complicated and clumsy. My Intel laptop is constantly turning things off and throttling the CPU in defiance of my power settings. The Intel Atom (the one to go up against ARM) works the same way.
Intel will probably win in the end, but I prefer the ARM philosophy.
For old school computers, so much of the processing burden has been offloaded, first to a separate math processor, then "smart" expansion cards like hardware RAID HBA cards. Now most things use DMA to bypass the CPU. I'm finding more and more that it's more important what graphics card I get than the CPU, as more and more GPU-enabled software is published.
Now we have more than enough CPU cycles on tap to do the most common desktop and server tasks, it's less of an issue. I'm running older AMD CPUs on my desktops and don't feel any need to upgrade them. But I have swapped in more powerful graphics cards. For my servers, it's the big dollar RAID cards that get me the disk throughput that I need, not the CPUs.