Alpha and PowerPC failed simply because there was no compelling application s/w for them (excepting the mac s/w, which was so well controlled by Apple they could manage a migration of OS and apps easily). Chicken and egg...nobody was buying the chips due to lack of apps, nobody would develop the apps without the installed base of h/w.
While they were good, fast chips, without a business reason to buy them, blap...no market. Some technical guy ran drooling into the managers office with charts and graphs showing the speeds and feeds, and the manager asked "does it run the operating system, office suite and 300 other apps our customers use? No? Ok then, thanks for the idea.".
On the server side, different problem. Most servers arent CPU bound and even for those that are, Alpha and PowerPC products didnt really offer a hugely different bang for the buck over existing products. Again, no compelling business driver to include a second very different hardware platform in the data center.
Correct on IBM. Few companies want to become commodity suppliers. Fewer still want to be forced from being a high revenue product supplier to a commodity status. Nobody wants that to happen when the guy forcing you into that status isnt and in fact cant make any money by bringing you to that status. When you're 10-25x the size of the guy that forces you into a commodity status and your manufacturing efficiency is one of the best in the industry, and the little guy has no chance of outlasting you or besting your efficiency to the point where he's going to eke out even a little profit...then what the hell are we doing?
Itanium and 64 bit desktops. Itanium was a good idea when it started. A lot of screwing around with engineering and marketing groups not being able to agree on how to proceed really froze that whole Intel/HP relationship up for long enough that the products werent that great. Intel stayed in it because they promised HP they would, then HP bailed out. Gee, thanks. It makes great sense to have a different server architecture than desktop, because the apps they run really dont have a lot in common. And x86 frankly is a piss-poor architecture for servers, with or without 64 bit extensions. The approach taken with x86 to handle memory extensions past 1MB is awful. X86-64 didnt fix that, it just allowed you to address more memory in an inefficient and hideous manner.
I'd probably argue that more than 4GB of memory isnt really necessary on a desktop in 98-99% of applications, and probably wont be for 4-5 more years...at least. On the server end, big help.
But revamping x86 for a server platform via x86-64 isnt really a super solution. Itanium was too friggin late and too poor a performer to take that role up, although the idea was good if the execution wasnt.
Instead, Intel kowtowed to the markets wishes..."gimme 64 bit! I dont know why I want it, other than 64 is two times 32! Dont go confusing me with any facts either, my mind is already made up!".
I'm not going to blame reduced profits on the itanium thing, but it was a factor. Intel cut a lot of money heading towards the server programs to put it into improving manufacturing efficiency and towards desktop programs as that was the big revenue producer.
The people managing the server end of things were a little ridiculous too. I once sat in a meeting for almost 8 hours while a very senior guy from the server group argued about what name we should give to a program because he'd proposed the same idea a few years before and was shot down. Although he was unable to create a compelling business reason to do it then, he was steadfast that while someone else had successfully proposed "his idea" that it carry "his name". That we'd spent six figures on promotional items bearing the program name that had to be thrown out was irrelevant. We ended up calling the program "Bob" for two weeks to shut him up. I still have a huge box of polo shirts with the old programs name on them.
When Intel acquired a portion of Compaq's microprocessor division, part of the original Digital Alpha team, they immediately became second class citizens. A lot of good ideas were wasted or delayed, most notably dual cpu cores and some good multiprocessor architecture.
So somewhere between reduced revenues, moronic management, competition that does nothing other than take the profit out of the business, vanity and "not invented here" syndrome, we have 4GHz processors you can fry an egg on, an expanded memory architecture processor nobody needs thats wildly inefficient for what its supposed to do, but really low prices.