Was a key component, back when operating systems and applications used large amounts of assembler. Now it's a commodity component like anything else, you use the CPU that makes the most sense for the hardware you are building, not because it is a particular family; the software that will run on it can always be recompiled for your target.
You say that like it somehow makes an ugly architecture un-ugly. Yes, compilers can make any high-level language essentially usable on any Turing-complete architecture with sufficient memory.
That's beside the point. Kludgey is still kludgey and elegant is still elegant, whether or not most people care about it. Some of us still like to use assembler, if only for hobby purposes. Some of us still care about these things.
Besides, architectures
do still make a difference today, if less so than in the past - or weren't you paying attention when Android x86 became a full-fledged project rather than a simple cross-compile, because the original was heavily optimized for ARM?
It's 2011, not 1992. I used to hate x86 too. That's all it really was, just hate for the sake of it. The truth is that like it or not, the "x86" has risen above every reasonable technical criticism that's ever been levelled against it. First it was too slow and would never survive the RISC revolution. Which it did, just fine. Turned out that all the main architectural features of RISC don't actually require a reduced instruction set in order to implement. Then it was all "it will never survive the 64-bit revolution". Erm no, if anything, it's the most popular 64-bit platform in existence, likewise the most popular multi-core platform.
I don't give a damn about historical turf wars. I've looked at the architecture, both in its original "some day I'll be a real 32-bit chip!" incarnation and in its later forms, and I just plain don't care for it. Too few registers and an almost-but-not-quite orthogonal approach to using them that's never
entirely disappeared, for one thing. (At least they seem to have
finally ditched the last vestiges of memory segmentation in x64.)
It's not the "worst CPU
evar!!!1," but Lord, it's not
good. And the fact that the hurdles you list have been overcome attests to nothing more than that there's been a whole lot of time and money from a whole lot of different parties invested in making sure that it keeps up with newer, better-designed architectures - of
course it's kept up, the freaking
RCA 1802 could've become a modern desktop workhorse with those kind of resources.
Which is what the vast majority of consumers want and hence what any business that wants market share will aim for.
Aaand
when did we start judging quality by commercial success? If we judged movies by their box-office,
Transformers 2 would be a masterpiece.