Henngh. Some points:
*NIX, as about the fifth comer to the desktop market (after PC, Mac, Amiga, Atari, Acorn...) is facing a serious squash of namespace. Simple names like "Media Player" or "CD Burner" are already taken a dozen times over, and rebranding a standalone package (as Linspire has tried) requires constant maintenance of a rebranding patchset. If you look closely, Apple and MS are feeling the pinch as well, which is why you get things names like "Front Row" instead of "Presentations."
A *NIX desktop can be reasonably "polished" if you swear to only use GNOME or KDE apps. Maintaining this discipline is rough, of course, particularly if you don't know what you're doing. At least, these days, almost any "desktop" has twice the resources needed to run both at once.
That situation is not much different than an Amiga running both MUI and ReAction apps at once. But the Amiga didn't reach that point until the platform was practically dead anyway, so maybe no one noticed. (In fairness, the control panels/configuration utilities for both libs are probably more easily located.)
The UNIX hierarchy (original flavors or LSB) is slightly weird, but as long as developers stick to it it shouldn't cause problems. Approaches like iTunes (to pick one major example that most of you have probably learned to work with), or for that matter every search engine ever to exist for the web, show how hierarchy is an obstruction to UI anyway. What UNIX (and most OSes, really) are still missing are UIs that are both reliable yet also agnostic about specific paths... In front of any system, the big problem is getting the computer to "Show me what I want."
...
Linux works (when the planets align, same as any other system), but Linux maybe wasn't the best place to start the desktop UNIX experiment. Linux as a codebase doesn't seem to "learn" well, since Linus continues shuffling isolated projects in and out of the tree. If Linus had decided to be disciplined and say "There ought to be limits to Linux," then certain things would be set in stone by now, but that's not his style... and his style is not necessarily cooperative with anyone's efforts to do anything. :-D
That's not meant to be an insult, but the 'Linux is about what Linus finds interesting' aspect does bleed over quite visibly every now and then. Thankfully, Linus is now interested in not being bothered by people constantly asking him to shuffle isolated projects in and out of the tree, so the situation (in terms of trying to generalize interfaces and stick with them) is incrementally improving. On the one hand,
you should stick with the codebase you're handed, but on the other hand, Linus is only human.
If this adventure had started with a boring, relatively static foundation (my bias is toward the BSDs, say circa FreeBSD 3.x-4.x) the attempts at desktop distributions may have faced much less breakage from underneath. But once Flash and Java in their proprietary forms became things, that route was closed off and Linux became "the codebase we were handed."
Having watched this for 10 years or so now, I'm ambivalent as to whether the openings of Flash and Java and some of the other blobbish formats floating around today will really change anything. A lot can happen in the next ten years, and new beasts will probably emerge by the time FOSS support for the old ones is fully-baked.
...
Anyhow, I think the point I'm meandering towards is thus: About all there is to learn from the Amiga has already been the case on the UNIX desktop (diverse applications, diverse interfaces, diverse libs), albeit on a much more horrific scale than we ever had to deal with when 20MB was a massive disk. There's an element of tolerance to be learned, to be sure -- a "funny looking" tool may look funny because it's been designed for its purpose -- and humility -- a "Workbench" that does a few things well may be more accessible than the modern approach that mingles all sorts of metaphor-breaking desklets and so on before the user even knows what he's doing... but I think I've been converted: If you find yourself grunting "Why can't this be an Amiga?," you have to step back and consider how the system at hand differs from an Amiga.
It also pays to realize this: The initial whiz-bang of the Amiga was in the hardware, followed by just enough software to realize it. If technologies like SSDs and nonvolatile RAMs are about to solve the shutdown/startup debates, maybe it's time to welcome them, cheer Moore's Law, and get some other work done.
...
I certainly welcome anything Amiga-like (and something Amiga-like is still required for good performance on 300MHz handhelds or Atom-powered machines that run like 300MHz handhelds), but I'd still be happier to see a "heavy" UI on UNIX get fully-baked than yet another XFCE ("It's smaller! It does less! We're really proud it has a calendar! Half the programs you run will pull in all the GNOME or KDE libs anyway!"). If you're still ambivalent now, see how you feel when 3GHz machines with 4GB RAM are propped up for free next to a dumpster.
:roll: