MUI is de facto standard. It has standard L&F if you like. If you dont then MUI programs dont have same L&F with Gadtools or Intuition BOOPSI classes but it is up to user.
MUI never worked for me. It's license-incompatible. To remind you, I wrote freeware. Pay-nothing. If I would use it, users would have had to pay for the interface, but not for the software. Pay somebody else for my programs? I beg your pardon.
Anyhow, MUI doesn't solve the problem either. Consider a program that draws on the screen itself, preparing some kind of graphics. For the sake of the argument, consider your average Mandelbrot fractal generator. Now, how do you iconify this? Ok, you can create your *own* off-screen buffer and let the program render into that, and then always copy to the screen whenever there is one. But that's actually complete overshoot, and it also doubles the required resources. If you have a smartrefresh window, or even a superbitmap, this backing-store is already there, and transparently provided by the Os. So why not use it for exactly that, namely backing store while the window is not shown?
Instead you reinvent wheel and reimplement 100K framework to your application.
Never did that. Gadtools worked nicely. Of course, that's fairly simple, but it was good enough for most cases. There was no need to go for the full-fledged 100K for MUI if I had everything I ever needed in ROM.
I recall there are many other occasions where Workbench tried to close its screen.
As in? There is actually only a single Os call to do that. It could be called by programs that felt like it - in the beginning to save precious chip-ram - or for games to avoid screen-switching. Or to adjust the screen mode. It's non-crucial for the first applications since the memory situation improved. The latter is where it makes a difference, but this is typically not anoying since you do not adjust your screen every minute.
The purpose of iconification is not so much to get rid of the workbench, but to get a better overview on what's on the screen, and re-organize the windows. As soon as you have a couple of windows open, it becomes rather anoying to shuffle them around, and iconification is a solution.
Is it really so? It is little messy but from developer POV not too messy.
Yes, it really is. Its entire design is too tightly coupled to the Amiga chipset (bitplanes, blitting) and hard to abstract from. A lot of code in P96 is just there to emulate the bitmap blitting. Gfx also documented too many of its internals and does not have a clean interface. So for example, the view and the viewport are entirely "open" structures, which led to the bizarre hack that gfx stores the monitor and viewmode information in the *colormap* because this one was *not* documented. And uses hacks like "GfxAssociate" to link external published structures that can be messed with by any user by its internal administration structures, simply because the documented structures became too small and unsuitable.
The entire monitor database is an extreme hack and has no clearly documented structure. Instead, it depends on a couple of Gfx-internal databases that were never documented.
Instead of using clearly defined "creation and destruction" functions like intuition does (OpenWindow/CloseWindow) - nowadays one would call this "constructors" and "destructors", gfx relies upon the user that he builds such structures "correctly", which is a nightmare for forwards compatibility - see above: Such structures cannot be extended.
Count for example the number of patches that go into gfx to make rtg happen - an entire hack-o-rama necessary for graphics cards, with a lot of emulation overhead due to the poor interface.
Ok, seriously: Gfx was slammed together, probably in a rush, without thinking and without any design. It's an excellent example for teaching people of how *not* to design a graphics primitives library.