Bloatware typically refers to useless or seldom used features. Roj's 60MB mouse driver? Well, the driver is probably just some 100K DirectInput hook which maps each mouse button to a system signal. The rest of it is the installer, fancy uncompressed BMP graphics, miles and miles of XML exported from a lousy PowerPoint clone, the updater which constantly runs in the background (and is almost always a service and not a task), etc. The drivers for my printer are 85MB, but the printer works just fine using the vanilla driver that comes with XP.
Also keep in mind that the Amiga really didn't have a lot of drivers for things. The whole OS was pretty much hard-coded just for the chipset, which of course was its greatest downfall. Take a look at the Linux kernel, and you'll find tons of hacks to make hundreds of devices work. Even the Macintosh, a closed hardware platform, has to support huge numbers of different hardware configurations, and the OS is expected to adapt to each one, not require you to re-install every time you swap out one or two parts. Any OS by itself is usually quite lean.
Now, the window manager is a different story.
Don't even get me going about game consoles. It bothers me when I hear PS3 developers talking about how they "need" Blu-ray. I can't imagine even filling up a DVD given all the work that's been done with procedural synthesis. Aren't the Cell's vector processors designed with synthesis in mind?
Hellcoder: Thanks to windows the complexity of an OS has increased alot. Shadow fading windows, 3D rotating stuff. Compatibility with older OS's. This adds alot of functionality and is never good for the speed.
Shadow effects and the like are easy to do.
The problem is putting things together into monolithic libraries. In any one session, you might be using less than 5% of the software's features, but it all has to be loaded into memory, just in case you might use it.
But, hey, we just swap it out to VM, so why bother worrying about how big a library is? Just include everything and let the hard drive swap its butt off!
It's too difficult from a developer's standpoint to splice the libraries into individual tools, and compilers are still too stupid to do that for you. Optimization is key, not packaging. Hell, how many times have you worked on a group project and dependencies don't even build properly? Nobody takes proper packaging seriously.
Hans: Maybe all developers should be given a machine with restricted resources, and told to make it work usably on that.
That's a law in many good software firms.
Paradox: The AmigaOS os is the most simplified structure and therefor can be advanced upon much further than any other OS.
Therefore I agree that AmigaOS could be the most efficient and resource saving OS ever.
Interesting username. ;-)
The reality is that AmigaOS doesn't do very much. Yes, it's lean on resources, but it doesn't do most of the things you'd expect from a modern OS.
If I wrote a piece of sorting code that was 5 lines long in C, does that make it efficient and fast? Using more RAM can actually have huge long-term benefits. You just have to gauge how many resources your customers will have in real-world situations and not go over those limits.
It's also faster and more reliable to write good, maintainable code that gets the job done, rather than cryptic code that is amazingly efficient. What good is it that you use 30% less RAM, but the code takes ten times longer to develop and debug?
Much of the efficiency that comes from the Amiga is because the hardware and os are tightly integrated together. But you can't get that with generic mass produced hardware, it has to be propriatory and no users want that because its expensive eg remember the powerpc macs cost nearly twice as much as equivalent PC hardware at the time.
I disagree. Even X86 is pretty efficient if you think about it, because hardware engineers cannot be anywhere near as sloppy as software engineers. That's the result of cutthroat competition, limited (or rather, costly) resources, and a deep awareness of time. Hardware engineers always have to know when things happen, while software tends to sit around, wait for other things to get done, and allow everything to get out of sync.
To improve responsiveness, maybe software compilers should introduce more controls to monitor time, like hardware compilers. Not like I have any experience with hardware design, of course.