I'm trying to remain objective and not offend anyone:
I could never understand how the Atari people could compare their technology to the Amiga. Jack wanted an off the shelf based computer and the ST fit that requirement. However, without the many custom chips and advanced OS of the Amiga, the ST line could not compete.
I remember seeing the Amiga Boing! demo on the Atari and it took all its power to do it.
However, I wish I could talk with the actual Atari developers because I have had two questions in my mind all these years that I wish I knew the answer too...
1. Why on Earth would a company that is building a new computer go against the CPU manufacturer (Motorola, obviously) and use instructions that may not be in the next version of the chip?? If you don't know what I'm talking about, TOS uses instructions in the 68k that Motorola warned would not be in future versions (010,020,030, etc...).
Unless you rewrite or patch TOS (causing compatibility problems) you can't use a newer CPU.
This is why you could not get a MegaMidgetRacer or put in a 68010 for the Atari ST. You could get a sped up 68k but that was about it.
As a CS Professor and a developer, I cannot fathom why you would do this. I can't imagine a situation during development that going this route was the best solution.
2. Why would you have an off switch for the blitter? I mean is there a moment where you go "Boy, I wish this program would run slower so lets have the CPU do more work..."
Then you do an 030 based Atari computer with a new TOS with nice features and gut the blitter entirely so you can call it your Graphics Workstation.
I know Commodore made some really dumb mistakes too so I don't want to sound like a 'fan boy' but these are just issues that I wondered 'bout.
Cheers!
-P