MiAmigo: So, the Amiga today would sport not one, but, actually three equally powerful, equally advanced, state of the art Motorola chips, each designed specifically for jobs in sound, and graphics
Refer to Sony's "Grid" architecture. Data processing, graphics, and audio all require much the same instruction set to process. The reason why we have dedicated chips to do that work is because they are hard-wired to do the same intructions over and over without fumbling with complex decoders.
GPUs really are complete computers, but they are still used as "accelerators." Why not have three CPU cores in a single chip, or ten seperate coprocessors? It's not the complexity of the calulations you want to get done, it's just how fast you need it to be done, at a given cost.
Old arcade machines didn't have dedicated CGI boards. They just had a half-dozen ordinary CPUS: 68K, Hitachi, Texas Instruments, AT&T, and so on. My favorite arcade game, Hard Drivin', had a very demanding graphics and audio system, but was driven by several, standard parallel CPUs -- one 68K to do the physics and run the game, and two Texas Instruments processors running in parallel to do the polygon calculations. Sega machines are infamous for using 68K chips for audio mixing and effects, instead of dedicated DSPs. You had more programming flexibility that way, instead of being restricted by the limited capabilities of a hard-wired DSP.
Flexibility sometimes wins out over speed. Dedicated sound chips are being phased-out in favor of CPU mixing. Audio effects on a Pentium 4 chip are easier and far more capable than the effects possible with a standard EAX-compliant chip, like the EMU10K found on a typical Sound Blaster Audigy card.
nVidia has removed their much-applauded audio engine from their nForce chipset, and they have publicly stated that hardware-accelerated audio is dead.
I saw a demo where an ATI GPU was performing realistic fluid simulation as a smoke demo -- in the GPU itself. A few years ago, only powerful, full-featured CPUs with floating point support could do stuff like that.
Cymric: System Configuration: Also, the OS would be completely configurable at start up
Is it really necessary to have the same hardware configuration for each of the tasks you specified? Why not have a dedicated OS for each task? Note that many OSes come in single processor or multiprocessor editions, depending on your hardware. Making one OS that will work on both types of architectures, at least at this time, results in a major loss of efficiency for either type of system.
Cymric: Not a bad idea, but AFAIK FPGAs are not fast enough to meet today's demanding specifications
My feelings, too. It's unlikely that "Flash" memory will ever be used like RAM, simply because it is more complicated and will always be at a disadvantage. Simpler is always cheaper and more plentiful than complex.
It would be nice if computers could mix many types of memory and prioritize them appropriately. Who said you had to break things down into RAM and a hard drive? Why is on-die memory only used for state caching? Why can't we have high-speed RAM for calculations, cheaper years-old RAM for scratch work, flash RAM for result storage, and then page things out to the hard drive (with no filesystem) at leisure? Virtual Memory and filesystem models used in today's OSes are far too simple.
Bring back the Video Toaster, and make it an integral part of the Amiga's standard hardware.
Many graphics cards have video capture built-in. Making an application that can read standard video streams makes sense, but it's doubtful that every machine needs the hardware.
Then again, sound cards with line input was a rarity a few years ago, and now even budget machines have it. When all TVs go digital in a few years, everything will be on a serial connection, so broadcast channels and video feeds will all be 2-way, and you won't even need dedicated video capture hardware, anymore. Death of S-Video? Why not?
Bloodline: My Gfx card and sound card are computers in their own right!
True. A typical top-tier GPU has more transistors (by several million) and performs more calulations that the highest-spec Pentium! A graphics card has its own memory and bus architecture, registers, caching... it's really a computer inside your computer -- complete with its own dedicated power feed and regulators. :-)
Rogue: You can't say, oh heck, two hours work gone, but hey, I can reboot in five seconds.
AROS?
Sorry, but it just crashes too damn much.
MiAmigo: What I actually envisioned (back when I wrote the piece, and even now) is nothing less than a solid-state computer.
EROS?
Which, by the way, has been discontinued. That experimental OS didn't get anywhere.
MiAmigo: I want my computer as accessible (no boot-up time) as my microwave, or even my kitchen sink.
Yes, as do many interface designers. However, the only solution to many complex data organization problems is caching indexes, and for that, you need memory storage. Solid-state computers will probably never really exist. We'll have computers that hibernate with a battery feeding the memory, at best.
I remember using a text editor written in AMOS that wrote data to the floppy drive on a character basis. Very reliable, but responsiveness was calamitous.
The idea of backing up memory to a storage device is a bit far-fetched, too. Maintaining all the data integrity of each component in the system (such as the register state of your graphics sub-system) is unlikely, due to driver issues and non-standards compliance. I've never owned a computer -- PC, Mac, or Linux -- that would always reliably wake up from sleep mode. You can minimize boot time, but you can't get rid of it unless it's a purpose-built machine that only does a tiny number of things and requires very little storage.
"Always On" also implies that the hardware never changes. That's fine for proprietary systems where all the chips are soldered together, but not realistic for open architecture, like the PC. Otherwise, the machine must verify that the hardware has not changed every time it turns on, and that's where a lot of startup delays occur. There was a time in history where new hardware would cause the machine to lock up and everything was configured with jumpers. I don't think people want to go back to those dark days.
Don't forget the rule of resource, either. If a resource is available, programmers will use it, whether they really need it or not. Efficiency is too much to ask. ;-)
Cymric: ...But nobody is expecting them to run Doom III or Half Life 2 at 1600x1200x32 at 8xFSAA and over 80 Hz refresh rate either. Whether you need or want such insane specs is quite another question. (I would not, my monitor cannot cope, so why bother?)
Elegance directly conflicts with human nature. Elegance is necessity, but people want things they don't need. Yeah, you don't HAVE to have a 3GHZ machine to do certain things, but people will buy them anyway because they want them. Servers demand elegance and efficiency, and so do dedicated devices like the micro controllers in your microwave and home thermostat. Such standards really cannot apply to PCs, hand-held phones, and other multifunction devices. Hell, you can take photos will cell phones these days, and next will be movies, GPS, wireless banking, etc.
Kinda scary, really.