Hi,
WOW!!!
What a discussion on which computer is better, this is an Amiga site of course and so by that I declare the Amiga the winner, all you Linux and Windose users are LOSERS just by the fact that you are on an Amiga site.
Greetings to you too! You have to explain your argument a bit closer, because maybe I'm too much of a loser to see how it makes sense logically. I'm not here because I think Amiga is the superior platform, but because I like old computer systems and think the Amiga is particularly interesting. The Amiga sure is an impressive innovation but in a typical home/work environment today it just doesn't cut it for the tasks a user would normally expect a computer to perform.
Lets face it a computer is only as fast as a user can use it. There is no way any of you can type faster than your machine, therefore almost 100% of you are using too much horse power for what you are using it for.
Are you serious? Then why don't you sit down and decode real time video on paper and just draw it yourself on the screen? Do you decompress zip files by reading the compressed data yourself? Do your drivers print you a message that you have to type in again to pass it to the system? Hell, why not move the laser around over a CD manually if you can do it just as responsively?
A computer is faster at what it does than every potential user, but it doesn't matter because most tasks it performs are tedious and complicated enough for it to be perceived as slow anyway.
Even when you play the most awesome games like fallout 3, crysis, far cry, doom 3, the computers today move much faster than what your senses can see. The only thing you are trying to do is get faster frame rates even though your eyes cannot see them, that is why most TV sets use the 22 fps rate anything above that is quite useless as far as the eye can see.
Judging from your totally misinformed point I take it that you have no knowledge at all about real-time graphics, and if you don't notice the difference between running a game at 60 fps and 22 fps you should probably see a doctor too, because it should be clear to anyone under 80. The reason 22 fps works for movies is because cameras don't really capture discrete moments of time on each frame, but rather pretty much everything between the frame before and the frame after, which introduces a lot of motion blur that conceals the slow frame rate.
Now, doing a similar effect on a computer costs a lot of time, because in the end it means rendering or extrapolating all the significant frames "in-between" too. I guess
you could do it by hand pretty fast, though.
And no, TV sets aren't usually locked to 22 fps.
So please tone down the arrogance until you actually know what you're ranting about.