A couple corrections and additions are added to the following:
Cymric wrote:
1. Measure time with a greater accuracy than 55 ms. Amigas have CIAs which provide microsecond accuracy.
That is true, but only partially true. The way the PIC (Programmable Interrupt Controller, which included 3 timers, timer 0, 1 and 2, but only timer 0 could cause an IRQ) works is that you can set the tick count to 65536 (maximum) and then wait for it to go to 0, at which point an IRQ is caused. This yields the 55ms timing that people are refering to. Of course since the PIC worked off of a 1.19318MHz crystal, the actual tick time, and therefore the accuracy of the timer was 0.838 microseconds!! Therefore if you set the ticks to 1 and waited for an IRQ, you would have an almost 1 microsecond accuracy! (of course code and IRQ latency would make that a bit hard, but you were certainly nowhere near the mythical 55ms times, but in the low 1-4 microseconds depending on the CPU). This could and was exploited to create a Copper-like system, by some of us.
Also keep in mind that any Pentium or newer CPU does have a TSC instruction which is as accurate as the clock cycles of the CPU, therefore one can make tiny measurements. The problem is that it cannot cause an interrupt. Although I believe the newer APICs do have more accurate timers, but haven't messed around with that stuff for ages.
Cymric wrote:
2. Generate raster interrupts the way the Copper can.
That's true, because the PCs never had a Copper-like chip. On the other hand, those with some ingenuity and coding skills would (and did) devise a Copper-like system, which wasn't as accurate as the Copper (ex. every 4 pixels), but every scan line, and could have vertical raster bars on various PC screens (aka. demos and games), for virtually "0 CPU cycles". We used the PIC for this.
EDIT: I found my old code... Now, I'm ashamed of what I wrote on line 3, but at least I had my head together for line 4:
"
; this is my software (IRQ based) COPPER-equivalent (sorta

chip for the
; Inherently Bogus Machine Piece o Crap (aka IBM PC) [v86 mode]
; but I will admit to you that I love the x86 assembler instruction set

; and that my Amiga 3000 will kick its ass at any given moment

"
Cymric wrote:
3. Display an image based on bitplanes. (Then again, the Amiga cannot really display a chunky image without employing advanced Copper trickery, and then at great loss of resolution. The entire concept is alien to the Amiga hardware, is what I'm saying.) This made the Amiga perfect for sideways 2D scrollers, but absolutely not perfect for 3D games.
That is enterily wrong. The PC since the EGA days _DID_ support bitplanes and various operations on bitplans (xor, etc). The problems were 2:
1) Limited bitplanes. They were only supported in 16 color mode, which meant 4 bitplanes. Yet we were able to do quite a few cool demos & intros with those because they made drawing much quicker (usually 2x faster, since it's only 4 bits per pixel vs

. This also helped a lot in high resolution modes, like 640x480 and 640x400 (games like that Gyger inspired adventure game, whose title I forget, used this, and actually I forgot, one of the games I worked on)
2) You couldn't individually scroll the bitplanes like Amiga playfields. You could only hardware scroll all of them at once. Sucked big time... although there were some tricks you could do, and there was also the ability to have a vertical hardware scrolling split screen.
All the rest hold true as far as I know.