amigaksi wrote:
>by mdwh2 on 2008/9/2 19:00:43
>>amigaksi wrote:
Alex is right in trying to repaint the screen because the ORIGINAL point is showing a screen full of sprites on a system that does not have sprites.
Well... when you say "a screen full" you do mean 8. The Amiga could only dispaly 8... though I know if you are prepared to limit their vertical motion you can reuse the unused sprite areas... but really that's not very useful, I know when I was programming Amigas, I would not use hardware sprite very offten, blitter objects were much more flexible (I would often use my Blitter objects on a foreground dual playfield so they behaved more like sprites).
But anyway, sprites were simply a solution to the problem of low ram bandwidth... as ram bandwidth increased, they become less useful. The colour colour depth limit alone makes them impractical for most tasks.
>Right, I understand this -
You don't because later in your post you state the samething-- let the hardware do it. You can't let the hardware do it, if the argument is how to render sprites on a system that does not support hardware sprites.
Either you are stupid, of you are doing this on purpose.
Modern gfx hardware can display graphics objects all by itself... These grahpical objects are far in advance of anything the Amiga hardware can do. I have explained all this in my earlier post, which for some reason you ignore?
>and even if you have to repaint every pixel by CPU, this is easily possible on modern hardware.
That's not the argument either. This is a straw man argument. When you emulate accurately some aspect of the system, you have to meet or exceed the requirements; here I PURPOSELY used the words REAL-TIME sprites meaning you have to meet the real-time constraints of the original item you are trying to emulate. So back to the point, if the Amiga 1000 OCS can render 30 sprites in around 40 microseconds, you have to do the same in the new system in 40 microseconds or less.
The OCS A1000 can display 8 hardware sprites (at low resolution and of 4 colours each) per refresh...
Any modern system can easily exceed that number at full screen resolution and at 24bit colour depth... even just using the CPU, and will perform even better if you use the GFX hardware.
You are the one shifting the argument... by bringing Emultion into this... then you have the overhead of all system interactions to wait for. But with a modern system, emulation is easy, I had A500 emulation running any progam I threw at it at full speed in 1999 on an old P233...
IT HAS NOTHING TO DO WITH REFRESH RATE. Imagine a scenario where besides the 40 microseconds, all the other time is being used to send pulses through the I/O ports or the Amiga is in HALTed state and some other machine is controlling some medical heart/lung machine.
You are deflecting... It has everything to do with refresh rate. We are talking about the Gfx system, in such a system the Quantum
is the refresh rate.
>> Moreover, you missed another point-- that you have to use a standard graphics card/CPU not something that works on maybe your system and you are NO LONGER using a system that does NOT support sprites.
Is this even a vaild English sentence?
>What do you mean "works maybe your system"? 3D graphics cards that do texture mapping in hardware have been around for over a decade!
I know cards are around, but we're talking about standards. AGP is the standard since most people nowadays have AGP or better cards. I'll answer this further below.
Even my oldest PC, is PCI-E... but why are you talking about a conector interface. Had you said VESA 2.0 (the standard for all gfx cards) I could have taken your post with more creedence...
>How old is your ATI card exactly?
Does not matter really since it has to work in most PC systems which would require doing it in software not relying on some sort of "sprite" hardware being present.
Then use a suface normal object with the 3D Hardware... But the blitter is more than capable of this task, and this the method I would choose on the Amiga too... hardaware sprites are lame for most tasks.
>Software written for graphics cards will work on any make of graphics cards (although there may be some differences, this is in areas that is way beyond what any Amiga chipset ever did) - unlike banging the hardware, which won't work on anything, possibly not even a newer version of that chipset from the same company (consider all the OCS vs ECS vs AGA incompatibilities).
That's wrong. OCS banging works just fine for ECS/AGA as far as I have tried it and thus good for this argument. On the contrary, you can't be sure the graphics cards will support certain hardware features that you may be relying on.
Modern systems are highly integrated software/hardware combinations. The Driver provides an abstraction away from the hardware, from an engineering point of view this is vastly superior solution.
Once you add a feature to the hardware, where the hardware is exposed to the developer... that feature can never be removed... if you only offer software interfaces, the hardware can be improved and the feature depreciated (for removal in future).
And some software/OS/drivers may shut down certain hardware features without you knowing it.
You what?
And there are more bugs in these software/OS/drivers than in OCS/ECS/AGA compatibility.
Software can be updated, simply and quickly. Hardware is set in stone (or rather silicon)... If I buy a device with a broken driver, a simple sfotware update fixes it... buy a device with broken hardware... that device is always going to be broken.
So maybe it will work using a device driver and maybe it won't. Hardware banging is allowed for by Commodore themselves in the Hardware reference manual as I already explained.
Yes Commodore did allow hadware banging... and Apple didn't... which one is still around?
As you improve the hadware, and programers are used to an exposed hardware interface, you have to keep the old circuits in place... filling the chip up with antiquated functions that steal space from modern features.
>If you're rendering from hardware, the CPU doesn't need to do a thing.
We're not sure if hardware is present, so we need to take the worst case and do some algorithm like (after pasting sprites in appropriate areas):
IOn a modern system, you don't need to worry if a hardware feature is present, the driver either uses the hardware or emulates the feature, as best it can. That way software always runs as best it can!
Mov ECX,640*400/4
CLD
Mov EDI,VidMem
Mov ESI,BitMapPtr
Rep Movsd
Don't post random crap in forums, it annoys me.
>You don't need "sprites", because any bog standard (or even several years old) PC will do it in hardware. You don't need latest and greatest - that was just an example of what modern hardware is like today - a 10 year old Voodoo would do it.
It's not a standard and some AGP cards do not support hardware sprites. Regardless, the argument is to emulate sprites in systems that DO NOT support it in hardware.
Hardware sprites are a solution to a problem that doesn't exist anymore... Sprites were only useful in low memory, low bandwidth systems...
>But even if we restrict ourselves to a CPU solution, I don't see why this is not possible. The obvious example would be a software 3D renderer, which has to redraw the entire screen many times a second. That was being done a decade ago with Quake - now computers are doing things like real time raytracing!
See now why this is called a straw man's argument.
The Quake example gives was a perfect counter to your argument. This is shown clearly by your inability to refute it.
>> Duh! Perhaps, I should put in Video Toaster in my machine and use that to my advantage as well and some other souped up attachment that only works on my Amiga.
>Okay, fine - and what will it be able to do better, compared with modern hardware?
Another straw man argument. Never said I'm trying to beat out modern hardware;
Yes you did, you said modern hardware can't display 8 low res 4 colour Graphical Objects (i.e. Sprites) on screen and move them every screen refresh. But a 25 year old Computer system can. Your premis is wrong.
since you kept picking some 100GB graphics card, I started picking up some hardware which is nonstandard for Amigas. I purposely picked OCS Amiga as an example not even AGA to stick with bare standard where you know exactly what is happening in a REAL-TIME set up.
Real Time in computer science simply means Achieving a task within a set time contraint. The task must complete by its deadline, or it has failed.
Also please learn to quote... it's not rocket science, yet somehow you don't seem to be able to achive this simple task.