ronybeck wrote:
And big boys say NVidia cheats on 3D benchmarks.
hehehe I think NVIDIA termed it a "Product enhancment". That is the NVIDIA Drivers detects when 3D Mark was run and drop the texture quality slightly (plus other things I guess ) to produce a slightly better result. ATI do this as well. They just don't get caught.
The latest (NVidia) thing is that they rewrote some major aspect of their renderer to produce an acceptable result in the benchmark. Some arm-twisting got Futuremark to agree that it's an "application-specific optimization," but it was a bit more severe than ATI's past "Quack"ery. Changing the angle at which the scene was rendered would totally destroy the image, as opposed to things like texture-quality hacks (ATI fessed up to something subtle, too, but subtle enough that it didn't really up their numbers much, either)... Whether you consider that an "acceptable" optimization is up to you; fact is, they got in a tight spot, don't agree with some of the direction "standards" (DX9, etc, vs. their own Cg, etc.) are taking anyway, had to put a chip out the door on an old process with a fan the size of a kitchen appliance, and optimized the specific benchmark case to heck to try to keep some face.
Since most actual 3D apps let you move the 'camera' (player movement in an FPS, rotation/etc in CAD or data vis), it was considered a pretty severe cheat... er, "artificial optimization," as these things go. Meanwhile, ATI's had driver/design bugs in the past that just totally ruin rendering to begin with... (Something to do with the sky in some popular demo, around the release of the initial Radeon... and those Rage II+DVDs I loathe so much can't even get 2D right.

)
Tons of articles on the subject, if you Google around.
What you have to remember is though is that AOS4 won't have DirectX. It will use Mesa aka OpenGL. Given this, NVIDIA has always been an OpenGL Card. The drivers support directx though. ATI is designed to be a directx card. As such NVIDIA cards have an edge in OpenGL performance over ATI.
I don't really know. Maybe they do optimize this way, but I assume NVidia is most concerned about pushing their own technologies that bring them revenue and a hope of mindshare/lock-in (Cg, again, being the example I know of off the top of my head). Apple certainly doesn't use DirectX, and ATI's had a few wins there, though that's also political after the whole "NVidia leak" BS there. Do benchmarks back this up, and can anyone trust them either way anyway?

It really doesn't matter what card you use. There wont be anything on Amiga that needs such awsome power of the Radeon 9800 for a long time ( if ever ). So it will just be a bugdet prefference unless your religion prohibits you from buying NVIDIA ;-)
If we get some ports of "big-name" games, hopefully we'll see it sooner, rather than later. However, since we'll be using the relatively agnostic OpenGL, and/or Warp3D which nobody in the "real world" has heard of anyway, I think we can reduce it to "a fast card is a fast card;" the biggest predictor of performance issues will probably be the openness of the vendors, since most drivers will probably be third-party (SciTech, Hyperion, P96?). Right now, this favors ATI, but for all we know, NVidia will overcompensate in our favor once they realize they're losing friends. ("Hey, who cares if we give the Amiga nuts some specs? Nobody uses those, anyway, and it'll make a good press release." :-D)