Welcome, Guest. Please login or register.

Author Topic: Gforce4 support for the AmigaOne  (Read 5774 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline Floid

  • Hero Member
  • *****
  • Join Date: Feb 2003
  • Posts: 918
    • Show all replies
Re: Gforce4 support for the AmigaOne
« on: August 01, 2003, 02:29:16 AM »
@Wain - I haven't been paying attention, but I assume whoever's running them is going to cap it off with "Buy a Mac!," "Buy a PC!," or something else "amusing" at the end.

Anyhow, SNAP is great, 2D support off the bat (well, whenever it appears integrated; I'd rather hope they get it out for the initial 4.0 pressing, as otherwise, it's sort of a moot point- without it, you'll need a fairly cutting-edge ATI? card or a rare Voodoo3? just to boot, anyway...), little need to even worry about native 2D drivers just yet, as SciTech know what they're doing... 3D support is basically a separate issue, technically, and can be integrated into Warp3D or whatever the heck we're talking about here when/as it's possible.

Both ATI and NVidia seem to prove the biggest jerks always win.  Lately, ATI tries to do the right thing and fails (Windows drivers, and I shan't forget my Rage II+DVD boxes with the horrible 'black snow' bug that shows up at 800x600 or higher), while NVidia sticks to what might make them profitable (Cg, proprietary initiatives... I still remember the pay drivers from a few years ago, has everyone else forgotten?)... Choose your poison.  I lean towards ATI's "ethics" lately, but fact is, the graphics market would suck even worse if one of them disappeared.

If all you need is 2D, the low-end Matrox line still seems to fit the bill.  What're these fanless cards someone spoke of?
 

Offline Floid

  • Hero Member
  • *****
  • Join Date: Feb 2003
  • Posts: 918
    • Show all replies
Re: Gforce4 support for the AmigaOne
« Reply #1 on: August 01, 2003, 05:42:43 AM »
Quote

ronybeck wrote:
Quote
And big boys say NVidia cheats on 3D benchmarks.


hehehe I think NVIDIA termed it a "Product enhancment".  That is the NVIDIA Drivers detects when 3D Mark was run and drop the texture quality slightly (plus other things I guess ) to produce a slightly better result.  ATI do this as well.  They just don't get caught.
The latest (NVidia) thing is that they rewrote some major aspect of their renderer to produce an acceptable result in the benchmark.  Some arm-twisting got Futuremark to agree that it's an "application-specific optimization," but it was a bit more severe than ATI's past "Quack"ery.  Changing the angle at which the scene was rendered would totally destroy the image, as opposed to things like texture-quality hacks (ATI fessed up to something subtle, too, but subtle enough that it didn't really up their numbers much, either)...  Whether you consider that an "acceptable" optimization is up to you; fact is, they got in a tight spot, don't agree with some of the direction "standards" (DX9, etc, vs. their own Cg, etc.) are taking anyway, had to put a chip out the door on an old process with a fan the size of a kitchen appliance, and optimized the specific benchmark case to heck to try to keep some face.

Since most actual 3D apps let you move the 'camera' (player movement in an FPS, rotation/etc in CAD or data vis), it was considered a pretty severe cheat... er, "artificial optimization," as these things go.  Meanwhile, ATI's had driver/design bugs in the past that just totally ruin rendering to begin with...  (Something to do with the sky in some popular demo, around the release of the initial Radeon... and those Rage II+DVDs I loathe so much can't even get 2D right. ;))

Tons of articles on the subject, if you Google around.

Quote
What you have to remember is though is that AOS4 won't have DirectX.  It will use Mesa aka OpenGL.  Given this, NVIDIA has always been an OpenGL Card.  The drivers support directx though.  ATI is designed to be a directx card.  As such NVIDIA cards have an edge in OpenGL performance over ATI.
I don't really know.  Maybe they do optimize this way, but I assume NVidia is most concerned about pushing their own technologies that bring them revenue and a hope of mindshare/lock-in (Cg, again, being the example I know of off the top of my head).  Apple certainly doesn't use DirectX, and ATI's had a few wins there, though that's also political after the whole "NVidia leak" BS there.  Do benchmarks back this up, and can anyone trust them either way anyway? ;)

Quote
It really doesn't matter what card you use.  There wont be anything on Amiga that needs such awsome power of the Radeon 9800 for a long time ( if ever ).  So it will just be a bugdet prefference unless your religion prohibits you from buying NVIDIA ;-)
If we get some ports of "big-name" games, hopefully we'll see it sooner, rather than later.  However, since we'll be using the relatively agnostic OpenGL, and/or Warp3D which nobody in the "real world" has heard of anyway, I think we can reduce it to "a fast card is a fast card;" the biggest predictor of performance issues will probably be the openness of the vendors, since most drivers will probably be third-party (SciTech, Hyperion, P96?).  Right now, this favors ATI, but for all we know, NVidia will overcompensate in our favor once they realize they're losing friends.  ("Hey, who cares if we give the Amiga nuts some specs?  Nobody uses those, anyway, and it'll make a good press release." :-D)