Amiga.org
Amiga computer related discussion => Amiga Hardware Issues and discussion => Topic started by: Minion on January 27, 2003, 07:21:21 PM
-
Just seen the results for the GeforceFX on Tomshardware - Its totally lame considering it has a 500Mhz GPU clock with 1Ghz DDR=II memory. On average its a little bit faster than a Radeon 9700 Pro, and in many tests, it is slower. Bare in mind that the Radeon uses cheeper DDR memory, and lower tech 0.15 micron process. Now if only ATI can sort out their drivers.
Whoever posted the GeforceFX WOWOWOWOWOW post needs to learn to not be suckered by marketing speak. (remember the Pentium 4 Specs?)
-
As with NV25 and R300 examples, the driver support has to mature. Refer to serious OpenGL driver battle royal for the indications of NV30's potential. At the moment, official release drivers doesn’t even support NV30+ family (only in leaked beta form).
Anyway, nVidia has branched to chipset and integrated audio markets btw…
Likewise with “Pentium 4” product (a core change), it takes time mature.
PS; I recall, NVidia hasn't made the shift to pure 256bit bus unlike ATI, Matrox, 3DLabs. Geforce FX is one crippled GPU, even with 1Ghz DDR technology (due to DDR overheads).
-
On average its a little bit faster than a Radeon 9700 Pro, and in many tests, it is slower
To quote www.tomshardware.com
NVIDIA takes the crown! No question about it - the GeForceFX 5800 Ultra is faster than the competition from ATI's Radeon 9700 PRO in the majority of the benchmarks. However, its lead is only slight, especially compared to the distance that ATI put between its Radeon 9700 PRO and the Ti 4600. Still, when compared to its predecessor, the GeForce4 Ti, the FX represents a giant step forward.
-
I'd hardly call the GeForceFX pre-release boards slow... It does trample the current fastest graphics card on most tests, and comes very close on the few it loses...
However, what DOES concern me is the power consumption and the personal tornado generator that nVidia calls a cooling unit on the thing. THAT, to me, looks like a sign of desperation. Much like the wacky things 3dfx was trying shortly before they went belly-up. (Trying to stuff 4 GPUs on a single consumer-level card, anyone?)
Hopefully nVidia will manage to turn things around, though. For the most part, they've shipped quality products in the past, and have done some wise things with drivers, etc... Of course, if they don't it'll just be ATI's turn to run with the lead for a while...
Nothing much changes except for the names and the locations of the bugs... :-D
-
The GeforceFX may be slower in terms of FPS on 'current' games... but its feature packed...and with Cg its features will be easily implimented... its raw polygon push is much more appealing from what I've read... and I'm sure its drivers will be much better.
Its lock will go up 300mhz from what I've read....
I think it wise to not get sucked in by FPS of current games... its more then just FPS at what AA...
If you judge a GPU by its FPS and not by its features....then your missing the point entirely... we're enter a time where FPS matter less and features matter more... and stability/drivers/etc matter more... the race for FPS was over at the Gforce4 ...now its time to see some features and higher poly counts at the same FPS...
I well remember P4 specs... first generation P4's got beat by Athlons... now look where we are a little while later?
To me personally... ATI will never be in the lead until their able to get decent windows drivers... I dont consider their current psudo-decent-drivers to be of the level of 3dslabs/nvidia or even SiS's Xabre400 or Matrox's parhelia... sure those cards might not all stack up in the hardware department...but as everything includeing the Amiga proves... hardware is only half the battle...until ATI can get decent drivers out there...I wont buy their products...even if their 10x the speed...
-
Hopefully nVidia will manage to turn things around, though. For the most part, they've shipped quality products in the past, and have done some wise things with drivers, etc... Of course, if they don't it'll just be ATI's turn to run with the lead for a while...
Unlike 3DFX, there are no problems with nVidia’s financial future since they have branched to other market segments.
From leak beta 42.70 driver's nv4_disp.inf file. There are other code names.
NVidia.Nv25.3 = "NVIDIA NV25"
NVidia.Nv25.4 = "NVIDIA GeForce4 Ti 4200"
NVidia.Nv25GL.1 = "NVIDIA Quadro4 900 XGL"
NVidia.Nv25GL.2 = "NVIDIA Quadro4 750 XGL"
NVidia.Nv25GL.4 = "NVIDIA Quadro4 700 XGL"
NVidia.Nv28.1 = "NVIDIA GeForce4 Ti 4800"
NVidia.Nv28.2 = "NVIDIA GeForce4 Ti 4200 with AGP8X"
NVidia.Nv28.3 = "NVIDIA GeForce4 Ti 4800 SE"
NVidia.Nv28GL.1 = "NVIDIA Quadro4 980 XGL"
NVidia.Nv28GL.2 = "NVIDIA Quadro4 780 XGL"
NVidia.Nv30.1 = "NVIDIA NV30"
NVidia.Nv30.2 = "NVIDIA GeForce FX 5800 Ultra"
NVidia.Nv30.3 = "NVIDIA GeForce FX 5800"
NVidia.Nv30GL.1 = "NVIDIA Quadro FX 2000"
NVidia.Nv30GL.2 = "NVIDIA Quadro FX 1000"
NVidia.Nv31.1 = "NVIDIA NV31"
NVidia.Nv31.2 = "NVIDIA NV31 "
NVidia.Nv31GL.1 = "NVIDIA NV31GL"
NVidia.Nv31GL.2 = "NVIDIA NV31GL "
NVidia.Nv34.2 = "NVIDIA NV34"
NVidia.Nv34.3 = "NVIDIA NV34 "
NVidia.Nv34.4 = "NVIDIA NV34 "
NVidia.Nv34GL.3 = "NVIDIA NV34GL"
NVidia.Nv34GL.4 = "NVIDIA NV34GL "
-
It's hard to place the blame. According to Anandtech, overclocking the memory doesn't yeild any performance increase, so the problem is either the drivers or the core itself.
ATI switched to a new core design a while ago (which is why the Radeon 9700 doesn't use unified drivers with the 8500), so they've had time to test the hell out of their drivers, which normally has been ATI's weak point. NVidia still has some work to do on their drivers, but then the brand new 350 Radeon core will be available. After all, the GeforceFX only comes out on top if you turn on ALL the effects, including those not supported by any current games. The Radeon Pro really does a number on NVidia for current games.
Frankly, I think it's like the old AMD vs Intel battle. Intel has the clock speeds, but AMD is just a more effecient processor. Raw clock speeds can't save the GeForceFX. NVidia will have to do some major work with their drivers just to stay competitive with ATI.
I don't think the Cg effects of the GeForceFX will be embraced so quickly with this kind of performance disappointment. And with that damned leafblower roaring and your first PCI slot blocked, I think the whole GeForceFX release has been a disaster.
Reminds me of the Matrox Parhelia. That card had more memory bandwidth than anything else out at the time, but the card was still not so hot (but is sure was expensive!!!)
-
...there are no problems with nVidia’s financial future since they have branched to other market segments.
Correct me if I'm wrong, but I believe they said the same things about Enron... (Great growth, corporate diversification, etc...) Of course, they were blatently cooking the books, where nVidia was only investigated, with no charges brought. Take that for what it's worth.
In today's microcomputer market, it only takes one botched product cycle to put you in pretty deep trouble. Personally, I'm a bit worried that through all their other 'market segments' such as mediocre chipsets and XBox alliances, they lost sight of what got them to the top... making a good Graphics Processor.
-
I don't care what manner of PR the FX gets, in my mind and most of the people at Sharkey's (http://www.sharkyforums.com/forumdisplay.php?s=&forumid=14), nVidia are sitting on a lemon. I hope they can make as much revenue from selling lemonade as they did selling video cards last year. lol.
-
Ilwrath wrote:
...there are no problems with nVidia’s financial future since they have branched to other market segments.
Correct me if I'm wrong, but I believe they said the same things about Enron... (Great growth, corporate diversification, etc...) Of course, they were blatently cooking the books, where nVidia was only investigated, with no charges brought. Take that for what it's worth.
Does Enron produce and owns real products (for sale)? Or are they just yet another middle man(i.e. energy distributor) ?
In today's microcomputer market, it only takes one botched product cycle to put you in pretty deep trouble.
Can that be said for the first release of Radon?
Personally, I'm a bit worried that through all their other 'market segments' such as mediocre chipsets
NForce 2 is not a mediocre chipset relative to VIA "crap sets". It maybe true with the first nForce 1 release but not true on the second release. This pattern is similar to the original release of Geforce 256. It took the second release(e.g. Geforce 2 series) to make this a real success(relative to 3DFX).
On Athlon XP 1800+/ASUS nForce 2/512Mb-DDR-SDRAM)/GF4-4200 delivers 254.1 FPS on QuakeIII(normal settings, timedemo 1, demo001, nosound). This is not bad for 1.53Ghz CPU. 3DMarks2001SE @ 10336. (No overclocking)
It should be competitive with similar equipped 2Ghz Pentium 4.
My older MSI built VIA KT class mobo(with similar components as above) doesn’t does deliver the same frame rates as the nForce 2 based board i.e. ~+182FPS on QuakeIII(default, timedemo 1, demo001, nosound). 3DMarks2001SE @ 8780.
(No overclocking)
and XBox alliances, they lost sight of what got them to the top... making a good Graphics Processor.
Focusing on a single product doesn’t guarantee survival (refer to 3DFX ).
-
I well remember P4 specs... first generation P4's got beat by Athlons... now look where we are a little while later?
So you mean that the GFFX has a longer pipeline and will be able to scale to higher clockspeeds? Or are you attributing the P4's current performance to software optimization?
I'd take issue with the P4/software route since many of the P4 optimizations also benefit the Athlon and give it a performance boost as well. Perhaps not as much as the P4, though.
If you take a P4 at 2 GHz and an Athlon at 2 GHz, clock-for-clock, and ran the same program, the Athlon would perform better. Sure, software optimization will help the P4 a bit, but... I think the fact that the P4 is something like, what, 600 MHz faster? is more of a factor than anything else.
Since the GFFX isn't exactly positioned as the first "GPU" of a family of chips designed to scale up to 2 GHz, uh, I don't see how the analogy holds any water.
---------
Enron? Diversification? Actual products? Blah blah blah.
http://finance.yahoo.com/q?s=NVDA&d=t
Looks to me like nVidia's motherboard chipset efforts, broad product range (high-end, mid-range, budget, portable), and R&D (creation of Cg and attempting to get it adopted, migration to .13 micron process) has really paid off.
I mean, a year ago their stock was almost 7 times as valuable. Granted, the stock market is insane, but this means that investors have waning faith in nVidia's ability to provide a competitive product--diversification or not.
And now consumers are also questioning nVidia.
http://www.hardocp.com/article.html?art=NDIxLDY=
I don't know where there was a mention of 300 MHz overclocking or room for improvement. The sample only went up 30 MHz or so. They'd NEED to clock it up another 300 MHz the way the current sample is performing.
This is a "reference" board with premature drivers. I fully expect the card to perform better with the actual retail products and after a few driver revisions. However, this is a really crappy start.
Oh, and whoever linked to Tom's... that site is generally regarded as biased and bowing to nVidia. It might not be the best place to cite praise for the GFFX.
-
Focusing on a single product doesn’t guarantee survival (refer to 3DFX ).
However, there appears to have been a direct correlation between nVidia's delays in their process shrink and their next flashship product and a lack of consumer and investor confidence.
I'm not sure what your example with 3dfx is supposed to show... they had a great product in the Voodoo2. Then they had delays with their follow-up products and the products didn't perform as well as people were expecting.
Are you trying to imply that ATI is going to buy and gut nVidia? That's the only thing I see in a reference to 3dfx (RIP).
-
Can that be said for the first release of Radon?
The Radeon has sucked for a while. Only when ATI got threatened by nVidia's growing market presence with OEMs did they decide to try and change their image by initiating their "Catalyst" driver program and rapid betas.
And look, it's mainly on the Windows platform--where they get the most exposure. Still don't have features from older ATI cards working on the Mac. Same problem with nVidia, too.
ATI and the Radeons have done well. The hardware is capable and the drivers have markedly improved. And they also have successfully been playing the PR game. Now it's time for nVidia to turn this disappointing first impression (after tons of delays) around into something positive.
Even if 3 months down the line the drivers are giving 30% speed improvement, nVidia needs to do something sooner rather than later or they'll fall out of the public's eye. Thats' just my opinion, though, so who knows. We'll all have to wait and see.
-
A few random thoughts ...
The ATI card actually has a fast memory bus (twice the buswidth)...
The drivers will improve a lot, nVidia usually runs a tweaked prev gen driver when they release new chips ... speedups will come when they put in real support for the chip in the driver.
Toms Hardware was once great (when Tom did most of the reviews himself and took pride in doing them good) but they have degraded to much lately ...i.e. comparing XBox with PS2 and saying the PS2 isn't as good because it lacks vertex shaders ( The PS2 has really kewl vector units that are more programmable than the GFFX btw)
Please don't compare the GFFX with P4 this chip complete blows away the GF4, the P4 never could compete with the P3 really. The P4 was supposed to be faster per clock than the P3 but the chip got to big so they had to cut out a lot of units to make it run. Hope they make it back to the next rev ...
The GFFX has no real advantage over the 9700 feature wise. But hey you can use the Ultra as a hairdryer !
Don't buy a card for features you won't need during the life time of the card ... Do you know any game that looks a lot better with GF3(pixel shaders) than GF4mx ? no didn't think so ...
/Babbling off
-
Oh yeah you mean the the most recent P4s based MBs tromping DDR400 based systems? That sure happened, but IMO it was short lived. NForce2 with Double DDR will likely take the crown back soon.
-
However, there appears to have been a direct correlation between nVidia's delays in their process shrink and their next flashship product and a lack of consumer and investor confidence.
They still have a month for the final release. The conclusions was a bit premature. The 42.xx drivers were still in beta stage.
It be would nice IF you could show me a link, which has non-beta 42.xx drivers. I have 42.70 which are still in beta stage.
I'm not sure what your example with 3dfx is supposed to show... they had a great product in the Voodoo2. Then they had delays with their follow-up products and the products didn't perform as well as people were expecting.
Read the previous post (I'm referring to Ilwrath's post). 3DFX only covers 1 market segment i.e. video card market.
Ilwrath's claim was 1 market segment = success. This case was not true for 3DFX.
3DFX don't have alternative revenue sources outside the video card market.
Are you trying to imply that ATI is going to buy and gut nVidia? That's the only thing I see in a reference to 3dfx (RIP).
Failed to read the previous post will ultimately lead you to a wrong conclusions.
-
Remember when AMIGAS on board graphics kicked ALL p.c. graphics cards ASSES ? : ( Commodore was pretty diversified too ! I loved 3DFX my 5500 still running.(oops) Nvidia drivers always have issues ATI getting better . Why can't we get kick ass AMIGA graphics again ? . The AMIGA was cheaper pound for pound over a p.c.
Commodore made more than p.c. clones ! Tyco is very deversified , Daewoo was very deversified, United tried to deversify (mismanagement and corparate greed can kill ANY company!
-
ATI and the Radeons have done well. The hardware is capable and the drivers have markedly improved. And they also have successfully been playing the PR game. Now it's time for nVidia to turn this disappointing first impression (after tons of delays) around into something positive.
Then try it on some AGP 8X motherboards.
From Australian PC USER Nov 2002 edition, page 30.
Five Radeon 9700 cards fail on the following motherboards;
1. MSI 648 Max (Sis648 chipset, Pentium 4)
2. Soltek SL-85ERV (VIA p4X400, Pentium 4)
3. ASUS P4S8X(SiS648, Pentium 4)
4. Gigabyte GA-7vAXP(VIA KT400, Athlon XP)
5. VIA P4PB400(VIA P4X400 Pentium 4)
Now the 5 Radeon 9700s cards.
1. ATI Radeon 9700 Pro
2. Gigabyte Maya II Radeon
3. Hercules 3D Prophet Radeon Series 9700 Pro
4. HIS Excalibur Radeon 9700 Pro
5. PowerColor Evil Commando Radeon 9700 Pro
In Australian PC USer Dec 2002 edition, Under the
title of "More Radeon 9700 Woes", page 36.
1. VIA P4PB400 (they manage to work with this mobo, but at a slower performance compared 4X mode).
2. "flakey" on Intel's new D845GEBV.
A typical ATI 9700 vendor response = "upgrade your bios". A mundane end users ("average punter") shouldn't be the ones be handling these issues (i.e. BIOS flashing and 'etc').
That should put things into perspective.
PS; Both SIS's Xabre and NV18 works fine with the above mentioned 8X AGP equiped motherboards.
-
Commodore was pretty diversified too
Not quite diversified i.e. thier addons relies on the success of their main product (i.e. the Amiga PC).
I don’t think their X86 PC lines were competitive enough
Refer to S3's example as a survivor in the X86 PC market unlike the 3DFX Inc.
I loved 3DFX my 5500 still running Nvidia drivers always have issues
I don’t think 3DFX V5500 was running on nv4(refer nv4_disp.inf file) family of drivers.
And I quote from "nv4_disp.inf" file.
NVidia = "NVIDIA"
NVidia.Nv4 = "NVIDIA RIVA TNT"
NVidia.Nv5 = "NVIDIA RIVA TNT2/TNT2 Pro"
NVidia.Nv0A = "NVIDIA Aladdin TNT2"
NVidia.NvVanta = "NVIDIA Vanta/Vanta LT"
NVidia.NvUltra = "NVIDIA RIVA TNT2 Ultra"
NVidia.Nv5M64 = "NVIDIA RIVA TNT2 Model 64/Model 64 Pro"
NVidia.Nv10 = "NVIDIA GeForce 256"
NVidia.Nv10DDR = "NVIDIA GeForce DDR"
NVidia.Nv10GL = "NVIDIA Quadro"
NVidia.Nv11 = "NVIDIA GeForce2 MX/MX 400"
NVidia.Nv11DDR = "NVIDIA GeForce2 MX 100/200"
NVidia.Nv11GL = "NVIDIA Quadro2 MXR/EX"
NVidia.NvCrush11 = "NVIDIA GeForce2 Integrated GPU"
NVidia.Nv15 = "NVIDIA GeForce2 GTS/GeForce2 Pro"
NVidia.Nv15DDR = "NVIDIA GeForce2 Ti"
NVidia.Nv15BR = "NVIDIA GeForce2 Ultra"
NVidia.Nv15GL = "NVIDIA Quadro2 Pro"
NVidia.Nv17.1 = "NVIDIA GeForce4 MX 460"
NVidia.Nv17.2 = "NVIDIA GeForce4 MX 440"
NVidia.Nv17.3 = "NVIDIA GeForce4 MX 420"
NVidia.Nv17.4 = "NVIDIA GeForce4 MX 440-SE"
NVidia.Nv17GL.1 = "NVIDIA Quadro4 500/550 XGL"
NVidia.Nv17GL.2 = "NVIDIA Quadro4 NVS"
NVidia.Nv18.2 = "NVIDIA GeForce4 MX 440 with AGP8X"
NVidia.Nv18.3 = "NVIDIA GeForce4 MX 440SE with AGP8X"
NVidia.Nv18.4 = "NVIDIA GeForce4 MX 420 with AGP8X"
NVidia.Nv18GL.1 = "NVIDIA Quadro4 580 XGL"
NVidia.Nv18GL.2 = "NVIDIA Quadro4 280 NVS"
NVidia.Nv18GL.3 = "NVIDIA Quadro4 380 XGL"
NVidia.Nv01F0 = "NVIDIA GeForce4 MX Integrated GPU"
NVidia.Nv20 = "NVIDIA GeForce3"
NVidia.Nv20.1 = "NVIDIA GeForce3 Ti 200"
NVidia.Nv20.2 = "NVIDIA GeForce3 Ti 500"
NVidia.Nv20DCC = "NVIDIA Quadro DCC"
NVidia.Nv25.1 = "NVIDIA GeForce4 Ti 4600"
NVidia.Nv25.2 = "NVIDIA GeForce4 Ti 4400"
NVidia.Nv25.3 = "NVIDIA NV25"
NVidia.Nv25.4 = "NVIDIA GeForce4 Ti 4200"
NVidia.Nv25GL.1 = "NVIDIA Quadro4 900 XGL"
NVidia.Nv25GL.2 = "NVIDIA Quadro4 750 XGL"
NVidia.Nv25GL.4 = "NVIDIA Quadro4 700 XGL"
NVidia.Nv28.1 = "NVIDIA GeForce4 Ti 4800"
NVidia.Nv28.2 = "NVIDIA GeForce4 Ti 4200 with AGP8X"
NVidia.Nv28.3 = "NVIDIA GeForce4 Ti 4800 SE"
NVidia.Nv28GL.1 = "NVIDIA Quadro4 980 XGL"
NVidia.Nv28GL.2 = "NVIDIA Quadro4 780 XGL"
nvWin2kDualview = "NVIDIA Dualview"
Can you find 3DFX in there?
-
Now if only ATI can sort out their drivers.
More people living in the past. ATI's current driver sets are rock solid and great performers. Very much on par with NVIDIA.
-
METAL wrote:
Now if only ATI can sort out their drivers.
More people living in the past. ATI's current driver sets are rock solid and great performers. Very much on par with NVIDIA.
I wish I could reply like this…
-
More people living in the past. ATI's current driver sets are rock solid and great performers. Very much on par with NVIDIA.
Display drivers maybe, All-In-Wonder multimedia bundled software is buggy as hell. I dare you to prove me wrong.
-
For Geforce FX QnA with nVidia Corp refer to
http://forums.tweaktown.com/showthread.php?s=&threadid=7998
-
And with that damned leafblower roaring and your first PCI slot blocked,
:roflmao: :roflmao: :roflmao: :roflmao: :roflmao:
-
A typical ATI 9700 vendor response = "upgrade your bios". A mundane end users ("average punter") shouldn't be the ones be handling these issues (i.e. BIOS flashing and 'etc').
Yes, maybe typical, but there was a problem with the bios on a number of 8x capable boards that had to be resolved with a bios update before using 8x cards. It is very valid, and should not be assumed that it is the Radeon 9700's fault. Actually I didn't really attribute it as anyone's fault, just a glitch in moving to a new standard.
After seeing the Radeon 9700 Pro in action, I almost had to pick my jaw up off the floor. Very impressive!
And no leaf blower necessary! ;)
-
Reminds me of the Matrox Parhelia. That card had more memory bandwidth than anything else out at the time, but the card was still not so hot (but is sure was expensive!!!)
Actually, GeF FX < R9700 in trems of real bandwidth.
GeF FX = 16Gb/s (128bit bus x 500Mhz mclk speed)
R9700 = 19Gb/s (256 bit bus x 310Mhz mclk speed)
Potential move for Nvidia (without another massive core change)
1. Move to 256 bit bus.
Potential move for ATI
(without another massive core change)
1. Move to higher clocked memory modules .
2. Move to higher clocked GPU.
-
Herewegoagain wrote:
Yes, maybe typical, but there was a problem with the bios on a number of 8x capable boards that had to be resolved with a bios update before using 8x cards.
It is very valid, and should not be assumed that it is the Radeon 9700's fault. Actually I didn't really attribute it as anyone's fault, just a glitch in moving to a new standard.
The cited motherboards was working with the other 8X AGP video cards(i.e. non-Radeon 9700 cards). Thus the issue was with ATI's end.
This is a repeat of
PS; Both SIS's Xabre and NV18 works fine with the above mentioned 8X AGP equiped motherboards.
Both SIS's Xabre and NV18 are 8X AGP capable video cards.
-
The cited motherboards was working with the other 8X AGP video cards(non-ATI cards). Thus the issue was with ATI's end.
Ok, but they were based on what? The Geforce4 series? If so, it's possible that potential problems were already worked around (just saying it's possible)... But the question is, would these same boards without the bios update have had similiar problems using the GeforceFX?? It would be an interesting test to try.
-
Ok, but they were based on what?
I see you don't have a clue about SIS's Xabre and NV18. Both of these products are 8X AGP capable.
They work with the motherboards cited in my post.
The Geforce4 series?
Did you forget SIS's Xabre GPU card?
But the question is, would these same boards without the bios update
IF they are available.
have had similiar problems using the GeforceFX?? It would be an interesting test to try.
That is not the issue since those cited motherboards worked with other 8X AGP capable cards.
We can get to your theoretical scenario when that event actually happens(i.e. actual product release).
PS; ATI may release fixes or issue a revised R9700 design, which may fix these problems.
Sigh... Why does the average punter has to bear the beta testing phase?
-
ATI's drivers for gaming are OK... their drivers for application 3D are pathetic...and their drivers for their multimedia cards are pathetic...I wont buy one until they start to really make improvements.... I did just buy a wildcat 4110 for 150 off ebay though :P ahh the joys of a market economy.
-
I'm sure that the GeForce FX will get a little bit faster with driver tweeks (even my ol' GeForce Ti 4400 got a 5%-10% speed increase with the latest detonators)
But currently I still think that ATI are in front in terms power + value, although I haven't seen how much a GeForce FX costs. Maybe when CPU's actually get faster, then we'll see what the true potential of each card is, because I feel that even a 3.06Ghz P4 is the bottle neck of the ATI Radeon 9700 pro and possibly even the GeForce FX.
I remember some time last year that a graphics card company, I think Matrox, cheated with their card drivers. They made their drivers lower their graphics detail in Quake III so that it looked like the card was fast. Was a marketing ploy, because Quake III is used as a benchmark for alot of Magazines that review cards.
-
I remember some time last year that a graphics card company, I think Matrox, cheated with their card drivers. They made their drivers lower their graphics detail in Quake III so that it looked like the card was fast. Was a marketing ploy, because Quake III is used as a benchmark for alot of Magazines that review cards.
ummm that was ATI with their first radeons ;D
But now with the 9700 and its new core they have done a good job and nVidia will have to work hard to beat them :-)
have you listened both cards?
GeForce FX Booting (http://www.tomshardware.com/graphic/20030127/images/geforcefx-boot-hq.mp3)
GeForce FX running 3dMark (http://www.tomshardware.com/graphic/20030127/images/geforcefx-3dmark-hq.mp3)
and now the 9700...
Radeon 9700 Booting (http://www.tomshardware.com/graphic/20030127/images/r9700-boot-hq.mp3)
ummm now i've found something better in my Voodoo3, it doesn't have a fan so it's completely quiet. :D
-
mips_proc wrote:
and with Cg its features will be easily implimented... its raw polygon push is much more appealing from what I've read... and I'm sure its drivers will be much better.
With DX9's HLSL, Cg will become pretty obsolete, more so since its support for backends is limited - only two cards I know of implement ARB_fragment_program at all.
Between OGL 2.0 HLSL and the DX9 one, I don't expect Cg to be around for long, more so since Microsoft will really push for their version.
If you judge a GPU by its FPS and not by its features....then your missing the point entirely... we're enter a time where FPS matter less and features matter more... and stability/drivers/etc matter more... the race for FPS was over at the Gforce4 ...now its time to see some features and higher poly counts at the same FPS...
That is, unless you recognize that the featureset of the FX is only unique for now. The others will follow. And, you don't care about that feature set unless it is really used by games, which is absolutely not the case for now.
To me personally... ATI will never be in the lead until their able to get decent windows drivers...
I am sure glad that right now there isn't really any leader at all. It keeps changing back and forth, and new players are also appearing. Right now, quality graphics cards cover a much broader range of price segments, starting from cheap cards like SiS Xabre and Radeon 9000 up to the expensive beasts like the 9700 PRO and GeForceFX. However, this means that games will try to appeal to all of them, and that more or less settles the feature set to the smallest common set.
Fortunately, HLSL's (regardless of which language) are starting to really kick off. I am curious to see what a P-10 can do.
-
I see you don't have a clue about SIS's Xabre and NV18. Both of these products are 8X AGP capable.
These were cross posts without a refresh, but I'm sure you knew this.
That is not the issue since those cited motherboards worked with other 8X AGP capable cards.
Ok, let's take a look and see who is at fault, ATI or the reviewers of these cards (note the Xaber update in the Gigabyte bios). These are the bios versions that were available at the time you claim these cards were reviewed.
[color=CC0000]MSI 648 MAX
BIOS Type AMI® BIOS File Size 491KB
Version 1.1 Update date 2002-9-27
Update Description -Fixed system unstable when using ATI 9700
-Fixed error message sometimes appear after S3 resume
-Fixed S3 resume failed when USB legacy set as enabled
Special note ,
Download 6585v11.exe [/color]
** Soltek I could only locate the SL-85ERV2 on their site, and they list no bios updates that I could see.
[color=CC0000]Asus P4S8X BIOS 1003A
1003A 2002/08/27
Serial ATA ROM updated 376.bin(016).
Patch ATI AGP 8X.
Support Intel Pentium 4 2.8Ghz CPU.
[/color]
[color=0000CC]Gigabyte GA-7vAXP last 3 bios updates
7vaxp_f7.zip F7
(Oct. 25, 2002) 1. Disable CPU fast command when FSB166Mhz CPU plugged.
2. Modify top performance setting frequency.
3. Fixed SIS Xabre AGP card STR fail on win98\ME.
7vaxp_f6.zip F6
(Oct. 07, 2002) 1. Fixed system hang under win98 when Creative PCI sound card exist
2. Improve CPU to AGP performance for ATI AGP card.
7vaxp_f5.zip F5
(Sep. 05, 2002) 1. Mass production release.
[/color]
[color=CC0000] Via P4PB400
Version (Date) 1.13 (01/03/2003)
File Name (Size) P4PB4113.bin (257KB)
Update Description Support high capacity HDD
Version (Date) v1.10 (10/16/2002)
File Name (Size) P4PB4110.bin (257KB)
Update Description add LAN Boot and Enhance other functions
Version (Date) 1.05 (08/29/2002 (08/29/2002)
File Name (Size) P4PB4105.bin (257KB)
Update Description Enhance Some bootable device
Version (Date) 1.00 (07/22/2002)
File Name (Size) P4PB4100.bin (257KB)
Update Description First Release [/color]
So 3 of the 5 boards you listed already had bios updates several months before this site tried to review the card. Soltek doesn't list ANY type of bios updates, and Via is extremely vague about what the fixed in the update. Anyone with half a clue about PC products knows to check for bios updates and chipset updates when an entirely new product like this hits before they try to use it.
-
Hammer wrote:
As with NV25 and R300 examples, the driver support has to mature. Refer to serious OpenGL driver battle royal for the indications of NV30's potential. At the moment, official release drivers doesn’t even support NV30+ family (only in leaked beta form).
I know, but thats not the point. This is the best Nvidia can come up with. The processor has more transistors than the R300, yet it's slower clock for clock than the R300. I suspect this will be the same for DX9+ games aswell, driver optimisations or not.
Anyway, nVidia has branched to chipset and integrated audio markets btw…
Likewise with “Pentium 4” product (a core change), it takes time mature.
PS; I recall, NVidia hasn't made the shift to pure 256bit bus unlike ATI, Matrox, 3DLabs. Geforce FX is one crippled GPU, even with 1Ghz DDR technology (due to DDR overheads).
I dont think that the 256 bit bus on the R300 has as much latency as the DDR-II bus on the NV30, but the difference would probably only make less than a 5% performance difference.
-
Ilwrath wrote:
I'd hardly call the GeForceFX pre-release boards slow... It does trample the current fastest graphics card on most tests, and comes very close on the few it loses...
How does 30% increase represent a trampling when the GFX is ahead, but if the ATI card is 30% ahead, its very close?
-
mips_proc wrote:
The GeforceFX may be slower in terms of FPS on 'current' games... but its feature packed...and with Cg its features will be easily implimented... its raw polygon push is much more appealing from what I've read... and I'm sure its drivers will be much better.
Its lock will go up 300mhz from what I've read....
The GPU ran at 500MHz in the review.
This FPS talk represents an about face! In a previous thread about upgrading, you were all for 130+ FPS! Make your mind up. (or did you agree with my points there?)
-
Hammer wrote:
Ilwrath wrote:
Personally, I'm a bit worried that through all their other 'market segments' such as mediocre chipsets
NForce 2 is not a mediocre chipset relative to VIA "crap sets". It maybe true with the first nForce 1 release but not true on the second release. This pattern is similar to the original release of Geforce 256. It took the second release(e.g. Geforce 2 series) to make this a real success(relative to 3DFX).
Deffo agree with that - those Nforce2 chipsets are the tits! Most of the computers in my house use Via "crapsets", and the Nforce2 is worth about another 200+ rating on them.
-
(note the Xaber update in the Gigabyte bios).
That's nothing to do with AGP 8X. "STR" has something to do with "Suspend-to-RAM" feature.
It doesn’t automatically destabilize the card IF one doesn’t use this feature. One could use "Suspend-to-Disk" instead.
Asus P4S8X BIOS 1003A
ASUS’s updates may have two update values for example.
Via ASUS auto update feature
1001E has update value of "01/13/2003" (For A7N8X)
But from the ASUS web site;
1001E has update value of "2002/12/23"(For A7N8X)
1001G has update value of "2002/12/24 "(For A7N8X)
BIOS 1002 has update value of 2003/01/30, but it doesn’t exist (broken link perhaps) in the FTP sever at this time (AUS 31/01/2003).
2. Improve CPU to AGP performance for ATI AGP card.
A performance tweak. This is a common practice.
MSI 648 MAX
Update on website and FTP site(or via MSI's auto update feature) may not be in sync btw. Note that I do own a late model MSI VIA KT class board.
-
I know, but thats not the point. This is the best Nvidia can come up with
I recall, the official launch date was somewhere in late February.
The processor has more transistors than the R300
NV30 has support for 128bit floating-point color feature, which R300 doesn’t not support. A feature may require more transistors.
I dont think that the 256 bit bus on the R300 has as much latency as the DDR-II bus on the NV30
Well, NV’s DDR-II solution only delivers 16Gb/s, while ATI’s 256bit solution delivers 19Gb/s.
Go figure that out....
-
Deffo agree with that - those Nforce2 chipsets are the tits! Most of the computers in my house use Via "crapsets", and the Nforce2 is worth about another 200+ rating on them.
Where?
-
Hammer wrote:
I know, but thats not the point. This is the best Nvidia can come up with
I recall, the official launch date was somewhere in late February.
20 odd days aint goona get a huge performance increase in all likelyhood, is it? (and yes I know its been done before)
The processor has more transistors than the R300
NV30 has support for 128bit floating-point color feature, which R300 doesn’t not support. A feature may require more transistors.
The R300 does support 128 bit FP colour, as FP colour is a prerequisite for DX9. Check Toms hardware and you'll see the 128 bit FP support in the R300
I dont think that the 256 bit bus on the R300 has as much latency as the DDR-II bus on the NV30
Well, NV’s DDR-II solution only delivers 16Gb/s, while ATI’s 256bit solution delivers 19Gb/s.
Go figure that out....
I was pointing out that the Nvidia solution was SLOWER. i.e. less throughput and more latency
Go figure THAT out........
Where
I was agreeing with you that Nforce2 is good - vastly better than VIA's attempts at a chipset mainly due to the fact that Nvidia can design a memory interface.
No need to assume that because I think that the Geforce FX is sh*te that I think everything Nvidia is crap.
I have a Geforce 2 ultra in one of my computers and Ti in another. I dont give a rats ass who makes my hardware as long as its decent.
-
Deffo agree with that
Is that relevant?
- those Nforce2 chipsets are the tits!
Is that relevant?
Most of the computers in my house use Via "crapsets",
Is that relevant?
I do own MSI built VIA KT class chipset and have access to ASUS built to VIA KT class chipset test machines.
and the Nforce2 is worth about another 200+ rating on them.
What do you mean by "Nforce2 is worth about another 200+ rating on them"?
-
Check Toms hardware and you'll see the 128 bit FP support in the R300
Sorry, I should be referring to "Max pixel shader precision".
Refer to
http://tech-report.com/etc/2002q3/nextgen-gpus/index.x?pg=5
I was pointing out that the Nvidia solution was SLOWER. i.e. less throughput and more latency
Go figure THAT out........
Refer to my "Posted : 2003/1/29 11:45" for a similar statements.
-
20 odd days aint goona get a huge performance increase in all likelyhood, is it? (and yes I know its been done before)
The drivers v42.70 was still in beta form...
-
Hammer wrote:
Is that relevant? (several times)
I do own MSI built VIA KT class chipset and have access to ASUS built to VIA KT class chipset test machines.
Is that relevant? Is that relevant? Is that relevant? Is that relevant?
I now see how all of these arguments erupt on A.org. YOU'RE NOT ALLOWED TO AGREE WITH ANYONE! Or its not relevant or a waste of time.
and the Nforce2 is worth about another 200+ rating on them.
What do you mean by "Nforce2 is worth about another 200+ rating on them"?[/quote]
OK i'll reword it. The Nforce2 Mobo's appear to be up to 10% faster in many tests than the VIA based ones. Therefore, on a 2000XP, they are worth another 10% i.e. It then performs like a 2200XP or has an extra 200+
I checked your link - personally I dont care if the R300 ONLY has 96 bit pixel shader precision, that still allows 7.9x10^28 different positions. I think thats plenty.
The point is the GeforceFX is a crock of sh*te.
-
Not sure if this was alredy posted in this thread, but anyway. Here's a anandtech view to the Radeon vs Geforce debate:
http://www.anandtech.com/video/showdoc.html?i=1779&p=16 (http://www.anandtech.com/video/showdoc.html?i=1779&p=16)
It would be nice to see a 0.13 micron version of Radeon9700PRO running at 500Mhz or faster.... :)
-
OT Did I log on to [color=990000]Amiga[/color][/b][/i][/u].org?
I will never trust ATI's drivers completely - makes life difficult running a day to day PeeCee...
-
@Hammer
Ilwrath's claim was 1 market segment = success. This case was not true for 3DFX.
No, that wasn't at all my claim. My claim was don't neglect the market you're good in for a chance to try your hand in a market you're not so good in.
Pre-expansion nVidia prided itself on making shipping dates. Huang (nVidia's CEO) quote -- straight from Wired magazine
Fact is, Huang knows there's little room for even one mistake in his business, much less the same one twice. It's the nature of the graphics-chip industry: A company rises to leadership only to miss a delivery window and rolls over for an upstart with a better technology. Cirrus Logic, 3dfx Interactive, Pseng Labs, s3, Rendition, Chips and Technologies - they once were all leaders; now they're all gone.
Nvidia has sidestepped the boom-and-bust cycle by hewing to a simple philosophy: Technology matters, but the production calendar rules. "The first breath of success for Nvidia came when we recognized that the PC market has a pulse that's regular and predictable," says chief scientist David Kirk. PC manufacturers ship machines to resellers twice a year - in April and August. That means Nvidia has to have a new chip ready each February and June.
(For the rest of the very good interview, go to
Wired's nVidia interview - July 2002 (http://www.wired.com/wired/archive/10.07/Nvidia.html?pg=1&topic=&topic_set=) - It's worth a read)
Now, for a review of the facts of what's happened:
1) they're blowing ship dates.
They're late on the Feb ship date for the NV30, (Feb 03) and neglected to put anything interesting out for the June 02 ship date.
2) While the NV30 is powerful, it's more of only a tweak above the 9700Pro, rather than the generation jump that was originally marketed...
3) Their financial security is highly debatable. While they have diversified, it's cost them money. Money they may not have, or that they SHOULD have applied to shipping the NV30 faster / better.
-
No, that wasn't at all my claim. My claim was don't neglect the market you're good in for a chance to try your hand in a market you're not so good in.
Are you implying “nForce 2” chipset is not competitive with VIA KT class chipset?
Now, for a review of the facts of what's happened:
1) they're blowing ship dates.
Their chip fab contractor has blown the dates.
From recorded history, fab companies has missed their time schedule during their shift to .13 process.
2) While the NV30 is powerful, it's more of only a tweak above the 9700Pro, rather than the generation jump that was originally marketed...
Did they (i.e. nVidia)?
I recall, their official comparison was with the GeForce 4-4600 TI. Majority of the so-called hype was magnified by the press.
-
YOU'RE NOT ALLOWED TO AGREE WITH ANYONE!
Your words.
OK i'll reword it. The Nforce2 Mobo's appear to be up to 10% faster in many tests than the VIA based ones. Therefore, on a 2000XP, they are worth another 10% i.e. It then performs like a 2200XP or has an extra 200+
A little bit of effort doesn’t bug anyone.
I checked your link - personally I dont care if the R300 ONLY has 96 bit pixel shader precision, that still allows 7.9x10^28 different positions. I think thats plenty.
The point is the GeforceFX is a crock of sh*te.
Very centric response.
-
It would be nice to see a 0.13 micron version of Radeon9700PRO running at 500Mhz or faster.... :)
Both sides has their own advancement strategy.
Refer to
NV35 plans for post NV30 era (http://www.xbitlabs.com/news/story.html?id=1040641432)
-
Hammer wrote:
YOU'RE NOT ALLOWED TO AGREE WITH ANYONE!
Your words.
You are the one that said pointless when I agreed
A little bit of effort doesn’t bug anyone.
Thats why I explained what I mean otherwise I wouldn't have bothered, would I?
I checked your link - personally I dont care if the R300 ONLY has 96 bit pixel shader precision, that still allows 7.9x10^28 different positions. I think thats plenty.
The point is the GeforceFX is a crock of sh*te.
Very centric response.
So you dont disagree, then?
-
@hammer -
Are you implying “nForce 2” chipset is not competitive with VIA KT class chipset?
No... I'm implying that they gave up the clear-cut lead in the fairly high-margin GPU market to create that chipset. Basically the nForce is a good chipset, but it was a poor business decision, and the fact that it's horribly mis-marketed surely can't help matters.
First off, chipsets don't have the profits of high-end graphics processors. Especially not chipsets that only work with AMD processors, which are currently suffering because very few OEM companies are producing computers supporting them. In fact, about the only way to get a computer with an AMD processor is to purchase the poorest of the HP offerings -- which wouldn't use an nVidia chipset, or go white-box or self-built. White box and self-built machines don't sell in the same quantities (as say Dell) and have even LESS of a profit margin.
So, from that we can assume that nVidia is targetting the white-box and self-built market. That market is mostly budget workstations and gamers. The chipset still doesn't make sense, though. The high-end version with firewire support has a semi-poor built-in sound card that no gamer would want. All versions sport the on-board GeForce4MX, which isn't a good fit for gamers or graphics workstations, either. (Gamers would rather have a GeForce4Ti, workstation users would rather have a Quadro, so they're certified with ACAD, Catia, etc.)
Exactly what market is nVidia trying to hit?? :-?
From recorded history, fab companies has missed their time schedule during their shift to .13 process.
Hmm... Perhaps, but then again, if they were counting on this as the boost, then it wasn't the best of planning, either...
If they weren't as busy with other less profitable markets, they could have had a contingency plan for problems with the new manufacturing process, and been able to push the .13 process back to the next shipping round, while still delivering a quality update in the meantime.
I recall, their official comparison was with the GeForce 4-4600 TI. Majority of the so-called hype was magnified by the press.
Very good point. I don't think nVidia ever did make that comparison, but it was what was expected of the card. Releasing a graphics card that falls far short of expectations is never a good thing, even if those expectations are higher than you meant to set them.
-
No... I'm implying that they gave up the clear-cut lead in the fairly high-margin GPU market to create that chipset. Basically the nForce is a good chipset, but it was a poor business decision, and the fact that it's horribly mis-marketed surely can't help matters.
Mis-marketed? How?
Exactly what market is nVidia trying to hit??
.
As with Intel 845 (and E7) series, it should target at all price points.
First off, chipsets don't have the profits of high-end graphics processors. Especially not chipsets that only work with AMD processors, which are currently suffering because very few OEM companies are producing computers supporting them.
.
Based on what?
In fact, about the only way to get a computer with an AMD processor is to purchase the poorest of the HP offerings -- which wouldn't use an nVidia chipset, or go white-box or self-built. White box and self-built machines don't sell in the same quantities (as say Dell) and have even LESS of a profit margin.
.
Refer to http://www.simhq.com/simhq3/hardware/previews/nForce%202/
-
All versions sport the on-board GeForce4MX, which isn't a good fit for gamers or graphics workstations, either
not true... not all NForce2 boards have onboard video... most dont... the NForce2 supports AGP 8x... I got an EPoX 8RDA+ without onboard.
I agree entirely about Nvidia though their going t*ts up in a big way from the look of it... their chipsets are good but their AMD based and no OEM's are going to use them... OEM market penetration of AMD is very lacking right now...all because they didnt cap their cores....
Then you have the issue of their graphics card sucking in general... their Quardo's are about the only thing worth buying right now...thats because their price/performance ratio is hitting a sweet spot right now that ATI cant touch...but probably soon will....
The "LOW END" wich many people talk about...that makes everyone all their money is having FIERCE competition right now...
Trident comming out with a new low-end GPU, you got the ATI low-end offerings...the Nvidia low-end offerings... you got Xabre 400 hitting at the low-end... you still have Intel putting its integrated gpu on boards.... you got Matrox hitting that somewhat .... its a hard market right now on the low-end... Nvidia could loose that market in a matter of months...if SiS,Trident, or someone else(VIA?) comes up with a much cheaper but competative low-end solution that dosent cost much head/etc...
I think Nvidia is in sad shape right now... in this market 8 months makes or breaks a company...
the GeForceFX reminds me of the Voodoo5 in more ways then one...and thats NOT a good thing... by in large the Voodoo5 is what sunk 3DFX and it'll sink nvidia to if they cant pull out of this slump.... looseing visiontek is a big-deal to them aswell.
-
You are the one that said pointless when I agreed
I didn't say it was "pointless". Please find the word "pointless" in my post.
So you dont disagree, then?
Are there any reasons to agree in the first place? The product is not even released in the market place.
I checked your link - personally I dont care if the R300 ONLY has 96 bit pixel shader precision, that still allows 7.9x10^28 different positions. I think thats plenty.
The "first cause" post was in regards to transistor count and the potential reason why relatively large number of transistors was included with GeF FX.
You can’t get something for nothing in regards to hardware features vs transistor count.
The point is the GeforceFX is a crock of sh*te.
You haven’t made any substantial evidence why that view is valid. Except for writing fan fiction.
Refer to http://www.nvnews.net/articles/geforce_fx_commentary/index.shtml for past driver improvements(within the GeF 4 TI line).
Refer to http://www.penstarsys.com/editor/Today/nvidia3/nvda_tdy_4.htm for more information regarding GeF FX, nForce 2, John Carmack and 'etc'.
-
The high-end version with firewire support
So does the similar priced ~$150 USD Audigy 2 card...
has a semi-poor built-in sound card that no gamer would want.
It’s a similar chip that was included in a X-BOX btw… “No gamer wants” assertion is completely wrong.
Have you tried running +32 channel midi–channel midi file on emu10k series card yet (for HW accelerated not the buddled SW option)?
Nvidia’s Sound Storm has access to bandwidth beyond the PCI limit (All CL’s DSP cards are limited by this standard). There was also a talk about releasing nVidia Sound Storm for PCI-Express enabled slots (PCI-Express enabled slots has the necessary bandwidth to support 200+ HW accelerated channels).
It’s good enough to match Audigy 1 and SBLive 5.1 DE level markets. Sound Storm also includes Creative Labs’s style EAX user configurable control panel and Dobby Digital Encoding (for multi-channel DirectSound3D titles).
Dobby Digital Encoding feature has yet to be included in VIA's and CL's audio card add-ons solution.
Sound Storm is a 24Bit DSP/APU chip (just like CL’s emu10k series). But this is dependant on motherboard vendor’s installed CODEC. CODEC stage can be bypass via direct digital SPDIFOUT ports.
NVidia’s “Sound Storm” is a Microsoft reference for hardware implemented DirectX 8.x class audio.
For Sound Storm's Dobby Digital Encoding feature refer to
http://www.overclockers.com.au/article.php?id=134416
All versions sport the on-board GeForce4MX,
Completely wrong. I recall ASUS A7N8X Deluxe (SPP/MCP-T) doesn’t include MX4x0 level IGP…
which isn't a good fit for gamers or graphics workstations, either
It runs QuakeIII and relate titles well enough compared VIA /Savage 2000 solution.
-
the GeForceFX reminds me of the Voodoo5 in more ways then one...and thats NOT a good thing... by in large the Voodoo5 is what sunk 3DFX and it'll sink nvidia to if they cant pull out of this slump.... looseing visiontek is a big-deal to them aswell.
At least nVidia hasn’t made those massive 4 chips in 1 AGP card.
The lesser/cheaper GeF FX 5800 (9700 non-Pro edition market segment) sibling doesn’t require such cooling add-ons.
Note that ATI has survived from being second to nVidia from the past history. 3DFX’s collapse is a combination of legal battles with vNidia, non-completive value/middle markets segments, reduced distribution access (i.e. switched to non-multi-vendor sourced model(e.g. 3DFX only vendor)).
-
the GeForceFX reminds me of the Voodoo5 in more ways then one...and thats NOT a good thing... by in large the Voodoo5 is what sunk 3DFX and it'll sink nvidia to if they cant pull out of this slump.... looseing visiontek is a big-deal to them aswell.
Refer to http://www.tech-report.com/sendto_friend.x/4679/
Gainward to offer quieter GeForce FX?
-
Hammer argue all you want... the GFFX is relatively slow and relatively noisey... it costs alot and delivers less then expected.
-
the GFFX is relatively slow
Not true for all of the cases.
and relatively noisey...
Did you miss my previous post?
it costs alot and delivers less then expected.
Are you claiming that one could actual buy the product?
Beta version of "Detonator 42.86 for Windows 2000/XP" should be floating somewhere in the internet (eg. Guru3D.com)....
As beta releases (leaks) indicates, continual product development is currently taking place.
-
Hammer
newegg (http://www.newegg.com)
taking preorders for it for 400 right now...
it is noisy... despite what you can bring about one company toying with another exotic cooler the ones that ship in bulk will be noisy...it dosent perform as expected... nobody thought it would run this slow for how much cooling its taking.
Its basically an overclocked GPU... it was originally intended to run at a slower clock...wich the lower-end bracket of that card will run at.
I'm not saying nvidia is toast...but they better work very hard on their next card...or their going down...
-
Notice the word “Preorder”. The product is not currently in stock (ETA 2/17/2003) .
-
I'm not saying nvidia is toast...but they better work very hard on their next card...or their going down...
IF history can be use as a guide, it took NV11 and NV15 to fix most of the problems with NV10. All 3 cards are DirectX 7 class GPU.
The typical nVidia initial product introduction problems were also mirrored with nForce 1. The problem was mostly fixed with nForce 2. Both have DirectX7 class IGP and Sound Storm APU.
I don’t think they would change this pattern.
Another Geforce FX vs ATI R300 refer tothis (http://www.a1-electronics.co.uk/Graphics_Cards/GeForceFX/GeForceFX_5800Ultra.shtml)
I wonder why Nvidia didn’t go for Leadtek size solid copper based solution. Leadtek cooling solution is to add metal around 80 percent of the card’s surface (both sides).
John Carmack's statements regarding DOOM3, R300 and NV30. (http://www.clanbase.com/finger_cache.php?plan=johnc,John+Carmack)
-
(SNIP) has a semi-poor built-in sound card that no gamer would want
Poor sound card? Note that Dolby recommends nForce2 for playing games with Dolby Digital content.
Refer to http://www.dolby.com/games/pc.faq.html
-
Hammer wrote:
You are the one that said pointless when I agreed
I didn't say it was "pointless". Please find the word "pointless" in my post.
Fine be a nitpicking prick. You said "is that relevant" If its not relevant then its pointless. The message was the same.
So you dont disagree, then?
Are there any reasons to agree in the first place? The product is not even released in the market place.
So WHY ARE YOU DEFENDING IT THEN?
You seem to think that I cant state that it seems slow and loud, because its not released yet. Funny the Radeon 9700Pro reference board was identical to most of the ones for sale now, and very similar in performance to the review samples.
I checked your link - personally I dont care if the R300 ONLY has 96 bit pixel shader precision, that still allows 7.9x10^28 different positions. I think thats plenty.
The "first cause" post was in regards to transistor count and the potential reason why relatively large number of transistors was included with GeF FX.
So whopee doo, the Geforce FX needs more transistors to make their "feature" better on paper than the competition, even though it has no real world use. It still has more, yet is not as good as the competition.
You can’t get something for nothing in regards to hardware features vs transistor count.
The point is the GeforceFX is a crock of sh*te.
You haven’t made any substantial evidence why that view is valid. Except for writing fan fiction.
Refer to http://www.nvnews.net/articles/geforce_fx_commentary/index.shtml for past driver improvements(within the GeF 4 TI line).
Refer to http://www.penstarsys.com/editor/Today/nvidia3/nvda_tdy_4.htm for more information regarding GeF FX, nForce 2, John Carmack and 'etc'.
Fine. Refer to Anandtech. There's your evidence. I've allready read what john carmack has to say, and its not a glowing endorsement.
Now I'm sure you'll nitpick through this entire post, as you are clearly a Nvidia fanboy, and cant handle that they, just like everyone else can make crap hardware every now and again.
Nvidia fanboys are now so common that there's a special word for them now:-
NVIDIOT!
Ever notice how you're the only one defending the GeforceFX here. There is a reason for that, you know.
-
Fine be a nitpicking prick. You said "is that relevant" If its not relevant then its pointless. The message was the same.
Concession accepted!
So WHY ARE YOU DEFENDING IT THEN?
Subjective assertion. Where did I say that?
Funny the Radeon 9700Pro reference board was identical to most of the ones for sale now, and very similar in performance to the review samples.
Not with the newer drivers releases and BIOS fixes...
It still has more, yet is not as good as the competition.
.
You recall that both cards has exceeded the DirectX 9 standard.
yet is not as good as the competition.
.
Are you asserting that statement for all of the cases? Precision is better than generalization.
Refer to Anandtech, There's your evidence.
.
Any statistical data must n>30 btw, and driver reversion can play a big part. Anandtech currently uses v42.63 beta driver. The current leaked driver is at v42.86.
Another preview can gathered from
http://www.extremetech.com/article2/0,3973,846380,00.asp
http://www.maximumpc.com/features/feature_2003-01-03.html
http://computers.cnet.com/hardware/0-1107-8-20824307-1.html?tag=txt
http://www.neoseeker.com/Articles/Hardware/Previews/geforcefx/
I've allready read what john carmack has to say, and its not a glowing endorsement.
False, He did state that both have their weakness i.e. depending on the code path.
Now I'm sure you'll nitpick through this entire post, as you are clearly a Nvidia fanboy,
Subjective assertion.
and cant handle that they, just like everyone else can make crap hardware every now and again.
I did recall that nForce 1 and NV10 was relatively flawed product. I wonder is a fanboy now?
Can’t you read properly?
Nvidia fanboys are now so common that there's a special word for them now:-
NVIDIOT!
Getting personal doesn’t get you anywhere.
Grow up little boy!!!
Ever notice how you're the only one defending the GeforceFX here. There is a reason for that, you know.
Irreverent to the topic. Try again Minion
-
oki.. I shoot...
Even AGA is better than any of these PC gfx boards for sure. The texts dosent flow right and even when playing emulators like MAME on PC or any other 2D game.. Its damn slow compared to what a AGA Amiga can do.
On the 3D side its another story, but 2D.. Even this monster wont beat AGA in 2D. Not even Voodoo3...
Regards,
AmiDelf
-
AmiDelf wrote:
oki.. I shoot...
Even AGA is better than any of these PC gfx boards for sure. The texts dosent flow right and even when playing emulators like MAME on PC or any other 2D game.. Its damn slow compared to what a AGA Amiga can do.
On the 3D side its another story, but 2D.. Even this monster wont beat AGA in 2D. Not even Voodoo3...
Regards,
AmiDelf
nah! i think yer wrong there!
have you ever used a graphics card in an amiga?
-
AmiDelf wrote:
oki.. I shoot...
Even AGA is better than any of these PC gfx boards for sure. The texts dosent flow right
Could you be more specific?
-
Hammer wrote:
So WHY ARE YOU DEFENDING IT THEN?
Subjective assertion. Where did I say that?
As expected you nitpick. To everyone reading this thread it is obvious that you are defending the NV30.
What you are doing here is like trying to get off for a crime on a technicality, not prove your innocense. Stop nitpicking and get to the point.
Funny the Radeon 9700Pro reference board was identical to most of the ones for sale now, and very similar in performance to the review samples.
Not with the newer drivers releases and BIOS fixes...
That affected performance how much?
Exactily.
Mainly bug fixes (of which Nvidias drivers also have some)
It still has more, yet is not as good as the competition.
.
You recall that both cards has exceeded the DirectX 9 standard.
So they both exceed the DX9 std. Big deal
yet is not as good as the competition.
.
Are you asserting that statement for all of the cases? Precision is better than generalization.
Obviously not, as you may have noticed that I mentioned in my first post that the NV30 was overall faster, just not by as much as it should, which was the whole point of this thread.
Precision is better than generalisation, but not when the improvement in precision is pointless. Its just for marketing bullsh*t to get the fanboys all excited about something even they dont need to use to its full potential.
Refer to Anandtech, There's your evidence.
.
Any statistical data must n>30 btw, and driver reversion can play a big part. Anandtech currently uses v42.63 beta driver. The current leaked driver is at v42.86.
Surprise, surprise, you're nitpicking again, now about statisitcal data, so here goes
that n>30 is bullsh*t. Statistical data can be gained from n=1, its just as you take more samples it gets more accurate.
Secondly, do you have any reviews based on these "leaked" drivers? So that statement means something between nothing and f-all.
Thirdly, I am not going to trawl the internet to find 30+ reviews of the NV30, as I have better things to do in my time. By all means do it and "prove" me wrong, but it'll just show what a sad life you lead.
Another preview can gathered from
http://www.extremetech.com/article2/0,3973,846380,00.asp
http://www.maximumpc.com/features/feature_2003-01-03.html
http://computers.cnet.com/hardware/0-1107-8-20824307-1.html?tag=txt
http://www.neoseeker.com/Articles/Hardware/Previews/geforcefx/
I've allready read what john carmack has to say, and its not a glowing endorsement.
False, He did state that both have their weakness i.e. depending on the code path.
No its TRUE. I said Its not exactily a glowing endorsement
and it isn't. So we go from disagreeing about the relative performance of the NV30 to you just trying to disagree with everything I say. Maybe you should grow up, eh? After all you're the one trawling the internet continuously for reviews to prove everyone else wrong!
Now I'm sure you'll nitpick through this entire post, as you are clearly a Nvidia fanboy,
Subjective assertion.
Maybe, but you have just nitpicked through that entire post! Call it what you like its true.
and cant handle that they, just like everyone else can make crap hardware every now and again.
I did recall that nForce 1 and NV10 was relatively flawed product. I wonder is a fanboy now?
Bet you didn't at launch!
Or if you didn't maybe you weren't such a fanboy then.
Can’t you read properly?
See my comments on the John Carmack statement. More proof of your hypocracy.
Nvidia fanboys are now so common that there's a special word for them now:-
NVIDIOT!
Getting personal doesn’t get you anywhere.
Grow up little boy!!!
Nitpicking doesn't get you anywhere. Grow up pedantic prick!
I find that amusing that you call me a little boy for getting "personal", but that sentence in itself just proves you are a hypocrite! I think hypocrites are idiots, and you like Nvidia and are a hypocrite, so Nvidiot suits you!
Ever notice how you're the only one defending the GeforceFX here. There is a reason for that, you know.
Irreverent to the topic. Try again Minion
Relevant. You are the only one defending it. If it was so good, then dont you think there would be more ppl defending it. Once again you are nitpicking.
-
mips_proc wrote:
Its basically an overclocked GPU... it was originally intended to run at a slower clock...wich the lower-end bracket of that card will run at.
I'm not saying nvidia is toast...but they better work very hard on their next card...or their going down...
Agree on this.. It's not such a great chip. And apparently there are discussions of cancelling it alltogether.. See
this (http://www.theinquirer.net/?article=7615)
-
As expected you nitpick. To everyone reading this thread
it is obvious that you are defending the NV30.
More irrelevant character based rhetoric.
Its not exactily a glowing endorsement
Who said it was “not exactily a glowing endorsement”?
You are putting words in month where is doesn’t exist. You are assuming too much.
Did you missed statement regarding the limits being reach for R9700? The speed is heavily dependant on the code path.
What you are doing here is like trying to get off for a crime on a technicality, not prove your innocense. Stop nitpicking and get to the point.
Irrelevant to the issue. It is you who diverted
this thread toward a character based flame war.
That affected performance how much?
Exactily.
Mainly bug fixes (SNIP)
Refer ATI’s 3 month lead on driver maturity statements. Are you claiming ATI doesn't also increase their driver performance while they fix their bugs?
Look in ATI fan base forums regards testing of newer
Catalyst drivers and their expected 3DMarks2001SE/QuakeIII results.
To quote Australian PC User Dec 2002 Page 36
"This time, we managed to come up with one AGP8X motherboard that would
work with a Radeon 9700 - VIA's P4PB400 board. - but the performance results
were actually below those achieved with the AGP 4X board."
No speed lost was encoutered with SIS Xabre 400 and NV18 while using AGP 8X motherboard.
Refer to http://www4.tomshardware.com/business/20020925/atimojo-10.html
This is just a cited example for Catalyst v2.3 drivers.
Note that, this is not the latest Catalyst drivers.
There are more Catalyst driver comparisons IF you search the web.
Obviously not, as you may have noticed that I mentioned in my first post that the NV30 was overall faster, just not by as much as it should, which was the whole point of this thread.
Refer to header title "GeforceFX=surprisingly slow".
Maybe, but you have just nitpicked through that entire post! Call it what you like its true.
Concession accepted.
See my comments on the John Carmack statement.
Did you missed the statement regarding the limits being reach for R9700?
More proof of your hypocracy.
More irrelevant character based rhetoric.
Nitpicking doesn't get you anywhere.
Concession accepted.
Grow up pedantic prick!
That's all you can do?
Or if you didn't maybe you weren't such a fanboy then.
Because you systemically failed to read my posts.
Maybe you should grow up, eh? After all you're the one trawling the internet
continuously for reviews to prove everyone else wrong!
Wrong again. More irrelevant character based rhetoric. A website posted links relevant to GeF FX's benchmarks together.
Are you claiming that you are lazy or can't use a search engine effectively enough?
I find that amusing that you call me a little boy for getting "personal",
but that sentence in itself just proves you are a hypocrite!
You’re the first cause, a flame starter; only a hippy
wouldn’t expect a return of fire.
hypocrite!
More irrelevant character based rhetoric.
That doesn’t get anywhere anywhere Mr recalcitrant.
and you like Nvidia
More irrelevant character based rhetoric.
Relevant. You are the only one defending it.
Subjective assertion.
If it was so good, then dont you think there would be more ppl defending it. Once again you are nitpicking.
More irrelevant character based rhetoric.
Surprise, surprise, you're nitpicking again, now about statisitcal data, so here goes
that n>30 is bullsh*t. Statistical data can be gained from n=1, its just as you take more samples it gets more accurate.
Which one is better?
but it'll just show what a sad life you lead.
It just shows that you are lazy and can’t use a search engines effectively enough.
Another Geforce FX reversion (regarding the noise).
http://www.theinquirer.net/?article=7626
http://www.hardocp.com/
Quote from hardocp.com, Tuesday February 04, 2003
GeForceFX Reborn:
I know what you are thinking, "Already?" We were lucky enough to put our hands on a revamped GeForceFX 5800 Ultra on Monday and I can say that NVIDIA has moved their GFFX flagship in the right direction.
While physically identical in the picture above, except for the coloration of the ducting system, these are two very different GFFX Ultra cards. This new GFFX cooling system does not run in 2D operation, making it quieter than any other 3D cards in this current generation while not being used in a gaming capacity. When the GFFX Ultra is utilized in a 3D application, the fan system spins up and is still about as loud as it was before. NVIDIA reports it to be around 5dBa quieter than the models we saw Web reviews based on last week.
I gamed for around five hours on Monday with the card installed in my own case and I left the side cover off. The case sits at my feet. I found game play in UT2K3, MOHAA, Wolfenstein, and NFSHP2 to be very playable at 12x10 with 4XAA and 8XAF turned on. BF1942 was acting up on my card but after talking to NVIDIA, I am not sure if it is a driver glitch on their end of a system glitch on my end. Still, it is said to be working great at the NVIDIA labs in Austin, TX. I tend to game with the sound turned on, so I did not find the cooling system on the GFFX Ultra to be an issue at all, but we can all argue about that later.
Now that the noise is gone in 2D, and if it ends up on the shelves this way, there are going to be a lot more folks buying the GFFX Ultra and keeping it. Still, if you are used to a very quiet computing environment, the GFFX is most likely not for you.... but then again those games listed above probably are not either.
What did you say about preview releases (engineering release) being equal to final release?
Hammer wrote: Are there any reasons to agree in the first place? The product is not even released in the market place.
Minion wrote: Funny the Radeon 9700Pro reference board was identical to most of the ones for sale now, and very similar in performance to the review samples.
Are you're applying ATI’s experience on Nvidia?
The samples are merely engineering releases.
As hardocp.com's info has indicated, Nvidia was still working for the final release of the GeF FX.
We can’t make an informed judgment before the final release for said card.
-
JoannaK wrote:
mips_proc wrote:
Its basically an overclocked GPU... it was originally intended to run at a slower clock...wich the lower-end bracket of that card will run at.
I'm not saying nvidia is toast...but they better work very hard on their next card...or their going down...
Agree on this.. It's not such a great chip. And apparently there are discussions of cancelling it alltogether.. See
this (http://www.theinquirer.net/?article=7615)
NVidia may bypass the first generation product release cycle and go to the second-generation release cycle. NVidia may not like “nForce 1” (or GeForce 256(NV10)) type untidiness.
-
Minion: The point is the GeforceFX is a crock of sh*te.
In summary;
You: already made a hasty judgment.
- "GeforceFX=surprisingly slow"
- "GeforceFX is a crock of sh*te"
Me: wait for the final release, before making a judgment.
I don’t think “sitting on the fence” = defending nVidia in this case. A real Nvidiot would say "nVidia rulez” at every opportunity .
I wonder who is the fan boy now, when the final card was not even release yet.
Minion: 20 odd days aint goona get a huge performance increase in all likelyhood, is it? (and yes I know its been done before)
AND
Minion: I know, but thats not the point. This is the best Nvidia can come up with.
Not in this case, when a revised GeF FX(refer to www.hardocp.com) and newer Detonator drivers exist after majority(v42.6x) of the sample (engineering release) reviews.
Via guru3D.com
Detonator 42.86 was dated at 2/1/03 6.12/10
Detonator 42.81 was dated at 2/1/03 5.25/10
20 days is quite alot for nVidia's case.
The asserted claim for “This is the best Nvidia can come up with” is simply false.
We don’t know the inter-workings of nVidia labs.
PS; Note the date on the revised GeF FX.
The processor has more transistors than the R300, yet it's slower clock for clock than the R300.
That kind of argument doesn’t stick with Intel’s Pentium 4 btw. The packaged overall performance is more important.
There are reasons for the increase amount of transistors. Which I have given.
But you responded by;
I checked your link - personally I dont care if the R300 ONLY has 96 bit pixel shader precision, that still allows 7.9x10^28 different positions. I think thats plenty.
Sounds like “640kb is enough for everybody” statements…
I see you haven't not taken this into an account.
GeF FX for the following;
1. 1024 Texture address operations per pass
2. 1024 Color instructions per pass
They would need extra transistors IF they support that feature in hardware.
Can you repeat your rhetoric IF ATI includes their own "128bit pixel shader precision" and support for "1024 Texture address operations per pass"/"Color instructions per pass" features in their next Rxx0 release?
I wonder who is the fan boy now.
Refer John Carmack's related statements regarding "maximum
instruction count" and "program limits on the R300".
Trevor Wilkin also echos a similar statements as with John Carmack.
(Trevor Wilkin is Lead Programmer for
Microsoft - Salt Lake City group)
I suspect this will be the same for DX9+ games aswell, driver optimisations or not.
Do you have basis for this?
As for anandtech.com's review...
As www.hardocp.com has indicated, Anandtech just embarrassed itself by presenting the sample GeF FX as the final review. The rush to be the first review for a particular product hardly equals quality review…
-
Me: wait for the final release, before making a judgment
you mean wait for the final release and rant like a zealot at anyone who's already decided they dont want a dusbuster in their computer?
-
I could disect your entire post, and nitpick every single point, but due to its excessive size that would take far too long.
Any points I make, you nitpick them to death, and decide they are not relevant. I have spent far too long on this allready. My original point "GeforceFX=surprisingly slow" still stands, as I was expenting it to wipe the floor with the Radeon 9700 Pro. It didn't, and to tell the truth I was dissapointed. Why? Because I want to get a GF4 Ti4600 or a Radeon 9700. More competition at the top drives the prices down on the lower segments aswell.
You have spent an incredibly long time trying to prove to everyone that the GeforceFX is the best thing since sliced bread, yet have convinced no one.
I accept that you think that it is good, but I dont.
Just remember - while you're trawling through your search engines trying to prove me wrong, I am working for a living and having fun. Maybe you have won this argument, but it dont make the NV30 any good, and its like winning the special olympics........
Oh and MIPS
For once we agree.
-
mips_proc wrote:
Me: wait for the final release, before making a judgment
you mean wait for the final release
Correct.
and rant like a zealot at anyone who's already decided they dont want a dusbuster in their computer?
Where did I state this?
What did www.hardocp.com say?
-
I could disect your entire post, and nitpick every single point
Fire away...
Any points I make, you nitpick them to death, and decide they are not relevant
Any character based assertions should be irrelevant to the topic.
My original point "GeforceFX=surprisingly slow" still stands, as I was expenting it to wipe the floor with the Radeon 9700 Pro
IF one calculated the bandwidth before anandtech’s, GeF FX reviews, one can see that the GeF FX’s will not deliver performance in the hyped level expectations.
One should not fall for hype and remain in a neutral position until the they release the _final_ product.
You have spent an incredibly long time trying to prove to everyone that the GeforceFX is the best thing since sliced bread, (SNIP)
.
Where did I state this? Please be more specific.
Just remember - while you're trawling through your search engines trying to prove me wrong, I am working for a living and having fun
It just took me less than 30 minutes since I’ll ready booked marked most of mainstream PC hardware websites.
IF one works in the IT industry (solution provider), one should be abreast with future developments. This kind of information gathering is minor.
but it dont make the NV30 any good,
Find "good = NV30" in my post.
and its like winning the special olympics........
nVidia issuing of revised GeF FX means that they have more problems to be fix.
-
You just dont geddit do you?
So desperate to nitpick, that you didn't get the message of my previous post.
I cant be arsed with this waste of time argument any more!
-
I guess you can’t take the heat…
Refer to http://www.beyond3d.com/interviews/jcnv30r300/index.php?p=2
For more John Carmack on NV30 vs R300 (06 February 2003). Notice the statements referring to nVidia driver improvements.
-
TWAT!
-
Here’s one example for not jumping the gun to early…
Refer
http://www.hardocp.com/article.html?art=NDI4LDQ=
http://www.hardocp.com/image.html?image=MTA0NDkyODY4NXltVGQ3TE51aE1fNF80X2wuZ2lm
PS; Nvidia driver being used is 42.67(for the hardocp's newer nVidia driver test), not the latest 42.86 leak.
www.anandtech.com has used the slower 42.63
http://www.anandtech.com/video/showdoc.html?i=1779&p=4
As the hardocp's preview has shown, the driver does play a significant role.
Avoiding early judgements will avoid making foolish statements.
-
For some reason I actually bothered to read those URL's. Still confirms what I originally said - GFFX=not all that. Didn't say it was slower than Radeon 9k7 Pro, just not that impressive considering that The GeforceFX STILL isnt available.
This supports my argument, allthough they use outdated benchmarks, the message is still the same, is it worth bothering?
http://www.gamersdepot.com/ed/geforcefx/001.htm
You still havent proven that the GeforceFX is lots better than Rad9k7pro, and somehow, I doubt that you will.
-
@Minion
What do you think of ATI's drivers for the radeon 9700?
(he quietly walks away....)
-
Phoenix wrote:
@Minion
What do you think of ATI's drivers for the radeon 9700?
(he quietly walks away....)
I have no opinion on ATI's drivers for the Radeon for the 9700, since I have never used them. I have used a few ATI cards in the past, and the drivers were generally crap. I will comment on the 9500/9700 drivers if I get a card, as I dont trust reviewers anyway. My experience of Nvidia drivers has generally been good. I am currently thinking about getting a Radeon 9500 Pro, but will not pay more than £120+VAT for one. Its either going to be that, or a Geforce4 Ti 4200 8x (£99+VAT). I will also have to be convinced that their drivers are much better these days.
-
DIE THREAD DIE!
You're all saying things that have been said and argued about on this thread before! Please, please just let this thread die! There's nothing more to be argued about, and if you think there is, can't you just agree to disagree, because all there can be left is just a plain difference of opinion that you're not going to be able to do anything about!
Even the full-time ATI and NVIDIA zealots probably think this argument is ancient history!
-
For some reason I actually bothered to read those URL's. Still confirms what I originally said - GFFX=not all that.
You are already passing a judgement based on 42.63 beta driver and hardware (e.g. noise issue)?
Didn't say it was slower than Radeon 9k7 Pro, just not that impressive
I recall the product was not even at it's final release form. The level of your asserted "impressive" should be based on math estimations. Not on just press hype.
You still havent proven that the GeforceFX is lots better than Rad9k7pro,
What’s to prove, when the product was not even in its final form?
Where did I asserted that "GeforceFX is lots better than Rad9k7pro"?
A "lots better" is inherently subjective.
and somehow, I doubt that you will.
Try again since, I have not stated or claimed "GeforceFX is lots better than Rad9k7pro".
-
Hammer you look like a little kid trying to tell his classmates that a snowday is comming when its jun 1st and 90 degrees outside..
get a grip... you can tell everyone their the fool if/when the GFFX revives.... currently its a glorified dustbuster with sucky drivers and 10% overall lead...
-
mips_proc wrote:
Hammer you look like a little kid trying to tell his classmates that a snowday is comming when its jun 1st and 90 degrees outside..
get a grip... you can tell everyone their the fool if/when the GFFX revives.... currently its a glorified dustbuster with sucky drivers and 10% overall lead...
Amen to that!
I've just realised what Hammer is doing. Since he won't disagree with what I originally stated, he is just nitpicking and trying to win an argument. I dont think the aim is to prove the GeforceFX is great, but more to prove that he is the greatest arguer on this board or is allways right, however to do that, he has to prove that the Geforce is great.