I well remember P4 specs... first generation P4's got beat by Athlons... now look where we are a little while later?
So you mean that the GFFX has a longer pipeline and will be able to scale to higher clockspeeds? Or are you attributing the P4's current performance to software optimization?
I'd take issue with the P4/software route since many of the P4 optimizations also benefit the Athlon and give it a performance boost as well. Perhaps not as much as the P4, though.
If you take a P4 at 2 GHz and an Athlon at 2 GHz, clock-for-clock, and ran the same program, the Athlon would perform better. Sure, software optimization will help the P4 a bit, but... I think the fact that the P4 is something like, what, 600 MHz faster? is more of a factor than anything else.
Since the GFFX isn't exactly positioned as the first "GPU" of a family of chips designed to scale up to 2 GHz, uh, I don't see how the analogy holds any water.
---------
Enron? Diversification? Actual products? Blah blah blah.
http://finance.yahoo.com/q?s=NVDA&d=tLooks to me like nVidia's motherboard chipset efforts, broad product range (high-end, mid-range, budget, portable), and R&D (creation of Cg and attempting to get it adopted, migration to .13 micron process) has really paid off.
I mean, a year ago their stock was almost 7 times as valuable. Granted, the stock market is insane, but this means that investors have waning faith in nVidia's ability to provide a competitive product--diversification or not.
And now consumers are also questioning nVidia.
http://www.hardocp.com/article.html?art=NDIxLDY=I don't know where there was a mention of 300 MHz overclocking or room for improvement. The sample only went up 30 MHz or so. They'd NEED to clock it up another 300 MHz the way the current sample is performing.
This is a "reference" board with premature drivers. I fully expect the card to perform better with the actual retail products and after a few driver revisions. However, this is a really crappy start.
Oh, and whoever linked to Tom's... that site is generally regarded as biased and bowing to nVidia. It might not be the best place to cite praise for the GFFX.