Another DX9.0 benchmark test...

Ah, good that they finally took the time to test it in full (I already knew the 9800 Pro was 2-3 times faster than the 5900 Ultra in TROAD though).
Interesting to note that even the 9600 Pro heavily outperforms the 5900 Ultra :)
 
how much points do you get with the 9600 PRO?
 
I realize this benchmark is about dx9 compatibility. But there is a reason why sites do not benchmark using full AA/AF.
Nvidia has another lvl of AA(8x) that ATI doesnt have while ATI has another lvl of AF(16x) Nvidia doesnt have. You do not now what is causing the performance numbers if they are not using the same settings (Resolution,AA/AF settings,Color depth etc).
Remember ATI uses 16bit and 24bit precision and converts to 32bit when needed (does not hold 32bit quality) while Nvidia uses 16bit and 32bit precision. The reference to 16bit vs 32bit color depth in reflections isnt the same as precision.
Neither of these generation of cards are DX9 compliant (only compatible). MS changed the dx9 specification after both cards were already made. Christmas isnt that far off to wait for a DX9 compliant card (DX9.1 maybe even?), thats if you care about dx specifications.
 
Actually they didnt go with full aa/af becuase of bugs in nvidias drivers, read the b3d thread accompanying the article
 
Originally posted by Asus
I realize this benchmark is about dx9 compatibility. But there is a reason why sites do not benchmark using full AA/AF.
Nvidia has another lvl of AA(8x) that ATI doesnt have while ATI has another lvl of AF(16x) Nvidia doesnt have. You do not now what is causing the performance numbers if they are not using the same settings (Resolution,AA/AF settings,Color depth etc).
Remember ATI uses 16bit and 24bit precision and converts to 32bit when needed (does not hold 32bit quality) while Nvidia uses 16bit and 32bit precision. The reference to 16bit vs 32bit color depth in reflections isnt the same as precision.
Neither of these generation of cards are DX9 compliant (only compatible). MS changed the dx9 specification after both cards were already made. Christmas isnt that far off to wait for a DX9 compliant card (DX9.1 maybe even?), thats if you care about dx specifications.
Well that's uninformed :)
The culprit here is pixel shaders. Hardware 2.0 (DX9) to be exact... Nvidia stinks at it.
Concerning color precision, ATI NEVER uses fp16. Why? It cant do it. It does everything in fp24. It cant do fp32. Just fp24.
Nvidia on the other hand uses FX12 (int12), fp16 and rarely fp32. Mostly its fp16. Both cards are DX9 compliant as far as I know, but Nvidia is borderzone, as DX9 demands at least fp24 for its shaders, and Nvidia usually want to do them in fp16. If you know what exactly makes them only DX compatible, point it out, dont walk around the subject, its something I would like to know :)
 
...just to add something: AA quality of ATi cards is superior to the same quality value of nVidia cards. Example: 4XAA(ATi) > 4XAA(nVidia).
Plus, the 32FP path on nVidia cards will make the hardware crawl, that's why nVidia tries, when possible, to translate high FP rates to 12s (16).
 
tnx, nice :) i still dont know which one to buy :p
 
The 9600 doesn't really beat the 5900 by much (the graphs have different scales if you didnt notice.) But I'm pleasently surprised to see how well it performed, since I plan i getting it before HL2 is released.
 
No, it isnt much, like 50 fps VS 40 fps or 30 fps VS 25 fps, but the thing is, it CAN beat it. It shouldnt really. Personally, I'm very dissapointed at Nvidia. When their new and hottest card cant match up with a budget rated card in a advanced and taxing DX9 game, its really bad. There is absolutely no reason whatsoever to buy anything from Nvidias current lineup, even diehard Nvidia fans realize this (although most wont admit it).
 
I still feel sorry for all the goons that bought a FX card to play HL2..and upcomming DX9 games.

And the people that dropped 500$ on a 5900 256 ........have fun with your 10 more fps in a dx8 game.
 
Originally posted by SidewinderX143
"bugs in nvidia's drivers"

proof?
what part of read the thread do you not understand?

"NVIDIA emailed me, asking why AA wasn't tested.

I said (in summary, not ad verbatim) that there's a bug when AA is enabled at certain AA levels+resolutions (very nasty bug) but that this only happens on a 128MB 5900 (works fine on a 256MB 5900). "



http://www.beyond3d.com/forum/viewtopic.php?t=7543&start=40
 
Get a load of this (Gabe's reply - quoting from the General Discussion board):
I have been a long-time NVIDIA card user. Currently I have ATI 9800 Pro's in both my work and home machines.

The DX9 performance described by the Beyond3D article is consistent with what we've been seeing.

Case closed ;)
 
Originally posted by Asus
I realize this benchmark is about dx9 compatibility. But there is a reason why sites do not benchmark using full AA/AF.
Nvidia has another lvl of AA(8x) that ATI doesnt have while ATI has another lvl of AF(16x) Nvidia doesnt have. You do not now what is causing the performance numbers if they are not using the same settings (Resolution,AA/AF settings,Color depth etc).
Remember ATI uses 16bit and 24bit precision and converts to 32bit when needed (does not hold 32bit quality) while Nvidia uses 16bit and 32bit precision. The reference to 16bit vs 32bit color depth in reflections isnt the same as precision.
Neither of these generation of cards are DX9 compliant (only compatible). MS changed the dx9 specification after both cards were already made. Christmas isnt that far off to wait for a DX9 compliant card (DX9.1 maybe even?), thats if you care about dx specifications.
Well i assume the R360 will be better equipped to handle dx9, but again, you won't see compliant cards for a while longer.

I feel that ati's 6xaa should be tested against nvidia's 8xaa. the radeons are faster and look better. If anything, sites should state that the settings are different at "max quality settings" and show the tests anyhow. It's not a test perse, but it sure is a decent performance comparison
 
Originally posted by Pagy
Well i assume the R360 will be better equipped to handle dx9, but again, you won't see compliant cards for a while longer.

I feel that ati's 6xaa should be tested against nvidia's 8xaa. the radeons are faster and look better. If anything, sites should state that the settings are different at "max quality settings" and show the tests anyhow. It's not a test perse, but it sure is a decent performance comparison
R300 is DX9 compliant, no one has proven otherwise yet.

Btw, even at 8x it sucks :) 8xMSAA is nearly worse than 4xSS (it is).
 
So, in summation, how does this apply to the games?

HL2, D3, STALKER, Far Cry, MP2?
 
Originally posted by alco
So, in summation, how does this apply to the games?

HL2, D3, STALKER, Far Cry, MP2?
Overall, it means that with 2x the speed in DX9 games, people using ATI can get a HELL OF A lot higher image quality... Both by default (these 2x is already of higher quality), but you can enable more and still measure up to Nvidia... High FSAA/AF is out of the question for Nvidia, and they are the ones that really need the taxing FSAA... If you want to use it that is.

Specificly, it doesnt mean jack shit. Cause we dont know how developers make the game. We know a bit, like HL2 will be faster on ATI. Doom 3 uses a special path that is faster on Nvidia (BUT if they would have both used the "standard" OpenGL, ATI would have been faster, Carmack has said so himself). Stalker is a notorius TWIMTBP game. BUT it was developed on a ATI 9700, as the FX of the time (early 5800) couldnt handle it :p Far Cry I dont know... And neither MP2...
 
Back
Top