ATIs better than nVidias for HL2?

E

EnricoPolatso

Guest
Just noticed in the Valve interview on Gamespy Doug Lambardi says :

For folks who want the ultimate experience, they'll want the latest ATI card, and the fastest processor available from AMD or Intel

Hmmmmm - not the latest nVidia card? To be honest I think I'm going to be sticking with nVidia for my next card becuase I want FreeBSD support that only nVidia does, and I'm sure it'll look amazing on either brand of card.

But I thought that was interesting all the same...
:cool:
 
Yeah I noticed it too but you must take into account - Its been rumored that ATI has payed Valve to let them use the 9800 pro on the tech demo and to plug the latest ATI line as better performing. However I think its just that ATI currently is superior in HL2. Afterall I dont think Valve would sell out like that.

I myself am getting a 5900 Ultra mainly for Doom III. It doesnt bother me that Ill get a little less FPS in HL2, as long as I can run Doom III fine then im ok.
 
i'm sure that the way Source is set up, the ATI card probably just runs better
 
People are saying that the 9800 pro is better with shaders, but review sites have yet to test that theory. Since Half-Life 2 makes heavy use of shaders then yes ATI would take the crown with HL2.
 
actually about Doom III... it was optimized for different cards.. so that means it is optimized for both ATI AND Nvidia.... they say you go into options and select your chipset and then boom... its optimized for your card :)
 
Ahh, so the reason valve is pushing ATI has been revealed :

http://www.halflife2.net/forums/showthread.php?s=&postid=52610#post52610

1) Is this a problem that can be fixed with new drivers, or would we have to buy a whole new card to recitify it? If so, are there any cards on the horizon that would offer it?

Drivers aren't likely to fix the problem, with the exception of the ATI 9500-9800. There's hope there for being able to use FSAA properly. You are out of luck on NVidia unless either NVidia or us come up with some clever way of solving this problem.

2) Is this a problem unique to hardware + Source?

It's a problem for any app that packs small textures into larger textures. The small textures will bleed into each other if you have multisample FSAA enabled. The best thing to do right now is either buy an ATI card in the hopes that it will be solved there, or wait until the next generation of cards come out.
 
yeah im getting a 5900 ultra, cuz' i like Nvidia support and they ahve allways been pretty solid with the drivers. i also want it for doom iii, theif iii, and there are rumors that stalker is being optimised for Nvidias FX line (sorry ATI fans).
 
No card is optimised for one particular game only, Its all PR bullshit. If a card performs better on a certain game its because it has better strenghts in certain area's that the game makes use of. Dont beleive the "optimised only for XXXXXX" bullshit that being thrown around.
 
Originally posted by Northwood83
No card is optimised for one particular game only, Its all PR bullshit. If a card performs better on a certain game its because it has better strenghts in certain area's that the game makes use of. Dont beleive the "optimised only for XXXXXX" bullshit that being thrown around.
No, all games are mostly trying to use a broad approach, getting it to work on most cards. However, there are KNOWN example of these optimisations (I remember one game that was patch like 1 week before going gold, it was ONLY for Nvidia, but made work for ATI, cant remember its name though), the biggest current example is Neverwinter Nights. That is DESIGNED with Nvidia cards, its opengl extensions and functionality in mind. It still does not work fully on the ATI cards (it freaks out with 4xAA and up, it literally crawls, water shader isnt working, its even got Nvidia FSAA settings even though I'm on an ATI (w00t! I can enable quincunx AA!))

Anyway, back on track: The R3x0 chip SMASHES the NV3x chip in DX9 applications. It is a proven fact. The NV3x is an awesome DX8 card, and the Nv35 is relativly fast in DX9 compared to its predecessors, but it still does not stand a chance against ATI. That's why they are the best card. Not because HL2 will be designed with R3x0 in mind, but because R3x0 was designed with DX9 in mind, and so is HL2.


and there are rumors that stalker is being optimised for Nvidias FX line (sorry ATI fans).
When the FX 5800 came, people asked the Stalker team how their 9700 would fair against it, turned out it was being coded on a 9700 at the time cause the 5800 was too slow to play Stalker at bearable framerate :D
I'm sure the 5900 fair better though :)
 
Originally posted by razorbill
Stalker, Doom 3 and HL2 would not be fine with a 5600 ? :x
Most likely they will be fine. Nvidia drivers will make sure of it, one way or the other, and you will have a fast game ;)
But as an example, both the 9800 and 5900 run Doom 3 like crap. The 5600 is half as fast as those two cards. Meaning it will run twice as crappy. Though since we havent seen the final paths, it might run good enough.
 
Originally posted by dawdler
Most likely they will be fine. Nvidia drivers will make sure of it, one way or the other, and you will have a fast game ;)
But as an example, both the 9800 and 5900 run Doom 3 like crap. The 5600 is half as fast as those two cards. Meaning it will run twice as crappy. Though since we havent seen the final paths, it might run good enough.


What you are bubling???!HL will work great even with GF2.
 
Originally posted by Romano_Cule
What you are bubling???!HL will work great even with GF2.

It may work but it will not look good. Your talking about halflife 2 right?
 
hmmm didn't you guys know that Doom III was optimized for both ATI AND Nvidia? In a John Carmack .plan file (srry can't find the link) he said that they have finished working on the ATI optimizations and are trying to fix the FSAA bug with Nvidia. He said that you could go into options and select your chipset and BOOM it is now optimized for it with all extensions, shaders, etc..
 
Originally posted by Iced_Eagle
hmmm didn't you guys know that Doom III was optimized for both ATI AND Nvidia? In a John Carmack .plan file (srry can't find the link) he said that they have finished working on the ATI optimizations and are trying to fix the FSAA bug with Nvidia. He said that you could go into options and select your chipset and BOOM it is now optimized for it with all extensions, shaders, etc..
Not technically correct. Its actually so that ATI uses the "standard" ARB2/ARB path with little chipset optimisation, while Nvidia uses the a specific heavily optimized Nv path to achieve high score.
At any rate, we already know that all Nvidia generations run OpenGL better than ATI.

And on the HL2 will run great on a Geforce 2, its appears to be a common interpretation that what Valve said was "This game run just as good on a 700/Geforce 2 like on a 3.0/9800Pro" but there is an OBVIOUS misinterpretation in that ;)
It will without a doubt run horrible on a Geforce 2 compared to a 9500/9700/9800. It will run yes, but you wont be playing the experience that is HL2 anymore.

Its like taking a 5900 Ultra, underclock the core to 100 mhz and go "w00t! I have a FX 5900 Ultra!!!!!!!!!! I'm the 1337est in the world!!!!!!!!"
 
I have the fx 5900 ultra :) and i have played the DOOM III demo and I have over 70 fps and it looks fine.
Also there are some parts of Unreal 2 and UT2003 that looks better and runs better on and Nvidia card!
 
Doom III demo? There isnt one.

Some parts? UT2003 is unable to play with trilinear filtering on a Nvidia cards (forced bilinear in the driver even if you set it to trilinear to gain speed) and has HORRIBLE mipmap banding compared to the ATI cards. (so much for better AF, you arent allowed to use it anyway ;)).
Did you play Unreal 2 with 4xFSAA and 16xAF?
 
I play with 4xAA and 4xAF when im on my ati machine. thenerdguy must be thinking about the leaked alpha.
 
i play with 16x AF on my mini beast, no af, because its only a 9500 that grow up into a 9700. *give video card dirrty looks*
 
Back
Top