Half-Life 2 DirectX 9 Performance

The difference between the 9800pro 128 and 9800pro 256 is neglible on most of the games out at the moment without AA and AF turned on. But With AAx6 and AFx16 (i think they are the numbers) the performance difference is about 10% on most modern games, this difference will only increase as the new generation of games come out.

This is why i'm getting the 256 because i won't be upgrading any time soon after this upgrade so i need it to be able to run the future games well.
 
so what is the best 9800?

I heard the Gygabytes one is fantastic.

I think i'm going to change my avatar soon :p
 
I'm getting the Sapphire ATi Radeon 9800 Atlantis PRO 256 MB DDRII, as it's meant to be really good and i've looked at loads of benchmarks 4 it.
 
What do you guys suggest? I only want to spend about $200, and will be waiting till mid-October to buy a new card. Do you think maybe the 9700 pro will be in the $200 range by then? Or can I get a 9700 and flash the bios so it acts like a 9700 pro? Either way, I don't have the cash to get a 9800 pro, or even the 9700 pro with the price its still currently at.
 
This site is getting a lot of publicity from the info from valve thread
 
can you really flash a 9800 and get it to run like a pro?

heheh, so it does found this post on a forum

The flash DOES work but you need to be careful what you use!! I made the plunge and flashed to what was billed as the "latest" 9800 Pro bios (filename 98-8004-NEW.rom) which actualy sent me backwards in my oc'ing. Had to go back to go back to 412/325 from np OC of 412/344 to get rid of artifacts. Boy was I dissapointed!! I was getting ready to flash back to the np when I though I might as well try the other bios I found in posts here. Flashing to the "standard" one (filename 98-8004.Bin) made my 9800np scream just as the forum posts suggest. It's magic!! Running at 425/380 with no problems. I'm not trying to go any higher as I'm estactic to be at 425/380 on a $250 Sapphire 9800
 
Originally posted by d8cam
Thanks for the info!

I am currently a GeForce 3 Ti200 owner and am switching to ATI as soon as HL2 comes out. The 9800 is surely the best card out right now. No arguing, please.

i would wait if i were you, with the 9800 pro you will probably get about 30-40 fps in MP in medium settings, thats imo not enough...
If i were you i would keep my GF3 intull the new Ati R340 (something like it) comes out, it shouldnt take that long anymore, somewhere in the spring. Then you will be able to play HL2 in MP with 50-60 fps at high settings :)
 
i would wait if i were you, with the 9800 pro you will probably get about 30-40 fps in MP in medium settings, thats imo not enough...

False.

Get the 9800 Pro, should play HL2 perfectly. Some will depend on your processor but if you are at 2ghz+ it will be great. I mean Gabe wouldn't have a 9800 Pro in his home machine if he was only planning to play the game on medium settings and get 30 fps? bah, nope.
 
no he just buy it expensive, ati 9800pro 256 cost in usa 500 dollar ;o

i but it for 500 euro' :O~
 
lmfao thanks. I was planning on buying a fx ultra 5900 256 as well LOL
 
Half Life2 benchmarks could help us get a life.

So where to start? Well, I've recently been looking at revising the benchmark suite we use on bit-tech. There's been plenty written about the unreliability of 3DMark2003 and other benchmarks, so we figured it was time to re-evaluate. We have been looking at the joys of Gun Metal recently - Nvidia's flagstone DX9 game (well, I say flagstone - it's the only one they have right now). Despite being a prime part of Graphzilla's The Way It's Meant To Be Played regime, it turns out it runs just as well on ATi hardware. Strange no?

Likewise, the Dawn tech demo has caused a lot of wrangling as fans try to get it to run on ATi cards - with success. However, the bizarre thing is that it runs just as well, if not better. That's despite an Nvidia tech telling me it supported shader routines and similar that simply wouldn't work on ATi hardware.

So this leads me to ask, is there really that much difference between the hardware being pushed our way?

Yes, we have different memory bandwidths, clock speeds, shader instruction support and the like. But in the end, does that really give us any different result? There is very little to separate high-end Nvidia cards from high-end ATi cards now in terms of performance, so are we being fed all this marchitecture for nothing?

The worrying thing is that I think the graphics companies are onto this, and are taking more and more drastic measures to try and differentiate their products. I mentioned benchmarking at the beginning - well you may have read the reports that imminent stunner, Half-Life 2, will be coming with a benchmarking tool. 'Heavens!' thinks every hardware reviewer in the world - a game people will want that we can use to benchmark products.

Except for the rumours recently reaching my ears that ATi, following its collaboration to show HL2 at E3 in Las Vegas, has since paid a seven-figure sum to Valve Software to optimise for ATi hardware. Rumour? Possibly, but an unnamed Nvidia source said 'it wouldn't surprise him'. ATi PR refuses to comment, and Valve Software was extremely diplomatic and just fed us the official company line. Read into that what you will. If this is the case, the question is, how much will it make a difference? If Nvidia's TWIMTBP overt regime can only produce a game that runs five per cent faster, how far will, what are now late-stage covert optimizations, get ATi? The proof of the pudding will be in the gaming, and I for one can't wait for the next interesting chapter in this saga.


Source: www.theinquirer.net


:cheers:
 
I was going to buy a FX 5900 ultra as well when I saw it was getting 30fps more than the R 9800 in alot of benchmarks but then I heard about the dodgy DX9 performance and decided to save myself some money and get the R 9800 pro 128mb and also get a card which is more likely to perform in HL2 although some Doom 3 benchmarks show the fx as being better but the test machines were set up by NV so I can't really put that much faith in them.

BTW there is virtually no difference between the 128mb and 256mb version of the R9800 the only way they could find any difference between the 2 cards was to run them on Doom 3 at ultra high settings whith only 4x agp enabled.
 
Part of me has really enjoyed having a top-end card again with my 9800 Pro 128mb and makes me want to sell it when the next big thing comes out (9900 I guess). If the rumors about it being "alot" faster than the 9800 are untrue I guess I will hold onto my 9800 for a year or so, probably longer. But if there is a significant increase with the 9900 then I will probably sell the 9800 Pro. Anyone think I will still get a good value for it?
 
From what I've read, the R360 is spposed to be the 9900, and it's supposed to be a revamped 9800. I think the one you mght be talking abou, Fluxcap, is the R400, whatever that is going to be called, no one knows for sure - even the R360 real name is up in the air. But the R400 from what I've read is ATi's next generation of cards, and that's the one that's supposed to be possibly twice as fast as the 9800. Maybe what I've been reading is incorrect, but that's what I've seen. And as for the release of the 9900, i've heard September and October, but the R400 might not be out till next spring.

Edit: I actually might wait to upgrade till the R400 comes out. If it's supposed to be that much heftier than the 9600 series, then I dont want to chuck out 400$ for the 9800 pro now when the same 400$ early next year will get me something that's way more advanced. I hate buying graphics cards at the end of their release cycle. For example, I bought the Geforce 3 Ti 500 in February/March of 2002 and then in like April the new Geforce 4's came out. FOr what I had spent on my GF3, I could have gotten a top of the line GF4 if I had only been smart enough to check when they were coming and waited a few months. Needless to say I'm not going to do that again.
 
"Except for the rumours recently reaching my ears that ATi, following its collaboration to show HL2 at E3 in Las Vegas, has since paid a seven-figure sum to Valve Software to optimise for ATi hardware. Rumour? Possibly, but an unnamed Nvidia source said 'it wouldn't surprise him'
Oh yeah, that confirms it, an unnamed Nvidia source says it wouldnt surprise him if ATI paid them :p

Doesnt matter really. Nvidia has done the same in the past. Hell, some games hardly worked on the Radeons when they came out... (one had an emergency patch like one week before gold, cant remember the name though)), and Nvidias The Way You're Meant To Be Played optimisations in Neverwinter Nights wrecks performance on ATI, even certain features... But on the other side, since Tomb Raider AOD a The Way You're Meant To Be Played game, we know how good that is :p

I would laugh if ATI can outperform, outfeature and simply own Nvidia in HL2. We KNOW Valve would never willingly cause less performance on Nvidia cards, but it would show what an ATI card can do... And since the R300 == DX9, no one cant even claim they optimise for ATI, they optimise for DX9 :)
 
i've seen 9800 pro's break 7000 before, keep in mind on a severely tweaked and oc'd system/vc
 
I'm getting a radeon 9800, and I'm so glad now :)
I knew ATi would own Nvidia :)
 
I wouldn't say "own", but ATI is improving and getting more and more people convinced every day. The competition is healthy for the market I think. Who knows, I may get an Nvidia card next, I will just have to wait and see what is the best bang for the buck.
 
I'm honestly concerned about this DX9 performance in the FX cards. How can it be that bad? How can this be reconciled by the fact that benchmarks so far have shown FX cards to be faster than Radeons in Doom III - which is a DX9 game, right?
 
exactly
get the best card with the features you use at the price you can afford

it's all we can do.
i bought my 9800pro aiw last month for the iq, 6xaa performance and dx9 performance. I seriously considered the 5900ultra as well but went another way.
 
Originally posted by dis
I'm honestly concerned about this DX9 performance in the FX cards. How can it be that bad? How can this be reconciled by the fact that benchmarks so far have shown FX cards to be faster than Radeons in Doom III - which is a DX9 game, right?
Doom III is OpenGL. And it was NOT using the same path (for example "DX9"). If you would translate it, it would be like running HL2 on an "Nvidia" path and a "DX9" path... The Nvidia path would automaticly lower precision/quality. The DX9 path would be equally fast, just not until the next driver revision from Nvidia :)

And yes, I am sarcastic

/me kisses my 9700 Pro

/me looks around for a hospital that treats burned tongues
 
the fx cards will probably beat radeon in doom3 when run under openGL. as for hl2, looks like radeons will be the best.

whoever said Hl2 will run 30 fps with medium settings with a 9800 pro: you're a moron. gabe has said a 9600 pro on a 2g 512 ram machine will run 30 fps with max settings. So I think its safe to assume a 9800 pro can handle max settings too, hmm?
 
9800 will surely handle HL2 at high settings on 1024 at least. And well, 1024 is enough for me :)
 
Originally posted by Typhon
the fx cards will probably beat radeon in doom3 when run under openGL. as for hl2, looks like radeons will be the best.
Dont stare yourself blind on OpenGL. Tests with UT2k3 in OpenGL showed that the 9800 is 2 to 3 times faster than the 5900!
The 5900 pulled along in speeds of 50-60fps. The 9800 at 150-170 fps :p
 
I would just get a Sapphire 9800 and flash it to a 9800 Pro.

It's up to you guys though. A Sapphire 9800 is around $240.
 
dawdler, testing UT2003 in OpenGL is bullshit anyway because the game is optimized for DX, not OpenGL.. FX5900 cards are a bit faster in OpenGL DX8 games, no need to say otherwise to trash them even more only because they are crap with DX9.. Since HL2 is a DX9 game it's better to get a ATi 9800pro..
 
Originally posted by Inflatable
dawdler, testing UT2003 in OpenGL is bullshit anyway because the game is optimized for DX, not OpenGL.. FX5900 cards are a bit faster in OpenGL DX8 games, no need to say otherwise to trash them even more only because they are crap with DX9.. Since HL2 is a DX9 game it's better to get a ATi 9800pro..
Of course, but the thing is, the Detonaters are ACCLAIMED for their OpenGL performance. Always have been. ATI drivers have been on the exact opposite side, with poor performance in OpenGL.
What does it matter if its unoptimised? 5900 should tear the 9800 a new... ATIhole :) But it dont. Not only does it dont do it, it get totally OWNED, with speeds equal that of the G4. It should have enough raw speed to pass that, dont you think? Odd indeed...

To recap in short:
Unoptimised and generally poor OpenGL drivers, poorly coded engine: 170fps
Highly optimised and acclaimed OpenGL drivers, albeit still poorly coded engine: 60 fps.

Is it only me that see the odd thing? :dozey:
 
Detonator 5x.xx will be released on the 02-10-2003.....I ll wait and see if Nvidia can fix their DX9.0 problems........
 
2-3 times faster? i find that a little hard to believe.
 
Originally posted by Typhon
2-3 times faster? i find that a little hard to believe.

http://www.3dnews***/documents/5599/ut2k3-2.gif
Lowest res:
9800 Pro: 130fps
5900: 60fps

http://www.3dnews***/documents/5599/ut2k3-6.gif
Lowest res:
9800 Pro: 180fps
5900: 60fps

Just looking at it, I would say the 5900 is somehow locked at 60fps at low resolutions. But then look at it when at 1600x1200, 9800 Pro is still twice as fast :)
 
Ok, time for me to post a pet peeve of mine.

[rant]
DirectX (DX) and OpenGL (OGL) are MUTUALLY EXCLUSSIVE. There is no such thing as an OpenGL DX8 game. THEY ARE MUTUALLY EXCLUSSIVE. They are 3d APIs. You can use one, you can use the other, you cannot use both. Again, NO SUCH THING as an OpenGL DX game. Or for that matter, a DX OpengGL game. Doom 3 is OpenGL, not DX9. Quake 3 is OpenGL, not OpenGL DX6. Serious Sam 2 is OpenGL, not OpenGL DX8. Seeing a pattern here?
[/rant]

phew, I feel better now
 
Actually Doom 3 will still require DX9 because it uses Directdraw, just every other windows game (i think)
 
I don't think I have ever seen a benchmark with more than a 5fps difference between a 128MB and a 256MB video card... it's usually like 1 or 2 fps difference between them, and not always in favor of the 256MB card.

If you have seen one feel free to link to it.

OpenGL is often chosen because it is multi-platform. Microsoft doesn't force you to use any part of DirectX. In fact, it wouldn't be a good choice to code a game to use a combination of DirectDraw (the 2D component of DirectX) with OpenGL... because it would make it harder to port to other operating systems and you would have access to less features than DX9. If you want to make a game just for Windows you are better off going full DirectX because it can do more.
 
Originally posted by mrchimp
Actually Doom 3 will still require DX9 because it uses Directdraw, just every other windows game (i think)

While I'm not sure either way if that's the case, a game requiring DX9 to be installed does not make said game a DX9 game, since it doesn't necessarily use any part of the DX9 specific api. DirectDraw has been around since the first incarnation of DX.
 
Hey guys, I got a Gigabyte Radeon 9800 PRO 128MB, and this card is by God awfully fast and powerful. If your lookin to buy a Radeon 9800 PRO 128MB instead of the 256MB, then go with Gigabyte. I run everything @ 1280x1024, full details and settings maxed, and I never get below 40 - 50 FPS with my rig in my sig.
 
Originally posted by Forbidden Donut
Ok, time for me to post a pet peeve of mine.

[rant]
DirectX (DX) and OpenGL (OGL) are MUTUALLY EXCLUSSIVE. There is no such thing as an OpenGL DX8 game. THEY ARE MUTUALLY EXCLUSSIVE. They are 3d APIs. You can use one, you can use the other, you cannot use both. Again, NO SUCH THING as an OpenGL DX game. Or for that matter, a DX OpengGL game. Doom 3 is OpenGL, not DX9. Quake 3 is OpenGL, not OpenGL DX6. Serious Sam 2 is OpenGL, not OpenGL DX8. Seeing a pattern here?
[/rant]

phew, I feel better now


Direct X isn't a 3D API. Direct X is a COLLECTION of APIs. There is a 3D API included in direct X however, you can use OpenGL with the direct X networking API fex.
 
Directdraw is updated every time they release a new version of DX and I would assume Doom 3 uses the latest version.

It deosn't really matter whether a feature is specific to an api or not, different api's use different function calls to do the same purpose, so in other words the DX 9 renderer (you can't have 2 renderers active in the same scene as forbidden said) couldn't use OpenGL function calls even if they had the same perameters and did exactly the same job as a DX 9 function, a programer would have to change the name of a openGL function call(s) to it's DX9 partner(s) and do some other stuff ;)

I don't know why I wrote that
 
Originally posted by Ahnteis
Direct X isn't a 3D API. Direct X is a COLLECTION of APIs. There is a 3D API included in direct X however, you can use OpenGL with the direct X networking API fex.

WTF is a DX networking fex???
 
It's something very, very evil that makes people with routers cry.
 
Back
Top