All Nvidia Users Read

As far as I can tell its Valves technologies choices, not direct sabotage.
 
I would think that this is simply because they optimized for ATI, not because valve wanted to give us nvidia users the shaft.

But only time will tell and all the conjectures here dont prove or disprove anything, regardless it is a discussion board
 
azz0r said:
As far as I can tell its Valves technologies choices, not direct sabotage.
Right.
For example, D3 used the same path for ATI and Nvidia DX9 cards. They improved ATI cards performance a bit by changing how it accessed the lookup tables. Of course to go all the way, ATI would have needed to add ultrashadow2 to their hardware support and improve their OpenGL drivers. ;)

You can help the FX cards a bit more by doing some more optimizations but you can't change their CineFX engine nor the fact that it does not execute operations in the standard order or fully support the DX9 standard. Remember, Nvidia left the group for a time when DX9 came out.

Valve has spend much time during the summer of 2003 optimizing for Nvidia FX cards trying to improve performance. Not sure if it was fruitless or oftered any improvements. I do know they came up with a solution for fixing that AA or AF bug.
 
Am I the only one not getting this "brilliant marketing scam by Ati/Valve"? Okay, lets assume that Valve deliberately crippled FX cards, with what purpose? It's obvious something like that would have been ordered by ATI/part of the agreement, and I'm sure ATI is having one hell of a competition with X800 vs FX5900..... :rolleyes:

The real ATi competitors atm, the 6xxx series, work perfectly fine with HL2 and DX9, why aren't these crippled if it was a smart marketing scam to compete with Nvidia?

It doesn't make sense in the slightest way, neither party gains jack shit.
 
umm it doesnt work on my fx5200, or i did it wrong....

- Run 3d-analyze
- change to ati radeon
- force hook.dll
- force low precisoin pixel shaders
- dll path = half-life 2\bin
- exe = half-life 2\hl2.exe
- pre-run exe = steam.exe

then run... create proccess failed ... then no more

did i miss anything out?
does steam have to be closed or open for this to work?
does it work on css?
 
I tried 3danalyze like he said and got an error but when I started HL2 through steam I got teensy artifacts on my 6800gt like

-the light cone from the flashlight has a sharp edge
-the flashlight looks jaggedy on doors (notice it's smooth on the wall)
-the water reflection "seethrough" renders weird

the last is how it looks normally
proof below
 
Valve did this on purpose. They had this plan set up even with their benchmarks way back last year they used the crap nvidia dx9 shaders instead of the good ones.
 
isis said:
umm it doesnt work on my fx5200, or i did it wrong....

- Run 3d-analyze
- change to ati radeon
- force hook.dll
- force low precisoin pixel shaders
- dll path = half-life 2\bin
- exe = half-life 2\hl2.exe
- pre-run exe = steam.exe

then run... create proccess failed ... then no more

did i miss anything out?
does steam have to be closed or open for this to work?
does it work on css?

Same problem here with my 5900XT Seems it doesnt work.

Are there any other programs that can be used. RivaTuner ect?
 
vertthrasher said:
Valve did this on purpose. They had this plan set up even with their benchmarks way back last year they used the crap nvidia dx9 shaders instead of the good ones.
Hate to tell ya but the benchmarks last year were before they took bids from Nvidia and ATI as who to partner with. ATI won because they offered more money plus the performance and how easy it is to program for their cards worked out well.
 
NachoMan said:
Sorry, how does that suggest that i'm a fanboy exactly?

I guess I was jumping the gun a little in assuming that Valve hadn't optimised for Nvidia, but what with all the reports coming in of Geforce cards running slower and with poorer image quality AND the fact the valve endorsed ATi, it's hard not to jump to conclusions.

You are the one who called BloodMario a fanboy. The thread began with what appears to be an incorrect claim (that 16bit and 24bit percision are "indistinguishable") and a lot of people slamming Valve without even having proper benchmarks or screenshots. BloodMario questioned the consipracy theory that Valve intentionally made FX cards run worse, noted that it is well known that FX cards stink at DX9/PS 2.0 (see my link), and even pointed out the conspiracy talk makes no sense because 6xxx series runs very well.

BloodMario was correct in his statements, which were a tongue-in-cheek to the previous posters.

Your response was to call him a fanboy and stated, "it's a bit shocking that valve didn't even bother to do any basic optimisation for Nvidia cards" which is false. So he is a fanboy for being correct? The reports about the FX series being slower came out BEFORE the Shader day event. The FX cards did poorly in 3DMark03 in PS 2.0 tests. The link I put in my original post notes very well all the FX problems. Don't get defensive about the FX series--it was not one of nVidias better products. It is a fact. Valve endorsed ATi because ATi follows DX more closely. Being defensive about a sertain brand, like the FX series, is being what most would call a "fanboy". Now you probably are not a fanboy, but it was not nice to call BloodMario one either.

And you have jumped the gun. Like I noted, the problems with the FX are not limited to HL2. The post with screenshots with 16bit percision does show significant artifacting. All the previous posts about conspiracies against nVidia, how they lied, how they did not even both to do basic optimisations, and so forth are fanboy attacks against Valve and HL2. I am not saying you have to like Valve or HL2, but before you guys jump all over them at least get the facts first and test this stuff out. Calling people names because they disagree with rabid Anti-Valve statements is fruitless. In this case it seems the original poster is wrong (we will see with further testing) and all the comments cheering him on show how quick people will complain without even testing something out first. This thread just goes to show there are a lot of people looking for any chance to slam Valve, whether it is legit or not. I guess it is popular these days to do that kind of thing :rolleyes: Valve is not perfect, but not jumping to conclusions and slamming them before getting results would be a fair request. And that is why I turned your comment back on you... there are a lot of anti-Valve fanboys here who take every chance they can get to slam Valve--whether it is legit or not. Your comments to BloodMario were uncalled for, so I wanted you to think about how it would feel. Obviously you did not like it :) The lesson is lets all talk nice and not call names :) And I do hope for FX users MORE of the sweet shaders can be enabled. I feel bad for your guys :( The game still rocks with DX8.1 though, so it is not too bad!
 
Goo3y said:
Same problem here with my 5900XT Seems it doesnt work.

Are there any other programs that can be used. RivaTuner ect?

it will tell you create run process failed but it changed the values nonetheless just start hl2
 
when i opened half life 2 i tried to change the settings but it didnt work and it ran like a dog
 
Great post, Acert93. I can vouch for what you're saying, since I read numerous articles on the whole thing. But that was over a year ago... I thought we had gone over this?

Guess not. :dozey:
 
Worked perfectly for me on a geforce fx 5900xt. I am getting Directx9 water reflections and better frame rates then before during Water Hazard. The game almost screetched to a halt during that level, but now it looks better than ever and runs great. I deleted my cfg file after running 3d analyze just in case. I also went into Steam and forced the game to run in Directx9. I seriously think Valve knew about this now and chose to undermine NVidia cards. There should be a legitimate patch for the game from Valve. I feel like playing the whole game over now and I haven't even finished the first time! GGGRRRRRR
 
BloodyMario said:
Oh noes! A great conspiracy is uncovered once again by the great Mr. Tinfoil Hat.

Get a clue people. How is it Valve's fault that FX series of cards just plain suck. If Valve was really trying to undermine NVidia wouldn't it make sense to make GeForce series run worse than ATI's lattest offerings? But that's clearly not the case. So blame your own stupid ass for buying a card without checking the benchmarks.
obviously this guy is a ati g card owner, whos know pissed cause his crap card cant compete with teh 1337 nvidias...
 
I havn't found anything relating this with Nvidia's newest cards but as far as I can tell they work just as well as the newest ATI cards. If that is the case then I find it highly unlikely that Valve intentionally crippled FX cards. After all if they wanted to cripple Nvidia cards as so many people say then why wouldn't they have done that to the newest Nvidia cards as well?
 
boy*N.F.i* said:
obviously this guy is a ati g card owner, whos know pissed cause his crap card cant compete with teh 1337 nvidias...

wow .. you've reached some new level of stupidity.
 
Acert93 said:
needlessly long post...

You make it sound like he posted an eloquent well thought out response, when really he was just fanboy trolling...

Get a clue people. How is it Valve's fault that FX series of cards just plain suck...
So blame your own stupid ass for buying a card without checking the benchmarks.

anyway, it's not important, so lets drop it :p
 
I tried it on my fx framerates seem to be way better in the environment with less buidlings and such it helped a bit but i wouldnt advise thoose of you with like 6800gts and stuff to try.. i mean you already get beyond playable framerates imho but w/e floats your boat
 
You do the steam thing when it is turned off (what i did).

Next you go into your Half life 2/bin and look for a dxsupport.cfg
Back up this file
Find your an Ati card similair to your card 5700 -> 9600
9800 Pro -> 5950

Start with vendor (on the ati card info) and drag to the next }. Copy this and replace it over your card info.
 
NachoMan said:
You make it sound like he posted an eloquent well thought out response, when really he was just fanboy trolling... anyway, it's not important, so lets drop it :p

No problem Nacho :) Thanks for the smiley! Btw, can you edit your quotes in your last post for me. I did not find them in my posts and do not remember saying them. Thanks :)
 
Acert93 said:
The only fanboy I see is the one saying, "it's a bit shocking that valve didn't even bother to do any basic optimisation for Nvidia cards". The facts are they spent more time optimizing for nVidia cards than ATi cards.

The fact is nVidia cards are slow in 32bit percision and do not support 24bit. That makes it hard to do a lot of DX9 effects. Without BENCHMARKS and side by side full size comparisons of the tweak and standard DX9 we cannot be sure what performance or artifacting issues there are. A scaled down screenshot may be ok, but how is it in motion at full size next to a true DX9 setup?


WOW! something we agree on! and of course, it must be said that nvidia's driver could solve the problem...SHADERDAY '03 anyone?
 
There is a large quality hit when you drop to fp16, perfect example is how the windows and metal looks in the game, there is a topic with screens somewhere on the steampowered forums.

Oh and when you do fp16 shaders, your no longer doing dx9.

Min requirements for dx9 are fp24 which the fx cards can't do, and its way to slow to even use fp32.
 
Nvidia fx cards don't support 24 bit precision.

So, the nvidia fx shaders ran are instead 16 bit, but are made too look 24 bit by using the same method they use to make 256 color jpegs look like they have more. The artifacts seen on the nvdia fx cards are from "DITHERING" http://www.webstyleguide.com/graphics/dither.html the shader code to look more like 24bit shaders. I'd imagine this dithering to make it look like a 24bit causes alot of overhead since it has to do it on the fly because all the textures are already in 24bit.

By forcing 16 bit, it disables the dithering... but also makes the shaders be more pixilated.
 
jacen said:
i tested it on a gffx 5200 and it allmost doubled my framerate from 5-6fps @ dx9 everything high to 10-13 @ dx9 everything high.

...

Yes.....A dx8 card running dx9 features....nice try...
 
DiSTuRbEd said:
Yes.....A dx8 card running dx9 features....nice try...

The 5200 is a DX9 capable card. Slow at it, but still a DX9 card.
 
Help, please:

I have a Fx 5600 XT, but, where the hell is the FORCE HOOK.DLL in 3DAnalize? I can't find it. And what file should I use in the begining? the Steam.exe or HL2.exe? I cant use both... hel me please. Valve is a little bastard for doing that, you know, changing the options to favour the ATI users. Please, help me. Thank you.
 
Acert93 said:
In this case it seems the original poster is wrong (we will see with further testing) and all the comments cheering him on show how quick people will complain without even testing something out first.
I tried this method out on my computer and found it to degrade the frame/second. It would've been nice if it worked for me. But oh well.
As long as HL2 works good enough to where I can get an enjoyable experience out of it. Then I am happy....even with my subpar PCI 128 Nvidia Card.
Now if only I could get my hands on one of those Wildcat 512MB 8x AGP video cards from 3DLabs. Or even the 640MB PCI Express card. Then I wouldn't have to worry about graphics for a couple years. :LOL:
 
Opheli@r said:
Help, please:

I have a Fx 5600 XT, but, where the hell is the FORCE HOOK.DLL in 3DAnalize? I can't find it. And what file should I use in the begining? the Steam.exe or HL2.exe? I cant use both... hel me please. Valve is a little bastard for doing that, you know, changing the options to favour the ATI users. Please, help me. Thank you.


Force hook is in the lower left corner. You use the .exe for hl2 should be second from the top and on the 3rd one from the top put in steam.exe
 
DrkBlueXG said:
Now if only I could get my hands on one of those Wildcat 512MB 8x AGP video cards from 3DLabs. Or even the 640MB PCI Express card. Then I wouldn't have to worry about graphics for a couple years. :LOL:
Those cards actually give poor gaming performance. Sorry :x
 
Help

can someone help me here plz. Can anyone give a full deatailed explanation of wat to do exactly so i can get on and play HL2. I already have 3dAnalyze but im stuck with ATI Radeon vendor ID and ATI Radeon Device ID. I hav a Nvidia Geforce FX 5900 Ultra. can anyone help. THX
:D
 
Ok here is an easy explanation, but I still need some Info as well. I got a Geforce FX 5900. Changing the DeviceID to the one of a Radeon 9800 Pro creates artifacts, so I want to change it to a "normal" Radeon 9800. But how do I find out the DeviceID of the 9800 non-pro?

Here for the explanation:

1. Start up 3DA.
2. On Nr.1 (it's the second button somehow) choose you hl2.exe. It should be in your Steam\Steamapps\"Username"\Half-Life 2 folder.
3. On Nr.2 go to the \bin folder in the Half-Life 2 folder and choose one of the files, which one doesn't matter.
4. On Nr.3 choose Steam.exe which should be in your Steam folder.
5. Check the boxes mentioned above. They are "Force low precision pixel shader", "performance mode" and "force hook.dll" (they are all there, just look for them, it can be quite hard ;)).
6. Change the VendorID and the DeviceID to an ATI card that is about the same as your Nvidia card.
7. Click "RUN" at Nr.4
8. Close the error message and start HL2.


So if anybody knows how to exactly find out the DeviceID's please let me know!
 
thx Braska, but yeah im stuck with the device ID although im guessing the vendor ID is the same as the 2 ATI graphic cards shown there :

vendor ID = 4098

im only guessin but it makes sense. But i would like the device ID. anyways thx for explanation.
:D
 
now im stuck again. when i look in hl2/bin it says there arent any files there. But if i go through My comp there are. ummmm. This is gona take a long time i feel. anyways, gtg to bed. srry for double post. ill c wat happens tomorrow. :dork:
 
"name" "ATI Radeon 9550 (RV350LX)"
"VendorID" "0x1002"
"MinDeviceID" "0x4153"
"MaxDeviceID" "0x4153"
"m_nDriverVersion_Build" "6240"
"DefaultRes" "800"
"CentroidHack" "1"

do i paste the vendor ID into the proram?
 
All the GFX Card Vendor ID's are listed in the following file

Code:
C:\Program Files\Valve\Steam\SteamApps\USERNAME\half-life 2\bin\dxsupport.cfg

Open in Notepad.

Hope this helps.
 
is there any point in doing this on my gforce ti4600 128mb?
 
Back
Top