Valve and NVidia?

Originally posted by dawdler
For all those wonder about the G4Ti, that isnt an issue here. All "optimisations" done by Nvidia only apply to the FX series. The questionable, the less questionable, the straight out deciet, everything. ONLY the FX series. They arent in effect for the G4 and lower series. You actually get much higher quality images with a G4 than a FX super-duper-uber-cool 5900 Ultra.
You are exaggerating way, way, WAY too much. Did you actually read the article you previously linked too (3dcenter.org)? The article says that the latest NVidia Det52 beta drivers have hardly any optimalisations. The authors of that article could only notice the differences in image quality by using a special tool. There's no way the human eye would notice any loss in image quality during actual gameplay.

Furthermore, there's another article on 3dcenter.org which reveals how ATI drivers quietly switch to a fake, downgraded version of trilinear filtering when anisotropic filtering is on. Again, this is hardly noticable with the naked eye, but it is a questionable optimization by ATI.
 
ATi > NVidia, need I say more?

I'm no "fanboy" (I hate that word) of ATi, I currently have a GF4 4200 and plan to use it to play HL2.
 
Originally posted by Arno
You are exaggerating way, way, WAY too much. Did you actually read the article you previously linked too (3dcenter.org)? The article says that the latest NVidia Det52 beta drivers have hardly any optimalisations. The authors of that article could only notice the differences in image quality by using a special tool. There's no way the human eye would notice any loss in image quality during actual gameplay.



Ive looked at the image comparisons for the new nvidia drivers.......I CAN notice differance's when looking at pics. Would I notice them while playing a game? Not likley .....

That doesnt mean they are not there.
 
Yeah, there are some slight differences noticable when you analyse images directly. But Dawdler's statement that "a GeForce4 will produce much higher quality images then an FX" is either false or incredibly exaggerated. And that's my point.
 
3Dcenter is one of those sites I wouldn't trust for an impartial observation (they've always been on Nvidia's side no matter what). Just as I wouldn't trust the guys at Rage3D to give an Nvidia card a fair shot.

Bottomline here is that Nvidia's driver significantly lowers the overall image quality to eek out more performance. Compare a 9800 screen to a 5900 screen side by side at the same settings and the differences are obvious. Nvidia screwed the pooch with it's latest cards. They need to suck it up and head back to the drawing board for next season instead of using the Detonator shovel to dig themselves in deeper.
 
my GeForce 4 Ti4200 w/AGP8x works pretty damned well @ 1024x768 res., with 2xAA etc.

i'm still going to end up buying a 9600XT, 9600pro, or 9700pro within the next 4 weeks. *thanks God for birthday cards with money in them* ;)
 
I think that Nvidia's hardware isn't the problem, it's their drivers. The FX was delayed and released pretty late, so ATI has had time to work out their drivers and get them squeaky clean. As soon as Nvidia can pull their heads out of their asses (no offence :cheers: ) and make kick ass drivers then I can see them regaining their title as King of teh 1337 Graphics.

I'm just pissed at NVidia. I'm a fan and am willing to wait out the dry period of crappiness, unlike others...
 
Does anyone have any links to benchmarks for the Catalyst 3.8's?
 
no, which has nothing to do with this topic. start a new thread if u want those shaith quality ati drivers, lol (jk, ati drivers are teh ownage, i will admit that...)
 
Hallucinogen

Earlier i posted saying things u said were funny, and in that forum they were, but come on man, don't go low, down sydrome is not funny. Thats school yard stuff, are u at school?
 
Look i'm only gonna say this once, Ati and valve are like a couple. But Nvidia 52.14 drivers own, Y because i have them and have used them to play many games including the unspoken,lol. Of course they didnt want to use these drivers, cause theres hardly any diff with fx and rd. like 5fps,muhahaahha
 
Radeon 128mb better than GF FX 128 mb ? 256 mb ?

They don't said that on the benchmarks !
 
The new 52.14 drivers are much better than either the 51.xx or the 45.xx series. The image quality issues are corrected from 51.xx, and a lot of speed has been inked out over the 45.xx drivers. We have actually been very impressed with the speed, image quality, and playability enhancements we have seen. As long as NVIDIA doesn't take a step backwards before the official 50 series drivers are released, we think everyone who owns a GeForce FX card will be very pleased with what they get.
http://www.anandtech.com/video/showdoc.html?i=1896&p=60

Regarding Valve's treatment of nVidia, you really do have to question Valve's motivation when they torpedo ATi's biggest competitor at a media even sponsored by ATi. You'd have to be naive to think Valve wasn't playing a game of "You scratch my back, I'll scratch yours."
 
Originally posted by ShaithEatery
I think that Nvidia's hardware isn't the problem, it's their drivers. The FX was delayed and released pretty late, so ATI has had time to work out their drivers and get them squeaky clean.

It's not a matter of the drivers. Here's a little history:

Nvidia has been making videocards for a long time. My first dedicated 3D accelerator was a Riva 128. Ever since the Riva series they've made cards faster while adding a few key features here and there. The TNT series got 32 bit color, with the original GeForce the added T&L, GeForce 2 picked up DDR memory and got faster still, the GeForce 3 got the pixel shaders added to it and got even faster.

The thing is that all the designs from the earliest Rivas to the the GeForce 4 Ti were all based on the same chip design. The Riva core was simply made smaller, reused and added on to. That's key to how graphics card manufacturers can manage to roll out some really complicated designs every six months. Every card builds off of the previous design.

The problem is that eventually you will hit a wall somewhere in the design that you will not be able to work around. At that point in order to build a next gen chip you need to scrap the old designs and start over again. That is what also drives a lot of companies belly up. 3DFX, for example, was working on it's new "from scratch" design when they went bankrupt. And that is what Nvidia did with the GeForce FX.

The downside of a totally new design is that you never really know just how well it will perform until you get it done and into the wild. It's at that point that all the weaknesses of the new design will get pointed out. Then you head back to the drawing board with your notes about what didn't work right for the old design and you fix it for the next generation. Expect Nvidia's next card to be really nice, but the FX is never going to be what everybody wanted it to be.

And what about ATI? Don't they have to do chip redesigns? Yes they do. And they already have. The original Radeon core wasn't up to snuff when it came to adding features like cutting edge pixel shaders, etc. So they scrapped it and started fresh with the 8500. And the 8500 had it's share of bad problems. It's memory controller was a huge issue that hampered performance. Guess where they fixed that? That's right, the 9700. They learned what made the 8500 good and what held it back then they got rid of all the stuff that held it back and that is why the 9700 was such a huge jump in performance from the previous cards.

Interesting side note to all of the redesign stuff: Did you know that the GeForce FX doesn't have any native support for T&L or pixel shader versions 1.0-1.4? It uses it's 2.0 pixel shaders to emulate the older generation processes (and does it very well). And every ATI card since the 8500 doesn't support T&L or pixel shader 1.0-1.3 natively. They use that 1.4 (in the case of the 8500) or 2.0 pixel shaders to emulate the older ones.

Sorry for the long post...:cheese:
 
Originally posted by Unnamed_Player
Sorry for the long post...:cheese:
No problem. That was a very interesting read. :thumbs:
 
Valve and nVidia?
Eww!
That's like Britney and Madonna.
Valve's Gabe was quite notorious for mentioning ATI a lot.
"You'd get better FPS with ATI"
"ATI"etc...
 
Originally posted by Tredoslop
Valve and nVidia?
Eww!
That's like Britney and Madonna.


mmmmmmmm....sorry I drifted into daydream land there for a minute. I'm better now. :cheese:
 
Nice post, but I still think that the preformance of my FX 5900 has the potential of great improvement once nvidia gets some better drivers out.

And by the way Voodoomacnine, where u get those drivers? I got some beta 50 drivers from a friend and couldn't get them to work. Link?
 
Originally posted by ShaithEatery
Nice post, but I still think that the preformance of my FX 5900 has the potential of great improvement once nvidia gets some better drivers out.

I'm sure that there is some more potential there, all I'm saying is don't expect miracles and expect to feel a bit miffed when Nvidia releases it's cards for next spring.

I speak from experience in that area as the owner of a Radeon 8500. ;(
 
Originally posted by Mr. Redundant
I dont even know why I bother posting sometimes :/
It's alright, I read it ;) and you are absolutely right!
 
Yea, it's gonna suck when they ditch the fx's (only cuz I got one when they first came out and shelled out 400 bux like an idiot). I can only hope they will catch up to ATI and give them a swift kick in the bouncies...:bounce:
 
I can only hope they will catch up to ATI and give them a swift kick in the bouncies...

LOL, no offense, but I've always found brand loyalty pathetic. nVidia doesn't care about you, why should you care about them?

Surely, if they're not up to current standards it would make far more sense to buy from the company that is than to wait for nVidia to "catch up".

I don't get it...
 
Your problem. My answer to why i like NVidia would be the same if u asked a Dallas Cowboy's fan why he likes the Cowboys.
 
Just buy what you can afford that gives you the best bang for your buck. That's what I say. Screw "siding" with a company.

I agree with the statement that people buying into this "our older cards don't cut it anymore, buy this new updated version" BS - it also makes me sick. I'm still using a GF3, before I get my 9600XT, and it's doing just fine.

One thing though - this buddy-buddy thing that ATI and Valve have makes it pretty damn obvious that nVidia would get a big ol' **** you from Valve. Of course Valve and ATI have this partnership going.. more Radeon's sold to play HL2.. both companies make more money that way. Now, I know that the Radeon cards, hardware wise, are SLIGHTLY faster than nVidia cards but it's so damn obvious that nVidia is gettin the shaft that it ALSO makes me sick... for a couple-a reasons.
1. Valve's trying to be like: "Oh, we're just promoting the better card (and they are SLIGHTLY better, don't get me wrong).." we aren't leaving out nVidia." BULLSHIT.
2. Did I mention bullshit?

I say, cut the crap - I've seen this before, Valve/ATI. Keep your little buddy-buddy thing going.. just stop hiding the big old dildo with the words "**** you" written on it that you're sticking up nVidias ass :p
 
Originally posted by ShaithEatery
Your problem. My answer to why i like NVidia would be the same if u asked a Dallas Cowboy's fan why he likes the Cowboys.

That's called being a fan... AKA fanboy. Don't be offended, that's what it is.
 
In all fairness, the Radeons are more than slightly faster. It's quite significant, especially on more demanding software at high resolutions.
 
Back
Top