Possible solution for Anti-Aliasing on nVidia videocards

Axyon

Newbie
Joined
May 29, 2003
Messages
7,766
Reaction score
0
ntfs.org has posted an article claiming that there may be a solution to the Anti-Aliasing in Source problem for nVidia cards. The current Detonator drivers have this problem, but the next line of drivers won't:

"A friend at NVIDIA has let me know that while the current release of drivers, referred to as Rel40 do have this problem, the Rel50 based drivers will not. The drivers, which will be more commonly known as Detonator 5.x will hopefully be available by the time Half-life 2 begins shipping."

While this can only be regarded as a rumour for now, it at least reinstates some hope for nVidia cardholders.

View the article here.
 
I hope this isn't just something that got blown out of proportion. It certianly never seemed like Valve meant on publically announcing these problems in order to gain any sort of good or bad press: they were just answering a fan honestly about the technical issues they were currently facing.

I mean, can you imagine if every bug or issue in a game was released to the public and became a big controversy? :)

"John Carmack announces that the current build of D3 crashes when clients join MP!!!!!"

"Texture errors in Stalker!!!!!"

"As of this time, Far Cry has yet to get MP working without crashes!!!!"
 
Indeed. If anything, it's made Valve look like ATi fanboys, which is the last thing they need when they're trying to sell a game to a broad variety of people. But hey, at least it's getting halflife2.net some publicity, can't be bad, eh? ;)
 
If there's any truth to this and fixing the problem on nVidia cards is as simple as installing a new driver, it does cause one to wonder why Gabe Newall has been so adamant that there was no hope for Geforce owners. And if he's really been working closely with the video card manufacturers, why didn't he know about this or why didn't nVidia say something? I'm sure they're none to happy that it is being widely reported that their cards won't work 100% with the biggest game of the decade.

Or perhaps this is just an unverifiable "friend of a friend" type rumor that sounds just promising enough to get people to believe it.
 
Usually whenever you hear "driver problem" and "will be fixed" by nvidia, it will mean they are going to do someting very bad
 
cause one to wonder why Gabe Newall has been so adamant that there was no hope for Geforce owners.

Gabe Newall hasn't.
 
I think Valve had nothing to do with the rumors (at least not the company and G Newall) and it was a bunch of ATI fanboy sites (*cough*X-bit labs) that blew it out of proportion.
If someone at Valve had made a comment then that would just be speculation as I doubt they would have access to the heavily guarded secrets of nVidia or ATI.
 
Well as long as the problem will be sorted out then I think the damage wont be so bad to Nvidia. Almost all of the people that new of the problem are people that get their news on the internet. Now that this has been released those same people will get the news and nvidia should probably be alright.
 
Well, if it wasn't Gabe Newell himself, here is the exact quote from the Valve development team (probably Gabe):

1) Is this a problem that can be fixed with new drivers, or would we have to buy a whole new card to recitify it? If so, are there any cards on the horizon that would offer it?

Valve Answer: Drivers aren't likely to fix the problem, with the exception of the ATI 9500-9800. There's hope there for being able to use FSAA properly. You are out of luck on NVidia unless either NVidia or us come up with some clever way of solving this problem.

2) Is this a problem unique to hardware + Source?

Valve Answer: It's a problem for any app that packs small textures into larger textures. The small textures will bleed into each other if you have multisample FSAA enabled. The best thing to do right now is either buy an ATI card in the hopes that it will be solved there, or wait until the next generation of cards come out.
 
Originally posted by JackiePrice
Well, if it wasn't Gabe Newell himself, here is the exact quote from the Valve development team (probably Gabe):

1) Is this a problem that can be fixed with new drivers, or would we have to buy a whole new card to recitify it? If so, are there any cards on the horizon that would offer it?

Valve Answer: Drivers aren't likely to fix the problem, with the exception of the ATI 9500-9800. There's hope there for being able to use FSAA properly. You are out of luck on NVidia unless either NVidia or us come up with some clever way of solving this problem.

2) Is this a problem unique to hardware + Source?

Valve Answer: It's a problem for any app that packs small textures into larger textures. The small textures will bleed into each other if you have multisample FSAA enabled. The best thing to do right now is either buy an ATI card in the hopes that it will be solved there, or wait until the next generation of cards come out.

This sounds like a plablistiy stunt. He keeps saying that ATI cards are the only ones that'll work fully and you should go out and buy one. He even says that it's "the best thing to do"! This is total bs! Plus, is there really that huge of a difference between cards? They're both DX 9 cards and they only have a few hardware differences (stock clock speed, amount of pipelines). Just another reason I'll never buy from ATI.
 
They're both DX 9 cards and they only have a few hardware differences (stock clock speed, amount of pipelines). Just another reason I'll never buy from ATI.

Yeah: they're exactly the same: just another reason why I'll never buy one of them!

The reality is, it wasn't any sort of publicity stunt. It wasn't publically released info, just a tech guy explaining their current hurdles to ONE fan who had asked about AA. And the fact is, everything he said is true: ATI really does support centroid sampling in hardware, and NVIDA really doesn't. The fix really was what they said it was, and thankfully they really did find a way to borrow shader memory to do make it work on NVIDA cards at a slight performance hit. All in all, a happy ending.
 
Hmmm...

They're both DX 9 cards and they only have a few hardware differences (stock clock speed, amount of pipelines). Just another reason I'll never buy from ATI.

Boy...thats good logic. :D

So, if you never looked at an ATI compared to a GF...you might be susceptible to this type of logic. SEEING the difference for yourself will pretty much change your mind, unless your just opinionated.

Personally, the best people at nvidia were driven away after the acquisation of 3dfx. Those people ended up working for ATi and ATi ended up developing the best hardware ever. Basically, the engineers are the video card...not the name.

So, the way I see it, I will stay with the best hardware, regardless as to what brand name is on it. Right now, the best hardwre is made by ATi and it's no surprise that the valve techs recognize that as well as I do.
 
well what i see about this whole video card problem for me is well, tom's hardware guide got a benchmarking build for doom3, and the fx 5900 kinda spanked the 9800. well 1600x1200 no AA turned on it beat it by about 10-20 fps on medium quality consistently. and it kept beating it even w/FSAA turned on. well here's the link, decide for yourselves really

http://www.tomshardware.com/graphic/20030512/geforce_fx_5900-10.html#doom_iii_special_preview

there yah go

laters
austin
 
oh yah one other thing.... id is kinda using nvidia as their main card...

laters
austin
 
I thought ATi was sponsoring DooM³?

By what we have seen, the 5900 Ultra performs better on DooM³ than with the 9800 Pro. I think the same'll go with Half-Life² since I think the DooM engine is alot heavier.

;)
 
So... what would you guys recommend to a person/gamer who is going to buy a new generation graphic card? what brand and what model?
 
FSAA Already fixed

In a recent chat interview that was on planethalflife.com they said that the FSAA problem was fixed on both nvidia and ati card, i cant find the chat log now but it should be on planethalflife.com , it was an interview between de developers and people from fan sites or game magazines.
 
Re: FSAA Already fixed

Originally posted by nico187
In a recent chat interview that was on planethalflife.com they said that the FSAA problem was fixed on both nvidia and ati card, i cant find the chat log now but it should be on planethalflife.com , it was an interview between de developers and people from fan sites or game magazines.
Yes, but it doesnt matter until we found out how... If it was "clamped with shaders" (no idea what it means hehe), it means even poorer performance on the FX...
I just saw a guy posting an awesome TRAOD comparison with the Cg shaders installed on a 5900 Ultra, it was like 35 fps using max settings (like PS2.0) and 34 fps using Cg... "W00T!" You think, nearly no performance loss! Didnt make much image difference (none that I could find and I really tried) and then a guy posted the same screen for his 9800 Pro. At 85 fps :D
(when using Cg it went down to 54 fps! Talk about Nvidia favored :dozey: )
 
*Correction: 82 fps to 58 fps. Darn limit for editing :)
 
Uhm, wait, just saw on rage3d it was apparently so that they use different methods: shader clamping on Nvidia and centroid sampling on ATI.
If its true that is, but it was a GabeMail (TM)

And I want to complain on the darn forum, I want my edit!
 
Re: Re: FSAA Already fixed

Originally posted by dawdler
Yes, but it doesnt matter until we found out how... If it was "clamped with shaders" (no idea what it means hehe), it means even poorer performance on the FX...

no flame but ... maybe if you don't know what "clamping using shaders" means you should not say something on its performance impact. i personally think that shader clamping (IF they do it that way - they never said that!) is a really fast process and should not take more than 1% off the performance.
back on to the question which card to buy - if you follow recent benchmarks, you can see that the 5900 and 9800 cards are nearly same speed. that is not the case in that doom 3 test but if you read john carmacks latest .plan you see that that outcome has more to do with internal accuracy (inside the graphics chip) and can be tweaked further.
i would personally prefer the 5900 over the 9800, but that has more to do with the quality of the drivers. ati's drivers are getting better though, so it is really up to you which card to choose.
 
Strange things with Nvidia GeForce FX5600 Ultra.

I just bought a GeForce FX5600 Ultra w/128 megs RAM to replace a GeForce4 Ti4400 w/128 megs RAM. I moved the Ti440 to another computer. The main differences I've seen have been minimal.

I have been noticing some strange "slide-show like" performance in certain game scenarios/mods. Playing BF1942 and it's mods, most recently the Secret Weapons addon demo, I am playing on servers with 25-30ms pings from me and in-game I'm constantly stuttering. This is with everything turned on and at 1152x864 resolution. While I do use a 17-inch LCD monitor, it is NOT a ghosting problem.

Now, if you believe 3dMark03, there are some MAJOR differences in the "Fill Rate (Multi-Texturing)" and both "VGA Memory and Core Clock" speeds.

I honestly believe the memory and core clock speeds are bogus or artificial in some respect, however, I DID notice an improvement in how the tests ran/perform.

Click here to see my results. BTW, the differences between result2 and result3 were attained by "tweaking" the memory and core-clock values from within the drivers.

I bought the FX5600 because it was "affordable" and I don't see the FX5900 Ultra's or the ATI9800's dropping in price anytime soon. It seems like the price has been consistently going up. Has anyone else noticed this? Also, does it seem like Nvidia is playing PR games on their website and in the stores? You can buy the FX5200/5600/5900 Ultra's w/128 megs of RAM, but in order to get 256 megs of RAM it's not an Ultra.

Here is an example of what I'm talking about. On that page, the title of the page is simply "GeForce FX 5600" and yet in the specs section Nvidia refers to the Ultra for the specs and lists the memory at 256.

Regardless of what's going on, I have to say I am still somewhat biased towards Nvidia because their driver support has always been phenomenal. Say what you will about Nvidia versus ATI, but ATI (until recently) had very poor customer/driver support. I'm not ATI-bashing, just stating facts. I honestly believe both companies make excellent products, but I'm old enough to remember MANY people, myself included, suffering in games because of the ATI drivers. I have an ATI Rage Pro 3d that I played on for years.

Anywho, if anyone has any suggestions on why I'm slide-showing in BF1942, I'm all ears and eyes. I hope the end-result of all this "gossip" to date is some kind of fix for the FSAA issue, regardless if it's driver based or DirectX.
 
Re: Re: Re: FSAA Already fixed

Originally posted by Tropics
back on to the question which card to buy - if you follow recent benchmarks, you can see that the 5900 and 9800 cards are nearly same speed.
Hrm? Since when was the question which card to buy? This is not an Nvidia VS Ati thread. And you didnt follow my [very recent] post about the 5900U getting 35 fps in TRAOD and the 9800Pro getting 82 fps in the same scene. With poor math you can get it to nearly equal speeds, sadly my brain isnt that poor. I go by games, not benchmarks. Every single one that plays and has access to a 5900 and 9800 says the 9800 rulez it completely, in every game (I read nvnews, rage3d and beyond3d mostly). That is GAMING, not benchmarking games.

Say what you will about Nvidia versus ATI, but ATI (until recently) had very poor customer/driver support
Even if ATI would have had good driver support for a decade, those with a Geforce would say the exact same line.
Concerning the slideshow: Be sure no programs are running in the background. Be sure vertical synch is off (you might be on the lower end of the half). If nothing is the case... SUCKY DRIVERS!!!! :p


Edit: A guys on one of the forums (in conjunction with the cg and DX9 performance issue I think) made plans for a full DX9 test, using only these games and benchmarks. Maybe we'll see it soon :)
 
thanks WhoCares?


for ur benchmarks and am happy that there is littele difference between Gf4 and Gf5600
thanks
 
I hope my GeForce3 will be at least suitable to play (I dont care about jaggies on the Gman or anything trivial like that in gameplay)
But if it totally ruins colors and makes everything look retarded and colors will be intermixed then I'll be a very sad camper without enough money for a new card.
 
Originally posted by RakuraiTenjin
I hope my GeForce3 will be at least suitable to play (I dont care about jaggies on the Gman or anything trivial like that in gameplay)
But if it totally ruins colors and makes everything look retarded and colors will be intermixed then I'll be a very sad camper without enough money for a new card.

The game's designed to also run on cards two generations behind yours, so I'm sure they've optimized it to work fine with the game. Sure, you'll have to compensate AA and AF somewhat, but as you said, you don't really mind.
 
I'm buying a Gainward GeForce FX 5600 (Gainward are the best card manufacturers IMO ) with 256Mbs of RAM..... and I'm glad all this nonsense about AA is over..... personally AA doesn't matter much to me, I still play old games, but I want a new graphics card, and I personally don't like ATi (I tend to dislike sellouts), so it's a mid range GeForce for me :) ....
 
Originally posted by Abom|nation
The game's designed to also run on cards two generations behind yours, so I'm sure they've optimized it to work fine with the game. Sure, you'll have to compensate AA and AF somewhat, but as you said, you don't really mind.
Ok, that's awesome then :)

Just worried me when I read that blurb about color bleeding on the NVIDIA Cards and such; I thought of some really bad color warping/malformed things. I should be fine :)
 
ok guys.. wake up.. LOADS AND LOOOOOOOOOOAAAADS of pc owners have a gf card in their puter.. and i dpont think valve wiull relase hl2 just for the ati card holders... i mean,, think about it,,, how much money will they loose? ... well.. i guess this is a rumor started by some kid who holds a ati card, just like when i was 12 and hated nintendo and well i guess most of you gamers can relate... its just not likely that hl2 wont work with nvidia cards.. but whateva.. im waiting for call of duty...
 
i own a gf4 ti 4200.. and i have seen the latest radeon cards since i work with testing hardware and such things.. and i havent seen any card that works better or gives better framerates.. than the nvidia it also depends on the system spechs.. the graphix card does not make the compuiter .. i mean you must all be real l4m4s or just stupid to think that... the real difference is the cpu and the amount of ram your system has and also what type of mainboard you are using alot of things. now plz lets put this nvidia vs ati bulls**t down they are both GREAT cards and they solved the problem ok?



:cheers:

Originally posted by BBA
Hmmm...



Boy...thats good logic. :D

So, if you never looked at an ATI compared to a GF...you might be susceptible to this type of logic. SEEING the difference for yourself will pretty much change your mind, unless your just opinionated.

Personally, the best people at nvidia were driven away after the acquisation of 3dfx. Those people ended up working for ATi and ATi ended up developing the best hardware ever. Basically, the engineers are the video card...not the name.

So, the way I see it, I will stay with the best hardware, regardless as to what brand name is on it. Right now, the best hardwre is made by ATi and it's no surprise that the valve techs recognize that as well as I do.
 
Yes that infacted they are both good videocard makers but what if they arent lying about the problem with Nvidia's cards???
 
Back
Top