new 6800 problem peculiar to CSS

mikey388

Newbie
Joined
Dec 25, 2004
Messages
98
Reaction score
0
Hi after my problems with a 6600GT I sent it back and have replaced it with a 6800 standard(the link below shows the one)

http://www.dabs.com/uk/channels/components...?quicklinx=37XH

Out of the box it has run like a dream and Im now glad that my 6600Gt failed as it was only £25 more.

However I managed to unlock the extra pipelines and shader using Riva tuner and my 3D Mark score improved from 8600 to 9200.When I play Farcry with the graphics set on high it works great and looks superb,but when I play CounterStrikeSource I get a picture similar to a tv that isnt quite tuned in as shown below

de_prodigy0000.jpg


If I disable the extra pipelines and shader the problem goes away.I have tried Nvidias 71.89 and XG[1].win2kxp76.45a-HD drivers and both give the same problem.

Any ideas would be very much appreciated,if I have to leave the extra pipes turned off I wont be dissapointed as Im made up with the card anyway but the fact that it improved the 3d scores gives me hope it could be a driver issue

I have also tried turning the graphics option down to minimum and it still stays the same

Thanks
Mike

PS my system is

AS K7VT6 MOBO
AMD XP3200+ (400MHZ)
1GIG OF PC 3200 RAM
 
At first I was going to tell you that you OCed to high, but then I read your entire post and it seems pretty clear: The Pixel Shader Pipeline that was locked is defective. The artifacting is a result of a die flaw. How pronounced the artifacting is relates to how the game uses the hardware so your mileage may vary (i.e. FarCry may have very small artifacts and HL2 large ones).

Just an inside on this, NV and ATI will make one chip at first for a new gen. e.g. NV launched the 6800 LE, 6800 [Vanilla], 6800GT, and 6800Ultra and they all had the same 222M transistor GPU. But the chips were "different" in their retail implimentations:

6800 LE : 8 PS units
6800 :12 PS units
6800GT : 16 PS units
6800Ultra : 16 PS units (higher core/mem clock)

So how do they choose which chips to put in which card? Well the defects of course. The PS pipelines are in Quads (groups of 4), so a GPU with 2 bad quads will be made into an LE card. A GPU with 1 bad quad is made into a 6800 or 6800 LE. And a GPU with no bad quads would be made into any of the 4.

Of course they test the chips, and the ones that show the best promise for heat/voltage get made into the Ultra editions. This allows NV and ATI to get better yields, and thus offer more parts and maximize their early investment (yields suck on first runs).

So you basically got a card with a bad Quad. Sorry dude, happens.

Btw, this same principle (in a way), applies to CPUs. Basically Winchester chip is a Winchester chip, and a Northwood is a Northwood. There are some process refinements, but basically they are the same.

e.g. an AMD Athlon64 3000+ 90nm is identical to a 3200+ 90nm and 3500+ 90nm. The only thing different is the BETTER chips (i.e. can handle higher clock speeds) are binned in the premium slots and given a different Multiplier.

But AMDs 90nm are so good that 3000+ chips are routinely getting a 600MHz overclock (i.e. as fast, frequency wise, as a 3800+).

Another example is the P4 2.4c which routinely Overclocked to 3.0GHz and above. I know people who got there P4 2.4c to 3.6GHz on stock cooling and no head issues :shock: This was possible because the P4 2.4c was the "slowest" chip on the new process/design. Basically the P4 3.2GHz chips were the same as the P4 2.4s, just with different multipliers.
 
Thanks for that,like I said Im not too worried the card does what it said it would anything else would be a bonus.Its a big step up from my FX5200 and hopefully it will keep me going for a couple of years.
 
Back
Top