will nVidias cheating influence your video card decision?

X

Xtasy0

Guest
well, will it? they're been found to be cheating in 3dmark03 (also in a few other things, like game timedemos used for reviews and such) does this sway your confidence in nvidia? does it make you wonder why when 3dmark03 was updated nvidia suddently lost 29% of their performance while ati only lost a mere 1% or 2%.

or why the visual quality in games such as splinter cell (and others) is noticeably lowered by nvidias drivers (no matter what graphical settings you have set) to maintain an (obviously false) performance lead.

just some things to think about.

they'll definitely be influencing my decision.

what are your thoughts?
 
The fact that nVidia purposely reduced Image Quality to gain FPS over the Radeon cards in benchmarking and testing has put me off them for a long time. That and the 9800 is much more of a solid card.
 
No it will not. Because after i saw how shitty the 5800 was i decided that ati was the way to go for now. Things like this change, we will see next year when pci-x is up and running who is better, ait or nvidia, intel or amd.

i already got a 9500, soon i will be unlocking it when i get the rest of my rig.
 
I care about the performance you get for the price, I dont care if its ATI or Nividia, they both rule :D
 
I care about DX 9 features at a very low cost. That is why the FX 5200 is my dads next card of choice.
 
i think the fact that nvidia cheated proves how committed they are to giving people what they want. They're always trying, even if it's through unethical manners, to give the people what they want. Even if they couldn't give us speeds faster than a 9800 pro, they surely tried to convince us, and that makes me proud. I wish ati would've cheated also, but they didn't. So my choice: the fx 5900 ultra!
 
Originally posted by deepers
i think the fact that nvidia cheated proves how committed they are to giving people what they want. They're always trying, even if it's through unethical manners, to give the people what they want. Even if they couldn't give us speeds faster than a 9800 pro, they surely tried to convince us, and that makes me proud. I wish ati would've cheated also, but they didn't. So my choice: the fx 5900 ultra!

Wow. Just wow. So you're proud of nVidia because they're cutting the Image Quality on their cards? You're actually saying you condone this kind of act?

...
 
Originally posted by jhero
I care about DX 9 features at a very low cost. That is why the FX 5200 is my dads next card of choice.

Good luck getting any dx9 features to run on that card....
 
Originally posted by jhero
I care about DX 9 features at a very low cost. That is why the FX 5200 is my dads next card of choice.


FX 5200 is a really really bad card. Do NOT buy it!
It's to weak to even use any DX9 features. And is in some tests beaten by the GF4 MX.
 
Originally posted by deepers
i think the fact that nvidia cheated proves how committed they are to giving people what they want. They're always trying, even if it's through unethical manners, to give the people what they want. Even if they couldn't give us speeds faster than a 9800 pro, they surely tried to convince us, and that makes me proud. I wish ati would've cheated also, but they didn't. So my choice: the fx 5900 ultra!



Aaaaaahhahahahahahaha! Mwaaaaahahahahaha!
 
nVidia tried to shut Visiontek down with the help of BFG, which is comprised of former visiontek employees. Visiontek switched to ATI, and I followed them. I won't switch to nVidia unless they become the "performance underdogs" that we see so much of today (AMD, ATI, VIA)
 
yea not only did they lie, use tweaked drivers, and lie but all the benchmarks they did are at like 1600x1200...the point is that the 5900 ultra preforms better on higher res, but a 9800 pro would be a bit better on say 1024x768. and plus, no one would play a game at 1600x1200, its totaly ****ing stupid.

im getting a 9800 pro
 
Originally posted by [Hunter]Ridic
yea not only did they lie, use tweaked drivers, and lie but all the benchmarks they did are at like 1600x1200...the point is that the 5900 ultra preforms better on higher res, but a 9800 pro would be a bit better on say 1024x768. and plus, no one would play a game at 1600x1200, its totaly ****ing stupid.

im getting a 9800 pro

lots of people play games at 1600x1200 and the 9700/9800 pros perform better at that res as well :)

plus with ati you can play at nice high res and enable AA and AF
 
Hrm... lets try and be informed about this..... ATi Was on the development team that created 3DMark03, which is why ALL of the test favor them. Nvidia making drivers to take advantage of test isn't as bad as ATi. If you notice identical test between 3DMark03 and 3Dmark01SE, the difference between the two companies drops dramiticly.
 
Uhhh.... identical tests between 3DMark03 and 3DMark01SE? Is that possible? In-game tests have shown that Nvidia is strong in some cases, and ATI is strong is others. If you want to be informed, Nvidia is consistently dishonest about its products (paper launches, cheats, etc.) whereas ATI may be dishonest, but isn't caught as much. And they vowed to take all "optimizations" out of their drivers.
 
Originally posted by SidewinderX143
Hrm... lets try and be informed about this..... ATi Was on the development team that created 3DMark03, which is why ALL of the test favor them. Nvidia making drivers to take advantage of test isn't as bad as ATi. If you notice identical test between 3DMark03 and 3Dmark01SE, the difference between the two companies drops dramiticly.

Nvidia was on the development team as well, but they got mad at futuremark because they didnt have any dx9 products out and 3dmark03 relies heavily on dx9 features, which wold give nvidia cards a lower score since they cant complete some test, so they left, and their dx9 products still suck at dx9
 
Ati was not on the development team, they were in the beta, nvidia was also in the beta for a time, but they left.

nvidias reason for dropping from the beta was the high price it cost to be apart of it.

the tests dont favor any hardware sidewinder (or thats what Futuremark says) if you have proof otherwise please present it.

nvidias drivers were also made to lower visual quality in games to achieve higher performance, as well as detecting shaders and replacing them with their own, so the card didn't have to render everything like any other card would.

futuremark found close to 10 seperate cheats in nvidias drivers and released a lengthy paper on the whole thing.

i suggest you check it out.

[edit]

here is a link to futuremarks report on the cheating.

http://www.futuremark.com/companyinfo/3dmark03_audit_report.pdf
 
though being it is fun to argue about, i dont think it matters all that much. both cards are great nobody would be dissapointed with either one of them.
 
Originally posted by [Hunter]Ridic
though being it is fun to argue about, i dont think it matters all that much. both cards are great nobody would be dissapointed with either one of them.

well i dunno, if i bought a 5900 ultra because its 3dmark score was 2000 better than a 9800, but then when i played games the performance was much lower than the 9800 i'd be pretty dissapointed.
 
I Sit corrected. ATi was on the beta team, not the develpoment team.

...nvidias drivers were also made to lower visual quality in games to achieve higher performance

Isn't that the purpose of graphics drivers? Aren't they designed to optomize the performance of the card?

...as well as detecting shaders and replacing them with their own, so the card didn't have to render everything like any other card would...

Same reason. If the card doesn't handle certian shaders well, shouldn't the drivers be programmed to make up for that defenciany? would you want to buy a card that has non-function shaders? bacuse nVidia made routines that get rid of that jsut means they are trying to make the card perform better, which is the whole idea behind drivers.


And i'm not saing all of this beciase I'm an nVidia fanboy, i'm currently looking at buying a 9500 card to replace my GF3 card.
 
Originally posted by SidewinderX143
I Sit corrected. ATi was on the beta team, not the develpoment team.

...nvidias drivers were also made to lower visual quality in games to achieve higher performance

Isn't that the purpose of graphics drivers? Aren't they designed to optomize the performance of the card?

...as well as detecting shaders and replacing them with their own, so the card didn't have to render everything like any other card would...

Same reason. If the card doesn't handle certian shaders well, shouldn't the drivers be programmed to make up for that defenciany? would you want to buy a card that has non-function shaders? bacuse nVidia made routines that get rid of that jsut means they are trying to make the card perform better, which is the whole idea behind drivers.


And i'm not saing all of this beciase I'm an nVidia fanboy, i'm currently looking at buying a 9500 card to replace my GF3 card.



well here is what John Carmack has to say regarding shaders

Rewriting shaders behind an application's back in a way that changes the output under non-controlled circumstances is absolutely, positively wrong and indefensible.

Rewriting a shader so that it does exactly the same thing, but in a more efficient way, is generally acceptable compiler optimization, but there is a range of defensibility from completely generic instruction scheduling that helps almost everyone, to exact shader comparisons that only help one specific application. Full shader comparisons are morally grungy, but not deeply evil.

The significant issue that clouds current ATI / Nvidia comparisons is fragment shader precision. Nvidia can work at 12 bit integer, 16 bit float, and 32 bit float. ATI works only at 24 bit float. There isn't actually a mode where they can be exactly compared. DX9 and ARB_fragment_program assume 32 bit float operation, and ATI just converts everything to 24 bit. For just about any given set of operations, the Nvidia card operating at 16 bit float will be faster than the ATI, while the Nvidia operating at 32 bit float will be slower. When DOOM runs the NV30 specific fragment shader, it is faster than the ATI, while if they both run the ARB2 shader, the ATI is faster.

When the output goes to a normal 32 bit framebuffer, as all current tests do, it is possible for Nvidia to analyze data flow from textures, constants, and attributes, and change many 32 bit operations to 16 or even 12 bit operations with absolutely no loss of quality or functionality. This is completely acceptable, and will benefit all applications, but will almost certainly induce hard to find bugs in the shader compiler. You can really go overboard with this -- if you wanted every last possible precision savings, you would need to examine texture dimensions and track vertex buffer data ranges for each shader binding. That would be a really poor architectural decision, but benchmark pressure pushes vendors to such lengths if they avoid outright cheating. If really aggressive compiler optimizations are implemented, I hope they include a hint or pragma for "debug mode" that skips all the optimizations.


the difference is that nVidia is modifying the shaders so that they have worse looking output, its not what the game designer intended, its nvidia telling you that in order to have the performance of a 9800 pro you need to lower the details and make the game look ugly.

i dont see how you can possibly say that is beneficial at all. and yes the drivers are supposed to make games run better but not by degrading their visual quality.
 
Originally posted by SidewinderX143

Isn't that the purpose of graphics drivers? Aren't they designed to optomize the performance of the card?

Same reason. If the card doesn't handle certian shaders well, shouldn't the drivers be programmed to make up for that defenciany? would you want to buy a card that has non-function shaders? bacuse nVidia made routines that get rid of that jsut means they are trying to make the card perform better, which is the whole idea behind drivers.


And i'm not saing all of this beciase I'm an nVidia fanboy, i'm currently looking at buying a 9500 card to replace my GF3 card.

1. It must be the users choice to lower quality to gain performance, nvidia didnt lower quality because they thought peoples comps werent able to handle the graphics, they did it because their card was being beaten by the competitors, and in a benchmark too, that makes it MUCH worse, and even more useless

2. The card handles the shaders fine, everything was rendered as it should be, nvidia did change it because the card wasnt performing on par with the competition, so they lowered the quality without the developers consent, just so they can have a higher score. That is NOT very nice to do since it automatically renders the benchmark useless as one card is doing more work than the other and at a different quality
 
Nobody needs a 9800pro or a 5900ultra. Honestly, if you should never have to spend over $300 on a videocard. I would say get a radeon 9700(non pro) for a little over $200. Then again, im a cheap ass.
 
Originally posted by reever2
That is NOT very nice to do since it automatically renders the benchmark useless as one card is doing more work than the other and at a different quality

Isn't that the idea of the benchmark, to see which card does more work, better?

Also, This thread was created to see if nVidia's drivers made you rehtink card choice. Obviously, for you guys, it did. I am jsut saying these things to present the other side of the argument. Sure nVidia could have better image quality, and i'm sure they will address these issues. This is the whole reason to have 2 comapanies out here....
 
Originally posted by SidewinderX143
Isn't that the idea of the benchmark, to see which card does more work, better?

The work they do has to be the same, you cant cut corners when the benchmark is telling you to do something, you must do it exactly as the benchmark specifies. The developers of the benchmark dictate how the benchmark is run and to what extent, not the cards specific drivers
 
Back
Top