[H] + ATi respond to Tom's and nVidia accusations

P

Pr()ZaC

Guest
I hope people will stop reading such a biased site if in need of a good review...

ATi's part:
ATI Responds to Cheating:
At the recent editor's day held by NVIDIA, they informed the media of instances of where ATI was "cheating" in two games and a benchmark based on a game engine. Here is what ATI has to say.


We would like to respond to the recent allegations of benchmark cheating levied against ATI.

We take these allegations very seriously and are distressed by the implication that we are cheating our customers. At ATI we have taken a strong stance against application specific optimizations in benchmarks. Further, we never force optimizations that deliver lower image quality than the game developer has intended.

Three applications have been specifically questioned: UT2003, Halo and AquaMark3.

UT2003: We are working with Epic, to address a bug that has a slight impact on the image quality in the game and benchmark in certain situations. This is a known issue that has been recognized by both ATI and Epic as a bug. ATI is not cheating in any way in this application.

Halo: Although allegations of cheating have been made, no one has been able to find any example of decreased image quality in the game. In fact, many examples have been documented on the web where ATI cards and drivers are rendering a superior image to competing products. Again, ATI is not cheating in any way in this application.

AquaMark3: We are currently investigating our rendering in AquaMark3. We have identified that we are rendering an image that is slightly different than the reference rasterizer, but at this point in time we are unable to identify why that is. We believe that this does not have any impact on our performance. Our investigation will continue to identify the cause and resolve it as soon as possible. One point to note is that we render the same image using our latest driver (CATALYST 3.8) as we do with a driver that pre-dates the release of Aquamark3 by almost six months (CATALYST 3.2). Also, in all of our dealings with the developer of Aquamark3, at no point have they advised us that they are unsatisfied with the images that we are rendering. We do not have any application specific optimizations in our driver and we are not cheating in this application.

If you have further questions about the way RADEON graphics cards render in these application we encourage you to contact the game's developers directly.

We are disappointed that certain media outlets chose to perpetuate the allegations, made originally by a competitor, in their articles without an understanding of what was happening and without contacting us for comment.

We are committed to delivering the best gaming experience possible, without sacrificing image quality for increased performance in benchmarks.

[H]'s part:
If you go back and look at our recent GeForceFX 5950Ultra Preview, you will notice that we covered the two games that are actually referred to above, as for the benchmark put out by the NVIDIA "TWIWMTBP" Partner, we find little reason to evaluate that as we do not use it and you can not play it. After evaluating multiple screenshots of each game and hours of ingame evaluation we saw no difference in UT2K3 that were worth mentioning. We did find that ATI had a better overall image quality in Halo. So if ATI is cheating in Halo, NVIDIA is just doing a bad job rendering the scene. You take your pick of which you would rather have.

NVIDIA has been caught red handed this year cheating. They have admitted this to me in private meetings. Before they stand in front of us and accuse their competition, they owe you and me a public apology. This is not the first time we have said this. Quite frankly, I have seen NVIDIA do some pretty questionable things in the past but this all takes the cake on the hypocrisy-meter.

It is pretty sad when the only way to advance your product is to tear down your competition's. NVIDIA's product is sub-standard when compared to ATI's in the DX9 gaming arena. We know, ATI knows it, and NVIDIA damn sure knows it. NVIDIA needs to concentrate their resources on making a product better than the competition's instead of spreading what we see as FUD to the press who will inevitably regurgitate it to you guys.

NVIDIA should be ashamed of themselves. I know we are ashamed of them.

http://www.hardocp.com/
http://www.rage3d.com/#1067372704
 
Originally posted by Pr()ZaC
I hope people will stop reading such a biased site if in need of a good review...
IMO, this just shows that [H]ardOCP is even more biased then Tomshardware. They show blind faith in a statement from ATI and simply lie about Nvidia.

The fact that Halo looks better on a Radeon9800 has nothing to do with Nvidia cheating, the Radeon simply uses a better FSAA algorithm. But [H]ardOCP wants its readers to believe that this difference is actually caused by Nvidia deliberatly cheating. Nvidia does cheat with its trilinear filter, but this has virtually no effect on the image quality.

You can read more about the image quality differences here.
Conclusion from the article:
There are a few points to take away from our little venture into the world of image quality. Most notably, NVIDIA is in a much better position today than it was several months ago, though that much was probably apparent from our GeForce FX 5950 review.

At the end of the day, we’d still conclude that ATI’s anti-aliasing is superior to NVIDIA’s, a situation that is amplified when you consider game play. There’s no way you’ll be able to run around in Unreal Tournament 2003 at 1600x1200 with 8x anti-aliasing on the GeForce FX 5950. Even 4x AA is pushing it when you’re employing such a high resolution.

That said, in certain games, ATI texture quality has dipped below what we witnessed in our first image quality piece. Whether this is a result of changes made between CATALYST 3.6 and 3.8, we’ll be exploring in the very near future.
 
Heh they aren't more biased...
...Unless you are bias like Tom.

There was no mention of Nvidia cheating in Halo. Just said that it may not do as good of job rendering the scene.
 
Re: Re: [H] + ATi respond to Tom's and nVidia accusations

Originally posted by Arno

...The fact that Halo looks better on a Radeon9800 has nothing to do with Nvidia cheating, the Radeon simply uses a better FSAA algorithm...

FSAA doesn´t even work in Halo...
 
Re: Re: Re: [H] + ATi respond to Tom's and nVidia accusations

Originally posted by Sideshow
FSAA doesn´t even work in Halo...
I checked the article again and you're right, there is no FSAA in Halo. They're talking about some flashlights that are incorrectly displayed on Nvidia cards. Wow, what a "bad job" from Nvidia. As if there are drivers in existence without bugs.

What gets me the most is this:
Quite frankly, I have seen NVIDIA do some pretty questionable things in the past but this all takes the cake on the hypocrisy-meter.
If you turn the tables, ATI's past is also questionable and they still have questionable things in their drivers, yet they felt it necessary to report that Nvidia is cheating. I fail to see how ATI is any less hypocrite then Nvidia. [H]ardOCP only looks at one side of the story.
 
[H]ardocp used to be the other way around, always on the Nvidia side, but i guess they could not defend them anylonger:)

Nvidia cheating with it´s drivers are no big news. But the bad things, as you said, is that when ever someone brings up an issue with Ati´s drivers, cheat or not, everybody just say that nvidia is worse.

Nvidias great cheatingcampaign is makeing it easy for Ati to "cheat" just a little and if someone questions it they just point at Nvidia.
 
[H] WAS nVidia biased, then something happened and *poof* [H] started to stop defending nVidia cheats.
ATi knows how a cheat can damage a company reputation, that's why there're no cheatoptimizations in the pictures presented. The glitches you see there, as explained, are driver bugs, will be fixed and they DON'T modify the overall speed in any way.

You should read everything again and carefully if you think that Tom's is a fair product review.
 
I'm not defending Tomshardware, as it is quite obvious to me that their reviews are biased towards Nvidia. But you can't blame Tomshardware for being biased while at the same time presenting [H] as an alternative. I don't know what's got into the reviewers at [H], but the pure hatred they show towards Nvidia in their news updates really makes you wonder how objective their reviews are.
 
[H] isnt bias toward ATI or against Nvidia I don't think.
But rather they try to write in a tone that will apeal to their readers which happen to be hardcore gamers. Right now Nvidia isnt the best choice for those gamers as they want a solid DX9 solution.
Thats why.
If [H] wrote about gfx cards with an audiance that work mainly in 2d graphics/DX programing/OpenGL industry programs then right now they would use different benchmarks and write in a good tone about Nvidia because thats what would apeal to their readers.

They used to be "go nvidia" because they actually thought that way rather than for their reader. They changed about when IQ started become part of the benchmarks, I think.
 
Where is all this "biased" bull sh*t coming from?... all i get from those reviews is passion, which IMO is a good way of reporting and interesting to read.

If nVidia produced a kick ass card [H] would be shouting how great it was... they just speak the truth with passion. All you nVidia fan boys should stop being so damn protective, you are being a little too sensitive. Hell i used to own a Geforce 2, and it was an amazing card at the time... things change, thats why [H]'s opinions have. There isn't a law saying that "once a fanboy, always a fanboy"
 
Originally posted by pHATE1982
Where is all this "biased" bull sh*t coming from?... all i get from those reviews is passion, which IMO is a good way of reporting and interesting to read.
IMO, passion is for commercials. I don't want to bother with filtering out all this "passion" (read: fanboyism) to obtain some useful hardware info from an article.

Originally posted by pHATE1982
If nVidia produced a kick ass card [H] would be shouting how great it was...
They would probably try to mention the advantages of the card as briefly as possible, followed by an in-detail analysis of any short-comings the card might have.

Originally posted by pHATE1982
All you nVidia fan boys should stop being so damn protective, you are being a little too sensitive.
Nvidia fixed all of the image quality issues (except for one invisible issue) and yet [H] still demands a public apology? In fact, they are ashamed of Nvidia??? Wow, those people are sensitive!
 
They didn't solve the pseudo tri-linear issues and there's something fishy going on in the AM3 renders.
 
Hehe, Arno is even more fanatic about defending Nvidia than I am defending ATI. :)

"IMO, this just shows that [H]ardOCP is even more biased then Tomshardware. They show blind faith in a statement from ATI and simply lie about Nvidia."

This basicly shows you havent followed much in the NvidiaVSATI battle. As the others said, HardOCP was HEAVILY Nvidia biased once. Of course, not even he (and not Toms either) denied the fact the R300 was an awesome card, but the two sites has defended Nvidia to the death (fun note: Toms often have 'errors' in their articles... Oddly enough, they all make the Geforce look good, and the Radeon look bad. Every time). You should have seen [H]s old articles about it. He puts blind faith in Nvidia, saying everything is bugs, and nothing is a cheat or optimisations, that the image quality was fine (when over and over again proven otherwise). Then he actually started to come to his senses when Nvidia failed to listen to anything the community put against them. And now this.

Besides, we dont even have to put blind faith to it. Elite Bastards did an article (independant from ATI) to try to meet the accusations from Toms. They came to the same conclusions.
 
Originally posted by Pr()ZaC
They didn't solve the pseudo tri-linear issues and there's something fishy going on in the AM3 renders.
The pseudo tri-linear issue is the invisible issue I was talking about. So far I haven't seen a single website that could point out any visual difference between this pseudo trilinear filtering and normal trillinear filtering. And yes, Nvidia does render a certain AquaMark3 scene different then ATI, but this is not necessarily Nvidia's fault. In fact, ATI's statement in the first post acknowledges that the way ATI renders it is slightly off when compared to a reference rasterizer.

Originally posted by dawdler
Hehe, Arno is even more fanatic about defending Nvidia than I am defending ATI.
Wow, did I go that far? :cheese:
Seriously, I'm well aware that the 9800XT is the best card out there right now. I wouldn't recommend anyone to buy a GeForceFX5950. The only GeForceFX card that might be considered a good buy is the FX5700. Amazingly enough, that's what [H] concluded in their FX5700 review. They seem to be more reasonable in their reviews then in their news updates. I can't comment on [H]'s past as I only recently started reading the website.

I'm defending Nvidia here because I've been a happy Nvidia customer since the TNT2 and that rant on [H]'s news page seems to be written by somebody who forgot to take his medication.

[H] wrote:
So if ATI is cheating in Halo, NVIDIA is just doing a bad job rendering the scene. You take your pick of which you would rather have.
Uhmmm..... I think I'll pick option #3: asking both companies to fix their issues.
 
And ATI mostlikely will but I doubt Nvidia will increase their rendering quality for the scene. :)
 
Originally posted by Arno
Wow, did I go that far? :cheese:
Seriously, I'm well aware that the 9800XT is the best card out there right now. I wouldn't recommend anyone to buy a GeForceFX5950. The only GeForceFX card that might be considered a good buy is the FX5700. Amazingly enough, that's what [H] concluded in their FX5700 review. They seem to be more reasonable in their reviews then in their news updates. I can't comment on [H]'s past as I only recently started reading the website.
Yes you did go that far, but if you dont known its past its understandable :)

And of course, the FX5700 is disputed. How about:

"So who should consider this card as an upgrade? Put up against the 9600 XT I just can’t recommend anyone to buy the 5700 Ultra"

"any 5700 Ultra user wanting to buy TRAOD would be sorely disappointed gaming would be choppy, even at 1024x768. "

"If the conclusion/review stopped here we would be very satisfied with the card, unfortunately we have the issue of AA/AF performance on the 5700 Ultra."

"cards are almost frame for frame in 0xAA 0xAF but 2xAA and 2xAF creates a gulf in performance, this is something Nvidia need to address"

From a single conclusion on Driverheaven. Want to play a game? Max Payne 2 at high quality (2xAA/AF, 10x7) : 84.9 fps on the 9600XT. 56.2 fps on the 5700.
TROAD? 40.3 fps on the 9600XT (no AA/AF, 10x7). 25.2 fps on the 5700. How on EARTH can anyone come to conclusion its a good card against ATI competition???
 
Originally posted by dawdler
How on EARTH can anyone come to conclusion its a good card against ATI competition???
The FX5700 has a better OpenGL performance then the 9600XT, which matters to crazy people like me, who play RTCW and Enemy Territory 90% of the time.
The comment about FX5700 users being disappointed while playing TROAD is rather funny, as TROAD is a disappointing game no matter how you look at it.
And I don't trust the UT2K3 benchmarks, as DriverHeaven.net didn't mention if they used application settings or control panel settings.
The FX5700 scores in Max Payne 2 are disappointing, though.
 
5700 is the strongest FX for what card it is against (5600XTvs5700/5950vs9800XT) but that is probably why they felt they could move up from the 5600 name. :)
From what I saw in [H] review, the FX series seems to have good FPS for a while but it can dip and make some steep changes in FPS pretty quickly.
ATI's cards seemed to hold steady at what ever FPS but seem to have some pretty good high's. Even when ATI's did drop, it did so more slowly than the FX. Holding steady FPS in game is good...having your FPS jump all over makes it seem laggy. ;)
 
Originally posted by Arno
The FX5700 has a better OpenGL performance then the 9600XT, which matters to crazy people like me, who play RTCW and Enemy Territory 90% of the time.
The comment about FX5700 users being disappointed while playing TROAD is rather funny, as TROAD is a disappointing game no matter how you look at it.
And I don't trust the UT2K3 benchmarks, as DriverHeaven.net didn't mention if they used application settings or control panel settings.
The FX5700 scores in Max Payne 2 are disappointing, though.
True, the FX is slightly better than the XT in OpenGL. But both cards are very very very playable. Can you say the same about DX9 games? Can you say the same when you start using FSAA and AF? The OpenGL gain is so small its insignificant. Unless you intend to use the card for hardcore OpenGL modelling and rendering. In which case its far to slow, you wouldnt use an FX at all.

About TRAOD, it is not your place to say that, only the buyer decide. But It is a game nonetheless. I have actually played it. And if you get past the damn annoying beginning (which nearly none have), its a pretty nice game (I cant say its good, its better than poor at least).

Concerning the application setting, does it matter? If they used the control panel, both would have roughly the same IQ. If they used application, the FX would get an unfair advantage.

The scores in MP2 is not only dissapointed, its yet another show the FX series is a very poor choice for future gaming. The game is BLAZING on my 9700 Pro at high FSAA/AF. What if it would have been 'average' (around 60 fps)? The FX would choke and probably explode.
 
Originally posted by Arno
The pseudo tri-linear issue is the invisible issue I was talking about. So far I haven't seen a single website that could point out any visual difference between this pseudo trilinear filtering and normal trillinear filtering.

Let me show you then:
screenshot09_tn.jpg

screenshot10_tn.jpg


screenshot11_tn.jpg

screenshot12_tn.jpg


So, nVidia blatantly cheated, nVidia gained speed but lost some of the Trilinear filtering. Can you see the difference of the pseudo-trilinear now? :)
 
Originally posted by Pr()ZaC
Let me show you then:
screenshot09_tn.jpg

screenshot10_tn.jpg


screenshot11_tn.jpg

screenshot12_tn.jpg


So, nVidia blatantly cheated, nVidia gained speed but lost some of the Trilinear filtering. Can you see the difference of the pseudo-trilinear now? :)


errr....... very small pictures bro....im not even going to strain my eyes trying to see the differances.....
 
Nvidia Vs ATI

I think it is quite funny, that when Nvidia were found cheating, ATI thought it was "their duty" to step up and inform the world of Nvidia'a activities, but now that ATI have been found cheating, people are quick to defend ATI's actions.

As for Hardocp and Tom's Hardware...

In the end you cant really trust one person's opinnion, but most of the time have to see for yourself. This becomes a problem when benchmark's are forged, as you dont know wether the "in game" performance will be what it's meant to. I beleive that ATI has created one of the best performing cards available today, but it was through a long "trial and error" period with their older cards. Nvidia's GeforceFX is going through the same stages while they work out their driver bugs and card design faults. I dont beleive one card is better then the other, i beleive that a card can be the "best at the time", and ATI just happen to have that advantage....for now.

To HardOCP and Tom's Hardware i say: Cut the crap! We want to know what video card is best for what we want, not what your personal opinnion on a video card company is! Tell us the truth gaddamn it!
WE CAN HANDLE THE TRUTH!!!

-Razor2K
 
I just stopped trusting all those sites. I'm happy with my FX 5900 (non ultra). I do believe that NVidia has attempted to mend the error of their ways with the 52.16 drivers, and from what I've heard on the forums, they haven't been found to kill their cards with heating issues unlike the ati 3.8 drivers. I don't trust benchmarks anyways. I would rather get a hold of a true dx9 title (waiting for HL2) to test. I don't hold Halo as valid for testing anything because they don't even support aa. I get an excellent quality when I play whatever game I want. Sure I have to change the aa and af settings if I go from CS to UT2003, but I don't care. I'm happy with my card and that's that.
 
Re: Nvidia Vs ATI

Originally posted by Razorscott2YK
I think it is quite funny, that when Nvidia were found cheating, ATI thought it was "their duty" to step up and inform the world of Nvidia'a activities, but now that ATI have been found cheating, people are quick to defend ATI's actions.


-Razor2K

ummm...im lost here......where is the concrete evidence that they are cheating and its not driver bugs.......
 
...

fanboys. If i wasnt stuck at work all night with nothing but the internet to keep my occupied i wouldnt even bother with these boards. All i ever read on these boards is ATi fanboys screaming how their product is the best. Nvidia fanboys screaming how their product is catching up. Ive readreviews at both Toms and {H] and reading an [h] review of an nvidia cardis like cuba running the human rights council of the UN. Tom, while biased toward Nvidia still tries to give unbiased reviews and even removed nvidia from his recommended hardware for a timedue to the cheats/optimisations.
 
Re: ...

Originally posted by Detharin
fanboys. If i wasnt stuck at work all night with nothing but the internet to keep my occupied i wouldnt even bother with these boards. All i ever read on these boards is ATi fanboys screaming how their product is the best. Nvidia fanboys screaming how their product is catching up. Ive readreviews at both Toms and {H] and reading an [h] review of an nvidia cardis like cuba running the human rights council of the UN. Tom, while biased toward Nvidia still tries to give unbiased reviews and even removed nvidia from his recommended hardware for a timedue to the cheats/optimisations.

thank you for your useless input.....
 
I think it is quite funny, that when Nvidia were found cheating, ATI thought it was "their duty" to step up and inform the world of Nvidia'a activities, but now that ATI have been found cheating, people are quick to defend ATI's actions.
Since I love to comment, I will :)
Everyone is always quick to defend, as was Nvidia people (ie [H]). But at least here we get a comment from ATI, and not a review site. And if you read the text, you will see that previous drivers are darker too. It cant be cheating if its been there from the beginning. Is it even a bug then? Its simply the way the drivers render. There is a difference from rendering the screen a little darker globablly and 'forgetting' to render lights or 'forgetting' to do AF on certain textures in certain applications. Or reducing AF quality over versions.

Originally posted by ShaithEatery
I just stopped trusting all those sites. I'm happy with my FX 5900 (non ultra). I do believe that NVidia has attempted to mend the error of their ways with the 52.16 drivers, and from what I've heard on the forums, they haven't been found to kill their cards with heating issues unlike the ati 3.8 drivers.
No, one of the drivers just made the fan stop on random when going into 3D mode :p
The heating issue is a nonissue. Official drivers dont have an issue at all (except a nasty FSAA/AF selection bug, I was really let down by the 3.8), however a leaked version did make screen frequencys odd which could break your screen if it couldnt handle it.
 
Originally posted by dawdler
The OpenGL gain is so small its insignificant.
You made a lot of good points in your post, but a 10 fps difference is not insignificant if you're into OpenGL gaming like me. And how about a 17 fps difference in Jedi Knight? 31 fps on the 9600XT and 48 fps on the FX5700, that's quite a difference. Not that Jedi Knight is such a highly popular game, but neither is TRAOD.

I'm sorry, I didn't make myself clear enough. What I meant was real gaming screenshots. Nobody is gonna play a game with those funky colors. I want to see a screenshots comparison with the real colors from the game. So far I haven't seen a single website that could point out any visual differences (without using a special non-gaming rendermode or tool).

Originally posted by dawdler
But at least here we get a comment from ATI, and not a review site.
I would trust a comment from the Iraqi former minister of information, before an official comment made by a large company. You wouldn't trust a comment made by Nvidia, so why would you trust ATI? How can a comment made by ATI possibly be objective?

Originally posted by dawdler
It cant be cheating if its been there from the beginning. Is it even a bug then? Its simply the way the drivers render.
There's no guarantee that ATI's initial drivers were cheat-free. The problem here is that the difference between a cheat and a bug is only made in the mind of the programmer. We can't tell if it was the intention of the programmer to reduce detail in some textures or that he just made a programming error.
 
Originally posted by Arno
You made a lot of good points in your post, but a 10 fps difference is not insignificant if you're into OpenGL gaming like me. And how about a 17 fps difference in Jedi Knight? 31 fps on the 9600XT and 48 fps on the FX5700, that's quite a difference. Not that Jedi Knight is such a highly popular game, but neither is TRAOD.
True, that's a pretty big difference for Jedi Knight. But look on average instead. In NWN, there's a 7 fps difference (though we all know how Nvidia optimised that is). In RTCW:ET the difference is 5 fps. In EF2 the difference is 3 fps.

I would trust a comment from the Iraqi former minister of information, before an official comment made by a large company. You wouldn't trust a comment made by Nvidia, so why would you trust ATI? How can a comment made by ATI possibly be objective?
I would trust them (or at least, not deny them) until proven otherwise. The contents in the ATI announcement was proven BEFORE the announcement was made, hence one have reason to trust it.

There's no guarantee that ATI's initial drivers were cheat-free. The problem here is that the difference between a cheat and a bug is only made in the mind of the programmer. We can't tell if it was the intention of the programmer to reduce detail in some textures or that he just made a programming error.
Nope. But that isnt the issue, the rendered image is simply darker in AM3, its not a cheat, they show the same renderload.


Btw, here's a pretty neat read: http://www.penstarsys.com/editor/tech/graphics/nv_ati_dx9/index.html
Though it will apparently update soon, he notes that in the end. And of course, I dissagree with some of his conclusions. For example that the FX would be more futureproof with its fancy features. BOTH cards will be obsolete before that becomes an issue.
 
In some cases i think the Nvidia darker look is better, although in others it looks just plain odd. I think it has something to do with contrast settings for certain types of light sources. As you can't even get the same ATi brightness in photoshop, it is impossible to get certain parts of the picture in sync.
 
Originally posted by mrchimp
In some cases i think the Nvidia darker look is better, although in others it looks just plain odd. I think it has something to do with contrast settings for certain types of light sources. As you can't even get the same ATi brightness in photoshop, it is impossible to get certain parts of the picture in sync.
Hehe, but the fun part is that in this case its actually ATI rendering the picture darker than the DX9 reference, which Nvidia nearly get on the spot :)
What it has to do with for Nvidia is skipping some lighting effects completely.
 
Do you have any links to screens from the DX9 reference? i don't think I'v ever seen any.
 
Back
Top