this whole "cheating" thing....

sigh..lol...i told him not to argue. i told him that he would loose....but noooooo, he had to tempt fate....and guess what, he still lost.
 
Re: Re: Re: Re: Re: Re: this whole "cheating" thing....

Originally posted by Xtasy0
anything you heard about john carmacks internal memo regarding ati leaking doom 3 is false, that memo was a hoax, he hasnt said publicly who leaked it, about the only thing he said was on slashdot, and it didn't place blame anywhere, so you're again, assuming ATI leaked it, and you know what happens when you make assumptions right?


yes, i do know the old adage about "assuming"... YOU assumed i was referring to the hoax memo which i never mentioned...LOL! the link i posted refered to the slashdot article where carmack did in fact express his displeasure. which you just backed me up on no less! one of the article i posted mentioned the inquirer, which in other posts i have already said was dubious at best...but when they posted the article on nvidia "supposed" problems with pixel shader 2, THEN the enquirer becomes a "source" ROFL!!! you ATI fanboys sure are entertaining!

do you need more hay for your straw man? he is looking mighty thin!! ;)


so far it seems the ATI club can only attack me personally, oh well being called "immature" and other things by some of the people on this forum does not really mean much...considering where it is coming from! ;)
 
Re: Re: Re: Re: Re: Re: Re: this whole "cheating" thing....

Originally posted by Shad0hawK
yes, i do know the old adage about "assuming"... YOU assumed i was referring to the hoax memo which i never mentioned...LOL! the link i posted refered to the slashdot article where carmack did in fact express his displeasure. which you just backed me up on no less! one of the article i posted mentioned the inquirer, which in other posts i have already said was dubious at best...but when they posted the article on nvidia "supposed" problems with pixel shader 2, THEN the enquirer becomes a "source" ROFL!!! you ATI fanboys sure are entertaining!

do you need more hay for your straw man? he is looking mighty thin!! ;)


so far it seems the ATI club can only attack me personally, oh well being called "immature" and other things by some of the people on this forum does not really mean much...considering where it is coming from! ;)


the slashdot article said nothing about ATi, as i said there is no public knowledge of who really leaked doom 3, so stop assuming you know.

and why would an nvidia employee at a convention say pixel shader 2.0 is defective if it wasnt? find some other people who were there and confirm it with them if you want. i'm not taking EVERYTHING the inquirer says for fact, im selective, i choose what sounds real.

and how about posting those examples other people were asking for? why avoid them? oh and incase you missed it, i own an nvidia card, i dont own an ati other than my old rage IIC, i'm not a fanboy of either company.

blah blah i dont know if there was any more meat to your post, i can only deal with "ROLF LAWLZ OMG WTF" for so long. i'm done with this thread.
 
Wow

Shadowhawk, how many threads do you have to make trying to say that Nvidia is better than ATI.

You have had a part in about 3-4 threads, if I can remember right, and every one was lost by you and all the other people supporting Nvidia.

Please stop trying to change our minds and try and make us think Nvidia is better.

I will admit the 5900 is a good card and I liked Nvidia all the way up until they started making some REALLY bad cards. Most of the FX series sucked and that is what made me change my mind, some of the FX series were even slower than some of the Geforce MX series cards.

ATI seems to be doing just the opposite of Nvidia, they are putting out some very nice cards right now, almost all of the 9X00 series is really good.

Until I see some benchmarks where Nvidia has a demanding lead over ATI I will not buy one of there cards.

Right now ATI is also cheaper and perform just as well if not better.
 
Re: Re: Re: Re: Re: Re: Re: Re: this whole "cheating" thing....

Originally posted by Xtasy0
the slashdot article said nothing about ATi, as i said there is no public knowledge of who really leaked doom 3, so stop assuming you know.

i mentioned that in reference to carmack being upset about, which it said he was.

Originally posted by Xtasy0
and why would an nvidia employee at a convention say pixel shader 2.0 is defective if it wasnt? find some other people who were there and confirm it with them if you want. i'm not taking EVERYTHING the inquirer says for fact, im selective, i choose what sounds real.

i do not really know an nvidia employee did that, maybe so, maybe not, would this be the first thing the inquirer posted that turned out not to be true?.... ;)

Originally posted by Xtasy0
and how about posting those examples other people were asking for? why avoid them? oh and incase you missed it, i own an nvidia card, i dont own an ati other than my old rage IIC, i'm not a fanboy of either company.

i have been rather busy, and most likely missed some stuff my apologies :) but when people reply to me starting off with the sentence "i did not read your post, but..." it can make one wonder why i am bothering. i am not a "fanboy" of either myself, although i tend to prefer nvidia's cards for stability.

Originally posted by Xtasy0
blah blah i dont know if there was any more meat to your post, i can only deal with "ROLF LAWLZ OMG WTF" for so long. i'm done with this thread.

as far as i can remember i have not typed anything like that, as far as "meat to my post" the original intent was to address the ATI fanboy's constantly saying "nvidia cheated blah blah blah, i wont buy a card from a company that cheats blah blah blah" when in fact ATI cheated a tad over 2 years ago and got busted again 3 months ago.

as for the merits of each companies flagship cards a more rational discussion is here:

http://www.halflife2.net/forums/showthread.php?s=&postid=60332#post60332

now i am off to bed!! goodnight :)
 
so when ATI changes copyrighted code it is an "optimization" but when nvidia changes copyrighted code it is "cheating" ROFL!!!
This is why this should be kept at hardware/software boards... Many people here obviously have no idea what Nvidia did in the first place, that classified as a cheat compared to optimisation. ATI changed a few shaders, yes, they gained some 1.9% I believe. Nvidia changed nearly everything and gained 30+%. But the changing is not that harsh, it still looked relativly good (for an Nvidia card). The thing was, they used clip planes, and this was the drop that made everything spill over (not to mention the actuall change of the Nvidia shaders reduced workload, the ATI changes didnt). That is NOT a valid optimisation. That is straight out cheating. It can not be applied for ANYTHING except static camera paths. And you can gain alot by determining what to render and what not. It only had one purpose, to gain speed in 2k3.

Then it comes to the fact that Nvidia recognises 70+ applications in their drivers, ATI reckognises none (that they could find)
 
Re: Re: Re: Re: Re: Re: Re: Re: Re: this whole "cheating" thing....

First of all, let me start by saying that I am NOT a fanboy of either company; I'm a hardware enthusiast.

Let's start with the most "important" question:
Which card is the better one?

Now, what is the definition of "the best card". Is it the card that gets a few more fps in a game than another card, or is the cheapest high-end card on the market? Is it the overall performance, perhaps? Well, in my opinion, it's the overall performance.

There is no "proof" that either one is better (9800pro and GFX5900U). There are many reviews out there, and I'm sure you've read alot of them (or perhaps not?). Many reviews has come to many different conclusions, but they've also come to different results in benchmarks, games and the like.

These two reviews/comparisons show that the 9800pro is better than the GFX5900U:

Review 1

Review 2

These two, though, show that the FX5900 is better than the 9800pro:

Review 3

Review 4

When deciding which graphics card you want to buy, you'll obviously read a number of reviews to get the picture of which is the better one. Today, it's much harder to get the "right" picture because of the fact that both ATI and Nvidia have been known to cheat (some said, or still says, that Nvidia "crimes" were worse, though that comes to a matter of opinion) in benchmarks by optimizing their drivers. While doing a little review of them both my self (9800pro vs. FX5900U), I came upon something intresting. I had come to the point when I would run 3d mark 2003. First, I ran the benchmark on both cards, and I could see that the FX5900U had a small, though very noticeable lead over the radeon card. What I did then, was to change the name of the .exe file from 3dmark03.exe to whatever.exe. To my slight surprise, the 9800pro now came on top of the 5900U. The 9800pro also droped on percent, though that wasn't as much as the nvidia card lost in points. By this I'm not saying that the 9800pro is better than the FX5900U, or vice versa, though it's just one example of the cheatings we've been hearing so much about lately.

One thing every reviewer seems to agree on is that the 9800pro has better image quality than the FX5900U. Do you know how they came to this conclusion? They took a screenshot of a game (most commonly UT2003), studied it for several minutes, then came to the conclusion that the radeon had slightly better image quality. Well, let me tell you something, you will *not* notice any differance when playing games.

One thing that I can tell you, though, is that the radeon card has a slight performace lead when it comes to FSAA (FullScreenAntiAnialising) and AF (AnisotropicFiltering), though the difference's hardly "good enough" to make you buy a 9800pro instead of a GFX5900U.

I noticed that the 9800pro was a better overclocker. It could go from 380/340(680 effective) to 470/370(740 effective) with the standard cooling. The FX card came up to 480/880 from 450/850. Both ran 100% stable after the overclockings, and they ran with the standard cooling system that came with the card. Though I must remind you all that this has more to do with which 9800pro/5900U card you buy, not the chipset itself. The cards I had was Sapphire Atlantis Radeon 9800pro and Gaindows GeforceFX 5900 Ultra. Now the Sapphire card is known to very overclocker friendly in comparison to, for an example, the Hercules version of the card. Sometimes you will notice very large differences of how much you can overclock the cards. I'm sure that many people have 5900U cards that will overclock quite alot more than my Gainward.

Now, back to the benchmarks. I wanted to get a good view of how well the cards would perform, so I ran quite alot of games and benchmarks to test them. I'll list some of them:

*UT2K3*
*Unreal2
*GTA: VC
*Half-Life: CS -
*Splinter Cell
*NWN*
*Mafia
*BF1942
*Serious Sam: the second encounter*
*Quake3*
*Jedi knight 2: Jedi outcast*
*Anarcy Online
*Max Payne
*Aquamark
*3d mark 2001SE -
*3d mark 2003

Well, I actually ended up listing all of them. As you can probably see, this was *very* time consuming. As you may've noticed, I've put a little star "*" after a few of them. The 5900U were the winner in the games/benchmarks that have a "*" after them.
Something to note was that the 9800pro performed horribly with FSAA turned on in NWN. I've heard that NWN is nvidia optimized, though this is something I cannot confirm. It should be noted, though, that no card ever had any real "über power performance lead". I've put a "-" after the 3d mark 2001SE. It was so f***ing
close that I just couldn't put ATI as the winner in (there was a 12 point difference). I got 20087 with the 9800pro (processor and graphics card overclocked) and 20075 with the FX5900U. They was a 54 point difference when they weren't overclocked (in favor the 5900U), though that still wasn't enough to put on of 'em as a winner.

Moving on to driver stability. I've heard alot from both "sides" that the ATI/Nvidia drivers are more stable/unstable. Now, it is true that the FX5800U could fry up with some older version of the detonator drivers, becuase the fan would turn off when running 3d screensavers. That problem, however, really didn't cause as much damage as some sites made it look. Oh, and if you wonder, that problem doesn't exist anymore. The latest detonator drivers (currently 44.03) and the latest catalyst drivers (currently 3.6) are both rock solid. I havn't any problems what so ever with any of them.

Conclusion: No card is the real winner here. They're so very close in performance. IF you're a fanboy, go with the card that your company has developed. :) If you're just intrested in getting the best card, get the cheapest one. When I did the review, the 5900U was more expensive, though as far as I know, the price differences are close to nothing. The 9800pro is a little better than the 5900U when you turn on FSAA and FA, but the 5900U, on the other hand, has a slight lead in games that requires more from the speed of the card (Q3, for an example).
I'm sitting on a 9800pro myself, but that's because I could get it extra cheap; I found an offer on a site that had got too many cards, and was selling them for 350 dollars (keep in mind that this was last month). You couldn't go wrong with either one

By the way, I didn't post the results of the games/benchmarks in picture formats. For one, my server hosting them is down, but it would also take alot more time to put them all up (not that this post didn't take along time to write, but still). Though, as I hope you can all see, this is a very fair review with no "fanboyism".
Hope you enjoyed the text as much as I did writing it. ;)

*edit* I forgot to put a "-" after CS, meaning that it was a tie, but I've done that now. :)
 
Something to note was that the 9800pro performed horribly with FSAA turned on in NWN. I've heard that NWN is nvidia optimized, though this is something I cannot confirm
Look at the FSAA settings in the ingame menu, you can pick quincunx AA on a radeon ;)
Its is very Nvidia developed, the horrible performance is known. As an example, 4xFSAA pulls it down HORRIBLE, while 2xFSAA is 3 times faster, 2x is actually the only mode working for ATI. Also shadows bring slowdown, and pixel shaded water is broken. Its all in the game...

The 9800pro is a little better than the 5900U when you turn on FSAA and FA, but the 5900U, on the other hand, has a slight lead in games that requires more from the speed of the card (Q3, for an example).
I dissagree with that part of the conclusion. Quake 3 is old technology now, and isnt more requiring than for instance a game like Max Payne which you show as the 9800 won.
I still see it as the 9800 is the obvious best card for modern and new games, which your tests show too. That should weight rather hard... Not a game years old.
 
A card that gets 200-400 fps doesnt seem very "demanding for speed" to me
 
Originally posted by nsxownzme
http://firingsquad.gamers.com/hardware/radeonquack/default.asp
Benchmarking Ethics and the ATI Radeon
October 24, 2001

This article is old as **** and who cares.. talking about the 7500 and 8500..
Yes, and then Nvidia declared that they would never do such a thing, and they have a bunch of "rules" written that forbid for instance an optimisation on one card that only apply to one game. That is not a valid optimisation according to Nvidia policys.

Then came FX and all policys where blown out the back of it... :D
 
Back
Top