E3 Videos - Anti-Aliasing?

W

Warchild

Guest
Im just wondering if they have AA (Anti Aliasing) on in the e3 videos... maybe a stupid question, but it looks like they had it on. ;)
 
I would say they almost definitely have Anti-Analysing on, probably 8x
 
Originally posted by Bilko
I would say they almost definitely have Anti-Analysing on, probably 8x

lol omg then im happy. If they could run the game at a perfect 100fps with P4 2GHZ GF4 with AA then that means my P4 1.7GHZ GF4 wont have any problems running it since i dont use that AA crap.

BTW, Anti-analiasing is a feature of the video card in which is smooths out uneven edges of objects, worlds and characters. Like ya know how on low resolution the edges of things are all squarish and not smooth? Well anti-aliasing smooths that out. Kinda pointless i think.
 
Re: levels of AA:

HL2 was shown at ATi's booth. The maximum level of anti-aliasing supported in hardware on ATi cards is 6x. This causes severe performance loss. ATi's 2x AA is viable in most environments for a very small performance hit, and really does clean up jaggies pretty well.

8x AA - as far as I know, only supported on GF FX products - causes crippling performance losses in present games, let alone next-generation engines...

Trying to derive the nuances of graphical settings from compressed video is kind of silly. Hence Bilko's rather clever dig: "anti-analysing 8x"
 
they could have run the demos at
480 * 320 with no anti-aliasing - and you wouldn't be able to tell as the demos were recorded with a video camera, with a probable resolution of 400 lines (the resolution of the .mov file is higher than this but contains no additional detail). All discussion of wether or not they used AA is utterly irrelevent. This thread stops. Here. I'll draw the line:

==============================
 
Originally posted by Hammer
Re: levels of AA:

HL2 was shown at ATi's booth. The maximum level of anti-aliasing supported in hardware on ATi cards is 6x. This causes severe performance loss. ATi's 2x AA is viable in most environments for a very small performance hit, and really does clean up jaggies pretty well.

8x AA - as far as I know, only supported on GF FX products - causes crippling performance losses in present games, let alone next-generation engines...

Trying to derive the nuances of graphical settings from compressed video is kind of silly. Hence Bilko's rather clever dig: "anti-analysing 8x"

FX cant do 8x AA. ATI cards have a lot better FSAA and AF then geforce. A geforce needs a 8-10x AA to get close to Radeons 6X AA.

Im running all my games with 4x FSAA and 16x AF and i dont feel any preformance hit at all...and it looks much better then on a geforce card.

My tip: Go buy a ATI card. Much better picture quality!
 
Fear my Geforce 2 MX! Can't even do Anti-Aliasing! Woooo...excuse me while I cry in a corner. I can't wait until I get my Alienware computer....two more weeks!

Bit off topic, but meh, I'm bored...

-Vert
 
Originally posted by Vertigo
Fear my Geforce 2 MX! Can't even do Anti-Aliasing! Woooo...excuse me while I cry in a corner. I can't wait until I get my Alienware computer....two more weeks!

Bit off topic, but meh, I'm bored...

-Vert
don't worry, i got one of those too :cheers:
would get a radion 9700pro or a 9800pro, but they are known for driver problems and also have only just started to suport open gl, nVidia have been doing it for alot longer and are proven to be more stable so i'm gonna wait for the new nVidia even though both nVidia and ATI are as bad as each other, read...

What's wrong with this pixel shader?
The Register writes:

Benchmark firm Futuremark has uncovered extensive cheating by NVidia in its 3DMark03 suite.

After an initial report at ExtremeTech, Futuremark revisited the tests and discovered eight instances of cheating, which improved the performance of NVidia's Detonator FX and WHQL drivers by as much as 24.1 per cent.

But NVidia achieved this not through brilliant optimization, but by alternative means which omit graphic details: so the output, while at times similar, does not resemble what it should.

As Futuremark explains:

"The cheating described here is totally different from optimization. Optimizing the driver code to increase efficiency is a technique often used to enhance game performance and carries greater legitimacy, since the rendered image is exactly what the developer intended."

NVidia deployed a variety of cheats: introducing its own pixel shaders and vertex shaders specifically for the benchmark. The shaders improved performance at the expense of image quality.

FutureMark is revisiting some ATI benchmarks, too. ATI, which has grabbed the lead from NVidia in a ferociously competitive battle, was caught cheating before.

In 2001 ATI was discovered to sacrificed quality for frame rates in a Quake 3.0 benchmark: when Quake was renamed "Quack", the drivers produced much better output but frame rate performance fell.

NVidia withdrew from FutureMark's beta program earlier this year.®

Related Link
FutureMark's NVidia audit (750 kb PDF)
 
...and that's why you never trust any one benchmark when measuring performance.

Derby is right, there's no way you can tell if AA is on from those videos or even what the resolution or frame rate was. We're going to have to wait for someone to benchmark it.
 
"FX cant do 8x AA. ATI cards have a lot better FSAA and AF then geforce. A geforce needs a 8-10x AA to get close to Radeons 6X AA."

No arguments about quality - and the GF FX 8x mode (yes, GF FX do support 8x FSAA) is definitely worse, in looks and performance, than ATi's 6x.

http://www.nvidia.com/docs/lo/2415/SUPP/Intellisample_2603.pdf

http://www.hardocp.com/article.html?art=NDcyLDY=

"Im running all my games with 4x FSAA and 16x AF and i dont feel any preformance hit at all...and it looks much better then on a geforce card."

ATi's R300/350 cards do a nice job of AA and AF, but they do cost you...

http://www.beyond3d.com/reviews/ati/r350/index.php?p=19#comp
http://www.beyond3d.com/reviews/ati/r350/index.php?p=17


"My tip: Go buy a ATI card. Much better picture quality!"

Thanks; I've owned an 8500LE, 8500 128MB, and I now have a 9500 Pro, on my way to, I hope, an R360 or R400...
 
For information the pc running hl2 was a p4 3Ghz with a radeon 9800 pro. The game ran in 1152*864 with antialiasing of course (don't ask me if it's 2x, 6x or 32x, i don't care about that anyway :])
 
i dont think the computer was running the game..


i think it was just a video,..like pre-recorded. ACTUAL ingame footage but just done a while ago and not using the pc.


so u never know.
 
Originally posted by bonanzaguy
ew, alienware r t3h sux0r and overpriced
Bah, Alienware rules. They're overpriced, yeah, but they're like the Porsche of computers, man. People are willing to pay it, so why not? I would build one myself, but I'm going to University in a couple months, and want to avoid any hassles, and Alienware is a very trustworthy company. They also tweak everything to a very high-performance level. Anyway, back on topic...

-Vert
 
yeah i didnt really think they sucked, im just an ass :p

but i'd much rather build my own (which i did) not to mention save a sh*t load of money
 
I was shocked at how expensive Alienware was. It's totally ridiculous. I could build a similar system out of individual parts and save almost 1000$. That's MAJOR moola.
 
Originally posted by bonanzaguy
yeah i didnt really think they sucked, im just an ass :p

but i'd much rather build my own (which i did) not to mention save a sh*t load of money

Your reasoning is compelling...just two flaws there, though:
1) Me = lazy.
2) Me = incompetent.

...sadly enough, I'm a computer science major. Heh....wow, amazing how far off topic this thread's gone. Ah well.

-Vert
 
Originally posted by Apos
I was shocked at how expensive Alienware was. It's totally ridiculous. I could build a similar system out of individual parts and save almost 1000$. That's MAJOR moola.
Yeah, once again, I'll admit it's overpriced, but if you've seen the build quality of those machines, it's wayyyy better than I could ever do. The cooling system is nothing short of revolutionary, the wiring's exquisite, and they tweak all the hardware to a perfect balance between performance and the hardware's safety (overclocking a vid card, for example). Anyway, I think it's a worthy investment. It'll last me a while.

-Vert
 
hehe, fair enough vertigo. im almost on the verge of being too lazy myself, but there's that self satisfaction that comes along with it... its like having a baby. but less work
 
Hmmm...anyway, back on topic...hey, bonanza, mind if I have one of those burned critters you got there? They sure sound tasty.

-Vert
 
Hmmm...I really oughta shut up soon before somebody thinks I'm insane and decides the only way to save humanity is to institutionalize me...that would never be good....mainly because I'd never get my Alienware computer! Anyway, thanks for the burned critter there. It was definitely tasty. You could sell them on the street one day if you could get the right marketing appeal for it.

-Vert
 
its ok vert, we can get you the help you need. pay no attention to those large burly men in the white lab coats coming up behind you...
 
But...but noooo! I'm Gordon Freeman! I have to save the earth! Wait, maybe these people in white lab coats are those rebel fighters I've heard of...and this strait jacket they're forcing me to wear is, in fact, a new sort of armour/weapon! I can single handedly defeat my foe with this strait jacket.

-Vert
 
silence him, he's figuring out our most intamate secrets!! just dont let him know im actually the G-Ma.. DOH!
 
Damn you and your Gollum-esque lisp! If only I weren't wearing this uber-weapon strait jacket, I'd take out a knife and stab you!...or a crowbar...too bad that just makes a metallic clink and somehow doesn't affect you at all...bah!

-Vert
 
Originally posted by subs
i dont think the computer was running the game..

i think it was just a video,..like pre-recorded. ACTUAL ingame footage but just done a while ago and not using the pc.

Actually it was a real time demo running the Source engine on that PC but with a set sequence of events. It's the same as a game benchmark (i.e. 3DMark2003), it's running like a game but you can't control what's happening.
 
/me thwacks bonanza's HL2 logo...
/me draws cheesy looking HL1 bullet-shaped crowbar dent on logo
/me runs away in strait jacket (still not sure how i managed to do all that in a strait jacket....)

-Vert
 
*buries head crab*
*head crab jumps out of grave, latching onto Vert*
*Vert turns into zombie. Hilarity ensues*

-Vert
 
Which just goes to show how ineffective you are derby. :)

*Vert zombie kills derby*
 
Critics Rave: The most off topic gut busting thread so far!
 
Ha! The only way to end this thread, derby, is just to not reply. Just like the rules..."Don't feed the trolls"...cept, I'm not a troll...I'm more like an annoying fly...so...don't feed...the fly? Yeah. That'll do. (please don't institutionalize me)

-Vert
 
Originally posted by Z|insane
Critics Rave: The most off topic bust gutting thread so far!
Don't feed the fly!
*Vert zombie kills Z|insane*

-Vert
 
Back
Top