NVIDIA designated the "official" card of Doom 3.

DarkStar

Tank
Joined
Jul 11, 2003
Messages
4,016
Reaction score
0
http://www.nvidia.com/object/nzone_doom3_home.html

Sorta near the bottom of the page:

"NVIDIA is pleased to announce that id Software recommends the GeForce FX family of graphics processing units for DOOM 3. The rich feature set and programmability of GeForce FX GPUs enable the griping world of DOOM 3 to come to life with real-time dynamic lighting and shadow, while raw horsepower delivers non-stop multi-player action at lightning-fast frame rates. Look for this sticker on video cards equipped with a GeForce FX GPU, and witness the mind-blowing detail of the DOOM 3 experience - the way it's meant to be played."
 
well this is interesting

Half Life 2 vs. Doom 3.

ATI vs. Nvidia.
 
I believe that is high-grade bullsh*t. We all know Nvidia's DX9 architecture is horrible compared to ATI's. However, their new cards might actually show some change, but until then.... GO ATI!
 
stigmata said:
I believe that is high-grade bullsh*t. We all know Nvidia's DX9 architecture is horrible compared to ATI's. However, their new cards might actually show some change, but until then.... GO ATI!

Doom 3 isn't DX9 :|
 
Sure id may recommend Nvidia... they may also recommend ATI! Who said recommendations are exclusive? Generally, having an ATI or Nvidia card is a good idea...
 
Letters said:
Sure id may recommend Nvidia... they may also recommend ATI! Who said recommendations are exclusive? Generally, having an ATI or Nvidia card is a good idea...

I got the impression from this announcement that they seem to be siding with NVIDIA on this one. If you're reccomending both, what's the point of saying anything? And if you plan on reccomending both, why NVIDIA today and not ATI?
 
DarkStar said:
I got the impression from this announcement that they seem to be siding with NVIDIA on this one. If you're reccomending both, what's the point of saying anything? And if you plan on reccomending both, why NVIDIA today and not ATI?
It's on an Nvidia site, no? Why the Hell would they include what Id says about ATI?
 
Letters said:
It's on an Nvidia site, no? Why the Hell would they include what Id says about ATI?

Well, I assume if they were going to be reccomending both, then they would have given each manufacturer the news at the same time. And we would have subsequently heard something from ATI on this matter today as well.

Plus, I think this was actually a press release that they gave to a bunch of media outlets as well, not just the folks at NVIDIA.
 
Doesn't look quoted to me... that's just THEIR WORDING of it... :upstare:

It's a very general statement that shouldn't be taken seriously... at all.
 
OpenGL or D3D doesn't matter... nVidia's FX cards have poor shader performance. Both OpenGL and D3D can use shaders. The FX cards are slower in both when shaders are involved.

All that statement says is that id Software can be paid to say anything... and nVidia is willing to pay whatever it takes to get them to say it.
 
OCybrManO said:
All that statement says is that id Software can be paid to say anything... and nVidia is willing to pay whatever it takes to get them to say it.

This seems to me the most likely explanation to me.

Basically I see it like this:

1. NVIDIA really got their act together and improved their shader performance (not likely)
2. NVIDIA paid id to say these wonderful things about their company (most likely)
3. NVIDIA totally went behind id's back on this one and are just waiting to see if they can get away with it (this would be sort of funny)
 
The fun part will be to run a game benchmark through the standard opengl rendering paths, and see which one comes out on top :)

Sidenote: DX9 is more than rendering. KOTOR requires DX9 installed too, but its an OpenGL 1.4 engine. Its just a crummy way of saying "Yeah, even though we use this awesome independant multiplatform rendering engine, we are too lazy to port it to Linux or similar, so we just use something small from DX9 and voila! Its a Windows game".
 
S.T.A.L.K.E.R. is nvidia optimised too, but GSC say`s it will be little difference in performance between geforce/radeon. about 5 fps or something, if any at all.

DOOM III : id says geforce have much more stable drivers. the fps will also be capped on 60 = max 60 fps.
 
So we know that Nvidia is 5fps more. What about the IMAGE QUALITY?

Who will win the race in this aspect?
 
If I remember correctly ATI leaked some stuff about DOOM III and Id told them to Freak off. Nvidia has allways been better at OpenGL than ATI cards have. Its just the simple truth.

Its like my 5900Ultra 256Mb card owning my friends 9800Pro at quake III in linux. :)
 
thenerdguy said:
If I remember correctly ATI leaked some stuff about DOOM III and Id told them to Freak off. Nvidia has allways been better at OpenGL than ATI cards have. Its just the simple truth.

Its like my 5900Ultra 256Mb card owning my friends 9800Pro at quake III in linux. :)
ATI actually didnt leak "stuff". It was ATI that leaked the alpha :)
 
Not ATI, a person who WORKED for ati...
 
With Id Software's upcoming Doom3 title looming on the not so distant horizon, we've been taking a sincere interest in Id's new game engine. Since it will obviously become a definitive tool in our benchmarking test suite, for next generation graphics products and related articles, we were also more than interested when our colleagues at Anandtech posted preliminary benchmark figures in their GeForce FX 5900 Ultra launch article. The scores represented here showed a significant lead for both the NV35 and NV30, versus ATi's Radeon 9700 and 9800 products. It piqued our interest for sure and we decided to ask NVIDIA what they felt may be attributing to the GeForce FX's obvious strength, at least currently, with Doom3.

The answer we got from Senior PR Manager, Brian Burke, was "UltraShadow"... It seems that Doom3's advanced lighting and shadowing effects place a heavy load on the graphics pipeline and NVIDIA feels that UltraShadow technology is a clear advantage in their architecture, versus other products on the market. We conducted a quick Q&A with Brian, on the topic of UltraShadow and Doom3, as well as its relative impact on performance in next generation game titles, that will also utilize the Doom3 engine. Before you dig in here, we suggest taking a look at NVIDIA's UltraShadow Tech Brief, if you need a refresher course on the technology.

That was from the INQ.com news

Here is the information concerning UltraShadow ....:

If UltraShadow technology is a "hardware assist" feature that programmers at Id have taken advantage of, in what way does the hardware architecture actually enhance the capabilities of this method of "Z-Cull" for shadows? Is there some sort of programmable register or mechanism within the chip that allows the developer to define their clip plane parameters? Is there a memory buffer for caching of this information, with respect to what is and is not supposed to be processed? Can you define better for us how your chip differs from a hardware perspective, versus ATi, in this regard?
We modified our GPU architecture to be able to process depth bounds methods. These methods are called by the "depth_bounds" API which we presented to developers @ GDC. When invoked, these methods provide the information necessary for our GPU's with UltraShadow to cull out shadow pixels beyond the constraints set by depth bounds. It involved modifications to several units within our GPU's.

This technology is currently patent pending by NVIDIA, and as such, so it would be difficult for ATI to provide the same benefit without violating NVIDIA IP. This technology is over and above such advances as 2-sided stencil, and clip planes, which both ATI R3xx GPU's and NVIDIA GeForce FX GPU's support. The benefit depends on the scene and # of lights, but can be up to 30% in some cases.

Source : hardocp, warp2search and xbitlabs

Does it mean that NVIDIA only advantage iS US Technology? :dozey:
 
MaxiKana said:
Not ATI, a person who WORKED for ati...
Thus ATI. Any person that works for the company, represents the company.

And no, Ultrashadow is obviously not the only advantage the Geforce have. There is so much you can tune for optimum performance on the Geforce, compared to the standard path.

Btw, Ultrashadow is also just usable in OpenGL rendering, making it less than usefull...
 
dawdler said:
Thus ATI. Any person that works for the company, represents the company.

And no, Ultrashadow is obviously not the only advantage the Geforce have. There is so much you can tune for optimum performance on the Geforce, compared to the standard path.

Btw, Ultrashadow is also just usable in OpenGL rendering, making it less than usefull...

What do you mean less than usefull???? i find it very usefull. :)
 
thenerdguy said:
What do you mean less than usefull???? i find it very usefull. :)
How many upcoming big games are OpenGL? How many upcoming games actually use this Ultrashadow in OpenGL?

Think on it.
 
in general, ATI and NVidia BOTH have good OpenGL proformance...though IMO ATI has a bit more power in their cards
 
I predict this'll be the same old story; even though nVidia got John Carmack twizzled around their little finger, the game will run equally as well (if not better) on ATi cards.
 
You have to remember that nVidia cards went with 32-bit shaders for their FX cards. ATi went with Microsoft's requirement of 24-bit shaders. Look who's in who's pocket now ;)

Just to point out, I have an ATi card thrumming in my PC. I'm no fanboy :)
 
dawdler said:
How many upcoming big games are OpenGL? How many upcoming games actually use this Ultrashadow in OpenGL?

Think on it.

well...alot of future games are probably going to use the doom3 engine :cheers:
 
Yeah but how many direct x games will be ported over to linux?????? Its good for thoes of us that use linux. :)
 
A2597 said:
in general, ATI and NVidia BOTH have good OpenGL proformance...though IMO ATI has a bit more power in their cards
That wasnt the issue. The point was that Ultrashadow is only supported under OpenGL, meaning you cant use it under DirectX.

well...alot of future games are probably going to use the doom3 engine
By the looks of it, the Doom III engine is quickly becoming obsolete, and something of the past. In the future you want Far Cry style engines to compete, not Quake style.

You have to remember that nVidia cards went with 32-bit shaders for their FX cards. ATi went with Microsoft's requirement of 24-bit shaders. Look who's in who's pocket now
Meaning exactly nothing. They couldnt run a game with full use of 32 bit shaders even if their lives depended on it. I honestly doubt the future games will tax the card *less* than current.

EDIT: Btw, here http://members.shaw.ca/spiritwalker/pics/dx9diff.html anyone can see what the high-definition-ultra-high-quality-RULEZ-ALLZ 32 bit shaders do in practice for modern gaming. Ie they revert to FX12.
 
dawdler said:
By the looks of it, the Doom III engine is quickly becoming obsolete, and something of the past. In the future you want Far Cry style engines to compete, not Quake style.
At the moment, Far Cry is the only game being developed on the CryTek engine, whereas several games are being developed on the Doom3 engine. The CryTek engine can produce nice graphics, but stability, performance and customisability are also important factors for game developers.
 
Well ill stick with nvidia and opengl,
So your saying that the man who created spome of the most used game engines of all time so far is wrong to use opengl??
 
Meh Im not exaclt in the know aobut this but OpenGl doesnt seem quite so....verstatile is the only word i can think of.

I think the DOom 3 engine, when it was first seen was undoubtedly the best, however people here are right, other engines are beinging to surface the rival and poossibly surpass the Doom 3 engine. Ok so Crytects enging may not seem the most stable but think about the time thats gone ino developing doom 3.

Anyway time for starget, see you all in an hour.
 
Farrowlesparrow said:
Meh Im not exaclt in the know aobut this but OpenGl doesnt seem quite so....verstatile is the only word i can think of.
Lol, its DX that's not versetile :)
OpenGL is theoreticly the best when it comes to versatility. But DX is definetly biggest.

Anyway, when I said Far Cry style, I didnt mean Far Cry specificly. I meant expansive engines, large terrains, advanced lodding and a foilage engine, etc etc... Example: Can you alter Doom 3 levels dynamicly ingame?
 
u guys never knew that?? what hole have u been living under?
 
dawdler said:
Lol, its DX that's not versetile :)
Can you alter Doom 3 levels dynamicly ingame?
Yes you can alter the level dynamically in doom3 engine. Infact so far its the only engine which can do that because of new map file format is no longer involvles bsp compilation and everything in D3 engine could be setup as dynamic. Because of static lighting you can't alter the map HL2 or Far cry engine.
 
harrys said:
Yes you can alter the level dynamically in doom3 engine. Infact so far its the only engine which can do that because of new map file format is no longer involvles bsp compilation and everything in D3 engine could be setup as dynamic. Because of static lighting you can't alter the map HL2 or Far cry engine.

have you seen the CryTek engine videos? You can play the game while you build the map. Anyways, Doom3 still needs to compile collision maps.
 
Back
Top