Nvidia selling lies with PhysX?

Asus

Newbie
Joined
Aug 22, 2003
Messages
10,346
Reaction score
0
The gist is that the code used for PhysX to run on CPUs is mostly X87 and single threaded while it could have been SSE instructions and multithreaded. Currently when run without a Nvida GPU and having the PhysX done on the CPU instead PhysX is very slow. But it COULD be very fast...possibly faster than when run on a Nvidia card (vs a CPU w/4 cores free and current double-precision SSE compared to older CPUs).
And support for the old Ageia PhysX cards is no more with current drivers so compatibility can't be holding Nvidia back.

So did Nvidia pick the slowest code for CPUs to make their cards look like they run PhysX better for a selling point or do they just not care?
 
Yeah, you must have an Nvidia card installed or they will nerf the PhysX code. On purpose. It's part of the agreement; otherwise you aren't licensed to use PhysX at all. (without it, you can't play certain games)
 
example of scaling with PhysX (last graph).

So that ATI card + CPU is being held back since the CPU is running the PhysX code as a single thread and using X87 code.
SSE would double performance (from TR article). Multi-threaded would help out more.
If you had a 6 core CPU running a game that did not take advantage of quad core (like using 2 w/ 4 free) but had physx you could run the game as well as the benchmark with 2 Nvidia GPUs.
 
I've never really liked this PhysX stuff, surely those realistic-looking cloths look great.
But compared to a lot of CPU-based physics engines, the 8FPS framerate I get in Mirrors Edge(when PhysX are on) on my i7-860 and HD5850 doesn't really convince me.
 
Yeah, I'm not a fan of proprietary computational code. This is bad for competition.

I suppose licensing physics engines like Havok cost much more money for a developer, so they may opt for PhysX (unfortunately for ATi owners). EDIT: and it sounds like (after reading the response below) that PhsyX is free for developers to use, further encouraging them.
 
What is funny about this speculation is that it might be possible to get not just equal but better performance on a PC with 4-8 cores if they switched what they optimized for... (CPU running physx instead of GPU)

See, when AMD bought ATI they could have made crossfire 'run faster' on AMD machines (AMD CPU, ATI chipset and ATI GPU) which really means slowing it down on others. But no they kept their departments separate. They even let Intel's chipsets run Crossfire...
Why does Nvidia disable the ability to run an ATI GPU with a Nvidia GPU as the PPU? No you need 2 Nvidia cards now to have a dedicated PPU since Vista. PhysX could have been kept separate when they bought Ageia. Maybe the licensing could have been where they got the money from but allowed it to be run on their PPU card or any GPU or CPU rather than tying it to Nvidia graphic cards.
 
No, they aren't playing fair. It's called The Network Effect. For example, with Apple IPhone's 'Facetime' video chat app, it requires both parties to be using an Apple IPhone.

The more people that have an IPhone, the more people will want an IPhone so they will be compatible. Outsiders (Android users, for example) will be out of the loop.

Ebay did it, Twitter did it, etc. When The Network Effect works, it works amazing. But costumers suffer.

EDIT: http://en.wikipedia.org/wiki/Network_effect
 
Exactly, virus.
I deleted part of my reply above since this sums it up better than what I wrote.

"Kanter notes that there's no technical reason not to use SSE on the PC—no need for additional mathematical precision, no justifiable requirement for x87 backward compatibility among remotely modern CPUs, no apparent technical barrier whatsoever. In fact, as he points out, Nvidia has PhysX layers that run on game consoles using the PowerPC's AltiVec instructions, which are very similar to SSE. Kanter even expects using SSE would ease development: "In the case of PhysX on the CPU, there are no significant extra costs (and frankly supporting SSE is easier than x87 anyway)."

So even single-threaded PhysX code could be roughly twice as fast as it is with very little extra effort.

Between the lack of multithreading and the predominance of x87 instructions, the PC version of Nvidia's PhysX middleware would seem to be, at best, extremely poorly optimized, and at worst, made slow through willful neglect. Nvidia, of course, is free to engage in such neglect, but there are consequences to be paid for doing so. Here's how Kanter sums it up:

The bottom line is that Nvidia is free to hobble PhysX on the CPU by using single threaded x87 code if they wish. That choice, however, does not benefit developers or consumers though, and casts substantial doubts on the purported performance advantages of running PhysX on a GPU, rather than a CPU.(hence the question about Nvidia selling lies)

Indeed. The PhysX logo is intended as a selling point for games taking full advantage of Nvidia hardware, but it now may take on a stronger meaning: intentionally slow on everything else.
"
 
It really does look more like dirty tactics by gimping their code on non-nVidia chipsets. However, they did have the option of not letting it work at all, which is what most companies do. But it makes sense that they let it work somewhat. The reason is, if they did completely drop PhsyX support [for non Nvidia chipsets], less developers would use PhysX code in their apps, since less people would be able to play the game. Imagine a game that does not support ATI cards at all. [Very roughly] half of gamers wouldn't be able to play it, and thus, wouldn't buy it. So that would be a terrible idea on nVidias part.

Right now, they give you an advantage for running Nvidia GPU. If they can lock-in a large portion of customers (for example Nvidia outnumbers ATI 3 to 1), they may pull the noose, and make it completely incompatible with ATI, which would have a snowball effect, putting ATI out to pasture.

However, like Tee Kyoo Em pointed out, the developer of a program can tone down the PhysX effects to just be a compliment, so a strong CPU is not required to run the PhysX code.
 
More relevant if you're looking at a specific market. ATi get quite a bit from people not willing to spend £300 on a graphics cards. It wasn't until last year you could spend quite that much on a single ATi card.

Just because 'everyone does it' and it's a viable business scheme doesn't mean it's far or right. In the same way that Valve put their consumers first, I feel PhysX is not a reason to purchase an NVIDIA card. Encouraging business which is unfair on me is a silly idea.

Basically, I am a consumer, and I don't give a **** whether its a good tactic for them, it puts me at a disadvantage.
 
I hope Nvidia will do something about this 'scandal' and improve their CPU support, after reading the following as a ATI-using graphics whore my heart is broken:

Code:
[B]Mafia 2 system requirements[/B]

MINIMUM SYSTEM REQUIREMENTS
    Operating System: Microsoft Windows XP (SP2 or later) / Windows Vista / Windows 7
    Processor: Pentium D 3Ghz or AMD Athlon 64 X2 3600+ (Dual core) or higher
    RAM: 1.5 GB
    Video Card: nVidia GeForce 8600 / ATI HD2600 Pro or better
    Hard Disc Space: 8 GB
    Sound Card: 100% DirectX 9.0c compatible sound card
    Peripherals: Keyboard and mouse or Windows compatible gamepad

    RECOMMENDED SYSTEM REQUIREMENTS
    Operating System: Microsoft Windows XP (SP2 or later) / Windows Vista / Windows 7
    Processor: 2.4 GHz Quad Core processor
    RAM: 2 GB
    Video Card: nVidia GeForce 9800 GTX / ATI Radeon HD 3870 or better
    Hard Disc: 10 GB
    Sound Card: 100% DirectX 9.0c compliant card
    Peripherals: Keyboard and mouse or Windows compatible gamepad

    PHSYX/APEX ENHANCEMENTS SYSTEM REQUIREMENTS
    Operating System: Microsoft Windows XP (SP2 or later) / Windows Vista / Windows 7 Minimum Processor: 2.4 GHz Quad Core processor
    Recommended Processor: 2.66 GHz Core i7-920 RAM: 2 GB

    Video Cards and resolution: APEX medium settings
    Minimum: NVIDIA GeForce GTX 260 (or better) for Graphics and a dedicated NVIDIA 9800GTX (or better) for PhysX
    Recommended: NVIDIA GeForce GTX 470 (or better)

    Video Cards and resolution: APEX High settings
    Minimum: NVIDIA GeForce GTX 470 (or better) and a dedicated NVIDIA 9800GTX (or better) for PhysX
    Recommended: NVIDIA GeForce GTX 480 for Graphics and a dedicated NVIDIA GTX 285 (or better) for PhysX NVIDIA GPU driver: 197.13 or later.
    NVIDIA PhysX driver: 10.04.02_9.10.0522. Included and automatically installed with the game.
 
What PC games on the pipeline that utilize half the power of these GPUs will actually take advantage of PhysX? The market isn't there. It's not a loss for consumers or ATI, really. Nvidia, however, appears to have ****ed up.
 
Is PhsyX even desirable in multiplayer? How would the physics calculations be done, on the client or the server? As long as the PhysX debris is for visuals only and doesn't have any effect on the gameplay itself (i.e. get killed by a piece of flying shrapnel), otherwise that would generate more useless data to clog up the network connection. Its hard as it is to find a decent server that doesn't spaz out every 120 seconds
 
Back
Top