Why would you want to "rip it's low-level multithreaded guts out" anyway?
I'm pretty sure that Havok has written their code with synchronizing it with developers code in mind. Otherwise, if it is as you said, it would be useless and every developer would need to "rip it out"..
And as far as...
Well, according to ATI, they can get their X1900 (the best at the time of the article), to calculate physics about 9x faster than the PhysX hardware physics card. They also mentioned that the X1600 (a rather cheap card these days) was able to outbeat the PhysX card as well. But it's yet to be...
I do read tech articles thanks. And why would Valve go out of their way to add multicore support to the Havok engine (which they use), when Havok already have???
I'm not talking about the AI in this article my friend.. I'm sure they've made the Source engine support multicore CPU's, however...
I'd say it's Havok that worked the multi-cpu code... their website says that they've made updates to support multi-cpu's for physics calculations. I'm assuming Valve have just used the updated code for Ep2.
And yes I do have a dual-core CPU :)
Does anyone know if Ep 2/TF2/Portal etc. are going to be able to use the GPU physics that Havok is able to use now?
I've got a spare X1600 now that I've upgraded to a 2900XT..
I had the same problem with my PC.. which was:
Athlon64 3000
1GB RAM
Radeon X800GT
Onboard Sound
Then I upgraded the RAM to 1.5GB... problem solved (much smoother too)
I would also check to make sure that he is dedicating enough memory to the video card. If he's only running with 8 or 16MB that may not be enough for HL2 (I'm not sure what the minimum amount is these days that the integrated video on Intel chipsets will take). If he's running with 32 or 64MB...
....And a perfect example of pointless crap is the useless post you just made...
HL2 physics rip off life?? You've got to be kidding me... you just said that so you'd sound like you knew something intelligent right?
Developers taking ideas from other games is bad....
Developers taking ideas from other games and EXPANDING on them is good...
From the sounds of it ID is doing the former... which doesn't really suprise me as other than the new graphics engine, Doom3 didn't really have anything new to...
Honestly, the G450 is a rubbish card for games, and Matrox's drivers aren't what you call "updated" to handle the latest games.
I'd suggest to you to either dump the G450 and go with either of ATI or Nvidia's dual head solutions, or...well... yeah ;)
It really depends on the graphics core, not the amount of RAM on the card.
The GeForce4-MX was really a souped-up Geforce2MX. It ran fine for the time, but was only DirectX 7 capable, but you can still buy them now with even up to 128MB of RAM onboard. It's quite a FAST DirectX7 card, so...
There's also Cool N Quiet from AMD..that could be whats causing the CPU to be listed as 800Mhz. (although it should ramp up to it's normal speed while in the game).
Try going into your BIOS and checking for the cool n quiet setting and try turning it off if it's on.
You may get held back by the ram... if you've got a 1.2Ghz P4 (was there ever such a beast?? I thought the slowest was a 1.4?), then you're running either PC100/PC133 ram or RAMBUS memory...either way you should really swap it out...
for $200-$250 I'd do the following:
- AthlonXP Mainboard...
I believe they were throwing around ideas of what they would like to "do" in a game, and made the engine based on what they wanted a game to do. And were not just worried about "ooooh let's have a physics engine so we can watch ragdolls!", they actually wanted to do something meaningful in the...
Who knows how much money they've made....but I know that if I had forked over $40 million of my own money and spent 6 years creating something.. I'd sure as hell want a fair bit in return...
I'm not saying you're a warez monkey.. but if that's how they think, then that's completely wrong...