Originally Posted By: PHeMoX
I have no intention of arguing with you about numbers, percentages and what more, but those graphs show how the GPU in those games become the bottleneck, not how much a multi-core increases performance in those games.

As said using a Q6600 or a Core 2 Duo 6600 to run Crysis makes a lot of difference. I could list many many more games in which I've experienced myself that it matters, but I feel it would be pointless.

Well I don't understand what you want to prove with this. You say a quad core gives you more performance but admit at the same time that a former high-end GPU already limits in today's games but still don't think it's a good idea in such a situation to invest more into the GPU if you're a gamer? Quite contradictory...

Originally Posted By: PHeMoX
As far as this game is concerned.. it's obvious it's not optimized for quad cores, but more so for single or dual cores... so yeah, doh, not much benefit of quad cores there. It's still pretty likely that the game will run better on a quad core still as background processes could run on different threads and so on.

I didn't explicitly mention Call of Duty 4 at all so why are you arguing about it? It actually proves my point of little state-of-the-art games having a quad core support now that you mention it...

And even for those tests where the quad core is better - you do know how this was tested? They just "deactived" some cores of their quad core and so they kept the clock. Because of this the difference will be smaller in reality (without OC at the same level) because dual cores have higher clocks. With that said and looking at it price per price you can get an E6850 clocked at 3GHz for the same price as an Q6600 clocked at 2,4GHz (plus the small Penryn advantage for the E6850 making it some percent faster). As today's games currently gain more from higher clocks rather than additional cores this is why you'll mostly find dual core recommendations in hardware forums when the situation is like here and you're a gamer on a budget...

Originally Posted By: PHeMoX
A good quad core really isn't all that expensive anymore. A good 3D card will easily be far more expensive if you choose right.

In this example here it's the recommendation to choose a dual core from the E-series (or an AMD alternative if that's wanted) and go for a HD4870 instead of the HD4850. Makes more sense for a gamer as you indirectly admitted...

Originally Posted By: PHeMoX
Because it won't run.. simple as that. A HD2900XT, Core 2 Duo and 350Watt power supply WON'T boot. I know this, because I've actually thought the same thing, tried it and found out I had to buy a bigger PSU. You're pretty stubborn aren't you?

Also, why else do you think that Nvidia and AMD/ATI both recommend much higher PSUs, especially with Quad Core high-end PCs and multiple graphics cards... even for single card setups the bare minimum is about 450Watt,

Well first - if a power supply features 350 watt or even more this doesn't say a system like the one you mention will boot at all. There are other very important features like especially the 12V trail and how many Ampere you have there (and the number of trails if there are more than one). So tell me which power supply you tried with that system and give its specs...

That's also the reason why get such high watt numbers for the graphics cards: The problem is modern cards really have way higher requirements (especially on that 12V trail I mentioned) and so they had to make sure that nobody will try to run their new system on an "ancient" system and so they gave those high watt numbers because those power supplies most likely would deliver what was needed. You can also see this when comparing what was told to have when you did SLI with two 6800 GT back then and what you are told nowadays. Although the power consumption of the GPUs really went up there is little to no difference (I don't know the exact numbers people were told) between the recommendations today and back then. You also can look up power supply proceedings in forums. When it's about old supplies apart from the watt it can deliever there also comes (i.e. should) the question about the 12V trail. Today you'll rarely find people with such an issue but back then you really had to ask if the 12V trail could deliver around - well let's say 18-20A or not. If not the system most likely didn't boot even if the supply had 500 watts or more...

Apart from that I'm not stubborn: Not only does my system prove you wrong but I'm also a moderator in a hardware board for some years and have recommended dozens of systems (if not hundreds up to today) with such dimensions and guess what: There never ever came back a bad review like "this system doesn't boot". Simply because you have to know which specs or recommendations to trust and which not and power supplies was one of the more special topics because people didn't simply fall into the "watt delusion" (which means more watt = better the producers advertised) but also didn't give the power supply much attention giving this an often too tiny budget buying some "junk" with high watt-numbers printed on it (which it couldn't deliver in reality at all)...

So you see my knowledge isn't based on rumors but facts and has been proven for many many times. Just don't believe everything you're told - there is so much "bullshit" out there. Things get tagged as "silent", people compare the dB numbers the producers give to compare the noise it makes - the list goes on and on and I think you too know some other examples for this...

Originally Posted By: Dan Silverman
Huh? I have not run into a single problem running any of my 32-bit components under Vista 64-bit. I have an older Canon scanner, an older Canon printer, a Netgear wireless WiFi PCI card, etc and the drivers are all 32-bit for them. However, they function quite fine. That is not to say that there isn't something somewhere out there that will refuse to run, but I have not had this problem.

Well you're lucky then - you'll find lots of people using old (well "old" depends on how you define it) pieces of hardware who just can't get their things to run under Vista 64Bit and that's why my standard recommendation here still is to check if all your stuff will run and if not if you really need a 64Bit Vista at all...

Enjoy your meal
Toast