No I did not say that, I said in these particular game tests the GPU clearly bottlenecks the whole outcome making it impossible to see a difference between quad core and dual core setups.
Well I slowly begin to lose your train of thought on all this. First you come up with Crysis as good example for quad core support which is probably the very most GPU limited game out there and now you "moan" about GPU limitations. It actually proves my point of quad cores currently not making too much sense just for gaming and in this special case for the dual core + stronger GPU being the better solution...
You've posted the links, not me. Anyways, Call of Duty 4 is DirectX 9, optimized to scale well.. it may look incredible and run/play great, but it's not a state-of-the-art game as in demanding the maximum of a system. It was designed around a high end single core system actually.
Well the page must malfunction with your browser then because the only thing I can think of now is that the link to Assasins Creed doesn't work correctly for you because CoD 4 is directly beneath it. If you're just arguing about that - well I can't help they put multiple benches on one page below each other but it should be clear which benches I pointed out...
This is downright false, it has increased hugely.
Well to prove you wrong:

This is form my old Asus A8N-SLI Premium mainboard which now is over three years old and from the time when SLI showed up. So you see the official recommendation not only is 500 watt and more but they also mention the 12V rail (I think I always wrote trail until now - that was wrong). Now when looking at the consumption of two 6800 Ultra that's nearly as much as a single 4870 drains. Still the recommendation of a new mainboard just says 600 watt for a fully equipped system (I hope you trust me without an image quote) although the consumption has pretty much doubled and although there nowadays also are two GPUs on one PCB. So you see there's not much about those vendor recommendations...
Also, I've tried a 500Watt before buying a 750Watt.. the rather new 500Watt actually also failed when I installed my second 3D card. You can't tell me this has to do with 12V or cables as these two new power supplies where brand new, had the latest things on it and where of the same brand with the same features except higher Watt on one of them, obviously 500Watt just wasn't enough for the system to run.. as was 350Watt when I installed my new 3D card.
Well as you see now even the vendors told about the 12V rail and you still didn't say which power supply you used back then...
Also, I've just tried deactivating cores in several games, and low and behold it did NOT INCREASE PERFORMANCE as in these tests, in fact the decreased. Explain that to me...
Well I don't know what you're talking about now so I can't explain anything. Deactivating cores which are used of course give you a lower performance. If you're mentioning that in one or two of the benches the "single cored" CPU gets a bit higher results (although only in those tests without AA afaik) that's indeed an interesting result. There was quite a discussion about this but it most certainly is about what happens when you deactivate cores on a quad (remember there's cache to split up for all the cores) and hitting a situation where this will do you good...
But if you are sceptical about those test I can link you others if you want. I'd also like to see those 40-60% gains you were talking of proven for a normal gaming setting...
Power supplies in the past tagged 350Watt probably didn't output exactly 350Watt either.
In fact, power supplies nowadays get more and more efficient, losing less energy by heat. So I would have expected that the 500Watt supply I used would run fine with 2 3D cards. It didn't and it wasn't a budget supply at all.
Well apart from the fact that the efficency has rather little influence on what a power supply can provide what I meant with this is that their for example is a "500 watt" power supply from LC Power where those "500 watt" - well they are no lie but the way they came to this number doesn't work in reality. That's also the reason why the guys at planet3dnow (at least I think it was there) don't really like those PSUs anymore because they ran tests for years and yet again they prove that this PSU literally "explodes" when delivering 400 watts. That's what I meant with this - vendors not only trick with the rails but also are a bit generous with their wattage specs - at last for some of those cheap PSUs...
Vista 64bit runs 32bit applications fine, it however can not handle x86 drivers. Rumor has it Microsoft will change this in the near future though as they want to push the 64bit platform more,
You mean they want to make the 64bit Vista support 32bit drivers too? That would be quite nice...
Enjoy your meal
Toast