Just out of curiosity...you wouldn't be testing the game/demo on DIFFERENT generations of cards, would you?...like say, a DX9 ATI card, supporting those pixel shader effects and a DX7-8 NVIDIA card that doesn't?
ATI Radeon 9500+ and NVIDIA Geforce FX+ = DX9 cards...the ones before that aren't.
Re: and again : nvidia vs. radeon
[Re: iSO_BigD]
#40361 03/30/0519:5403/30/0519:54
From my reading up on my Radeon 9800 I learned that it supports 2.0 shader, so with much anticipation I tried some of the new shaders presented here...
Now ok, it all honesty its most likely that I have done something wrong, but it seems that most of these cool shaders just give me a black object, and don't seem to run.
I wonder why this is? Maybe the 9800 isnt really 2.0? Or is there some kind of difference between implementing an Ati shader and a Nvidia one? Would explain why games developers tend to pick one over the other....
The Art of Conversation is dead : Discuss
Re: and again : nvidia vs. radeon
[Re: indiGLOW]
#40362 03/30/0520:1603/30/0520:16
Quote: ah.. so i think if i want to create shaders seriously i'll have to buy me a nvidiaCard as well. oky. which one is the current state of the art ?? something like 6800??
6800 cards come as 6800LE, 6800, 6800GT, 6800 Ultra (increasing performance/price). They all support VS/PS 3.0 and only differ in speed, memory, bandwidth and pipeline count. For testing purposes a 6800LE might suffice; for regular use 6800 or 6800GT are a good deal.