I have some particles that are filling a room with smoke & fire. Since the framerate would become unacceptable, I'm using fog to make the bulk of the level appear to be smoke-filled.
I want the particles to only appear as the player gets within a certain range of them. Since particles are not affected by fog, I am doing this to simulate that they are being affected. I've setup some distances where the particles will start to appear and a range of distance where they will slowly fade in to view.
I calculate what I call an "alpha_factor" variable based on the distance of the camera from the particles. Then I use this as a global value that affects the opacity of the particles. So if the as the particles are created, the alpha_factor is added into any lines that adjust the particle's alpha. This affects the particles in a general manner, but allows them to adjust their own alphas as they live. So if "alpha_factor" calculated to .20, this would make the entire particle effects 20% of their normal opacity. Surprisingly, this part is working very well.
The problem is that when the entire particle effect (we're probably talking 1000 individual particles here) is set to a low alpha value, and is only faintly visible, there is what looks to be alot of dithering present.
The dithering doesn't look too bad if I have the resolution set to a high value, but I need to keep this set to something around 800x600 because of framerate issues. What makes the dithering look so bad is that I can see many tiny black pixels. At the lower resolutions, this stands out.
Is there any way to remove this dithering?
I'm using 32 bit mode and the particle bitmaps are 24 bit images, so I can't figure it out.
Thanks,
Ron