im trying to achieve camera movement that deceases in speed the closer the camera gets to its optimum position, above the player model. the obvious way to go about this is by subtracting a percentage of the remaining distance to cover from the camera's position:

camera.x -= .02 * (camera.x - intended.x);
camera.y -= .02 * (camera.y - intended.y);

this is nice and smooth, but the camera would move around faster on faster hardware, and just multiplying the percentage by time_step doesn't work because the problem deals with exponents(and subtracting a percentage twice as big with half the frequency yeilds different results). Any ideas on how to factor time_step into an exponential increase or decrease?
thanks people!

btw this problem pops up in several places in my project, the camera scenario's just the easiest way to explain it.

Last edited by Chaeon; 12/14/08 01:19.