time_step is the duration of the last frame (that is, how much the computer needed to display the last frame) in !ticks!, or 16-ths of a second, meaning, if time_step is equal to 1, the last frame took 1/16 of a second to display on the screen.

It isn't explained well in the manual...

So,
at 60 fps, time_step is roughly equal to 0.267 ticks.
If you want to move your entity with 60 quants per second at 60 fps, you have to do this:
Code:
my.x += (60 / 16) * time_step;


which means:
Code:
my.x += 3.75 * 0.267; // or ~1 quant...every frame...60 quants for 60 frames


!This is frame independent in theory!
At 30 fps, moving 60 quants per second the calculation becomes:
Code:
my.x += (60 / 16) * time_step;


Or
Code:
my.x += 3.75 * 0.533; // or ~2 quants every frame, 60 quants for 30 frames



I hope this sheds some light on that matter laugh

In theory , 16 / time_step should give you the FPS (meaning "wait(16/time_step)" should wait 1 sec.), but I've had my problems the last time I tried it, but it could have been some other piece of code in my game (its around 20k lines, so its hard to notice...)