Quote:
time_step is the duration between frames in ticks.
one tick = 1/16 of second.


So, a framerate of 10 fps would have 160 ticks per second.
but a framerate of 100 fps would have 1600 ticks per second.

If that's the case, time duration based on time_step would be totally dependent on cpu or graphics muscle. Or so it appears to me.

I must be missing something, because on my computer both methods give the same result regardless of frame rate.