Actually...

I think I can identify one thing that may be contributing to my confusion...

This bit:

Quote:

...with 1 tick = 1 / 16 seconds = 62.5 milliseconds. The relation between time_step and the frame rate (fps) is this:

time_step = 16 / fps...


If I'm reading it right, he's saying 1 divided by 16 seconds = 62.5 milliseconds...

.. but then goes on to say time_step = 16 divided by the FPS...

... Huh?

That doesn't even make sense to me. Where is the 16 coming from and what does it represent? It's sorta introduced out of the blue and then used as "the standard" for the rest of it without explanation of what it is or why it's being used. It seems rather arbitrary.

Maybe that's what's throwing me off?



Last edited by Preypacer; 12/31/08 20:59.