The important part is this :
Code:
if(my.movement_input == 1)
		{
			temp_speed[0] = ((( 16 / time_step ) / 1000 ) * (my.movement_speed * time_step )) * latency_time;
			temp_speed[1] = 0;
		}
		c_move(me , vector(temp_speed[0],temp_speed[1],temp_speed[2]) , NULLVECTOR , GLIDE );


This is the code that predicts where the entity should be based on "latency_time" given. It simply moves the entity forward to simulate latency_time passed.

I checked the math, although there is an easier way to get quants/ms, this works accurately. I dont understand where is the problem, because I even include the actual latency time to correct for lag. I'm testing everything on 7ms. (one comp.)
The client moves localy through C_Move , the Server moves the client with C_Move to catch up and use more accurate collision detection, and uses prediction to send the 'next' expected position.
So I'm thinking of including the prediction function on the client and to interp between the actual client position and the expected server future response.
This is all without changing pan angles or anything, simply moving the entity forward on client and server simultaneously and using this function to predict where the client should be on the next update sent. Also , will tune up the time_frame variable to get average frame time on time_step.

Maby even adding a 'speed-up' effect on the client could reduce the error given by the engine.

I dont know why it gives wrong results, cuz in theory it should be extremely accurate.

PLEASE , HELP! grin


Extensive Multiplayer tutorial:
http://mesetts.com/index.php?page=201