Im using A5.51
I have just run a test of a network game over the internet that runs fine over a 10MB network. now i have worked extensivly on reducing the amount of data to a minimum by using client side functions etc...
Now heres what happened over the internet conenction - The frame rate on both the server and the client dropped dramatically. with 4 FPS on the server and about 10 FPS on the client. the frame rates are normally 60+ when running over a Lan and i just dont get it. Have i missed something. I have set pos_resolution = 0 to help speed it up but no change. I tryed the connection the other way round with my most powerfull machine running the server and it still dropped to 10FPS on the server with the client running at about 20FPS max. i dont understand why if a client connects the whole thing pretty much dies on me. The server should not be affected by the client speed should it
Now the connection i was using had a high latency time of around 800+ms which i expected to effect the results but i definatly did not expect this effect being done on the server.
for a general idea of the data being sent the values in the debug panel max out at about 450 over a lan connection but i did not get very far on the internet and that maxed at about 80 (before game starts).
Is this normal (i hope not) or am i missing some vital ingredient
Thanks