You have a third party client that acts as run loop for the server and updates it over the wire, this has several disadvantages: First, your system isn't autarkic anymore and you gained a lot of "single points of failure". The second problem is latency, which you always have to deal in an network environment, but yours is unneeded. The third problem is concurrency and this is also the thing you will feel in labor tests (the other two points are most likely only noticeable in a real world environment). You will face many concurrency issues once you are logged in with your client, this goes from wrong updated labels to cheat possibilities for users.

The second big mistake is the choice of PHP, while PHP is great for dynamic websites, its unbelievable slow and costs too many performance. What you do is something like "hey everyone, please DDoS my server. kthxbye" and it won't take many clients to take you server down.

The third big mistake is scalability, one day, you just can't scale up anymore without facing really bad bugs, mostly due to the external run loop. You also might want to reconsider the choice of MySQL, while it is great and well tested (someone ran into a possible problem long before you), PostgressSQL for example might be well more suited.

There are many many more mistakes in your design, even if its just a rough "lets do it this way and look how we implement it in detail on the fly" (which is also a big mistake), but I'm not going to list all of them for a project that will be canceled in a few weeks so if you want to here more or a actual proposal for a new infrastructure with protocol, write me a PM and tell me what you are ready to pay for it.

In the meantime I have to stay with my conclusion:
Looks like the worst and inefficient "protocol" ever.

Last edited by JustSid; 06/18/11 10:31.

Shitlord by trade and passion. Graphics programmer at Laminar Research.
I write blog posts at