posted September 24, 2001 03:34 PM
the client netrate is the maximum the client is willing to handle.
the server netrate is the maximum rate the server communicates with each client at.
If the client netrate is lower than the server netrate, the client netrate takes precedence.
If the server netrate is lower than the client netrate, the server netrate takes precedence.
If the sum of all precedent netrates used in the communication between server and clients hits the limit the servers network connection can handle, problems arise.
Yesterday we had between 25 and 30 participants. If each of those had their netrate at 10k (kBytes per second), the server would have had to prepare, pack and send up to 300kB per second - equivalent to a 2.5Mbit line, not counting the computing power needed to calculate that much data on the fly.
That means that either the server netrate has to be dropped for huge matches such as those we had last night (WMan said it was set to 10k, if I remember right), OR everyone has to lower their client netrate until both the server AND the line can handle the bandwidth.
Another problem is that some people tend to keep their netrate at 10k, even though their internet connection can't handle 10kBytes per second (equiv 80kBit per Second - more than ISDN or dialup can handle!)
Those people always cause and experience lag on populated servers.
Yet another problem is that even if your line should be able to handle the bandwidth, it may in fact not be at the very moment you need it. Internet traffic on exactly those nodes your packets are routed through can cause slowdowns.
4k is a little jumpy, but pretty safe unless you're using a 28k modem.
(32 users at 4k need a bandwidth of just over one Megabit per second (a good connection can handle that) and 32 is the max number of players allowed by TM, afaik.)