I am wondering how much bandwidth does a Mumble server use in the span on a month. I have a dedicated Linux server with 10,000GB bandwidth/month.
In their FAQ they have this formula and information.
Worst case scenario: Number of users × Number of talking users × 133,6 kbit/s. With less aggressive quality settings, it's ~60 kbit/s, and the bare minimum is 17.4kbit/s. Note that Mumble is geared towards social gaming; its quality enables people to talk naturally to each other instead of just barking short commands, so the amount of "users talking at the same time" can be somewhat higher than expected. This means that a server with 20 players and 2 players talking at once requires 1-3 Mbit/s, depending on quality settings. In the server's .ini file, you can specify the maximum allowed bitrate for users as well as the maximum number of clients to allow.
But I am not sure how to calculate the monthly based on that formula. Say I use there example on the high end and use 3Mbit/s and basing it on 100Mbps = 12MB/s that would mean that I use 1.296GB/hour which means about 31.1GB/day. So monthly it would use 933.12GB.
That may be right, not 100% sure on my calculations, but it doesn't seem like 1 server should be using that much bandwidth (granted that requires 20 people in there with 2 people talking at all times).
Does this seem right? or am I missing something? Can anyone with some experience doing this kind of thing give me some insight.
Thank you