I run a small game server hosting business for a niche game, and I rent dedicated servers to keep costs down, each of which runs many game server instances. Game servers can be created and deleted at any time, and I need to decide which machine the game server is going to be installed on when its created. After a server gets created, it can be transferred to another machine if the admin wishes.
I'm working on improving the backend, and the main concern I have is with allocating game servers to machines. I currently use a naive algorithm:
I assign a weight to each machine, roughly proportional to the amount of CPU/RAM.
In order to very roughly approximate resources used by each game server, I assign each game server a weight depending on the max number of players that can possibly be playing on that server.
Whenever somebody creates a game server, or requests to transfer their existing game server to another machine, I calculate the following for each machine:
(sum of the weights of the game servers on the machine) / (weight of the machine)
and I allocate the game server to the machine with the lowest result.
My main concern here is with calculation of the weights for each game server. There are many different factors that determine resource usage, such as the number of players currently online, mods installed, server settings, etc. As a result, I don't think it's possible to assign an accurate weight based on server properties, even if I made a more complex formula compared to what I'm currently doing with the max number of players.
I think it would be more efficient to allocate servers based on empirical data of resource usage, either with some form of load balancing software or with a custom-built solution. But I'm not sure how to do this, and I also realize that there may be completely different approaches I haven't considered, so any tips would be greatly appreciated!