I'm working on a simple multiplayer game with Socket.io and Node.js. I ran into a few performance problems and tried debugging to find out where there could be a memory leak. What is happening is that the CPU usage keeps increasing by about 10% for every player added to the game, and then stays steady until I eventually add another (it increases another 10%) or disconnect a player (decreases by 10%).
After I looked into it for a while I found out that the reason for such an increase is the 'emit' on the server side. The server looks something like this:
var clients = [];
io.sockets.on('connection', function(socket){
//logic for adding this player to the game arena, that runs
//only one time (when the client connects)
var id = addToGame();
//add the id of this player in the 'clients' list
clients.push({
id: id,
socket: socket
});
socket.on('disconnect', function(){
//remove the player from the game.
//delete the client'id from the list of 'clients'
});
});
setInterval(update, 30);
function update() {
for (var client of clients) {
var player = findPlayerById(client.id);
var message = getMessageForPlayer(player);
client.socket.emit('update client', message);
}
}
I thought the problem was with the fact that I was using anonymous functions a couple times, so I changed that but it didn't seem to have done anything.
So then, after messing a bit with the code, I realized that when I comment out this line
client.socket.emit('update client', message);
the CPU usage doesn't seem to be increasing at all when new players come along. Which kinda makes sense because the game runs so that there is always a certain given number of players in the game arena. Those players are initially CPU controlled and do exactly the same things a human player would be able to do. When a human player joins the game, they simply take the place of an existing CPU player, so that the computations that take place in updating the game are roughly the same regardless of whether all players are CPUs or actual human players.
The only difference is in the computation that runs right before emitting the 'update clients' message. That only happens for human players. And so I thought it would make sense that because that for loop queries the game for every single human player, it could be the reason for the increase in CPU usage. But to my big surprise, even if you leave all those computations as they are, and simply take out the 'emit' part, the problem disappears (the CPU usage stays steady no matter how many players you add).
I figured it's something I should expect since I'm using a testing dedicated server from Digital Ocean, with 1Gb of RAM and 1 CPU (also the emitted message is an array of about 150 elements in average, consider each element to be an array containing 10 strings as elements).
My question is: how do I go about getting a server with a CPU capacity that will allow me to host in the 500 players or more? With the one I currently have, the game logic itself takes about 40% of the CPU, and each player added increases the CPU by 10%, hard to reach a mere 10 players without the CPU usage reaching 100% and rebooting.
I was about to get a 32Gb, 4-core, server from OVH, but I want to get the point of view of people that know better than I do on this. What should I look for in a CPU to be able to emit those messages without the CPU failing?