Dedicated Server Performance
We're currently developing a client-server game using Unreal Engine 4.19.2.
At the moment we are trying to optimize the server side of the online game. Our goal is to use the computing power of the hardware server as efficiently as possible.
For testing, we use a server based on the Intel Xeon E3-1220 processor with 3 GHz clock speed, and Ubuntu Server 18 LTS, running under Windows 10 hypervisor as OS. The hypervisor allocates the guest Ubuntu 30 GB RAM and maximum priority to all 4 CPU cores.
We configured the engine for our game so that the Dedicated Server should tick 10 times per second (NetServerMaxTickRate=10), that is, the process of the Dedicated server has 100 ms to perform all its calculations in time.
According to the data obtained from profiling by standard methods, it can be seen that GameThread uses about 6 ms to perform all calculations; and the rest of the time the thread sleeps, which can be seen from the values of statistics variables from profiling: FEngineLoop_UpdateTimeAndHandleMaxTickRate = ~ 94 ms.
At the moment, we managed to run 45 Dedicated Servers under load and at the same time play on client PCs without any lags. In this case, the server OS shows 100% CPU usage, but there is still enough RAM to run the OS, and to run about 10 servers of the game simultaneously.
The question arises: why are the processes of the 45 servers so CPU intensive? Considering that the effective time of server calculations does not exceed 10 ms (if you put together the execution time of all threads of the Dedicated Server), then theoretically, you can run less than 100 servers, but certainly more than 50.
The most possible cause of this problem, as we think, is the "struggle" of the Dedicated Server for OS resources, namely, when in a certain CPU cycle more game servers "wake up" than the hardware can handle simultaneously.
Could you please help us solve this problem, if you have any ideas. We assume that you probably faced a similar problem of distribution of the load on the server when you deployed the infrastructure for the game Fortnite, or other games. Or, perhaps someone from the Unreal Engine 4 community may have solved this problem?
asked Apr 24 '19 at 10:31 AM in C++ Programming
If we assume that the UE uses a processor with 100 efficiency, then based on your calculations, you can run 4 (cores) 1000 (ms) / 60 (10 ticks 6 ms) = 66.7 servers. It is not clear to us why the UE uses a CPU with such low efficiency and is it possible to increase it.
answered Apr 25 '19 at 06:47 AM
Follow this question
Once you sign in you will be able to subscribe for any updates here