HiveBrain v1.2.0
Get Started
← Back to all entries
patterncsharpMinor

UDP Server Design and Performance

Submitted by: @import:stackexchange-codereview··
0
Viewed 0 times
designudpandperformanceserver

Problem

What I'm doing

I am writing a server to work as my games backend. The server is being written in C# using UDP as the protocol, while the game is in C++. The biggest thing I've been focusing on was trying to make the packet messages more OOP than what I've typically seen for game servers. As a result of this, I wonder if performance will suffer due to allocations of the packet message (IDatagram interface represents the UDP datagram).

The environment

Running a client app on a MacBook Pro with an i7 3.8ghz quad-core on OS X with 16gm of ram, I see the CPU max out. I'll show the code below for the client, and I'm sure there's a better way of trying to slam my server. I just threw it together today to see what initial numbers looked like (am I getting 5 round-trips a second or 1,000?). If I run the code on the same machine as the server using a loopback, I hit 100,000 round-trips per second. I expected that to be fast, but I wasn't expecting such a massive drop off when moving it off the surface (below) and running it across the network. I'm seeing 3,634 round-trips a second with the maxed CPU.

The server is running on a Surface Pro 4 with an i7 dual-core on Windows 10 with 8gb of ram. When I run the server it uses roughly 15% of my CPU and very little ram. With this setup, between the two machines, I'm averaging 3,634 round-trips from client, to server, back to client. This has me pretty confident considering I'm hardly using the CPU on the Surface and it's only a dual-core. I do however see a GC happen every 10 seconds roughly, which has me a little concerned that I could do something better with my allocations.

This shows the GC frequency. I know that the diagnostic tools in VS are not accurate representations of the server running in a production environment with a Release build, but I use it to determine worse-case scenario with performance.

The Design

I have a general datagram interface that represents all datagrams, as IDatagram. The client and server

Solution


  • First things first: you should check out C# naming guidelines and try to follow those.



-

I'm leaning towards converting all of the IDatagram implementations in to structs

Don't. You should read this page, which explains when you should use struct and when you shouldn't. The relevant part is:


AVOID defining a struct unless the type has all of the following
characteristics:



  • It logically represents a single value, similar to primitive types (int, double, etc.).



  • It has an instance size under 16 bytes.



  • It is immutable.



  • It will not have to be boxed frequently.




Your datagrams violate rules 1,2 and 3. If you are certain that garbage collection is going to become a bottleneck, you should implement obejct pools for your datagrams. Same goes for byte arrays and memory streams you are going to use: you gonna need a byte array pool. Also as you player base grows, you won't be able to handle all the connections on the single server due to performance and bandwidth issues. Sooner or later you will have to scale your system, so it uses multiple servers. Keep this in mind, when makig design decisions.

-
I'm not sure I understand why do you need different datagram interfaces for server and client. It looks like an overkill to me. The header should already identify where the datagram comes from. By using different interfaces and different baseclasses, you will have troubles implementing datagrams which could be sent by both server and client.

-
bool IsMessageValid(); method should be bool IsValid { get; } property.

-
DatagramNames implies, that this class contains a bunch of names, i.e. strings. But it is not the case. Maybe you should rename it to DatagramCodes or something.

-
DatagramFactory looks like an attemt to reinvent a DI container. :) You should consider using one of the existing frameworks, such as Casle.Windsor, Ninject, etc. It will allow you to register and resolve datagrams with a few lines of code.

-
Using attributes to store things such as protocol version can lead to troubles. Only use it if you are absolutely sure, that all your clients are always going to use up-to-date software and you are not going to need your server to be compatible with older versions of clients. Protocols tend to change, and it is usully a good idea to send a protocol version with a datagram. This extra byte can solve a lot of problems in a long run.

-
while (server.IsRunning())
{
    await Task.Delay(1);
}


don't use Task.Delay or Thread.Sleep in a loop with 1ms delay. This loop does no meaningful work whatsoever yet it can easily consume up to one CPU core. You should figure out a way to either put your main thread to work or put it to sleep until you explicitly signal it to wake up (by using WaitHandle, for example).

-
In you server code you use memoryStream.GetBuffer(). This method returns the entire inner array of memory stream, which is larger, then the datagram size. This will lead to larger traffic. To avoid this you should also save the actual datagram size, and use it instead of buffer.Length. Also, if you use default MemoryStream constructor, it will dynamically change the inner array size hitting both performance and memory usage. Ideally you want to use MemoryStream(byte[]) instead.

-
I would recommend against using sockets directly. They use pretty obsolete callback-ish api which was the way to go in older versions of C#, but nowadays it is pretty much replaced by TPL and async programming. Instead you should consider using the highest level of abstraction that is available. In your case its going to be UdpClient class. Which has a much cleaner API and is way easier to use and debug. Check this answer out for simple example.

Code Snippets

while (server.IsRunning())
{
    await Task.Delay(1);
}

Context

StackExchange Code Review Q#122969, answer score: 2

Revisions (0)

No revisions yet.