Why does network serializing a bool use 4 bytes?

Heya!

I’m building a multiplayer, VR game, and I’m trying to conserve every bit I can over the network. Quantizing floats, etc. I understand that UE4’s networking uses a bit stream (ie: FBitWriter), so that’s great.

However, in diving into Archive.h I see that the SerializeBool() method, and the corresponding << operator for bools, ends up writing four bytes for every bool. This comment is found in Archive.h and Archive.cpp:

// Serialize bool as if it were UBOOL (legacy, 32 bit int).

And sure enough, tracing the code we end up with:

Ar.Serialize(&OldUBoolValue, sizeof(OldUBoolValue));

which is just the normal integer serializer, with a length of 4 bytes. And upon tracing my NetSerialize() method for a custom struct of mine, FBitWriter::Serialize() gets called with 4 bytes (plus I can see the stream’s buffer advance by 4 bytes).

This is shocking to see that the engine is using more bytes for a bool than I am for my custom quantized floats. My bit hoarding is being wasted!

Are my findings correct? Is every bool being sent across the network actually using 4 bytes? If so, why? Can’t we take advantage of the fact that we’re using a bit stream for the very property that takes up a single bit?

Thanks!

I too would like to understand why this is. It seems super backwards that it would be better to serialize a bool as char or int8 over the network, rather than as a bool.

I am a programmer, but not super familiar with C++ or game development, so this comment may just be idiotic. I promise I am not being snide; just figured I’d throw my thoughts out there:

  • They may do this because of legacy code that is still integral to the engine as a whole.
  • They may do this because cross-platform end units (PC vs Mac vs console, etc) have fundamentally different hardware/base software, which forces an developer to adhere to how that device operates. For instance you cannot install a PC game on a Macintosh, nor play a PS* game on a XBOX* unit; and vice versa for both (among other examples).
  • They may do this because someone unified the way that data is transmitted throughout the engine, for whatever reason, even though you can transmit the same data (a bool in this case) in a smaller package. For example, you could write a function that accepts different input types (int, bool, str, etc) and handles those individually or write one that just accepts one input type and determines how to handle the data type on the fly.

Again, none of this is based off of technical know-how of the guts of the engine, just stupid things “as to why it is done that way” I’ve encountered over the years programming and working with computers and devices of all sorts.

If this comment is not constructive at all please remove it!

Heya!

While it is great that you want to help out with some generic, high-level thoughts, it doesn’t really help out here. Game network programming is all about minimizing the bytes sent between machines. The reason Epic would have moved from a byte-based networking system, to a bit-based one, is to pack as much as possible into the data stream.

Legacy reasons should not apply as all live machines should be running on the same engine version. And platform differences are handled when you pack and unpack the bit stream, rather than imposing them on the bit stream.

Now, if we were talking about reading and writing files, then some of these things may come into play. Legacy formats could indeed impact what you’re writing to a file. And if I had to guess, that is what has happened with the boolean code I came across. It is legacy file writing code that accidentally got left in for the network code.

I originally wrote this question against UE4 4.17. I’m hoping that since then, Epic has updated the networking code to handle 100 player Fortnite servers. This Spring I’ll have to dive back in and see what’s up with 4.19, unless someone beats me to it and provides an answer here. :slight_smile:

Wow, I just stumbled across this issue myselft, I was just curious that my hand crafted NetSerialize was working as best as possible and stepping through the code. e.g.
bool bTest = false;
Ar << bTest;
Will send 32 bits.

I’m using UE4.26 and I’m surprised to see this is still an issue. There’s so much code in UE4 which is quantizing floats yet the bools are like 32 times the size they should be!

Easy for me to fix my NetSerialize function but it’s going to be a pain to remove all the 32-bits bools from our whole code base.

They probably just already were using oodle and it wasn’t something that would have any real impact when using that.

maybe they just forgot to write a special override for bool or maybe it had some design problem.
use WriteBit(), ReadBit() instead.

If you take a look at FBoolProperty::NetSerializeItem() - you’ll see that regular UPROPERTY() serializtion does actually serialize it to a single bit (and of course, the required header/offsets UPROP)

I’m not sure why serializing it via the << operator uses four bytes, it’s probably an oversight of FBitArchive or perhaps it’s to handle bitfields, but if you are using custom serialization you can easily serialize the property to a single bit yourself via:

Ar.SerializeBits(&MyBool, 1);