Using FMemoryReader to deserialize a TArray, corruption breaks memory allocation

I have a FMemoryReader that’s deserializing a packet of network data into two integers and another TArray like so:

FMemoryReader reader = FMemoryReader(CommandBytes);
	reader << ModValue; //an int32
	reader << IntValue; //another int32
	reader << Metadata; //Metadata is a TArray<uint8>

On occasion, this packet of data may become corrupted (technically, the decryption fails), meaning bogus data is in the CommandBytes variable.

Unfortunately, this means that upon reaching the reader << Metadata; portion, it’s possible that the array length integer is completely wrong, and the deserializer ends up trying to create an unreasonably large TArray.

I’ve tried setting limits like so:

reader.SetLimitSize(CommandBytes.Num());//Accident forgiveness
reader.ArMaxSerializeSize = CommandBytes.Num();

But it seems to have no effect?

I already know the maximum number of bytes it should deserialize (courtesy of the source TArray, it’s not compressed in any way), so I just need to prevent it from trying to deserialize into a massive TArray (logic elsewhere validates the results, the packet would just be discarded).

How can I stop this insanity?

In practice this would be fine, but in theory it’s possible to have numbers that large as valid integers (in future versions).
I think the core of the problem is that decryption is failing, and the library we’re using apparently doesn’t have a way to detect if the decryption failed (it’s XOR based). I’ll probably have to solve it at that level…

My first thought is to just not attempt to serialize that bad data in the first place. Add a simple if check and bail/print a warning if you reach some huge value.

if (IntValue > SOME_THRESHOLD)
{
   UE_LOG(Error, TEXT("Detected bad data! Skipping serialization!"));
   return
}
else
{
   reader << Metadata;
}

Essentially what said, find a way to validate the data before trying to deserialize it. My specific implementation we did a checksum of sorts.

Actually having exactly the same problem. Have you managed to solve this?

Yea, A checksum would work well. Glad you got it figured out.