How to use a Bitflags parameter in a BlueprintCallable function

I’m trying to write a BlueprintCallable function that takes in a Bitflags parameter. Inside the function I want to use basic bitwise operations on the parameter. I was able to finally get it working but it seems hacky and like I’m doing something wrong. If anyone has any suggestions it would be greatly appreciated. This is what I have.

Enum (defined as bitflags):

UENUM(meta = (Bitflags))
enum class EFlipFlags : uint32
{
	kName = 0 UMETA(DisplayName = "None"),
	kDontSpawnEnemies = 1 << 0UMETA(DisplayName = "DontSpawnEnemies"),
	kDontSpawnDust = 1 << 1 UMETA(DisplayName = "DontSpawnDust"),
	kDontEffectTVs = 1 << 2 UMETA(DisplayName = "DontEffectTVs"),
	kDontEffectClocks = 1 << 3 UMETA(DisplayName = "DontEffectClocks"),
};
ENUM_CLASS_FLAGS(EFlipFlags)

Function with bitflag parameter

void ARoom::Flip(int32 flags)
{
	...

	flags >>= 1;
	bool flag_on = (flags & static_cast<uint32>(EFlipFlags::kDontSpawnEnemies)) == static_cast<uint32>(EFlipFlags::kDontSpawnEnemies);
	if (flag_on == false) // spawn enemies?
	{
		...
	}
}

The main issue I’m having is that EFlipFlags is of type uint32 but the parameter, flags, is of type int32. This makes it so that I have to do a kinday hacky bitshift flags >>= 1 in order to convert the singed bits to unsigned bit. My question is kind of vague but is there a better way I should be doing this? If I could just change the parameter from a int32 to a uint32 it would seem a lot better to me but the fact that the enum has to be unsigned and the parameter has to be signed makes me thing I’m doing something wrong. Any tips?

you could use reinterpret_cast, it casting type without touching memory and reading memory as it is, leaving bits intact

https://en.cppreference.com/w/cpp/language/reinterpret_cast

You can also use Bitmask specifier for blueprint, ofcorse this wont change how C++ will deal with it, so you still need to do the above but it should help out on blueprint part:

https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/Variables/Bitmask/index.html

Not sure if this works with UPARAM for functions arguments, you need to test it

Thanks for the reply. I’ll try reinterpret_cast out tomorrow, though it still seems weird that the bitflag parameter has to be a signed integer. Also, I’m pretty sure UENUM(meta = (Bitflags)) does the same thing as checking that box in blueprints so I don’t think that will change anything.

That because blueprints only support int32 and uint8 and since 4.22 also int64

Yeah, that’s what I figured. Hopefully Epic does something about this issue soon though. Also, I tried using reinterpret_cast to cast the signed int to an unsigned int but it didn’t change the value of it and I still have to do flags >>= 1 to get the bits to align correctly. Can you further explain what you meant?

Hmm try turning flags in to pointer and then reinterpret_cast the pointer.

They never gonna do this because it conceptional decision to support only those to keep it simple having just Integer and Byte types, they Integer64 to support bigger integers to make full use of what 64bit CPU can do

That’s basically what I tried. Oh well, I’ll just keep it as is… it isn’t the end of the world. I just wanted to make sure there wasn’t a better way to do it. Thanks for your help.