Bitmask Enum declaration in 4.12, what's missing?

I’m trying to use the Bitmask Enum support in Blueprints, as introduced in UE 4.12 This article explains its usage in Blueprints, but only touches on declaring them in C++. I can’t get a C++ declared (bitmask) enum to show up in my Blueprint as a Bitmask Enum type (for integers or for using in the Make Bitmask node). My UENUM daclaration:

UENUM( BlueprintType )
enum class EState : uint8
{
  None            =   0,  // Clear flag. Required by ENUM_CLASS_FLAGS macro

  WHITE_TO_MOVE   =   1 	UMETA( DisplayName = "White To Move" ),
  BLACK_TO_MOVE   =   2 	UMETA( DisplayName = "Black To Move" ),
  CHECK           =   4 	UMETA( DisplayName = "Check" ),
  CHECKMATE       =   8 	UMETA( DisplayName = "Checkmate" ),
  DRAW            =  16 	UMETA( DisplayName = "Draw" )
};
ENUM_CLASS_FLAGS( EState )

The article mentions a BitmaskEnum= metadata tag for C++ bitmasks, but doesn’t show its usage. How/where am I meant to include that?

My aim is to eventually create an event with the current state flags as payload, but EState is not recognised as a bitmask.

Thanks.

Made some progress with this, so I’ll jot down what I found.

To declare a bitmask enum in C++:

UENUM( BlueprintType, meta=(Bitflags) )
enum ETest
{
  BITFLAG1 = 1  UMETA( DisplayName = "BitFlag 1" ),
  BITFLAG2 = 2  UMETA( DisplayName = "BitFlag 2" ),
  BITFLAG3 = 4  UMETA( DisplayName = "BitFlag 3" )
};
ENUM_CLASS_FLAGS( ETest )

Note the Bitflags meta tag and the use of a raw ‘enum’. UE will only allow you to define uint8 as an enum base, but in that case the blueprint bitwise operators complains about trying to connect a byte value to integer inputs.

The above enum will now show up in Blueprints as an available Bitmask enum. E.g. define an integer variable in your blueprint, set it to a bitmask and the Bitmask Enum dropdown will allow you to select the above enum. Flags on the bitmask will be shown as named. It also correctly populates Set and Make Bitmask nodes.

Further, to add a bitmask using the specified enum in C++ for use in blueprints, I use

UPROPERTY( BlueprintReadWrite, Category = "Test", meta=(Bitmask, BitmaskEnum=ETest) )
int32 TestBitmask;

This can be used in blueprints or in C++ with the macro defined methods, e.g. TestBitmask |= ETest::BITFLAG1 to set a flag.

So far so good. It compiles, all blueprint nodes using the enum have the correct flags and I expect we should be good to go.

New problem: The Make Bitmask blueprint node, with ETest set as its enum returns the wrong values. If I do in code

TestBitmask = 0;
TestBitmask |= ETest::BITFLAG1;
TestBitmask |= ETest::BITFLAG3;

The integer (decimal) value of the above should be 5. Indeed, if I print that out in a blueprint using ‘Print String’ then I see 5, as expected. But If I now create a Make Bitmask node to replicate the C++ one, set it’s enum to ETest and set the first and third bits, and I print the return value, I get 18.

Bug? Or is there still an issue in the declaration?

UEnum Bitflags end up getting re-mapped by index in BP, so if you manually specify the enum bit in C++, it won’t be the same once marked as an enum bitflag.

This is how I am using them in C++:

UENUM(BlueprintType, meta = (Bitflags))
enum class EMyEnum: uint8
{
	None, //0
	Item1, //1
	Item2, //2
	Item3, //3
};
ENUM_CLASS_FLAGS(EMyEnum)

...
UPROPERTY(SaveGame, EditAnywhere, BlueprintReadWrite, Category = "Stats", meta = (Bitmask, BitmaskEnum = "EMyEnum"))
	INT32 Flags;
...

bool HasFlag(const INT32& Flags, EMyEnum TestFlag)
{
        INT32 bitFlag = static_cast<int32>(1 << (INT32)TestFlag); //shift to the bit corresponding to the enum value
	return (Flags & bitFlag ) != 0;
}

Hmm…thanks for the clarification, wish these things were documented and less of a black box. However, I submitted a bug report (here) regarding the discrepancy between C++ and BP bitmasks. The Epic staffer who responded logged it as a bug, so maybe things change in the next release. I’ll update this thread if I get more info from Epic.