C++ bitmask enums appear to be offset by 1

I created the following enum for use as a bitmask.

UENUM(BlueprintType, meta = (Bitflags))
enum class EMovementTrackingFlags : uint8
{
	None = 0x00,
	X = 0x01,
	Y = 0x02,
	Z = 0x04
};
ENUM_CLASS_FLAGS(EMovementTrackingFlags);

I then implement it as a public variable as follows.

	UPROPERTY(EditAnywhere, BlueprintReadOnly, Category = "Tracking Settings", meta=(Bitmask, BitmaskEnum = "EMovementTrackingFlags"))
	uint8 movementTrackingFlags;

This causes it to appear in the Details panel as follows:

107221-flagssettings.png

The values may be selected with a check mark, permitting mixed flags as intended. However, when I evaluate the flags I get erroneous results. Outputting the value of Movement Tracking Flags as an integer shows that “None” is equal to a value of 1, “X” is equal to a value of 2, “Y” is equal to a value of 4, and “Z” is equal to a value of 16. This is despite their exact values being set explicitly as 0, 1, 2, and 4, respectively.

Please note that the value of “None” is designated as a requirement in [the Unreal 4 Code Standard page][2].

I’ve found a work-around by elliminating all the explicit values and removing the “None” enumeration. X, Y, and Z take the expected values when this is the case.

UENUM(BlueprintType)
enum class EMovementTrackingFlags : uint8
{
	X,
	Y,
	Z
};
ENUM_CLASS_FLAGS(EMovementTrackingFlags);

This necessitates work-arounds in order to evaluate the flags.

bool UTargetTrackingComponent::IsMovementFlagged(EMovementTrackingFlags testTrackingFlag)
{
	EMovementTrackingFlags flagValue = (EMovementTrackingFlags)movementTrackingFlags;
	return ((flagValue & testTrackingFlag) != (EMovementTrackingFlags) 0 );
}

and…

movementTrackingFlags != 0

This is in direct contradiction to the code indicated on the Unreal Code Standards page, E.G:

    enum class EFlags
        {
            None  = 0x00,
            Flag1 = 0x01,
            Flag2 = 0x02,
            Flag3 = 0x04
        };
    
        ENUM_CLASS_FLAGS(EFlags)

As well as…

if (Flags & EFlags::Flag1)

This causes my code to behave and appear in editor as expected. However, I am leaving this thread open should staff weigh in on this.

I am also running into a similar problem.

I have my Bitflags set up as follows:

UENUM(BlueprintType, meta= (Bitflags) )
enum class EHeroActionTypes : uint8
{
	None = 0x00,
	Movement = 0x01,
	Attack = 0x02,
	Dodge = 0x04,
	Climb = 0x08,
};
ENUM_CLASS_FLAGS(EHeroActionTypes);

And the following variable declared in my class:

	UPROPERTY(EditAnywhere, Category = "Actions", meta = (DisplayName = "Action Suppress Flags", Bitmask, BitmaskEnum = "EHeroActionTypes"))
	uint8 HeroActionsToSuppress;	

Which ever value I set HeroActionsToSuppress to via the editor it is always 1 value higher when I inspect it in with the debugger. For instance:

107519-bit_mask.png

You would expect HeroActionsToSuppress to be 0x02 in this case but the debugger reports it as having the value of 0x04 instead of 0x02.

I have a theory that this is an issue with how the editor sets the value of the BitFlag instead of it being wrong inherently through C++. I’m going to dig in a bit more today.

This observation appears to be correct. I noticed when doing this that values set at the C++ level evaluated correctly, but values set through the editor would be offset as noted here. The only way to fix them was to explicitly set the integer value – not using hex or bit shift operators, but with a constant integer: 1, 2, 4. If I did that, everything behaved consistently.

It’s pretty inconvenient.

Unfortunately setting them to a constant integer did not resolve the issue for me.

UENUM(BlueprintType, meta=(Bitflags))
enum class EHeroActionTypeFlags : uint8
{
	None = 0,
	Movement = 1,
	Attack = 2,
	Dodge = 4,
	Climb = 8,
};
ENUM_CLASS_FLAGS(EHeroActionTypeFlags);

Ah. I did have to get rid of “None.”

UENUM(BlueprintType)
enum class EMovementTrackingFlags : uint8
{
	X = 1,
	Y = 2,
	Z = 4,
};
ENUM_CLASS_FLAGS(EMovementTrackingFlags);

Ok - After talking to the Epic staff this appears to be the correct solution. Posting here so others can utilize this awesome feature!

There does appear to be an inconsistency between the Coding Standards and how the Bitflag enum should be declared. See the Properties Documentation for the proper syntax.

Here is an example of how to declare a Bitflag enum:

UENUM(Blueprintable, Meta = (Bitflags))
enum class EHeroActionTypeFlags
{
	Movement,
	Attack,
	Dodge,
	Climb,
	CustomReaction,
};

Note that I’m not using the ENUM_CLASS_FLAGS macro any more. Instead I’ve created three macros that handle the testing for me:

#define TEST_BIT(Bitmask, Bit) (((Bitmask) & (1 << static_cast<uint32>(Bit))) > 0)
#define SET_BIT(Bitmask, Bit) (Bitmask |= 1 << static_cast<uint32>(Bit))
#define CLEAR_BIT(Bitmask, Bit) (Bitmask &= ~(1 << static_cast<uint32>(Bit)))

And an example of me using this macro to test if an action is allowed:

		if(TEST_BIT(HeroActionsToAllow, EHeroActionTypeFlags::Movement))
		{
			bMovementAllowed = true;
		}

UE-32816 is tracking the issue but as noted this is actually the designed behavior. You need to shift your enum value to do all of your testing, setting, and clearing.

1 Like

Thanks so much for doing the legwork on that one!

Does that mean enum need to be declared in code as uint32? I’m referring to above macros. I’d like to use it as well in code I’m just not sure yet how to declare them properly. It’s a mess :slight_smile:

Had the same problem and ended doing something similar.

I declared the enum normally, as you did.

But for operations I would instead create a conversion from your enum to flag:

#define TOFLAG(Enum) (1 << static_cast<uint8>(Enum))

That way, you can just do normal flag operations:

bool bFlagFound = Flags & TOFLAG(EGridPointFlags::Discovered)
// or
bool bFlagFound = Flags & TOFLAG(MyEnumVariable) 

That was my final solution, found it to be the easier way.

Greetings!

As per UE-32816,

This is actually as-designed behavior - the enum values are currently assumed to be flag indices and not actual flag mask values. That is, the editor will compute (1 << N) where N is the value of the enum. This is consistent with how user-defined enums work on the editor side. Can see how it would be useful to be able to designate native C++ enums as literal mask values though, especially when used alongside ENUM_CLASS_FLAGS().

Fortunately, to get the OP’s desired behaviour all you have since UE-32816 is add the following to the top of your enum declaration:

UENUM(BlueprintType, meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor="true"))
1 Like

For those that need to expose the UENUM as a BlueprintType and need to get beyond the uint8 limit, you can define the UENUM using the namespace method:

UENUM(BlueprintType, Meta = (Bitflags))
namespace EPlayerInputFlags
{
	enum Type
	{
		PIF_Move,
		PIF_Turn,
		PIF_Afterburn,
		PIF_Stealth,
		PIF_Booster,
		PIF_Bash,
		PIF_Hack,
		PIF_Unhack,
		PIF_Equipment,
		PIF_Weapon,
		PIF_PassengerDecoy,
		PIF_PassengerEquipment,

		PIF_MAX
	};
}

With that, you can have variables and functions like this:

// Determine whether player input is enabled/disabled
UPROPERTY(Category = "Input", EditAnywhere, BlueprintReadWrite, Meta = (Bitmask, BitmaskEnum = "EPlayerInputFlags"))
int32 PlayerInputFlags = -1;

// Returns true if player input flag is set
UFUNCTION(Category = "Input", BlueprintCallable)
bool GetPlayerInputFlag(const TEnumAsByte<EPlayerInputFlags::Type> InFlag) const;

// Sets specified player input flag
UFUNCTION(Category = "Input", BlueprintCallable)
void SetPlayerInputFlag(const TEnumAsByte<EPlayerInputFlags::Type> InFlag, const bool bSet);

Thank you for this addition! I think it was the missing piece preventing me from correctly using bitflags. I either had all bitflags values, but wrongly mapped to the previous one, or the last one was missing and the first one couldn’t be ticked. So, for reference, here’s a code sample that works the expected way on my machine (UE 4.17):

UENUM(BlueprintType, meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor = "true"))
enum class EPointSequenceType : uint8
{
	Linear = 1,
	HorizontalSineWave = 2,
	VerticalSineWave = 4,
	Whirl = 8
};

UPROPERTY(EditAnywhere, Category = FLZSplinePCG, meta = (Bitmask, BitmaskEnum = "EPointSequenceType"))
uint8 SequenceType;

Is this still working for you guys? Im trying to use this for online and it works but i think my “TEST_BIT” isnt correct and its causing my if statement to fail