Search in
Sort by:

Question Status:

Search help

  • Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both. Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.
  • You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with each other. Examples
    • cat dog --matches anything with cat,dog or both
    • cat +dog --searches for cat +dog where dog is a mandatory term
    • cat -dog -- searches for cat excluding any result containing dog
    • [cats] —will restrict your search to results with topic named "cats"
    • [cats] [dogs] —will restrict your search to results with both topics, "cats", and "dogs"

C++ bitmask enums appear to be offset by 1

I created the following enum for use as a bitmask.

 UENUM(BlueprintType, meta = (Bitflags))
 enum class EMovementTrackingFlags : uint8
     None = 0x00,
     X = 0x01,
     Y = 0x02,
     Z = 0x04

I then implement it as a public variable as follows.

     UPROPERTY(EditAnywhere, BlueprintReadOnly, Category = "Tracking Settings", meta=(Bitmask, BitmaskEnum = "EMovementTrackingFlags"))
     uint8 movementTrackingFlags;

This causes it to appear in the Details panel as follows:

alt text

The values may be selected with a check mark, permitting mixed flags as intended. However, when I evaluate the flags I get erroneous results. Outputting the value of Movement Tracking Flags as an integer shows that "None" is equal to a value of 1, "X" is equal to a value of 2, "Y" is equal to a value of 4, and "Z" is equal to a value of 16. This is despite their exact values being set explicitly as 0, 1, 2, and 4, respectively.

Please note that the value of "None" is designated as a requirement in the Unreal 4 Code Standard page.

Product Version: UE 4.13
flagssettings.png (27.3 kB)
more ▼

asked Sep 16 '16 at 01:31 AM in C++ Programming

avatar image

18 6 8 10

avatar image MikePrinke Sep 16 '16 at 03:15 AM

I've found a work-around by elliminating all the explicit values and removing the "None" enumeration. X, Y, and Z take the expected values when this is the case.

 enum class EMovementTrackingFlags : uint8

This necessitates work-arounds in order to evaluate the flags.

 bool UTargetTrackingComponent::IsMovementFlagged(EMovementTrackingFlags testTrackingFlag)
     EMovementTrackingFlags flagValue = (EMovementTrackingFlags)movementTrackingFlags;
     return ((flagValue & testTrackingFlag) != (EMovementTrackingFlags) 0 );


 movementTrackingFlags != 0

This is in direct contradiction to the code indicated on the Unreal Code Standards page, E.G:

     enum class EFlags
             None  = 0x00,
             Flag1 = 0x01,
             Flag2 = 0x02,
             Flag3 = 0x04

As well as...

 if (Flags & EFlags::Flag1)

This causes my code to behave and appear in editor as expected. However, I am leaving this thread open should staff weigh in on this.

avatar image Danny.Bulla Sep 18 '16 at 02:51 AM

I am also running into a similar problem.

I have my Bitflags set up as follows:

 UENUM(BlueprintType, meta= (Bitflags) )
 enum class EHeroActionTypes : uint8
     None = 0x00,
     Movement = 0x01,
     Attack = 0x02,
     Dodge = 0x04,
     Climb = 0x08,

And the following variable declared in my class:

     UPROPERTY(EditAnywhere, Category = "Actions", meta = (DisplayName = "Action Suppress Flags", Bitmask, BitmaskEnum = "EHeroActionTypes"))
     uint8 HeroActionsToSuppress;    

Which ever value I set HeroActionsToSuppress to via the editor it is always 1 value higher when I inspect it in with the debugger. For instance:


You would expect HeroActionsToSuppress to be 0x02 in this case but the debugger reports it as having the value of 0x04 instead of 0x02.

bit_mask.png (9.0 kB)
avatar image Danny.Bulla Sep 19 '16 at 05:36 PM

I have a theory that this is an issue with how the editor sets the value of the BitFlag instead of it being wrong inherently through C++. I'm going to dig in a bit more today.

avatar image MikePrinke Sep 19 '16 at 05:55 PM

This observation appears to be correct. I noticed when doing this that values set at the C++ level evaluated correctly, but values set through the editor would be offset as noted here. The only way to fix them was to explicitly set the integer value -- not using hex or bit shift operators, but with a constant integer: 1, 2, 4. If I did that, everything behaved consistently.

It's pretty inconvenient.

avatar image Danny.Bulla Sep 19 '16 at 09:30 PM

Unfortunately setting them to a constant integer did not resolve the issue for me.

 UENUM(BlueprintType, meta=(Bitflags))
 enum class EHeroActionTypeFlags : uint8
     None = 0,
     Movement = 1,
     Attack = 2,
     Dodge = 4,
     Climb = 8,

avatar image MikePrinke Sep 19 '16 at 10:00 PM

Ah. I did have to get rid of "None."

 enum class EMovementTrackingFlags : uint8
     X = 1,
     Y = 2,
     Z = 4,
(comments are locked)
10|2000 characters needed characters left
Viewable by all users

2 answers: sort voted first

Ok - After talking to the Epic staff this appears to be the correct solution. Posting here so others can utilize this awesome feature!

There does appear to be an inconsistency between the Coding Standards and how the Bitflag enum should be declared. See the Properties Documentation for the proper syntax.

Here is an example of how to declare a Bitflag enum:

 UENUM(Blueprintable, Meta = (Bitflags))
 enum class EHeroActionTypeFlags

Note that I'm not using the ENUM_CLASS_FLAGS macro any more. Instead I've created three macros that handle the testing for me:

 #define TEST_BIT(Bitmask, Bit) (((Bitmask) & (1 << static_cast<uint32>(Bit))) > 0)
 #define SET_BIT(Bitmask, Bit) (Bitmask |= 1 << static_cast<uint32>(Bit))
 #define CLEAR_BIT(Bitmask, Bit) (Bitmask &= ~(1 << static_cast<uint32>(Bit)))

And an example of me using this macro to test if an action is allowed:

         if(TEST_BIT(HeroActionsToAllow, EHeroActionTypeFlags::Movement))
             bMovementAllowed = true;

UE-32816 is tracking the issue but as noted this is actually the designed behavior. You need to shift your enum value to do all of your testing, setting, and clearing.

more ▼

answered Sep 22 '16 at 07:16 PM

avatar image

91 3 4 6

avatar image MikePrinke Oct 03 '16 at 03:52 AM

Thanks so much for doing the legwork on that one!

avatar image Vertex Soup Dec 11 '16 at 03:23 PM

Does that mean enum need to be declared in code as uint32? I'm referring to above macros. I'd like to use it as well in code I'm just not sure yet how to declare them properly. It's a mess :)

avatar image Muit Apr 30 '18 at 08:59 AM

Had the same problem and ended doing something similar.

I declared the enum normally, as you did.

But for operations I would instead create a conversion from your enum to flag:

 #define TOFLAG(Enum) (1 << static_cast<uint8>(Enum))

That way, you can just do normal flag operations:

 bool bFlagFound = Flags & TOFLAG(EGridPointFlags::Discovered)
 // or
 bool bFlagFound = Flags & TOFLAG(MyEnumVariable) 

That was my final solution, found it to be the easier way.


avatar image Ryan Darcey May 20 '19 at 12:03 AM

For those that need to expose the UENUM as a BlueprintType and need to get beyond the uint8 limit, you can define the UENUM using the namespace method:

 UENUM(BlueprintType, Meta = (Bitflags))
 namespace EPlayerInputFlags
     enum Type

With that, you can have variables and functions like this:

 // Determine whether player input is enabled/disabled
 UPROPERTY(Category = "Input", EditAnywhere, BlueprintReadWrite, Meta = (Bitmask, BitmaskEnum = "EPlayerInputFlags"))
 int32 PlayerInputFlags = -1;
 // Returns true if player input flag is set
 UFUNCTION(Category = "Input", BlueprintCallable)
 bool GetPlayerInputFlag(const TEnumAsByte<EPlayerInputFlags::Type> InFlag) const;
 // Sets specified player input flag
 UFUNCTION(Category = "Input", BlueprintCallable)
 void SetPlayerInputFlag(const TEnumAsByte<EPlayerInputFlags::Type> InFlag, const bool bSet);

(comments are locked)
10|2000 characters needed characters left
Viewable by all users

As per UE-32816,

This is actually as-designed behavior - the enum values are currently assumed to be flag indices and not actual flag mask values. That is, the editor will compute (1 << N) where N is the value of the enum. This is consistent with how user-defined enums work on the editor side. Can see how it would be useful to be able to designate native C++ enums as literal mask values though, especially when used alongside ENUM_CLASS_FLAGS().

Fortunately, to get the OP's desired behaviour all you have since UE-32816 is add the following to the top of your enum declaration:

 UENUM(BlueprintType, meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor="true"))

more ▼

answered Nov 01 '18 at 07:25 AM

avatar image

116 1 3 3

avatar image Scylardor4242 Jun 10 '19 at 10:14 PM

Thank you for this addition! I think it was the missing piece preventing me from correctly using bitflags. I either had all bitflags values, but wrongly mapped to the previous one, or the last one was missing and the first one couldn't be ticked. So, for reference, here's a code sample that works the expected way on my machine (UE 4.17):

 UENUM(BlueprintType, meta = (Bitflags, UseEnumValuesAsMaskValuesInEditor = "true"))
 enum class EPointSequenceType : uint8
     Linear = 1,
     HorizontalSineWave = 2,
     VerticalSineWave = 4,
     Whirl = 8
 UPROPERTY(EditAnywhere, Category = FLZSplinePCG, meta = (Bitmask, BitmaskEnum = "EPointSequenceType"))
 uint8 SequenceType;

(comments are locked)
10|2000 characters needed characters left
Viewable by all users
Your answer
toggle preview:

Up to 5 attachments (including images) can be used with a maximum of 5.2 MB each and 5.2 MB total.

Follow this question

Once you sign in you will be able to subscribe for any updates here

Answers to this question