I can’t find any good examples, but what would the correct way to use a UENUM as a Bitmask?
For example:
namespace EBitMask
{
enum BitMask
{
BM_FirstValue = 1 UMETA(DisplayName = "First Value"),
BM_SecondValue = 2 UMETA(DisplayName = "Second Value"),
BM_ThirdValue = 4 UMETA(DisplayName = "Third Value"),
BM_FourthValue= 8 UMETA(DisplayName = "Fourth Value"),
BM_FifthValue = 16 UMETA(DisplayName = "Fifth Value"),
//
BM_Max UMETA(Hidden),
};
}
In C++ it should work i think as enum is enum, in blueprint it won’t as it does not know how to handle bit flags, you would need to write some blueprint functions in C++ to be able to operate them in blueprint
It’s better to use hex on flags (0x01,0x02,0x04,0x08,0x10,0x20 ect.), use as many zeros ahead as much as you need to fit all flags
That’s entirely a personal preference. I either use bitshifts when setting them, or using hexadecimals as suggests.
enum BitMask
{
BM_FirstValue = (1 << 0) UMETA(DisplayName = "First Value"),
BM_SecondValue = (1 << 1) UMETA(DisplayName = "Second Value"),
BM_ThirdValue = (1 << 2) UMETA(DisplayName = "Third Value"),
BM_FourthValue = (1 << 3) UMETA(DisplayName = "Fourth Value"),
BM_FifthValue = (1 << 4) UMETA(DisplayName = "Fifth Value"),
// ...
BM_Max UMETA(Hidden),
};
and
enum BitMask
{
BM_FirstValue = 0x01 UMETA(DisplayName = "First Value"),
BM_SecondValue = 0x02 UMETA(DisplayName = "Second Value"),
BM_ThirdValue = 0x04 UMETA(DisplayName = "Third Value"),
BM_FourthValue = 0x08 UMETA(DisplayName = "Fourth Value"),
BM_FifthValue = 0x10 UMETA(DisplayName = "Fifth Value"),
// ...
BM_Max UMETA(Hidden),
};