x

Search in
Sort by:

Question Status:

Search help

  • Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both. Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.
  • You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with each other. Examples
    • cat dog --matches anything with cat,dog or both
    • cat +dog --searches for cat +dog where dog is a mandatory term
    • cat -dog -- searches for cat excluding any result containing dog
    • [cats] —will restrict your search to results with topic named "cats"
    • [cats] [dogs] —will restrict your search to results with both topics, "cats", and "dogs"

How to use a Bitflags parameter in a BlueprintCallable function

I'm trying to write a BlueprintCallable function that takes in a Bitflags parameter. Inside the function I want to use basic bitwise operations on the parameter. I was able to finally get it working but it seems hacky and like I'm doing something wrong. If anyone has any suggestions it would be greatly appreciated. This is what I have.

Enum (defined as bitflags):

 UENUM(meta = (Bitflags))
 enum class EFlipFlags : uint32
 {
     kName = 0 UMETA(DisplayName = "None"),
     kDontSpawnEnemies = 1 << 0UMETA(DisplayName = "DontSpawnEnemies"),
     kDontSpawnDust = 1 << 1 UMETA(DisplayName = "DontSpawnDust"),
     kDontEffectTVs = 1 << 2 UMETA(DisplayName = "DontEffectTVs"),
     kDontEffectClocks = 1 << 3 UMETA(DisplayName = "DontEffectClocks"),
 };
 ENUM_CLASS_FLAGS(EFlipFlags)

Function with bitflag parameter

 void ARoom::Flip(int32 flags)
 {
     ...
 
     flags >>= 1;
     bool flag_on = (flags & static_cast<uint32>(EFlipFlags::kDontSpawnEnemies)) == static_cast<uint32>(EFlipFlags::kDontSpawnEnemies);
     if (flag_on == false) // spawn enemies?
     {
         ...
     }
 }

The main issue I'm having is that EFlipFlags is of type uint32 but the parameter, flags, is of type int32. This makes it so that I have to do a kinday hacky bitshift flags >>= 1 in order to convert the singed bits to unsigned bit. My question is kind of vague but is there a better way I should be doing this? If I could just change the parameter from a int32 to a uint32 it would seem a lot better to me but the fact that the enum has to be unsigned and the parameter has to be signed makes me thing I'm doing something wrong. Any tips?

Product Version: UE 4.22
Tags:
more ▼

asked Jun 19 '19 at 03:49 AM in C++ Programming

avatar image

Blindopoly
7 1 1 2

(comments are locked)
10|2000 characters needed characters left
Viewable by all users

1 answer: sort voted first

you could use reinterpret_cast, it casting type without touching memory and reading memory as it is, leaving bits intact

https://en.cppreference.com/w/cpp/language/reinterpret_cast

You can also use Bitmask specifier for blueprint, ofcorse this wont change how C++ will deal with it, so you still need to do the above but it should help out on blueprint part:

https://docs.unrealengine.com/en-US/Engine/Blueprints/UserGuide/Variables/Bitmask/index.html https://docs.unrealengine.com/en-US/Programming/UnrealArchitecture/Reference/Properties/#asbitmasks

Not sure if this works with UPARAM for functions arguments, you need to test it

more ▼

answered Jun 19 '19 at 10:10 PM

avatar image

Shadowriver
37.1k 935 172 1116

avatar image Blindopoly Jun 20 '19 at 04:52 AM

Thanks for the reply. I'll try reinterpret_cast out tomorrow, though it still seems weird that the bitflag parameter has to be a signed integer. Also, I'm pretty sure UENUM(meta = (Bitflags)) does the same thing as checking that box in blueprints so I don't think that will change anything.

avatar image Shadowriver Jun 21 '19 at 11:29 AM

That because blueprints only support int32 and uint8 and since 4.22 also int64

avatar image Blindopoly Jun 21 '19 at 03:57 PM

Yeah, that's what I figured. Hopefully Epic does something about this issue soon though. Also, I tried using reinterpret_cast to cast the signed int to an unsigned int but it didn't change the value of it and I still have to do flags >>= 1 to get the bits to align correctly. Can you further explain what you meant?

avatar image Shadowriver Jun 22 '19 at 02:58 AM

Hmm try turning flags in to pointer and then reinterpret_cast the pointer.

They never gonna do this because it conceptional decision to support only those to keep it simple having just Integer and Byte types, they Integer64 to support bigger integers to make full use of what 64bit CPU can do

avatar image Blindopoly Jun 22 '19 at 03:53 AM

That's basically what I tried. Oh well, I'll just keep it as is... it isn't the end of the world. I just wanted to make sure there wasn't a better way to do it. Thanks for your help.

(comments are locked)
10|2000 characters needed characters left
Viewable by all users
Your answer
toggle preview:

Up to 5 attachments (including images) can be used with a maximum of 5.2 MB each and 5.2 MB total.

Follow this question

Once you sign in you will be able to subscribe for any updates here

Answers to this question