Bitmask operations in blueprint return incorrect results

I have a simple Bitmask enum with currently just 3 values in it right now, and I put together a function to let me quickly access and modify specific bits of a bitmask variable on the player (using this enum).

As you can see, the function should simply turn on a bit if the boolean passed in is true, and turn it off otherwise. However, the results are not as expected; when trying to turn on a bit, it seems to access the wrong bit index. In my testing, for instance, I had the three bits set to default 100 (4), then used the function to turn on the third bit again, and turn it off after a short delay. I expected that since the third bit is already on by default, nothing should be different, but the result would come out as 110 (6). Turning off the bit has the same result and accesses the same incorrect bit index, although it does of course turn the bit off as it should.

Both the Make Bitmask node and the Player Props variable are properly set to use the bitmask enum I created. As an additional note, the result is the same with or without the Make Bitmask node.

HI patience,

  • Does this occur in a clean, blank project with no additional content or is it limited to one project?
  • Have you tried this in the event graph? Do you see the same thing happen there or is it only within custom functions that the error occurs?
  • What steps can I take to reproduce this on my end?
  • Do you have any other functionality that could be changing/overriding the bitmask data?
  • I’ve just put one together and yes, it does still happen
  • I did so in the test project and sure enough it happens in either one
  • See attachment below. In short, create a bitmask enum with some values in it, and have a bitmask variable in a blueprint (recommended changing the default values so the effect is more noticeable I imagine). Recreate the function in the image of my initial post, and run that at any point on the bitmask (replacing references to the Player Props variable with your own bitmask variable). You’ll see that the values are changed incorrectly after calling the function.
  • I do not, as this was a new implementation I was putting in to replace another system, this function is the only place where these bitmasks are modified and the function is called only one time from player input, and then again after a timed delay.

I’ve attached a completely barebones project that replicates the exact problem. Run the level, and the values will be printed to the screen as they are changed after a few seconds.
link text

Hi patience,

On my end, the values are expressed both in the function and without as the same, 6 and 4. Are you expecting different values? What specifically should I be seeing on my end?

Hi ,

Sorry the test seems a bit obscure. The problem is that the bitmask operations are accessing the wrong bit index. In the test project I attached (which I’ll be referring to from here on out), the TestBitmask variable has the ‘Third’ bit set to 1 and others to 0. Binary 1 0 0 is 4 in decimal, which is expected.

If you follow the function, you’ll see that the True branch should enable a bit using a bitwise OR operation to turn it on. In the test I provided, I try to set the “Third” bit On again even though it is already on by default, but yet as you’ve seen in the test, for some reason the “Second” bit is turned on instead, making it 1 1 0 in Binary which in turn is 6 in decimal. This leads me to suspect it is targeting the wrong bit of the bitmask; one behind the actual desired index.

The same bit is accessed in the false branch, so I believe it has to do with the enum index not matching up with the bit index it is supposed to represent. The only way this as-is would make sense is if enums are zero-based while bitmask indices are not, but that would seem like an odd inconsistency (and certainly something worth mentioning in the docs).

As I have not run this test with a larger enum or bitmask set, I’m also wondering if the Input variable using the enum treats it like a normal enum instead of the bitmask it should be. If that is the case, again it seems like an inconsistency.

Any news on this? I’m trying to convert a bitmask Enum paramater to a bitmask (of the same enum type), but the result is wrong after casting the enum to an int.

Hello patience,

After taking a quick look over your project I found that this appears to be working as intended. I see that you are converting a byte into an int within your function. This is going to have the affect of passing the current index. This means that if you select your Third option and and pass it into the “Make Bitmask” (as seen in your function) you will be passing the number 2 (010 in binary). Please keep in mind that your “Test Bitmask” variable is set to 4 (001 in binary). These two numbers are then ran through a Bitwise OR node (more info below). The expected the result would be 6 (011 in binary). I have also provided a test that you can run below that will hopefully help explain my statements above. I hope that this information helps.

Bitwise OR:

A bitwise OR takes two bit patterns of equal length and performs the logical inclusive OR operation on each pair of corresponding bits. The result in each position is 0 if both bits are 0, while otherwise the result is 1. For example: 0101 (decimal 5) OR 0011 (decimal 3) = 0111 (decimal 7)

Example:

96712-bytetointhelp.png

Make it a great day

Alright I understand the information you’ve provided, but then I have to ask what I could do to set a specific bit on a bitmask variable. I cannot set the input type to any form of bitmask; I can only set it to the enum directly. The enum byte is not compatible with Make Bitmask and not using the Make Bitmask node simply gets me the same results I had before. Is there a solution for this?

Actually, I figured out a solution. For anyone else, it just takes a little math using exponents to get a similar integer value equivalent to the bitmask you desire. Here is the working BP function: