Triggers vs. On Event Tick

As far as I can see, the code that checks if a trigger should be fired or not is in C++, which means its already complied to machine code.

But the tick function is defined by you and since we are talking about blueprints, the logic inside the Tick() event is not compiled into machine code, instead it is in an intermediate state.

This would mean a trigger is more efficient than doing the tests in Tick().

Also keep in mind that the code behind a trigger are probably low level and has access to many of the internal data structures used by the engine thus making it much more optimized than a higher level logic implemented in blueprint.

What is the actual trade off in efficiency between using a Trigger and using OnEventTick?

The way I understand, a trigger will still have to be checked ever tick until the conditions to fire it are true. If that is the case, how is that more efficient? I am just trying to wrap my head around what is going on under the hood here so that I can make better decisions with how my game is implemented. A vague “Don’t use OnEventTick unless absolutely necessary” or “Use timers instead” is not very informative.

It would still be nice to get some speed trade offs listed between the time to check a trigger and the time to check say a single term event tick. (If cursec == maxsec dosomething, else cursec++)

I thought the same for some things, but that simple floating point comparison was what I was doing and I was told by staff that the Trigger event would be Faster than the OnEvent Tick. That is what got me thinking about it. The blueprint prior to the if statement was what I what above there. (If cursec == maxsec dosomething, else cursec++)

A trigger has collision detection logic in it, so obviously its going to take more time than a simple floating point comparison in blueprint. However depending on how trigger is actually implemented, this collision detection might come for free, since UE is already doing collision detection for all the dynamic stuff anyway.