Difference between Tick & Timer at same interval?

As the title says what would the difference be between using tick with an interval of 1s and using a timer with an interval of 1s?

Tick is supposed to occur every frame, meaning if your game runs at 60 fps then the tick will be triggered 60 times per seconds.
Ticks can be inconsistent because they depend on the gem’s fps.
A timer on the other hand tries to stay relative to actual time and therefore should be more consistent over a various set of computers.

Yeah I know that much but does that include pre-set intervals as well? I mean 1s should be 1s in both cases, but I guess I’m missing fundamental knowledge about how tick works behind the scenes.

That is what I meant, if I have the tick interval set to 1s is there an performance difference or something between tick and timer if both are set to 1s?

you can actually modify the tick interval in bps too so theres another thing to consider

If there is a performance difference, between tick and timer, set to the same interval it would not be much. As all timer routines function pretty much the same in terms of interrupt processing.

And the tick is a timer, there is no way around it, in that one can code a system up to always voluntarily release the processor, but that path is very fraught with problems, much better to set the Tick to X milliseconds, and let interrupt processing handle any context switch, even if it is nothing more than a “watchdog” routine".

There is one difference that between Tick and Timer. In that if you wish to have a very general puporse “Timer” routine. Then using the tick to drive the timer, is far better way to go, because then you can set it up, such that you can pass parameters to the invoked functions by the Tick. Because you are completely in control of it.

Whereas with the Timer, you do not have that luxury, it doesn’t allow the passing of a structure, as a parameter to an invoked function of the Timer firing. As soon as I saw this shortcoming, I never used timers at all, just wrote my own “timer function” using the Tick to drive it.

Hope this helps,

Thank you, TheDreamCather wrote about timers being more precise. Is that over long time or is it just overall precision each time it fires?

Actually it devolves down to, how “nitpicky” you wish to be about a timer. (Also presume that I’m talking about a game where you do not have run away processes that will just hog the processor).

The reason I say “nitpicky” is that with most Timers, or time driven events, the timer hardware in a PC/Mac, etc will not actually give a timing resolution less than 13 to 15 milliseconds. All the timing that I have done of the tick, is that it ranges between 18 to 22 milliseconds. This is on I7 (Intel) at 2.5Ghz. What the numbers are on Phones/Mobile, I really don’t know, and don’t care.

Now there are what is called High Speed timers, that will try to get to nano second resolution, but the Operating System needs to have a driver installed to handle that kind of resolution. Is it needed for a game? Nope. Are timers of that resolution needed, sure as heck are. When your interfacing to real time hardware, that is pumping in data concerning the temperature of a drill bit, when drilling for oil, or the depth, or the nature of the rock being drilled through, yep, that kind of resolution is needed.

But back to timers, TheDreamCatcher, is referring to a kind of resolution, where by, one asks for a timer to fire, at XXXXXXX milliseconds/microseconds etc in the future. Assuming that you don’t have processes in the game that are hogging the machine (especially with I/O, or a loop that is going to execute a few million times). If you check the actual time, that the timer fired, IN THE OPERATING SYSTEM, it will have fired exactly at that time. BUT…

There’s a problem, if the timer fired at that time, the “event” is just starting to be handled by the operating system at that time. This event has not reached your code yet. In order to do that, for the event to reach your code. It has to pass from the operating system (which remember may or may not do a context switch to Epic’s UE4, it may just queue up the interrupt to be handled later

(end of part 1)

(Start of Part 2)

by the operating system, so when that happens UE4, will be scheduled and dispatched, according to the priority, and algorithms for scheduling and dispatching, and if pages need to be brought in from disk, via a paging algorithm, or a swap, well that’s gonna take a lot more time).

Once UE4 is dispatched, to actually run, it will then determine, when to schedule the thread, that your blueprint is going to be running on, and at that time, UE4 more than likely will give priority to the timer that fired, so that finally, your “Timer” routine, gets control and does whatever you have it coded up to do.

So does your function that the Timer calls, actuall execute precisely at the time, that you requested? Absolutely not. As you can see, with the amount of work, that has to be done, it cannot possibly be EXACTLY when you wanted. Will it be close to that time? Heck yes it will, probably no more than 15 to 20 milliseconds AFTER your Timer popped.

What I believe TheDreamCatcher was referring too, is that if you have a function/code that is being driven by a Tick, that takes far too long to process (Remember, driving code using the tick is not bad, but you don’t want to have code driven by the tick, that takes 2 whole seconds or greater to run) then you can get in trouble. In relation to a Timer popping, this should not happen, as the Timer being executed by UE4, should have a Timer pop, being internally dispatched, at a higher priority, for your blueprint, than the standard Tick.

But with all things considered, an argument can be made, that it is more efficient, to use the tick vs. a Timer. Because tick processing is always going to happen, and we can hope that the folks at Epic, have the Tick code streamlined as much as possible, where as the Timer, is extra processing that must occur, if you use it.

Hope this helps,

There should not much difference, only in way they operate and order in which they execute (I don’t know what order myself ;p). Tick indeed is intended to run on every frame, but timer don’t magically interrupt on time pass, timer manager checks on every frame if time has passed and if did it will execute the timer.

I think timer should give better accuracy then tick as tick not really been made to be time interval and tick is only good if you want to update things on every frame and timer code has all the code needed to improve accuricy, but tick is also good in counting time on every frame as you can count time by accumulating delta time this allows you to have 100% control on time measurement and you can update things on the go according to this time. Tick actully should give you better accuracy on this case then setting timer to something like 10ms may not guaranty to execute on time if suddenly there will be next frame on 8ms, while tick is guaranty to run on every frame and you can update things on every frame.

CPU (x86 at least) don’t have timer instructions, even OS time function can not guaranty that time code will be executed in right time as it needs to run other software code on CPU, so there always inaccuracy. In other words for software sense of time does not exist and programs can only check how much time has passed since last check and act according to that and UE4 is no different from that. Fact that everything runs on same time on your PC is just a illusion, in really each program takes it’s turn on CPU to be executed, some are run on different cores to speed up things, thats why more core = better multitasking.

There is 0 difference in performance, regardless what you will do it will still execute your code in specific interval (in case of tick it will be on every frame) and your code will still take it’s time to execute on CPU (in your case in Blueprint VM as i assume you use blueprints) and it will be the same time (or else you got some conditional code) regardless when it will be executed. UE4 regardless will run Tick on objects and check timers if there are any. So performance purly depends on that interval regardless of method, the quicker it is the more your code will be executed and eat CPU performance, that is if you code does not slow down the UE4 frame per frame cycleing.

Thank you all for your insights and information, I’ve just seen people hate on tick in every thread despite the fact that you can change the interval and use delta seconds to make it work like a timer.