Is Retriggerable Delay As Accurate as Timer?

Hey, I’ve been using delay and retriggerable delay for some time on my character’s blueprint. But in time I’ve noticed they are affected by performance, Im just not sure in which ways. If they are buffered by frame rate or influenced in some other way to make the delays not exact, I’ll go through and change them all to Timers, which I may do anyway to be safe. Its just a bit of work to do since Ive been using them alot.

Are Delays as accurate and exact as Timers?

Both Timer and Laten action system is tick based so, they will trigger on next frame after time passed, so they are accurate as much as your render time.

Indeed, but Im wondering about how they act different, not how they are based

I believe the answer is wrong. Timers, Delays and Timelines are not frame reliant. I tested it myself. Told a timer to trigger every 0.01677 seconds. And even if I lock the fps to 10 (which is terribly low) it still ticks properly as if there was no restriction due to low frame rate.

2 Likes

All type of timers, delay, timer delegates, timelines are not dependent on frame or performance. What can be effected is what is happening before or after them. So in a bad performance situation, your event may call the delay late, and the things after the delay also gets delayed It may look like its the “Delay” node’s fault but for a very high chance it is not.

You are not clear enoug about the problem you are having with “late delays” so I can’t really help about it. I am personally using many delays aswell but didnt hit a performance yet.

1 Like

Maybe, there diffrent tick chanells, it definitly use ticks and deltatime to coundt time i checked code and delay and laten action system are independnent, so indeed there might diffrence in action. I didnt study code exacly i just looked at it, so i don’t know i assumes as it tick based it might be hook up to other ticks.

I guess use timers then

Yeah thats what I have learned from my experience also, but Im not sure so thats partially why Im asking here. Awhile back I had my Stamina system, particularly the regen system based on a Delay that was looping itself, and getting its first call from Event BeginPlay setting the loop in motion. The delay worked, but its accuracy depended on frame rate. The stamina regenerated based on performance rather than an exact amount of time. I switched it later after discovering that, over to a Timer. The Timer is exact and seems to always be accurate. Its from this experience that made me start to get curious about the differences between Timers and Delays as far as accuracy.

As far as my use of the delays and retriggerable delays, many if not most are used to prevent flow until that flow is broken, so that the last call going through finally can count down the retriggerable delay and set off a bool or something. Others are just simple delays there to space out time before something can be done like usual. But the more I think about it, the more I wonder if timers might be better to use instead of the retriggerable delays, since they act very similar but at least I know that the timer is exact. I dont know enough about how these work since I dont know a lot about how they was created in C++. My fear is that someone with a faster running PC might have the delays trigger quicker and give them an advantage in multiplayer, or perhaps the Server and the Clients will have differences handling them because of performance if that makes sense

I believe the reason Timers are working better than delays is that, Timer is a self contained tick with a delay in itself, while a regular Delay or a Retriggerable Delay needs something to call it before.

So again, the delay not working proper on regen is not really delay’s fault, but how good it is being called. In this case a Timer can be far better.

About multiplayer-related delays. On such things its the server that should take care of the delay, not clients. So what you need to worry about is, how good or bad the server’s own performance. :slight_smile:

I just made another check about the difference between Delays and Timers and it seems that Timers are indeed more accurate when it comes to low fps.

An event ticked by delay 4 times per 0.016sec with max 10 fps lock = Happens 3 times, missing one.

An event triggered by timer 4 times with a loop triggering each 0.016sec with max 10 fps lock = Happens 4 times, missing none.

I believe I will get rid of as much as delays I can too.

PS: This is still not the Delay node’s fault. It’s just the event is frame reliant while the Delay node is not. Yet Timer works better because it has a loop on it’s own, free from outer factors.

Ah I see, yeah that would make a good bit of sense and thats what I was also assuming but just wasn’t sure on. I guess the moral of this story is:

Rely on Timers for loops?
Delays are reliable for non loops?

Either way though thank you for the fast response and doing a bit of your own research. A part of me knows what to do from experience but being paranoid and wanting to make sure I’m understanding what Im doing as much as I can, I gotta ask these kinds of things even if they come off stupid haha.

As for the server thats what I was assuming also ha. I just need to make sure that all those sections of my code are running on the server I suppose. Ive been trying to get the client to handle as much of it as it can and have the server run the important bits, but Im not sure how that will work out since it leaves too much up to the client machine

Right, I dont need it to be that exact though. I’ll leave it to the people who designed and manage Blueprint to sort those quantum timings out. As long as it functions and functions for multiplayer I’m all for it. Also rather chilling end to your statement there, with the smiley face and all.
Right, well if my game was running at low frames I think timers would be the least of my issues there. On highest so far its at around 90 fps, of course it will most likely end up around 60 and depending on the map and other variables.

We might be speculating at this point if we are now talking about performance and other things, so I suppose I’ll stick with what I seem to know now and wait till I reach a valid fork in the road to where I decide to change Delays into Timers and such. I brought it up here now because I wanted to be preemptive about it, but the information I got here filled in the blanks I needed to know, so Ill consider it indeed answered.

What you need to take care of is, where the delay is being triggered.

I noticed that, Timers are only good enough when you have good 60+ fps performance.

There is this one problem with blueprints that, despite the fact that they are still as fast as C++, they actually not. A massive blueprint code could still cause the delay to be triggered with a very small delay. Examples can be that, a blueprint event calling another blueprint is actually not THAT instant, its just almost impossible to see but it’s still there.

Your main problem is currently not delays, but performance. There should be a way to optimize your game. Because as long as you grow up your game things will pile up and even timers won’t save you. :slight_smile:

1 Like