I’ve managed to get a timeline to fire on a fixed interval.
The problem is that I can’t get the timer to fire on exactly the same time as the tick function.
My reasoning behind this is for optimum performance.
For example, I suppose I could just set the interval to 0.01 seconds, but then it fires between frame updates which seems wasteful. At the same time, I don’t want it firing any less frequently than the tick event, as I wouldn’t want it to look juddery.
I’ts highly doubtful that you will get the timer to fire on the tick (I would say it’s impossible, but, eventually it would happen once out of 100 billion times). The biggest reason for this is that, the Tick is not guaranteed to fire on any fixed interval that you could synch too. It will fire once per frame, but you nor I know, does it fire at the beginning of the frame the end of the frame, or the middle? It will also be impacted by the number of objects that need to have the tick to fire, how long those objects will spend processing the tick (assuming there is no multithreading going on for all objects that are doing tick processing, and I really doubt there is). So the concept of a fixed rate “tick” occuring I really don’t think your ever going to find.
In truth, as far as “jitter” is concerned, you would need to start dropping below your timer firing 24 times a second before it would actually be noticeable (a tv will be 30 to 32 FPS, and a movie in your local theater will be 24 FPS or so). So I really wouldn’t be worried about “jitter”, for the average user.unless the FPS falls below 16 or so, and if that’s ocurring, it’s highly likely something else is majorly wrong.
If you wish to have something fire per tick, just dump your timers, and process on the “tick thread” and be done with it.