Time and framerate independance

Hey guys,

I’m new to the engine and my knowledge is probably inaccurate, so I’m not completely sure what the question is in the first place. I’ll give you a brief description of what the problem is:

Trying to create simulations of missiles, and precision in timing is rather important (or my stuff misses by a few hundred meters). I’m trying to create the simulation to be frame-rate independent. Here the available information seems to be a bit conflicting.

It’s being said that “Apply force” is frame-rate invariant, and with the following setup, it does indeed perfectly counteract gravity, regardless of frame rate.

However, if I want to apply a force over time, I’d assume I use a timer:

Assuming that ‘time’ itself is frame invariant, I’d think the later case should produce the same result. However, depending on the timer frequency the amount of force applied actually changes, and setting a -G acceleration on an object with 1kg mass, does not make the object hover.

Similarly, if I use a timer to fire an otherwise ‘good’ (supposedly tick invariant?) thruster for a brief time, like this:

I get a massive invariance in the force being totally applied. (In the above example with a force = 8000, time = 1, the 1 kg object without linear damping reaches a maximum altitude anywhere in between 1500 and 2400 meters.)

This makes me think that either

  1. apply force is actually not framerate-invariant, but it’s broken in the same way as gravity, so when used together they appear to work correctly.
  2. or time itself is not framerate-invariant?

The second case seems to be the logical explanation that I can think of, but that would defeat the point of using timers in the first place. If a unit of time elapsed is proportional to the number of ticks being calculated instead of the actual time elapsed I fail to see how in-game time is supposed to be used (practically timers are simple tick counters that fire with the ticks?).

I might be missing something here, so first of all what could be the actual problem? Given that, what do you think is a suitable workaround? (to make sure the amount of force applied is always constant, and the time period is close to constant)

Force is time rate invarient, the timer is not. Multiply your (current seconds * delta time) to get a frame rate indepedent time value, and pump that into your timer. Also, bear in mind that even if you force & timer are time rate independent, your actual movement updates may not be. Anywhere you are going to be using a time variable, you need to multiply it by delta time to get your actual framerate independence.

Could you please provide an example of this being implemented? I’m not sure how I’m supposed to call the timer, as delta time seems to be a constantly changing value. The closest thing I found would be “get world delta seconds” but that seems to return zero if it’s not requested as part of an “Event Tick” event.

There is a better explanation of Time*deltaTime there (including images). Also, let me restate something. The timer, to my knowledge, is based on the tick rate. Anything that affects your tickrate would affect your timer. As a possible solution, if the timer is not working right, you could also :

get seconds node
get delta seconds

time = seconds * delta seconds (returns framerate independent time)

There is a lot you could do with that, depending on the level of precision you need. You could pipe the result into a modulo node with the modulous being the value you had set on your timer. Floor the results and check each tick to see if the result = 0, if so, fire your event. Not as straightforward as timers, but it is framerate independent(mostly).

Also something to look out for:

Thank you very much for your answers, seems like I’ve got to a point where I understand what my problem is. :slight_smile:

If timers were in fact tick rate dependent, that would explain why the method on the 2nd blueprint doesn’t work. Based on my internet searches I had the same initial conclusion as you have right now, but this still seems to be conflicting with what I see in-game. Since this was still bothering me, I’ve decided to run some quick tests with timers:

On short terms, timer times and delta time seems to stay consistent.

(If UE4 runs for a longer period the error seems to increase (last column), implicating that what you are saying is in fact true, and I’m just being a stubborn idiot right now. On the other hand, I noticed that if I run more time-consuming calculations inside the timer, it tends to lag behind, which makes me think that the “repeating” part of the timer is fired after the given function has returned, and not immediately after the timer has expired. This would explain why the error increases over time, while the timers are actually still tick rate independent.)

If the timers are tick rate dependent, the only way you could achieve a tick rate independent simulation would be constantly referencing system time, which makes little sense to me. On the other hand, if we assume that timers are not tick rate dependent, I’m clueless why the 2nd implementation works the way it does.

This would be probably easier if I knew how exactly these things are implemented in the c++ code, but my knowledge regarding handling something as big as the engine is failing me, I have no idea where to even start looking. We deviated quite a bit from the original question. Should I maybe re-phrase the original question to reflect what I’m interested in?

Although I don’t have a complete solution to the problem, further investigation revealed that the question at hand is more complex than I originally thought, and should probably be investigated in smaller parts. Therefore the original question is obsolete, and I’m marking it as resolved.