Why is deltaTime different in BP and C++?

If I add this to my C++ Tick() in characterController

 Sleep(1000);

Then; C++ says the deltaTime is 400 ms while BP says it is 125 ms

Max Physics Delta Time is set to 0.033333

So, if Physics, C++ Tick & BP Tick all uses different values there is a risk things can get out of sync.

Is this Undefined behavior or can i about this somewhere?

Generally you should trust delta time… what why there no documentation about it, all you can do is study source code, this is main code for ticking:

https://github.com/EpicGames/UnrealEngine/blob/c9f4efe690de8b3b72a11223865c623ca0ee7086/Engine/Source/Runtime/Engine/Private/LevelTick.cpp

If you want 100% to have something in sync, then make multiple object rely on one object data

Also considering you freezing code, everything will freeze in that 400ms or 125ms point diffrence

I’d prefer reading “official” documents about this so I know what to expect, so my assumptions won’t break during some future update :confused:

There nothing more official here then engine source code, it’s truth it self. What you actually doing that you so obsessed over Delta time?

I’m trying to make a game that works correctly on all computers at varying fps.

I’m new to using variable DeltaTime.

Before I hardcoded my games for 60 fps and enabled vSync& then it would just work deterministic.

Now I’m learning Unreal and don’t know the correct way to make games with it.

I want to make something like this
https://www.youtube.com/watch?v=ny7mLmslkCg

But I don’t know how to do that flawlessly.

Do I have animations drive the gameplay?

Or do I have to preprocess everything into physics calculations & events and sync with that?

Hey -

I compared DeltaTime in code by printing it every tick

GEngine->AddOnScreenDebugMessage(-1, 2.5f, FColor::Blue, FString::SanitizeFloat(DeltaTime));

I also created a blueprint from the class and printed DeltaTime from the Event Tick node. In both instances it printed .008333 to the screen. How were you measuring the difference in DeltaTime when you were testing?

I do this in my c++ character controller tick:

34537-bug_cpp.png

And I do this in my blueprint debug hud (widget blueprint)

34538-bug_bp.png

I did run as Standalone from the editor

The sleep is to easily reproduce lag, it can also be triggered by alt+tabing and dragging the window.

How are you actually comparing your DeltaTime in each case? From what is shown neither of your cases returns the actual DeltaTime. Additionally, FMath::Max() is going to give you the larger of the two values, so I’m curious what the default value of maxTime is? In your code if DeltaTime is ever larger than maxTime it’s going to reset the value of maxTime (is this intentional?) In the blueprint you are taking a value of 1.0 and dividing it by a number between 0-1. This division is going to return a number that is larger than one that you’re then saving as your MyFps variable.

My point is that Sleep(1000) causes the game to run at ~1 fps

In C++ tick this is clamped to 1/2.5 sec

In BluePrint this is clamped to 1/8 sec

So my question is why they are clamped differently?

Are these clamps arbitrary?

Or are they documented somewhere?

Should C++ code substep up to when fps drops down to 2.5 fps
While BP should only substep up to when fps drops down to 8 fps

Is it documented somewhere what actually happens if the fps drops?

Dragging & moving the window with the mouse causes a temporary freeze in most programs and I don’t want this to potentially cause things to go out of sync!

I want to know the best way to deal with this!

I guess the solution is to do as suggested:

Changing this in LevelTick.cpp

	// Clamp time between 2000 fps and 2.5 fps.
	DeltaSeconds = FMath::Clamp(DeltaSeconds,0.0005f,0.40f);    

To

// Clamp time between 2000 fps and 30 fps.
DeltaSeconds = FMath::Clamp(DeltaSeconds,0.0005f,1/30.0f);

Where can I request that this magic constant will be exposed so we don’t have to compile the engine?

Hey -

I’m still a little confused on how you are doing your math. When comparing the value of DeltaTime from code against the value of DeltaSeconds from blueprints I get the same result for each. As for the proposed solution in your answer, you can submit a pull request on GitHub (www.github.com) with the changes to be merged directly into the source code.

Please close this one!

It is more a complaint over unexpected behavior than a real bug.

How can you possibly get 0.008333 if you add Sleep(1000)?

Sleep(1000) causes it to run at ~1 fps but you get 120 fps?

You know, after reading this thread. I came to the realisation, you’re off your rocker and not answering any questions from anyone else and you failed to acknowledge you made a mistake in your calculations. It would seem you’re one of those types of people that never admit they’re wrong.