How to get an accurate interval for Timers

I’m working on a gesture recognition algorithm for which I would like to record sensor values every 10ms. When i implement my timer however using the “Set timer by function name” block and look at the results it has an interval range of 1-30ms while the interval is set on 0.01 seconds. Is there a way to increase the accuracy on the timer interval?

Graph:

234510-blueprint1.png

Code:

void APlayer_pawn::SetControllerFromCenterPosition_Implementation(FRotator rotation1, FVector position1, FRotator rotation2, FVector position2) {
	if (recordLocations) {
		
		double millis = FDateTime::UtcNow().GetTimeOfDay().GetTotalMilliseconds();
		double deltaMillis = millis - lastMeasurement.timestamp;

		MeasurementValue val;
		val.timestamp = millis;
		val.x1 = position1.X;
		val.y1 = position1.Y;
		val.z1 = position1.Z;
		val.velocity1 = FVector(position1.X - lastMeasurement.x1, position1.Y - lastMeasurement.y1, position1.Z - lastMeasurement.z1).Size() * (1.0 / (deltaMillis/1000));
		val.x2 = position2.X;
		val.y2 = position2.Y;
		val.z2 = position2.Z;
		val.velocity2 = FVector(position2.X - lastMeasurement.x2, position2.Y - lastMeasurement.y2, position2.Z - lastMeasurement.z2).Size() * (1.0 / (deltaMillis/1000));

		lastMeasurement = val;
		controllerLocationsArray.Add(val);
	}
}

No, UE4 at least it’s game thread operates on frame to frame basis, and timers are checked every frame, so maximal accuracy is the frame render time.

Sense of time in programing is just illusion, due to different CPU clocks, diffrent compilation results and out of order behavior of modern CPUs most importently diffrent CPU time share for diffrent applications, program can’t predict how much time passed by it’s own flow like in old days (NES, GameBoy era), it can only check how much pass by looking on the clock… if you have a chance, since CPU can execute something else in time you looking for. So it is near to impossible to run code in very specific time interval.

Considering you looking for 10ms interval… why not just use tick to check sensors state? Since you planed to run in 10ms interval i assume there not much leniency involves in that operation. You need that data on the frame to show it anyway, 60FPS is 16.6ms so if you would read sensor state every 10ms you would actually read data that you would not even use every few frames since it would be already outdated before frame is completed. In case of 30FPS you would read sensor for no reason twice each frame.

So use Tick, Tick is to update object in preparation for frame rendering, you need to update game state to state of the sensor, right? so tick makes more sense to use in that case

And just factor in the delta time to maintain consistent timing between frames.

And just factor in the delta time to maintain consistent timing between frames.