Timer bug using Get Game Time

Hey guys!
I’m making a simple timer for my game. I want it to look like 0.1… 0.2… 0.3… So, I just get Game Time and multiple it times 10, round (I tried truncate, round and all different approaches, all the same) and than just divide it on 10.

It works, but on exactly 16 second I start to get weird values like this:

And what’s interesting. It works good without dividing on 10. But it’s not exactly what I need. So, maybe problem with casting from int to float?
Looks like an variable type overload or even engine bug. Hope you help me guys!
Cheers.