We are working on an interactive VR film project that involves playing back filmed material in different screen formats (360’, flat 16:9, etc.). An approach that we are considering for implementing smooth and gapless transitions between screen formats would require the following:
When a video is playing in a Media Player, is it possible to determine the exact timecode/index of the video frame that will go on screen after the current tick?
The main requirement is that we need to do modifications to the scene (switch the playback surfaces) precisely on a given frame, robustly. A C++ solution is ok (and seems inevitable).
Our current solution involves playing the film through two separate media players when in the proximity of a screen format switch, which in turn results in a video decoding performance bottleneck that we are trying to get rid of.
Platform: Windows
Video file(s): h.264 or h.265, 4K, stereo, would love to hit 60fps
Planned minimum system specs: VR Ready PC
Some possible approaches:
-
UMediaPlayer::GetTime() (BP: Media Player > Get Time): The value returned by this function seems to lag behind the actual playback position by several ticks, so it does not seem usable.
-
Embed frame markers in the video file, then read them from the media texture in UE: Based on some AnswerHub answers, it appears to be possible to access the actual content of the video frames via the video texture object. Consequently, reading markers from the video during playback should be possible. However, I have the suspicion that this approach might not give reliable access to the exact frame that is going to go on-screen now (for example, if the decoder runs in its own thread, if it writes directly to GPU RAM and the game thread gets a copy only later, and so on). Am I correct with these suspicions, or does this sound like a viable approach?
Predictively compensating some lag is not an option, because the prediction could be off a tick or two quite easily, which would give undesirable results in our case, especially in VR.
I would be grateful for any thoughts, or for any ideas for alternative approaches?