Media Framework 360 Video?

Hello,

is it possible to create a VR friendly 360 video using UE4’s Media framework? Would we go about doing that applying that to a spherical object? could we use it for a samsung gear VR experience as well?

Yes, you could map a video onto a sphere. There is at least one other project I know of that is doing exactly that. You will likely have to apply some distortion to the video itself so it looks right when projected onto the sphere.

The only problem with this right now is that the video has to have a very high resolution - at least 2k x 2k, better 4k x 4k, otherwise it won’t look good in the HMD. The way we currently upload video textures to the GPU is particularly slow for large videos, so you won’t get a good framerate. We are in the process of fixing this and hopefully have a solution for 4.8. In the meantime, you can use a low-res video for development and testing; it won’t look great, but at least you’re not blocked.

When I convert a 360 video into material and put it onto a sphere, the video seems to be reversed. How may I solve that problem?

Is this still an improvement to look for in 4.8?

Nope, sorry, I was out of office for three weeks and did not have time to work on this. Pushed to 4.9 now.

Fengkan, you have any blueprint or solution how you do this?
Would be most kind :slight_smile:

I found a solution thanks to help from Rich at Hammerhead VR. Below worked for me, started with 6 cameras upped it to 18 for the best result in combination with Autopano Video:

120 fov should work.

In order to achieve the desired result we will need to create a template stitch since our standard ue4 scene isn’t detailed enough for the automatic algorhythmns to work. I’m afraid I’ve lost the screenshots of this but I’ll try to explain it and find an example image on Google.

Essentially what we want is a perfect stitch from ue4 using the exact same camera setup used in our matinee and to splice that footage into the first few frames of each matching video.

The only way I’ve been able to do this thus far is to create a cube with inverted normals or BSP and apply a unique, high fidelity photograph to each face. Then to capture the matinees (just a few frames) This will mean that the autopano / video stitch algorithms will be able to recreate the image. Splice those frames into the beginning of each video. And stitch from that frame, this will mean that the remainder of the footage in your video will be perfectly stitched without any extra work. This process is repeatable as well, once you have your template frames you can use them over and over again on new shots. You may like to label each face of the cube with a number in order to keep track of which is which, which way is up and down, orientation ect. I had to create a cardboard net out of a cereal box, number the sides and keep it on my desk in order to keep track of how my cube was arranged.

Please see attached for image examples of a cube room and a HDR image used as a texture in that room.

When working with 360 video in real life, people generally download a template stitch for their 360 hero rig or freedom 360 go pro rig in order to speed up processing or solve any technical challenges. Since we don’t have a standard rig it won’t work for us, but the benefit is that we can place an infinite amount of cameras on the exact same pixel so there should be less ghosting but of course each video will increase rendering time and workload. This is already an incredibly labour intensive process so be careful. I think this technique will work best on static images since when the camera moves there is a higher chance of seams. You should also be aware of automatic exposure control on your ue4 cameras since this will effect the continuity of your footage between the individual shots. One problem you’ll run into is with 6 cameras you may not have enough shots with overlap to create a completely smooth stitch. The best way to get around this is to put a fish-eye warp on your lens. You could either create this as a post process on-camera effect or reverse engineer the oculus integration. When rendering in matinee for some reason the renderer takes a little while to spin up and produce high quality footage. It will be difficult to render each video individually this way and make them sync up, so instead we should sequence each camera in the director group and then split apart the JPEG frames into folders. Be aware, sometimes matinee loses frames for no reason so it’s more tricky / slower for us to import them into premiere as a frame sequence.

Great to hear this! Could you please link the commit here for reference?
Would this help also with gear VR?

We still haven’t had time to work on this. I’m currently completely backed up with other high priority, and I will be on travel all of next month. Unless someone else can look into it, this won’t happen before September, I’m afraid :confused:

@julianque, how are you getting around the current resolution limitation, or are you? We tried to import some 4k footage into UE4 and it was just not even possible to play.

so this is not going to get into 4.9??? This is the feature I was looking forward to most!! is there a way to capture out your scene to a 360 video currently I use Allar’s method https://www.youtube.com/watch?v=3m85QBjyFGE

But I can not get the frame rate of my movie textures to match the frame rate of the game even if I specifically match the frame rate of the game in the project settings to match the frame rate of the video any one know??

Definitely interested in this. Ultimately hoping for 60fps 4K playback on GearVR …

Use this:

Make a folder in Content called Movies and then import your movie folder there.
Right click the movie to make a media texture. Then right click the texture to make a material.
Put the material on a static mesh.

the whole Video play in Unreal is “hit” and miss" even in 4.11.1

Its seems fine in the dev environment… (with low res HD from a Ricoh Theta S) …and looks ok with 4k (demo footage)
but as soon as its compiled to an exe its hit and miss if it works… some setting seems to stop things working even on a PC…on a Samsung S7 Gear VR it worked once…and haven’t had it working since…

Would love it all to work so we could start using this for lots of clients !!!

I do not fully understand what makes a video playable or not with Media Framework just yet.

I asked about what works for GearVR and I have no answers:

I tried downloading sample videos that other users said they got to play with Media Framework but I have had no such luck. I am so confused. Please help.

The Android based player plug-in has pretty good support. I’d recommend H.264 encoded .mp4 files. You won’t be able to play this same file inside the Editor though, because the Windows based player plug-in doesn’t support H.264 yet (but it will work when you run on device).

The only format that works on both Android and Windows is MPEG-4 encoded .mp4 files, but the quality will be much worse.

In general, 360 video is not supported very well right now, because of performance limitations. I am currently refactoring the Media Framework to address these and other issues.

I have had limited success …but only really managed to get it to work on a PC…

I too want to go to VR Gear…but it fails miserably every time when compiled for Android…Weird screen results…

To actually get it to work on a PC I have had to go back to 4.10 as 4.11 no longer seems to work…somethings broken…

Have you tried it without GearVR to see if the movie plays on device? We have seen an odd interaction between the GearVR API rendering and the native movie player if the surface isn’t in view.

Yes the Movies play fine on the phone and they aren’t particularly high quality (working up to that !)…so not a “bandwidth” issue… all very frustrating…especially as your platform is fantastic, just the part that I need doesn’t work…

t looks to be maybe a flushing or state issue; I’ll check into it more and see if I can get a proper fix but I have a workaround you can try.

Create a UMG widget and add a tiny border in the upper left corner. Set the brush color alpha to 0.01 so it isn’t visible. Create and add it to the viewport in your level blueprint. This will make sure it renders something last. This cleared up the problem in our testing.