Media Framework 360 Video?
is it possible to create a VR friendly 360 video using UE4's Media framework? Would we go about doing that applying that to a spherical object? could we use it for a samsung gear VR experience as well?
asked Mar 12 '15 at 06:19 PM in Using UE4
Yes, you could map a video onto a sphere. There is at least one other project I know of that is doing exactly that. You will likely have to apply some distortion to the video itself so it looks right when projected onto the sphere.
The only problem with this right now is that the video has to have a very high resolution - at least 2k x 2k, better 4k x 4k, otherwise it won't look good in the HMD. The way we currently upload video textures to the GPU is particularly slow for large videos, so you won't get a good framerate. We are in the process of fixing this and hopefully have a solution for 4.8. In the meantime, you can use a low-res video for development and testing; it won't look great, but at least you're not blocked.
answered Mar 27 '15 at 03:46 PM
I found a solution thanks to help from Rich at Hammerhead VR. Below worked for me, started with 6 cameras upped it to 18 for the best result in combination with Autopano Video:
120 fov should work.
In order to achieve the desired result we will need to create a template stitch since our standard ue4 scene isn't detailed enough for the automatic algorhythmns to work. I'm afraid I've lost the screenshots of this but I'll try to explain it and find an example image on Google.
Essentially what we want is a perfect stitch from ue4 using the exact same camera setup used in our matinee and to splice that footage into the first few frames of each matching video.
The only way I've been able to do this thus far is to create a cube with inverted normals or BSP and apply a unique, high fidelity photograph to each face. Then to capture the matinees (just a few frames) This will mean that the autopano / video stitch algorithms will be able to recreate the image. Splice those frames into the beginning of each video. And stitch from that frame, this will mean that the remainder of the footage in your video will be perfectly stitched without any extra work. This process is repeatable as well, once you have your template frames you can use them over and over again on new shots. You may like to label each face of the cube with a number in order to keep track of which is which, which way is up and down, orientation ect. I had to create a cardboard net out of a cereal box, number the sides and keep it on my desk in order to keep track of how my cube was arranged.
Please see attached for image examples of a cube room and a HDR image used as a texture in that room.
When working with 360 video in real life, people generally download a template stitch for their 360 hero rig or freedom 360 go pro rig in order to speed up processing or solve any technical challenges. Since we don't have a standard rig it won't work for us, but the benefit is that we can place an infinite amount of cameras on the exact same pixel so there should be less ghosting but of course each video will increase rendering time and workload. This is already an incredibly labour intensive process so be careful. I think this technique will work best on static images since when the camera moves there is a higher chance of seams. You should also be aware of automatic exposure control on your ue4 cameras since this will effect the continuity of your footage between the individual shots. One problem you'll run into is with 6 cameras you may not have enough shots with overlap to create a completely smooth stitch. The best way to get around this is to put a fish-eye warp on your lens. You could either create this as a post process on-camera effect or reverse engineer the oculus integration. When rendering in matinee for some reason the renderer takes a little while to spin up and produce high quality footage. It will be difficult to render each video individually this way and make them sync up, so instead we should sequence each camera in the director group and then split apart the JPEG frames into folders. Be aware, sometimes matinee loses frames for no reason so it's more tricky / slower for us to import them into premiere as a frame sequence.
answered Jun 01 '15 at 08:22 PM
the whole Video play in Unreal is "hit" and miss" even in 4.11.1
Its seems fine in the dev environment... (with low res HD from a Ricoh Theta S) ...and looks ok with 4k (demo footage) but as soon as its compiled to an exe its hit and miss if it works... some setting seems to stop things working even on a PC...on a Samsung S7 Gear VR it worked once...and haven't had it working since....
Would love it all to work so we could start using this for lots of clients !!!
answered Apr 09 '16 at 07:54 AM
Follow this question
Once you sign in you will be able to subscribe for any updates here