Matching 2 cameras for seamless 360

Hey all,

I have 2 cameras ( a. Player Camera, b. SceneCapture2D Camera). I want to match the Scene Capture Cam to the Player cam, but I keep getting odd distortions. The player is going to be surrounded by a spherical mesh that is meant to match 1:1 with what the SceneCapture camera is seeing.

Things to note:

Both cameras are using the same FOV (90*)

Both cameras match each other in their relative space

I don’t believe that the sphere has anything to do with it since I’ve tried displaying on multiple surfaces and they all have that distortion around the edges…

Here’s my material that I’m using. Because of ScreenPosition, it is meant to render the forward vector of the SceneCapture camera to the Player camera. Thus, when the user looks around the sphere, they should just see exactly what the SceneCapture camera is seeing.

Here’s the BP for the cameras.

What the view looks like from the SceneCapture camera’s position.

What the interior of the sphere looks like with the projected texture.

Side note, this is going to eventually be part of a VR app. I just tried it in VR and have very strange results. I believe that problem is coming from the ScreenPosition node. Anyone have any ideas on how to get around it?

It seems like the ScreenPosition is accounting for each eye as though they are just different parts of the screen. Maybe I need to get a per-eye ScreenPosition somehow?