Search in
Sort by:

Question Status:

Search help

  • Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both. Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.
  • You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with each other. Examples
    • cat dog --matches anything with cat,dog or both
    • cat +dog --searches for cat +dog where dog is a mandatory term
    • cat -dog -- searches for cat excluding any result containing dog
    • [cats] —will restrict your search to results with topic named "cats"
    • [cats] [dogs] —will restrict your search to results with both topics, "cats", and "dogs"

Office Holiday

Epic Games' offices will be on holiday from June 22nd to July 7th. During this period support will be limited. Our offices will reopen on Monday, July 8th. 

Screenspace Portal's on VR

Hi Everyone, I am trying to create a portal system for use on VR, but have come across issues with the way that it renders.

Currently I have a SceneCapture2D component that renders a view to a 1920x1080 render texture. Using screen space UV's, this is then applied in a material to a plane in the world. This is an amazing method to be used for games that would be played on the monitor, however, when played in VR, there is an aweful amount of skew on the texture.

The problem is not the texture, as when this is applied to a BSP without screen space UV's, it can be seen as working totally fine without distortion.

Myself and colleagues think that the problem is with the way the screen space UV's function grabs the screen position. We have tried changing this function that that we calculate the screen position manually/traditionally, however, we still end up with the same results in VR.

We have tried to find out where we can get the camera view matrix or projection matrix. However, this seems to be near impossible with Blueprints and the material shader. We have also tried using the world to clip function.

Does anyone have an appropriate solution to stop the skewing of the render target texture?


alt text

alt text

alt text

Product Version: UE 4.7
portaltest01.jpg (54.7 kB)
portaltest02.jpg (48.5 kB)
portaltest03.jpg (50.4 kB)
more ▼

asked May 28 '15 at 12:14 PM in Rendering

avatar image

285 13 18 35

avatar image BenCone Aug 12 '16 at 05:04 PM

Could you show the end results of this? I just want to make sure this is worth the time and effort to implement, since I've been looking and have found multiple ways to do portals and none of them have worked since they all suffer from the exact issue you had in the OP.

avatar image BenCone Aug 15 '16 at 09:20 PM

I actually tried to step by step go through this and I'm hitting a lot of bumps along the way. If you could help me out that would be awesome. I'm a little new to C++ in Unreal, so I don't know how to properly implement those and make them work. Also, is there a Capture Component and Render Target in the scene already? It doesn't seem as though the code creates one, but I'm also not sure where it would take one over. Questions like these are preventing me from stepping further with this. Any insight would be great, thanks.

(comments are locked)
10|2000 characters needed characters left
Viewable by all users

1 answer: sort voted first

My colleague and I managed to fix this (mostly). Hold on to your hats!

We had to create our own BP class in C++ (people who use BP, do not worry, you can find some code to copy and paste below).

In the header, we created a UProperty for a scene capture component 2D that could be fed into a function. We also created a Calculate FOV function and a Update Portal View P Matrix Parameters.

The Calculate FOV function takes an input of a player controller. This then checks to see if a HMD is connected and if so, proceeds to grab the FOV from the VR Camera as it changes between VR and on PC.

The Update Portal View Projection Matrix Parameters function takes dynamic material instance, a player camera transform, and a player controller. First of all we get the sceneCaptureComponent2D's capture size (x,y). We then grab the view matrix and view location of the player's camera transform. We then swap the axis to match unreal's coord space so Z is up. Lastly we grab the scene capture component 2D's FOV.


If the viewport is wider than it is tall, the XAxisMultiplier is 1 whilst the YAxisMultiplier is viewport.x / viewport.y Else the XAxisMultiplier is viewport.y / viewport.x and the XAxisMultiplyer is 1.

We then create a projection matrix where the FOV value, axis multipliers and clipping plane values (we gave 10 and 1000) are fed into the constructor.

Then a view projection matrix is created by multiplying the view matrix and the projection matrix.

We then break the VPMatrix up into its column components (for X, Y, Z and W axis) to be fed into out dynamic material instance and feed them into it.


 // Fill out your copyright notice in the Description page of Project Settings.
 #pragma once
 #include "GameFramework/Actor.h"
 #include "Kismet/GameplayStatics.h"
 #include "Camera/PlayerCameraManager.h"
 #include "Engine/SceneCapture2D.h"
 #include "Classes/Components/SceneCaptureComponent2D.h"
 #include "MyTestPortalActor.generated.h"
 class AMyTestPortalActor : public AActor
     UPROPERTY(VisibleAnywhere, BlueprintReadWrite, Category = "RDSceneCapComp")
     USceneCaptureComponent2D* RDSceneCapComp2D;
     // Sets default values for this actor's properties
     // Called when the game starts or when spawned
     virtual void BeginPlay() override;
     // Called every frame
     virtual void Tick( float DeltaSeconds ) override;
     UFUNCTION(BlueprintCallable, Category = "RDMatrix")
     float CalculateFOV(APlayerController* playerController);
     UFUNCTION(BlueprintCallable, Category = "RDMatrix")
     void UpdatePortalVPMParameters(UMaterialInstanceDynamic* material, FTransform PlayerCameraXForm, APlayerController* playerController);


 #include "MyTestPortalActor.h"
 #include "Kismet/GameplayStatics.h"
 #include "Classes/Components/SceneCaptureComponent2D.h"
 #include "Classes/Engine/TextureRenderTarget2D.h"
 #include "Classes/Camera/CameraComponent.h"
 #include "Runtime/HeadMountedDisplay/Public/IHeadMountedDisplay.h"
 // Sets default values
      // Set this actor to call Tick() every frame.  You can turn this off to improve performance if you don't need it.
     PrimaryActorTick.bCanEverTick = true;
     RDSceneCapComp2D = NULL;
 // Called when the game starts or when spawned
 void AMyTestPortalActor::BeginPlay()
 // Called every frame
 void AMyTestPortalActor::Tick( float DeltaTime )
     Super::Tick( DeltaTime );
 float AMyTestPortalActor::CalculateFOV(APlayerController* playerController)
     float fov = 90.0f;
     if (playerController != NULL)
         if (playerController->PlayerCameraManager != NULL)
             fov = playerController->PlayerCameraManager->GetFOVAngle();
     // FOV changes when we have a VR Headset enabled
     if (GEngine->HMDDevice.IsValid() && GEngine->IsStereoscopic3D())
         float HFOV, VFOV;
         GEngine->HMDDevice->GetFieldOfView(HFOV, VFOV);
         if (VFOV > 0 && HFOV > 0)
             fov = FMath::Max(HFOV, VFOV);
             // AspectRatio won't be used until bConstrainAspectRatio is set to true,
             // but it doesn't really matter since HMD calcs its own projection matrix.
             //OutViewInfo.AspectRatio = HFOV / VFOV;
             //OutViewInfo.bConstrainAspectRatio = true;
     return fov;
 void AMyTestPortalActor::UpdatePortalVPMParameters(UMaterialInstanceDynamic* material, FTransform PlayerCameraXForm, APlayerController* playerController)
     float captureSizeX = RDSceneCapComp2D->TextureTarget->GetSurfaceWidth();
     float captureSizeY = RDSceneCapComp2D->TextureTarget->GetSurfaceHeight();
     const FTransform& Transform = PlayerCameraXForm;
     FMatrix ViewMatrix = Transform.ToInverseMatrixWithScale();
     FVector ViewLocation = Transform.GetTranslation();
     // swap axis st. x=z,y=x,z=y (unreal coord space) so that z is up
     ViewMatrix = ViewMatrix * FMatrix(
         FPlane(0, 0, 1, 0),
         FPlane(1, 0, 0, 0),
         FPlane(0, 1, 0, 0),
         FPlane(0, 0, 0, 1));
     const float FOV = RDSceneCapComp2D->FOVAngle * (float)PI / 360.0f;
     // Build projection matrix
     float XAxisMultiplier;
     float YAxisMultiplier;
     if (captureSizeX > captureSizeY)
         // if the viewport is wider than it is tall
         XAxisMultiplier = 1.0f;
         YAxisMultiplier = captureSizeX / captureSizeY;
         // if the viewport is taller than it is wide
         XAxisMultiplier = captureSizeY / captureSizeX;
         YAxisMultiplier = 1.0f;
     FMatrix ProjectionMatrix = FReversedZPerspectiveMatrix(
     const FMatrix ViewProjectionMatrix = ViewMatrix * ProjectionMatrix;
     FVector Xaxis = ViewProjectionMatrix.GetColumn(0);
     FVector Yaxis = ViewProjectionMatrix.GetColumn(1);
     FVector Zaxis = ViewProjectionMatrix.GetColumn(2);
     FVector Waxis = ViewProjectionMatrix.GetColumn(3);
     float XaxisW = ViewProjectionMatrix.M[3][0];
     float YaxisW = ViewProjectionMatrix.M[3][1];
     float ZaxisW = ViewProjectionMatrix.M[3][2];
     float WaxisW = ViewProjectionMatrix.M[3][3];
     material->SetVectorParameterValue("PortalVPM_Xaxis", FLinearColor(Xaxis.X, Xaxis.Y, Xaxis.Z, XaxisW));
     material->SetVectorParameterValue("PortalVPM_Yaxis", FLinearColor(Yaxis.X, Yaxis.Y, Yaxis.Z, YaxisW));
     material->SetVectorParameterValue("PortalVPM_Zaxis", FLinearColor(Zaxis.X, Zaxis.Y, Zaxis.Z, ZaxisW));
     material->SetVectorParameterValue("PortalVPM_Waxis", FLinearColor(Waxis.X, Waxis.Y, Waxis.Z, WaxisW));

Now how does the material know how to use these variables I wonder?

We created out own version of the engine's screen space UV function as it does not work in VR properly.

So below is how we set up the material (with the functions output UV's going into the vertex shader customised UV's rather than the pixel shader UV's) alt text

And here is a screenshot of the function. alt text

In order for the custom node to work, we added some HLSL that could make sense of the crazy variables we are feeding into it.

Here is the HLSL.

 float4x4 vpm;vpm = float4x4(PortalVPM_Xaxis.x, PortalVPM_Yaxis.x, PortalVPM_Zaxis.x, PortalVPM_Waxis.x,PortalVPM_Xaxis.y, PortalVPM_Yaxis.y, PortalVPM_Zaxis.y, PortalVPM_Waxis.y,PortalVPM_Xaxis.z, PortalVPM_Yaxis.z, PortalVPM_Zaxis.z, PortalVPM_Waxis.z,PortalVPM_XaxisW, PortalVPM_YaxisW, PortalVPM_ZaxisW, PortalVPM_WaxisW);return mul(float4(In.xyz,1),vpm);

As you can see, it takes all the parameters we are feeding in, and created a matrix to be used in the rest of the custom maths.

Admittedly this got a little out of my depth and my colleague helped out a lot. But if you have any questions about how or why things need to be done, please give me a shout and I will make sure I get an answer for you.

[1]: /storage/temp/45741-ue4portalanswer01.jpg

more ▼

answered Jun 09 '15 at 04:23 PM

avatar image

285 13 18 35

avatar image SrGeneroso Apr 06 '16 at 12:05 PM

Sure, It looks amazing, and also could help me peek how it works a off-axis projection parallax or whatever this concept it's called. My main interest it's to achieve something similars using headtracking and calculating the monitor as one of those portals. I'm researching how to begin and seems like I have a lot to learn from your work, so thanks for sharing.

avatar image Trex. Apr 20 '17 at 01:22 PM

Hi ! thanks for the snippet of code, I'm also trying to achieve a portal in VR.

So far with the code above applied to a simple plane I have a different result for each triangle of my plane, any idea ?

What I'm trying to achieve next is to get this portal working in stereo (maybe with 2 scene captures)

avatar image AgentMilkshake1 Apr 20 '17 at 01:57 PM

Hey Trex,

I haven't looked at this for a long time, but I have a feeling that you should be able to do it without engine changes. I have a feeling they added a stereo option to scene capture components. As I said I haven't had the chance to play around with it, but it might be that they added this for creating VR scene captures that have a stereo view (don't appear flat).

If you get a chance to try it out, let me know how you get on, I would love to revisit this one day, and that is one of the first things I want to investigate.

P.S. I'm not sure if the code snippet is still 100% correct for current engine versions, I think I did this during 4.7 so some of the code might not apply correctly if they have made any changes in the classes mentioned above :)

avatar image BlackFangTech Sep 14 '17 at 08:18 AM

What is PlayerCameraXForm intended to be? World-space transform of the player camera? Would it be possible for you to post a screenshot of the portal blueprint using these nodes?

(comments are locked)
10|2000 characters needed characters left
Viewable by all users
Your answer
toggle preview:

Up to 5 attachments (including images) can be used with a maximum of 5.2 MB each and 5.2 MB total.

Follow this question

Once you sign in you will be able to subscribe for any updates here

Answers to this question