Getting UVs of SceneTexture

Hi there,

I’m trying to modify the UVs of SceneTexture though a Post Processing Material.
I have currently created a PostProcess Material in the UE4 Editor which has the following connectivity: TexCoord->SceneTexture:PostProcessingInpput0(UV in, Color out)->MyMaterial(Emissive in).

So I’m essentially trying to replace TexCoord by my own UVs defined in C++ and being dynamically changed at every frame.

Here are my questions (I’m trying to avoid modifying the engine source code itself):

  • how do I get a C++ pointer to the “UV” member of “SceneTexture” (Editor) ?
  • what is the expected format for UVs in SceneTexture ? Array of Vector2D, 2D array of float (represented as a 1D array ?), …
  • Is there a way to access to the SceneTexture’s UVs of the Editor but using Blueprints ?
  • Is there a C++ sample that shows how to dynamically modify the UVs of SceneTexture ?

Additional question:

  • is there a way of injecting my own pixel shader in the UE pipeline without touching the UE source code ?

Please note that I initially tried to use Textures as inputs for that, but there is no 32bits floating point format. I managed to get the textures being modified in real time but as there are no float format, the result was way too aliased and unusable.

Best Regards,

Phoenix.

how do I get a C++ pointer to the “UV” member of “SceneTexture” (Editor) ?

The UV material expression/node is only there to generate some HLSL text to pass the UV in the vertex shader to the pixel shader. A c++ pointer would not help you in any way.

what is the expected format for UVs in SceneTexture ? Array of Vector2D, 2D array of float (represented as a 1D array ?), …
Is there a way to access to the SceneTexture’s UVs of the Editor but using Blueprints ?

The UV comes from the mesh you hook up. It can have many formats e.g. Skeletal mesh, Static Mesh, Particle …

In order to provide custom UV you can create/manipulate your own mesh and render it with a shader simply using the normal UV.

Is there a C++ sample that shows how to dynamically modify the UVs of SceneTexture ?

The TextRender component does that but you might find even better examples. You should be able to do that outside the engine but I would tackle that later.

Additional question: - is there a way of injecting my own pixel shader in the UE pipeline without touching the UE source code ?

You can add HLSL code to the Common.usf or MaterialTemplate.usf and reference it in your material with the custom node. This might be not portable (GLSL, ES2, ES3, PS4, Metal, …) and it might break when we change the engine internals. It also means many shaders need to be recompiled. Press “Ctr Shift .” to recompile all shaders or modify a single material and press apply to recompile this material only. Because of those issues we generally try to avoid that direction.

Please note that I initially tried to use Textures as inputs for that,

That can be solved but texture coordinates are likely to be faster anyway.

BTW: Look at the Occulus Rift code in our engine - that is already doing things you want to do.

Hi Martin,

Thank you for the fast answer.

Let me try to explain what I have in mind.
When I use a Gradient Material Function and take its output to feed the UV of the SceneTexture, I get the exact result I am looking for.
So I am wondering if there is any way to input such a function with a float array directly from C++ or using a blueprint ?
That would be the easiest way. Does that make sense?

With regards to the mesh, that sounds a bit weird to me, so I probably don’t get the way UE is designed.
I’m using a postprocessing material here, so there is no real mesh. The screen is the target here if you see what I mean. It is supposed to have UVs (which is the case according to SceneTexture) that can be replaced by a parameter. I am looking for the parameter that I have to add in the Editor which would be both modifiable from C++ and connected to the UV field of SceneTexture. Is it possible ?
Or Shall I create a dynamic mesh and change UVs on it from C++ and then attach it to the SceneTexture in the Editor ?

There are indeed code in your engine to do what I’d like to do, but I’m trying not to add my own classes to the engine. I would like to use the C++ SDK and the blueprint features. So if I do not intend to modify the Engine, the OculusRift example is of no help. Am I right to say that ?

Regards,

Phoenix.

I get the exact result I am looking for.

Your material function uses UV as input - UV doesn’t exist on CPU side. It’s generated by the rasterizer for each pixel.
You cannot output a single value from GPU as it would look the same for each pixel. If you pass a 1d or 2d array to the GPU it’s a texture or a buffer. Texture can be interpolated/filtered which is faster than doing it manually.

Or Shall I create a dynamic mesh and change UVs on it from C++ and then attach it to the SceneTexture in the Editor ?

You either use a texture (filtering can be fixed as said in another thread) or a mesh.

I would like to use the C++ SDK and the blueprint features.

Rendering algorithms need to be very fast and exposing flexibility cost performance. We intend to make it more flexible but at the moment you have to change engine internals in some cases.

Hi Martin,

I finally solved my problem trying with textures again.
I have created a texture in PF_B32R32F and overrided the material texture in c++ by grabbing the pointer of my custom Distortion Material created in the Editor. So I’m now streaming in real time UV maps in floating point and applying them on the scenetexture.

However, I would like to suggest an additional feature: adding more texture formats in the Editor to support 32bits floating point formats texture and … buffers ! That would make it much easier…
I would be happy to do it myself, but I would then need to know where to modify the GUI in your engine…

I’m now facing two more issues:

  • How do I add filtering to that new texture ? Any example I could use ?
  • How to render the scene at a bigger scale and then rescale it to the right size ? For example on a fisheye lens, many points are projecting out of the boundaries of the picture. So I would need to render the picture at a bigger size, distort it and rescale it…

All the best,

Phoenix.

However, I would like to suggest an additional feature

We do that as we need to. It should not be hard to add one yourself.

How do I add filtering to that new texture

I answered this before, that should be solved, or?

How to render the scene at a bigger scale and then rescale it to the right size

When the view is requested (can look at editor code) you can request it in higher resolution. I game the r.ScreenPercentage allows to do something similar. You need to look at the code related to that.

Hi Martin,

Just got back today from travelling.
Let me have a look at that.

Cheers,

Sam.