Using CustomDepth in material

I’ve succesfully rendered stuff to CustomDepth. It seems to be impossible to read this into a material. I get an error message saying: “SceneTexture expression cannot be used in opaque materials”. Is there a way to read in the custom depth?

Yes, you can use it in a material. However, SceneTexture is intended only for translucent materials, such as doing post process-like effects through geometry.

Hi,

We think this post contains useful information which we would like to share with our public UE4 community. With your approval, we would like to make a copy of this post on the public AnswerHub which includes the discussion but strips out your username and company name. Please let us know if you are okay with this.

Thanks!

Thanks,

Maybe I’m approaching the issue in the wrong way. I need to render an object into a depth buffer with front face culling turned on and then render the object with back face culling turned on. This way I can calculate the depth of the object for each pixel. How should I proceed with this?

As it’s quite a specific task, without perhaps making a more complex pass that sorts and keeps the best two depths/triangles, I think the way you’re tackling it as dual depth layers is a good approach. As an alternate accumulation idea you could disable all culling and testing then output all depths to a custom buffer with min and max blending. There’s multiple good ways to tackle this first step.

After either approach put the depth comparison in a material/post-process that fits as mentioned above, or add a new pre-pass to use the two depths to calculate your required output in a standalone buffer. Again it depends on your needs, but you could likely skip at least one pass by binding a new buffer during the the (flipped culling) custom depth pass and directly calculate thickness or similar to the output.

I’ll check with a few colleagues to be sure I haven’t missed any editor features to make this more convenient, but the most straight-forward way would likely be to modify the custom depth pass to a depth comparison pass. It’s worth noting that there’s a lot of scope for optimization here if you’re after specific data for a fixed subset of passes though.

Chris

As a coder I know the techniques and am able to do those in our own engine but the limiting factor for me is UE at the moment.

found it. In BasePassRendering.h there is an if -statement

if (!(Parameters.TextureMode == ESceneRenderTargetsMode::DontSet && bNeedsSceneTextures))

that prevents rendering with materials with scene textures… I don’t feel comfortable disregarding the statement since it’s there for a purpose, right? However, there is a comment about it being a sanity check… Might it be ok to comment it away totally?

I’m quite new to UE so a bit more specific instructions would be appreciated. From what I understood from the answer there seems to be a way to define additional passes, correct? And rendering to custom buffers can be defined in those cases? How to do these things in practise?

For the custom depth rendering I’d recommend taking a look through FSceneRenderer::RenderCustomDepthPass() in SceneRendering.cpp. It’s a pretty straight forward draw of the objects with the custom depth flag into the custom depth buffer. If you wanted to modify the target(s) used, you could take a look at how FSceneRenderTargets::RequestCustomDepth() in SceneRenderTargets.cpp works and see where the created targets are utilized.

As an alternative to the above methods we discussed, a colleague informed me that it may be simpler to allow the custom depth use in opaque materials and carry on with your original method, as it should be safe to do so. For that you could try taking a look in HLSLMaterialTranslator.h and seeing how ‘bNeedsSceneTextures’ and ‘bNeedsGBuffer’ are used. You should be able to change the code to not give the error when ‘SceneTextureId == PPI_CustomDepth’.

I believe this would require moving where RenderCustomDepthPass() is called in DeferredShadingRenderer.cpp to somewhere before the SetAndClearViewGBuffer() call. I’ve not tried that change myself, so I’d be interested to know how well it works.

sure thing

thanks for the info. Sorry for the late response, I’ve been out of office for a couple of weeks. I’ll look into the suggested methods and let you know how it went.

I’m following the second instruction of your previous reply. I’ve commented out this line for now:
“Errorf(TEXT(“SceneTexture expressions cannot be used in opaque materials”));”

That way I’m able to use the custom depth target as an input. I run into problems when I want to call RenderCustomDepthPass() before RenderPrePass(). The problem is that RenderCustomDepthPass() calls FinishRenderingCustomDepth() which calls:
auto& CurrentSceneColor = GetSceneColor();

I think the problem boils down to AllocSceneColor() not having been called yet. Is there a specific reason that FSceneRenderTargets::FinishRenderingCustomDepth() needs to do that CopyToResolceTarget() thing on the scenecolor target? I seek your advice on this because I’m sure that with my method I would end up rewriting stuff unnecessarily. I’m sure there is a simple fix to this, right?

Something funny is going on… When I use the customdepth buffer in the material, suddenly the object doesn’t get rendered in the base pass. I’m using RenderDoc to try to figure things out. Is there a check somewhere in the base pass that if the material uses the gbuffer or other scene buffers that it doesn’t get rendered?

Sorry for the delayed response, I wasn’t sure if you were documenting what you were trying or asking for advice, but as you didn’t follow-up I assume it was the latter.

I had a quick look at the code you mentioned and I believe you’re right and it’s safe to disable for now. As mentioned in the comment you talked about most of those edge cases should be dealt with elsewhere, but be on the look out for any corruption or even break in there manually and see what calls through aside from your desired case. You could always extend the code to check and only let your specific requirements through if you know the buffers in question are safe to use.

I’m not aware of any specific issues with re-ordering the code to allow this case to pass, but if you’re not seeing any progress I’ll ask the rest of the rendering team for someone more familiar with the area to comment.

Thanks for the answer. I’ll try to be more clear when asking questions. So far I’ve made some progress and am getting closer to what we need. I’m able to render back faces into CustomDepth and use it in the object shaders. That way I can compute the thickness of the object. Next I will look into how to bind PostProcessInputs to the same shader. The idea is that I’ll use the thickness to blend the the PostProcessInputs with the current color. I believe I need to somehow include this call FPostProcessPassParameters::SetPS() to have the targets bound… I will look into it. After I’m done I could report the steps required to get these things done so that if somebody else needs similar features my findings could be shared. Or better yet included in the UE engine itself.

I need some help regarding binding the postprocessinputs to the object shader. My current approach is that I add two new FShaderResourceParameters to FDeferredPixelShaderParameters. These are:

FShaderResourceParameter PostprocessInputParameter;
FShaderResourceParameter PostprocessInputParameterSampler;

I bind those in FDeferredPixelShaderParameters::Bind().

I’ve also added them to the FArchive like this:

Ar << Parameters.PostprocessInputParameter;
Ar << Parameters.PostprocessInputParameterSampler;

in FDeferredPixelShaderParameters::Set() I set them like this:

IPooledRenderTarget *Texture = GSystemTextures.BlackDummy;
FSamplerStateRHIParamRef Filter = TStaticSamplerState::GetRHI();
SetTextureParameter(RHICmdList, ShaderRHI, PostprocessInputParameter, PostprocessInputParameterSampler, Filter, Texture->GetRenderTargetItem().ShaderResourceTexture);

Note that I’m using BlackDummy as I do not yet know how to get the actual targets…

However, this approach does not work. It turns out that the inputs to the shader are just some random textures, including the previosly successfully bound CustomDepth.

How to do this properly?

The post-process inputs are defined in PostProcessCommon.usf and their names include their numbers, e.g.

Texture2D PostprocessInput0;
SamplerState PostprocessInput0Sampler;

It might be fine to use those existing inputs if they’re included in the shader you’re trying to use, but that may not be the case. You might also find it clearer to add a new name yourself then you can be pretty certain when there is or isn’t code in place as needed.

I’d suggest having a look in DeferredShadingCommon.usf and seeing what happens in there. Follow through one of the other targets, e.g. CustomDepth, then hopefully you can spot where you’re either not binding or not setting a particular texture, if it’s not on a particular list to appear on a drop-down, or it might be that the parameter simply doesn’t exist in that shader.

The problem was solved by adding this to HLSLMaterialTranslator.h
FHLSLMaterialTranslator::Translate()

if(bNeedsSceneTexturePostProcessInputs)
{
OutEnvironment.SetDefine(TEXT(“POST_PROCESS_MATERIAL”), 1);
}