Search in
Sort by:

Question Status:

Search help

  • Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both. Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.
  • You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with each other. Examples
    • cat dog --matches anything with cat,dog or both
    • cat +dog --searches for cat +dog where dog is a mandatory term
    • cat -dog -- searches for cat excluding any result containing dog
    • [cats] —will restrict your search to results with topic named "cats"
    • [cats] [dogs] —will restrict your search to results with both topics, "cats", and "dogs"

Clarification with SceneDepth Visualization


I am looking at the SceneDepth buffer visualization and somewhat confused at what it's doing. It's taking the scene depth texture (which I assume is the depth buffer) and multiplying by 0.0001 and then doing a frac on it? Can someone explain what the 0.0001 is and why it's doing a frac?

Also I am trying to create a buffer visualization which gives something similar to SceneDepthWorldUnits but instead of radially from the camera position, it'll be the accurate z distance from the camera plane. I tried using PixelDepth node as a start, but it just returns 1.0. Is that being clamped due to shader precision, or something I am doing wrong?

I tried SceneDepth node and while the relative values seems right, the absolute values don't seem accurate. For example, I placed a plane parallel to the camera plane 50cm in front of the camera in the world. The SceneDepth values for parts of the viewport which is covered by that plane all have the same values, but they are shown to be 5.88672, which doesn't appear to be unreal units or cm or meters? And a mesh at 470cm away has a SceneDepth value of 15.7891. Is there anyway to convert the output to a real world unit?

P.S. Is there a way to upload patches for fixing bugs in the engine code like there was for UE3?


Product Version: UE 4.10
more ▼

asked Nov 13 '15 at 01:43 AM in Rendering

avatar image

6 3 5 5

avatar image AndrewHurley Nov 13 '15 at 04:56 PM

Hey zhtet,

I believe our documentation on Depth expressions will help explain some of your questions in regards to Scene Depth. Keep in mind, only translucent materials can utilize scene depth.

Depth Expressions - Scene Depth


If you have further questions after taking a look at this documentation, let me know and I will do my best to provide you with an in 'depth' answer.

To answer your second question simply, yes. However, this does require a source build of the engine. What you can do is submit a pull request via GitHub. In order to get set up for this process follow the links and documentation below.

GitHub Setup


Getting Started: Engine Sources References


Let me know if you have further questions.


Andrew Hurley

avatar image zhtet Nov 13 '15 at 06:54 PM

Hi Andrew, Thanks for the response. However, as stated in https://answers.unrealengine.com/questions/228348/how-to-normalise-the-scenedepth.html, the documentation on that page might be a little outdated? Also I am still not sure on how to interpret the results from SceneDepth to get actual real world distance. Also only translucent materials can use SceneDepth? Not sure why? So what do I use to just get the values in the depth buffer regardless of obj type? I've hooked up a path in engine source to get FSceneRenderTargets.GetSceneDepthTexture(), but figured I'd use the buffer visualization if that works better. Sorry if I am missing anything obvious.

Ok I am only working with the source build, but do you mean to submit a push request?


(comments are locked)
10|2000 characters needed characters left
Viewable by all users

1 answer: sort voted first

Hey zhtet,

I needed to loop in some further support to get the correct information, but I was provided with some really helpful insight in regards to how the Scene Depth and its visualizer is calculated.

"The Normalization equation provided by the documentation was not necessarily accurate, but the principle of normalization still can be applied. The multiplication by 0.0001 and the frac is what creates that stripped pattern you see in Scene Depth Visualization because otherwise it would be a solid gradient from 0 to 2^24-1 which to most people looks like a bunch of whiteness. Multiplying Scene Depth by a Fractional Number (0.0001) is the same as Dividing by a whole number (1000) and they both limit what the range of the depth sampled. The frac is useful only in the visualization because it will help users see the depth based the divided amount. So x0.0001 is saying, make a gradient from 0 to 1000 units in Scene Depth, and anything after 1000 units will be 1. The frac then essentially tells the gradient to repeat because it is only taking fractional values." - Eric Ketchum

The correct terminology for GitHub when submitting code to be added into the engine, is called a pull request, since you are requesting that your submitted code be pulled from GitHub and integrated into the engine.

Hopefully this clarifies things, but let me know if you have further questions.

Thank you,

Andrew Hurley

more ▼

answered Nov 13 '15 at 08:14 PM

avatar image zhtet Nov 13 '15 at 09:01 PM

Hi Andrew, thanks for that detailed response; it clarified some of my questions. One question I still have is, whether there is a way to translate SceneDepth depth into a real world value, whether uu, cm or meters?


avatar image AndrewHurley Nov 14 '15 at 03:18 PM

You might have to create your own conversion table, since the visualizer is using its own units to calculate the 0 to 1000 unit distance from the camera.

In other words, create an object and place it 1 cm or unreal units (whichever you prefer) in front of the camera. Compare the single unit to the Scene Depth distance, and use this as your base per unit measurement. This is really the only way I can see how to convert the visualizer into real world units.


Andrew Hurley

avatar image zhtet Nov 17 '15 at 04:16 AM

Hi Andrew, thanks for the response. So I tried taking the depth buffer values, and apply FSceneView::ScreenToWorld to the values ([-1,1], [-1,1], z, 1.0) and am still not getting either the absolute or relative (to camera) world position. I've tried output from SceneDepth as the z as well as tried the value in FSceneRenderTargets.GetSceneDepthTexture(). Neither seem to give the correct values.

The closest is with SceneDepth and ScreenToWorld, but for a relative position of 50cm it's giving a value of -5448, for 100cm it's -10904, for 200cm it's -21808, which obviously runs out of accuracy since just 200cm already results in a huge value. Do you have any suggestions on approaching this? Has no one really done this yet? The way you mentioned doesn't work, since the depth values are projected coordinates and won't be linear.


avatar image AndrewHurley Nov 17 '15 at 03:55 PM

Hey zhtet,

I consulted another team member to find a helpful solution, and we came up with a way that requires only a little bit of set up and works well within the scope of its design.

What we did was get the vector length (float value) of an object, and divided it by a value (max distance divisor), and clamped it to a 0 to 1 output. We then created a material with a material parameter collection that would change from 0 (black) to 1 (white) based on the distance from the camera using the max distance divisor.

alt text

The material is a simple Material Parameter Collection (scalar) plugged into the base color. The only catch to this set up is you would need to apply this material to the objects you wish to calculate your custom scene depth. As for someone else trying this, you would have some luck posting and searching the forums for more answers from the community.


Andrew Hurley

(comments are locked)
10|2000 characters needed characters left
Viewable by all users
Your answer
toggle preview:

Up to 5 attachments (including images) can be used with a maximum of 5.2 MB each and 5.2 MB total.

Follow this question

Once you sign in you will be able to subscribe for any updates here

Answers to this question