How to read float values from a texture?
I have a couple of issues/questions regarding the material editor.
First: I would like to use a single channel 32 bit grayscale image as a texture in order to pass float values within it. Does Unreal support any image format with these characteristics? If so, which image formats would be advisable?
Assuming that isn't possible, the only way I figured to accomplish the same result would be to code a float value into a RGBA pixel (8 bits per channel) and use that texture instead. The problem would be to decode the RGBA pixel back into a float value. What I would really need is binary operations in the material editor to be able to decode the pixel.
So, my second question is: are there any nodes that perform binary operations (like bit shift) in the material editor?
Again, assuming there aren't, I know I could probably do it in HLSL with a custom node. However, that leads me to my third (and hopefully last :-) ) question: would any custom HLSL code that I write in the material node be translated into an appropriate language if the target platform doesn't use HLSL?
Thanks in advance!
asked May 27 '14 at 08:44 AM in Rendering
I am not sure about the HLSL custom node question, but any texture is basically just a series of float values already. You may have to counteract gamma curves depending on where your values came from and uncheck the sRGB flag in the texture properties in engine.
For example, I recently did a material function that simulates a per-object shadow by projecting the objects captured local normalized depth onto its local sphere bounds and comparing the depths to determine if any given local is shadowed from an above (or closer to the lightsource) pixel. The texture data can be grayscale or compressed and it doesn't matter, both are interpreted as a 0-1, RGBA just has more channels. In photoshop colors are read as 0-255, so 127 would be 0.5, 1.0 would be 255 etc. You can biasscale your values however you need to on the editor side.
Now the bit about counteracting gamma curves. In my example case, the shadow depth texture is also captured from the editor using an orthographic camera BP using a shader that gets the local depth and applies it to the mesh and then takes a screenshot in unlit mode. But unlit mode has a default gamma curve. I can either do a power of 2.2 in the material to counteract the gamma at the material level (best option for preserving even precision across layers), or counteract the gamma afterwards in photoshop. To do it in photoshop, you make a levels adjustment setting the midpoint to 0.4545. You may intuitively notice that 0.4545 is the inverse of 2.2 :)
After doing either of those tweaks, I can make a material that outputs the value 0.5 to emissive, take unlit screenshot (either compensating for gamma in the editor or photoshop) then import a texture of that same value, do a color picker off that rendered image and get my original 0.5 value (which would read as ~127 by a color picker).
One more reminder to uncheck the sRGB box in the texture properties. That will apply another unnecessary gamma curve that will screw up your values. I have missed this step a few times even though I should know better by now, and wasted a few minutes debugging my material thinking it was busted :)
I hope that answers your question. It is possible to encode float values and read them in a predictable way with a bit of careful workflow.
One more tip, the material function DebugScalarValues is VERY useful here. You can read the exact value the material editor sees from a texture. It reads per pixel so to test values it will be best to make solid color textures.
Follow this question
Once you sign in you will be able to subscribe for any updates here