x

Search in
Sort by:

Question Status:

Search help

  • Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both. Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.
  • You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with each other. Examples
    • cat dog --matches anything with cat,dog or both
    • cat +dog --searches for cat +dog where dog is a mandatory term
    • cat -dog -- searches for cat excluding any result containing dog
    • [cats] —will restrict your search to results with topic named "cats"
    • [cats] [dogs] —will restrict your search to results with both topics, "cats", and "dogs"

Office Holiday

Epic Games' offices will be on holiday from June 22nd to July 7th. During this period support will be limited. Our offices will reopen on Monday, July 8th. 

HLSL compute shader FViewUniformShaderParameters not populated.

Hey, I'm running some hlsl that uses the View.ScreenPositionScaleBias to cast a ray from the camera position, but it seems to be 0.0, 0.0 every time. I'm using the code implemented in Temaran's compute shader plugin example to render out to a texture based on a raycast from the camera into world space, my hlsl looks like

   #include "Common.usf"

   void MainPixelShader( in float4 uv : TEXCOORD0, 
                         out float4 OutColor : SV_Target0 )
   {
     OutColor = world_space_color( uv_projection(uv) );
   }

   float2 uv_projection( float4 uv )
   {
     float2 screen_uv = uv.xy / uv.w * View.ScreenPositionScaleBias.xy + View.ScreenPositionScaleBias.wz;
     float  scene_depth = CalcSceneDepth(screen_uv);
     float4 h_world_pos = mul(float4(uv.xy * uv.w * scene_depth, scene_depth, 1), View.ScreenToWorld);
     float3 world_pos = h_world_pos.xyz / h_world_pos.w;
     float3 camera_dir = normalize(world_pos - View.ViewOrigin.xyz);
     return intersect_ray_plane( world_pos, camera_dir);
   }

   float2 intersect_ray_plane(float3 origin, float3 direction)
   {
     float3 planePos = float3( 0,0,0 );
     float3 planeNormal = float3( 0,0,1 );
     float denom =  dot(direction, planeNormal);
     if (abs(denom) < 0.0000001 )
     {
       return float2(0.0, 0.0);
     }
     float t = dot((planePos - origin), planeNormal) / denom;
     return (origin + t * direction).xy;
   }

but View.ScreenPositionScaleBias.xy is always (0.0, 0.0) at which point everything messes up. Im using the ENQUEUE_UNIQUE_RENDER_COMMAND_ONEPARAMETER to add the shader to the render thread, does it just mean that I happen to be rendering before the FViewUniformShaderParameters has been populated, or do I need to bind the data to my pixel shader when it executes somehow within its threaded function, atm that is

     FRHICommandListImmediate& RHICmdList = GRHICommandList.GetImmediateCommandList();
 
   SetRenderTarget( RHICmdList, current_render_target_->GetRenderTargetResource()->GetRenderTargetTexture(), FTextureRHIRef() );
   RHICmdList.SetBlendState( TStaticBlendState<>::GetRHI() );
   RHICmdList.SetRasterizerState( TStaticRasterizerState<>::GetRHI() );
   RHICmdList.SetDepthStencilState( TStaticDepthStencilState<false, CF_Always>::GetRHI() );
 
   static FGlobalBoundShaderState BoundShaderState;
   TShaderMapRef<FBasicVertexShader> VertexShader( GetGlobalShaderMap( feature_level_ ) );
   TShaderMapRef<FTerrainShader> PixelShader( GetGlobalShaderMap( feature_level_ ) );
 
   SetGlobalBoundShaderState( RHICmdList, feature_level_, BoundShaderState, GTextureVertexDeclaration.VertexDeclarationRHI, *VertexShader, *PixelShader );
 
   PixelShader->SetUniformBuffers( RHICmdList, constant_parameters_, variable_parameters_ );
 
   FTextureVertex verts[ 4 ];
   verts[ 0 ].Position = FVector4( -1.0f, 1.0f, 0, 1.0f );
   verts[ 1 ].Position = FVector4( 1.0f, 1.0f, 0, 1.0f );
   verts[ 2 ].Position = FVector4( -1.0f, -1.0f, 0, 1.0f );
   verts[ 3 ].Position = FVector4( 1.0f, -1.0f, 0, 1.0f );
   verts[ 0 ].UV = FVector2D( 0, 0 );
   verts[ 1 ].UV = FVector2D( 1, 0 );
   verts[ 2 ].UV = FVector2D( 0, 1 );
   verts[ 3 ].UV = FVector2D( 1, 1 );
 
   DrawPrimitiveUP( RHICmdList, PT_TriangleStrip, 2, verts, sizeof( verts[ 0 ] ) );
 
   PixelShader->UnbindBuffers( RHICmdList );

the result I am then using in a material to alt text

so that my procedural texture is then projected onto our terrain with the material system. I can see that it is the View structure that doesn't have any data if I debug the shader by outputing different colors for different issues and it all traced back to this structure not having the data I need. Do I need to just populate out some of the data myself, or is this a render order issue (like how the decal system is deferred?)

Product Version: UE 4.10
Tags:
capture.png (51.3 kB)
more ▼

asked Mar 31 '16 at 01:21 AM in Rendering

avatar image

Wolflight
46 4 7 8

(comments are locked)
10|2000 characters needed characters left
Viewable by all users

1 answer: sort voted first

Ok nm I figured it all out, shaders have a way of making me feel stupid a lot of the time, all I really needed was the inverse View Projection matrix to do the ray cast so I bound that up in my shader variables (BEGIN_UNIFORM_BUFFER_STRUCT) and passed that through and did my raytrace with a bit of

 float2 uv_projection( float4 uv )
 {
   float4 clip;
   clip.x = 2.0 * uv.x - 1.0;
   clip.y = -2.0 * uv.y + 1.0;
   clip.z = 0.5;
   clip.w = 1.0;
 
   float4 world_pos = mul(clip,TerrainVariables.screen_to_world);
   world_pos.w = 1.0 / world_pos.w;
   world_pos.xyz *= world_pos.w;
 
   return intersect_ray_plane( world_pos.xyz, normalize( world_pos.xyz - TerrainVariables.view_origin ) );
 }

I did end up poking around trying to figure out why my shader had an empty View object and saw that it was because I was not binding the the FViewUniformShaderParameters with SetUniformBufferParameter to my specific shader.
Had some issues with a delay on the viewprj matrix being out of sync with the camera so it would lag on the projection. Managed to fix it up by reconstructing it using a thread safe set of code from this post but am still having some issues with the material update/projection occuring before we have rendered the asset from the active camera. Not sure how/if I can adjust when my render thread task occurs, if I can make it a higher priority?

more ▼

answered Mar 31 '16 at 06:23 AM

avatar image

Wolflight
46 4 7 8

(comments are locked)
10|2000 characters needed characters left
Viewable by all users
Your answer
toggle preview:

Up to 5 attachments (including images) can be used with a maximum of 5.2 MB each and 5.2 MB total.

Follow this question

Once you sign in you will be able to subscribe for any updates here

Answers to this question