World to Screen space - aligning screen space UVs to world location

So, I have a screen space effect set up, which cuts pieces out of certain objects to allow me to see behind them. It currently just cuts out a circle in the centre of the screen. The effect is mapped in screen space. However, the purpose of this is to allow the player to see their character through obstacles, and the camera is not guaranteed to be centered on the player. The effect is obtained by sampling a circle texture using custom screen aligned UV coordinates, and applying it to the opacity mask channel of specific objects.

I would like to make sure that this effect focuses on the player, rather than the centre of the screen, but I am struggling to do so, because it requires that I find the screen space position of an object in world space. I have tried various options, but none work as I would like:

  • Shader Transform Vector node from World to View space (coordinates are not even close to accurate)
  • Shader WorldToClipSpace node (best so far. Coordinates are at least on screen, but not focusing on the player)
  • Blueprint Convert World Location to Screen Location node (same results as World to View shader conversion)
  • Custom HLSL nodes (documentation is so sparse on the available functions aside from basic HLSL syntax that I wasn’t able to get any working code out of this)

I have tried various different variations on these, but none are working correctly.

Can anyone help me to create screen space UV coordinates that centre on a point in world space?

Hmm. I seem to have solved this, but I’m not sure why it works. Perhaps someone could shed some light on this?

My final transformation uses the WorldToClipSpace node, but then multiplies it by the vector (-0.5,0.5). This appears to remap the coordinates to work correctly, but I am not sure why this is the case, and am concerned that this will not turn out to be constant later.

Nice find! Some potentially useful info here?

http://http.developer.nvidia.com/CgTutorial/cg_tutorial_chapter04.html

Haven’t really looked into this but I’m guessing it’s the difference between Parameters.ScreenPosition.xy and ScreenAlignedPosition(Parameters.ScreenPosition).xy.

IIRC, the range on ScreenPosition.xy is (-1,1) and ScreenPosition.y “starts” at the top of the screen, whereas ScreenAlignedPosition(…).y “starts” at the bottom.