How can I set TouchInterface item position on the screen with blueprint?

Hello guys! I come here with specific question: I can’t get how works Touch Interface items position.

First of all, I created my WidgetBlueprint and TouchInterface, and positions of my virtual joysticks is not match if i change size of play viewport.

I know that UMG elements has anchors, but in case of TouchInterface: we see only fields with coords and nothing else.

And I think I could GET first/second item of TouchInterface and SET it’s screen location where I planned in my WidgetBlueprint, but don’t know HOW.

So here is my question: how can I set TouchInterface item position on the screen with blueprint? Or I do something wrong and you have another idea for my case?

I haven’t found any way to do that with current touch interface. Its like static object and you can’t change its elements in runtime. Also its bad idea to combine widget and touch interface, because I faced issues with touch order.

Everything you can do is add buttons and elements right to touch interface and map input to it.

Here is my setup for crouch and jump buttons:

So, use only custom widgets or create modified version of touch interface with c++.

It the only way I think

But this is extremely inconvenient! I would like to see the end result immediately, as in UMG, and not to select coordinates by numbers :frowning: