Touch input location is set to 0,0,0 in final execution

So in my game, I store the first touch press of the player for the beginning of a spline and then store and update the resulting move presses for the end of the spline. The spline is meant to help aiming and show the power of your shot. 9 out of 10 times it works perfectly. However, when I do it very quickly, the second location for my spline decides to return as 0,0,0 even though my inputs were no where near that location. I found it really baffling because in order to check my work I printed that vector with everyone move and it would update the proper world location every time except for the last value, so I’m guessing that when I let go, in the TouchInputs final act of valor it stores the “location” of no which there is no input, which would be 0,0,0. At least thats my guess.

Can see my BP and issues below. I ended up solving it after a bunch of trial and error by updating the second spline point location vector with Event Tick instead of the Input Touch node.

Hi ,

I have not been able to reproduce this error on my end. Have you tried using the Input Touch location from the Input Touch Event? If you use this, you should be able to do the trace down accurately without using Tick.

I was never able to get the correct touch location from the touch input, which is why I went with the get hit result under finger. Every time I tried to use the touch input location vector it was radically different than where I was touching. I played with it for a few days before moving on, no matter what I did I couldn’t correctly compensate… maybe I’ll take another stab at it tonight.

Yeah, its like I remember. The location given by the InputTouch node is just wrong. The location is offset from where its suppose to be. Is there some extra step to alter that vector in to something usable? Some conversion or what not? I just assumed the node was broken.

I’m using the touch inputs to place points for a mesh spline, and from the way its drawing I can tell at least its offset, its in the wrong quadrant of where I press, and in rotated 90 degrees. I’m not feeding it to and from the screen, so I didn’t think I’d have to do any real conversion for the screen.

And I don’t think I’m alone, this might be where I got the idea to use under cursor:

Yes, what I would do is get that location as I believe it is the location on the viewport that it is registering, or a point just outside of the viewport. Once you do this, run a line trace on the forward vector of the camera to X units, this should allow you to trace down to the actors in question as a touch by channel would. The problem you are currently experiencing is that the touch by channel requires a touch input, once you have released the touch, there is nothing for it to trace from as it doesn’t inherently remember the touch location. By using the release location and tracing down, you should get the same effect. Can you try this and see if it works for you?

Thank you for the responses . I’ll give that a shot tonight. :slight_smile:

So I attempted what you suggested last night but I also wasn’t 100% certain what you meant.

From what I understand, you’re suggesting I take the touch input vector as a start of a line trace and then use the camera’s forward vector multiplied by some amount for the end of the trace? If I understand you correctly that means for the executions of Pressed, Released, and Moved I’ll be doing a trace to get the location what I can then store?

If that is so, I must have failed somewhere because where I pressed was radically different than where my debug meshes were drawn, very far off screen. I’d love to get this working if it means it’s more efficient than my current setup.

Thank you for your time.

Can you show me a screenshot of your trace setup? I may be able to see something that will help fix the trace to work as intended.

Hi ,

We have not heard from you in several days. I am marking this as answered for tracking purposes. If you are still experiencing this error, please comment with the requested information.

For anyone else with the same problem, instead of bothering with the vector form TouchInput which was giving off offset coordinates and doing a trace, I ended up just getting the mouse location to world space from the player control and got what I needed. I hope it can help.

So I wasn’t sure if I followed what you suggested 100% but here is what it looks like. You can see in the blueprint below that the trace is offset from where my mouse is making touch press (first press and latest press represented by the beginning and end of the white line). For some reason it seems that my work is offset, that regardless of how I try to correct for it it’ll be off camera by a little or a lot. Appreciate all the help, just confused why mine isn’t working… if it was working correctly wouldn’t mine lines be drawn at the height of the camera and not on the play space?

Hi ,

This will require just a bit more math than I initial stated. Try something like this:

You will need to run a line trace to ensure everything ends up at the same Y location, but it should allow you to translate the screenspace location to world location based on where your touch input is.

For anyone else with the same problem, instead of bothering with the vector form TouchInput which was giving off offset coordinates and doing a trace, I ended up just getting the mouse location to world space from the player control and got what I needed. I hope it can help.