Hi!
I’m making small demo-project for Windows 10 with multi-touch input support.
I handle generic touch input events such as TouchPressed, TouchMoved and TouchReleased in custom PlayerController blueprint. Then based on these events some maths is performed and my custom Tap or Zoom events are fired.
Before now I used mouse as touch input device by enabling “Use mouse for touch” checkbox in project settings. And everything worked fine. Only one “Touch pressed” event with finger index “Touch 1” was fired when I clicked the screen.
But now I’ve tested my project with “Use mouse for touch” enabled on 46’’ multi-touch monitor and have got strange results: there are two “Touch pressed” events are fired when I touch the screen and then two “Touch moved” events when I move my finger. First event has finger index “Touch 2” and second has “Touch 1” like the following:
Touch2 pressed at X=1133.200 Y=565.140
Touch2 moved to X=1125.700 Y=569.100
Touch1 pressed at X=1121.000 Y=569.000
Touch1 moved to X=1121.000 Y=569.000
Then I turned off “Use mouse for touch” setting and only one touch input is fired now. But it has “Touch 2” finger index. If I touch the screen with 2 or more fingers I get finger indices 2, 3, 4… not starting from 1.
I guess finger index 1 is “reserved” for mouse?
So, is it bug or feature?