Oculus Touch Input Mapping

Howdy Folks

I’ve just started prototyping a few ideas in Unreal around a VR Project. I’m having trouble with Inputs from the Oculus Touch motion controllers, however. I have keyboard inputs setup, however, I’d love to move them to the touch controllers.

I’m confused as to what the touch controller button scheme is inside Unreal. Following on from the VR example I’m trying to reference the MotionController buttons but I’m lost as to which buttons each of the ‘Facebuttons’ refer to.

I’m trying to reference input from the Left B and Right Y and buttons.

Thanks

here you go for the buttons:

A == MotionController (R) FaceButton1

B == MotionController (R) FaceButton2

Y == MotionController (L) FaceButton2

X == MotionController (L) FaceButton1

One user posted a full diagram of the buttons here:

Brilliant! Thanks so much

how did you get the input buttons to trigger?

Hi, I’ve put an “AnyKey” node in my character blueprint and I print the name of the Key I press with my oculus touch with the “Get Key Display Name” node.

Do you know why when I press Y on my oculus touch, it prints “Motion Controller (L) Shoulder”? There is no button who prints FaceButton. Pressing A or X shows nothing.

I disabled Oculus VR plugin in settings to make things working with all headsets and controllers.

In fact, it’s when I press on the thumbstick in a specific direction who prints FaceButton 1 to 4 for both controllers.

This page is nice. I’ve recently found that the right start menu is unmappable. It is shown up as “reserved” on the diagram. The way to handle Oculus pausing is by using a “Has Input Focus” node in a blueprint.

1 Like

Where in the hell are you seing those inputs? All I’m finding is this

WHERE