3D cursor for OR 3d UMG

Hello, i’m very happy 3d UI with UMG finally arrived.
The problem is that i can’t find a way to interact with button with other means (trace, overlaps) than cursors.
It could be fine if my game was meant to be played on flat screen, however, it is a VR project, and so, using a 2D cursor in VR is a no. In fact, as it is displayed accross the whole screen and not once per eye, it makes it useless or difficult to use.
Also, i’d like to use the HMD orientation and Razer Hydra to point to buttons.
But it seems like the line trace and overlap events are not yet ready to detect what is inside the UMG 3d widget.

Is there a way to achieve this or do i have to wait for next update ?
Thanks

Hello ,

I have a suggestion that you could try, however depending on how your game functions this may or may not be a viable workaround. You could make a widget component for each aspect of your widget that you would like to interact with. An example of this would be to make a widget component with a drawn size of 60 by 40 that contains a button that is also 60 by 40. Once this is setup you could use the hit event to fake your click events. With this workaround you could build your UI in the components tab of the actor blueprint that holds your 3D widgets. I hope this helps

Make it a great day

Hi Rudy!
If i understand, if i wan’t to make, lets say, a digicode panel, do i have to make a widget for each button and then place them as 3d widgets in the actor’s component, or make them child inside another widget (like in the demo from the twitch livestream)?

If first solution, it is what i wanted to avoid.
If the second, wouldn’t all the subwidget overlap with the main ?
Could you post an image of what you mean ?

I will give a try as the second solution the way i understood it.

Well it seems i can’t get it working :-/

I experimented a bit to see how and why the classic cursor isn’t working. I already known that the fact the screen was devided by 2 would cause the problem. But here are some explanations.

So what we have here, is the game image on a flat screen. The red square represent a widget button.
So here no problem, we just have to click it.

Then we have the Stereoscopic 3D version of the render. What i though at first, is that we had to click either one of the square. The fact is that the cursor can only be on one eye.

I removed the lenses of my DK2 to see what was going on, and i discovered that even if the game is rendered in Stereoscopic, all the interaction are virtually on a flat screen. It means the red square on this image is at the exact same position as the first, non stereo render.

I don’t know if that can give you any direction on the way to go but…

Hello ,

I may have a slightly easier way to do what I mentioned before.

Steps:

  1. Set up your widget as normal
  2. Add this to a 3D widget component
  3. Add a static mesh for every button (I used planes)
  4. Place the meshes in the same place as the buttons
  5. Set the meshes to Hidden in game
  6. Set the widget component to have no collision (so that the projectile will only hit the static meshes)
  7. Use the hit events generated from the planes as your new “click” events

I hope this helps

Make it a great day

Hi Rudy, thanks for the explanation. When i spoke about a picture, i should have been more precise. I meant a blueprint picture. I can’t get the “onClicked” event from the widget button to appear. The only event from the widget i can see are custom events. Also, when i try so setup anything concerning a widget, the editor crashes.
I tried with cast and with assign/bind event (even if i didn’t totally understood that system). But the call event seems not to exist.

Hello ,

After further investigation into the method suggested above by myself, I have found that this method will not work. When I first tried the method with adding plane components I was using print strings to confirm my results. However, I later found an issue that prevents custom events from firing off correctly inside widgets when originating from another blueprint.

I do however, agree with the idea of alternate methods of interaction with 3D widgets being a good idea. I have submitted a request (UE-7621) to the developers for further consideration. I will provide updates with any pertinent information as it becomes available. I do not have a workaround currently for this issue. Thank you for your time and information.

Make it a great day

Hey all, I’m late to the game and not as knowledgeable as you in UE4, and probably programming in general… so I’m going to ask a question in my more layman’s terms. (I’m working on a VR project as well where I’m using line traces and a timer to detect an “active” state.)

I’ve created a UMG_Object that is a button, then I created a BP_Object that has the UMG_Object as a component. When I do a line trace, should this BP at least return me an Actor? Because I’m not even getting that… unless I’m doing something wrong. (Do I need to manually add a collider or some sort?

THEN, If I can get that working, should I be able to just get the UWidgetComponent off of my actor, and through that I can go ahead and get my UUserWidget Object (using GetUserWidgetObject())???

Thanks a bunch!

EDIT: I’m not doing to delete this in case someone else does what I’ve done… but the issue was the collision preset I was using. It was one UI. I went ahead and made one that is call GazePoint that blocks a trace coming from the camera.

Hey all…

I’ve created a 3D user interface object using UMG and a Widget Blueprint. I have a class called GazePoint that derives UUserWidget, and it has a UButton variable called MyButton:

/** The button tied to the gaze point. */
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "GazePoint")
UButton *MyButton;

In my UMG “Blueprint Properties”, I set the Parent Class property in Globals to GazePoint. I then created a graph to initialize MyButton:

EDIT - I was also able to do this in C++:

MyButton = Cast<UButton>(GetWidgetFromName(FName(TEXT("Btn_GazePoint"))));

Once I’m able to get a reference to my GazePoint class, I was able to check if my pointer contained a value by simple changing the background color of my button: (I chopped my code a bit to save space)

TArray<UWidgetComponent*> WidgetComponents;
Actor->GetComponents(WidgetComponents);

UWidgetComponent *MyComponent = WidgetComponents[0];

UGazePoint *CurGazePoint= Cast<UGazePoint>(MyComponent->GetUserWidgetObject());
CurGazePoint->MyButton->SetBackgroundColor(FLinearColor::Red);

Quick blueprint set up as well to give you a basic idea of the above code. You will need something to begin the execution of the blueprint nodes:

After this… my button’s background turned red.

I hope this is what you were looking for. Feel free to correct me if I did something wrong, if something didn’t make sense, or I was just completely off the mark for what you’re looking for. But of luck!

  • Austin

Hi, well that sounds nice.
The issue i was encountering was that i could trace the Widget component, but not the items inside it (buttons, sliders…) and so i couldn’t interact with them.
So if you found how to, go ahead, i’m listening! :slight_smile:

Just in case it doesn’t e-mail you to tell you I’ve updated my answer… I’m responding to your comment in the hopes this notifies you :slight_smile:

Hi, sorry it updated shortly after i posted mine.
Unfortunatly i’m not familiar with C++ and only work with Blueprints. If you do have a solution with BP only, it will be greatly welcome :smiley:

Here is a quick BP I did up… I didn’t test it but I hope it gives you an idea of what to do. You will need something to trigger the execution of the node so you can do your Casting and such… Ill add this to the buttom of my answer as well.

Hi Austin, i gave your solution a try, unfortunatly it doesn’t solve my problem.
What i’m trying to achieve is to trigger an “On Clicked” event on the specific button that was hit (let suppose there are several button on the widget, let’s say it’s the panel they showed up in the twitch live stream).
The problem is that triggering an event through a BP to a widget doesn’t work, they simply doesn’t appear in the right click menu.
Also the other problem, is to get the specific button that was hit

Ohhhh, yeah. It seems you cant call the Broadcast() functionality of the OnClicked event through blueprints… that’s odd. But it looks like that’s not what you want anyways… you want an object that has a delegate that passes a parameter of the button that was clicked. That will require a new class I think.

Sorry I couldn’t help that much! It seems you’ve to wait for them to expose the Broadcast functionality for delegates.

Hello ,

I have come up with another method. However, this method can only be done in 4.7 at the moment and relies on a tick event. This method uses a sort of custom cursor. The custom cursor is only and image that follows the mouse position. But since it is being done in a widget if it is applied to a 3D widget the custom mouse will only show up on the 3D widget. This will allow you to interact with a 3D widget even in VR. I hope that this helps.

Here is the example that I have come up with:

This is what it looks like in game. You can see the green square (my custom cursor) and it is applied to a 3D widget. This means that it is limited to the plane that the 3D widget is applied to.

This is the event graph for my widget. I would suggest breaking parts of this down into a function when you start to add more buttons. This simple take a bool from the player character that will tell it when the mouse button is down (for various reason On Mouse down cannot be used one such reason is that you would need to add the widget to the viewport in order to have it work correctly). After that it opens and closes a gate depending on if the mouse button is down or not and if the image is over the button.

This is my character blueprint.

28477-3dwidgetclicking3.png

Make it a great day

Just wanted to say that I used this approach along with some scrolling hacks for my VR browser

http://i.imgur.com/oARViiA.gif

and it worked well! I do hope though that down the line Epic makes custom input to UMG widgets a bit more streamlined and less dependent on hacks.

I took a different route to 3D VR gaze input: I implemented basic gaze-to-actor-with-widget-component and widget-cursor-display in Blueprint, then made a GazeInput plugin using code adapted from SlateApplication’s input processing code. This way the gaze input works with all widgets as-is without the need to add new detection geometry or create new widget types.

I hope to clean it up and release the plugin and instructions in future, unless Epic beats me and releases similar functionality themselves.

Any news on a plugin release (or just some source code / pointers)? Your approach sounds exactly like what I want to implement.