Hi,
I’ve been racking my brain and searching the internet for a good week now, as I’m looking to implement an interactable 3D UI widget for VR. I’ve tried to do this many ways within the editor, and I’m now looking to either edit the engine or make an extension to UUserWidget that allows me to emulate all of the touch invocations on interactable subwidgets from blueprints. I intend to do this the way getnamo mentioned by implementing a recursive geometry collision solver.
Looking through the engine code, the whole widget library is quite a mess, and it looks like it hasn’t seen an overhaul in years; it was not made to be extensible. This surprises me, but I suppose this was meant to map directly into an on-screen overlay, which is the antithesis for UI design in VR. There’s UUserWidget which has a “WidgetTree,” which contains Widget objects which aren’t actually widgets until they map out into SWidget objects, which apparently inherit their geometry from their parents in real time. I can’t seem to get the geometry of any SWidget from itself, nor from any children. Am I going about this in the wrong direction?
I just want to be able to use UMG Editor to design my 3D VR widgets instead of the apparently condoned method of manually dragging components for hit detection over my UI components. Users of UE4 should not have to roll their own VR UI for every specific project, especially if we’re just trying to add a simple UI to interact with a quick tech demo.
Is anyone interested in working on this with me? The community should really make a push to understand, fork, fix, and merge the existing UMG code to fit VR into the ecosystem. At the end of this, if I end up with a separate class for a VR UI Widget, I’ll share it on the forums. If it requires modifications to the engine and I get it working robustly enough, I’ll gladly make a merge request to the source.
Thanks for your help,