Terminal screen and display emulation

Hello everyone!

I am trying to write a terminal or screen for my game. I would need to somehow be able to edit a dynamic texture, display it on a GUI and in-game on a mesh.

Another thing is capturing user input via keyboard and mouse. In the GUI capture both mouse and keyboard, and in world capture only keyboard and use some sort of ray casting and unproject where the user looks at on the screen to make something like touch screen capabilities.

I want to implement a computer-like thing in my game, but I have no idea how to capture user input and/or generate a dynamic texture which can be edited and rendered at runtime.

If anybody knows how to do that, I’d love to know how to do it.

Can anyone from Epic Games please help us out here? This is also an essential framework, especially in the context of the #bigDataVR challenge.

Is this what you’re looking for?
https://docs.unrealengine.com/latest/INT/Engine/UMG/HowTo/Create3DWidgets/index.html

Umn, sort of. If I could get this into a C++ script, get the position of where the player is looking and display a dynamic texture, sure.