Render Images to Stream Real Time

Problem:

We are designing a simulator and would like to have multiple views accessible by an external process. Lets say the scene has 4+1 cameras in it but the 4 cameras need to be rendered to a video/image stream that another program can access either through shared memory or sockets and process these images with computer vision algorithms in close to real time. Ideally with a very low detailed scene we would like to get roughly 15 fps with as little latency as possible.

I do not know where to start looking to solve this solution and I may not be googling the right keywords to find the answer so any help in the form of “hey look at this” is much appreciated or if anyone is aware of a similar prpblem/solution

Well you can render a camera to a material. How to get that to a differnet display is FAR beyond me and 99.99999% far beyond blueprints. I HIGHLY suggest posting in the c++ category in the forums, hopefully some expert knows how to send a material to displayadapter x. I know you can choose display adapter x on startup but I’ve never worked with/read about multiple monitors.

It is going to require editing te engine code most likely or at the very least some complex C++

Hope this helps
Don’t forget to accept an answer that best clears your question up or answers it so when the community finds your question in the future via search/google they know exactly what you did to fix it/get it going.

When you say rendering to material do you mean render to texture? And if so am I able to access this Texture as image data that I could then share with an external process? We don’t need to render to multiple monitors just need to be able to expose the renders to another process

renter targets work directly in the GPU memory (as far as I understand it) so you will still have to make this availble externally… doable, but probably not in Blueprints