Render Images to Stream Real Time
We are designing a simulator and would like to have multiple views accessible by an external process. Lets say the scene has 4+1 cameras in it but the 4 cameras need to be rendered to a video/image stream that another program can access either through shared memory or sockets and process these images with computer vision algorithms in close to real time. Ideally with a very low detailed scene we would like to get roughly 15 fps with as little latency as possible.
I do not know where to start looking to solve this solution and I may not be googling the right keywords to find the answer so any help in the form of "hey look at this" is much appreciated or if anyone is aware of a similar prpblem/solution
asked Sep 30 '15 at 05:03 PM in Rendering
Well you can render a camera to a material. How to get that to a differnet display is FAR beyond me and 99.99999% far beyond blueprints. I HIGHLY suggest posting in the c++ category in the forums, hopefully some expert knows how to send a material to displayadapter x. I know you can choose display adapter x on startup but I've never worked with/read about multiple monitors.
It is going to require editing te engine code most likely or at the very least some complex C++
Hope this helps Don't forget to accept an answer that best clears your question up or answers it so when the community finds your question in the future via search/google they know exactly what you did to fix it/get it going.
answered Sep 30 '15 at 08:56 PM
Follow this question
Once you sign in you will be able to subscribe for any updates here