Live videocapture into ue4?

Uff, i would say yes, but that is somethine so special that i have no idea how to do. I already saw guys streaming a webcam into UE4 though.

Is it possible to stream a livevideo, grabbed by a capturecard, into ue4 as a texture?

Hi eXi,
I guess you meant this: Augmented Reality "virtual workspace" with Unreal Engine, CoherentUI, OpenCV - YouTube ?

As i am not a programmer, i cannot program open cv in c++, i thougt more of a plugin or of a feature of the new mediaframework, wich could help me to integrate a consumer card like the blackmagic intensity pro or in very near future the datapath displayportcard wich can capture up to uhd.
Any more recommendations?

i am still LOOKING FOR A PROFESSIONAL VIDEOCAPTURE SOLUTION FOR UNREAL :slight_smile:
i would like to add,the solution i am looking for is like the kinect4unreal plugin for example.

Btw: It can send its hd camerapicture to unreal, wich could become handy in a different application.

We don’t support this out of the box just yet, but it’s something we’re working on.
See also the Media Framework Roadmap that I just posted on the forums.

Hello Gmpreussner,
thanks for taking your time to reply.
It is good to hear/read that you are working on a professional solution.
:slight_smile:

It would be great if you could include support for cards from blackmagic, active silicon and datapath.

This will open unreal to totaly different applications, i guess!

best regards

BlackMagic will definitely be integrated as we need it for one of our bigger licensees. I will look into the other products as well, thanks!

I just started researching this myself. Glad to hear it is being worked on!

I’m going to mark this thread resolved, because we already have several threads like this one. Please refer to the forum thread for future updates, thanks!

After so long time, this feature is still not supported:(

Hoping to keep this alive, I cast a vote for this Media Framework feature ( link to UE-35408 ).
Live video input to texture would be useful in the broadcast industry.

There are currently two options:

For anyone looking for a starting point on how to implement this:

A little background: I’ve written a UE4 plugin (company internal only) that does essentially this.

Use UTexture2D::CreateTransient() to create a UE4 texture to hold the final image data. I would suggest doing this only once right at the initialization of your Plugin.

Get a pointer to your Raw Data buffer of the Video Feed. (I used the Decklink SDK).

Use OpenCV to convert your raw data buffer format into a format recognized by a UE4 Texture (I.E. BGRA8).

Get a pointer to the raw data inside of your UTexture2D Object pointer you received when you called UTexture2D::CreateTransient() by something along the lines of: void* FormattedImageData = m_Texture->PlatformData->Mips[0].BulkData.Lock(LOCK_READ_WRITE);

Write each pixel of your data using the pointer returned by Lock() then be sure to call Unlock() followed by UpdateResource().

Let me know if you have any questions.

Please clone the plug-in into /Engine/Plugins

There was a change in 4.15 that I missed that prevents the project from compiling as a project plug-in. I will fix that later today for the upcoming 4.16 release.

The NDI-Plugin sounds promising!
I am having some compiling issues when i try to rebuild NdiMedia-4.15 on ue 4.15. It does not compile, i tried it on 2 different installs.
Anyone else having this issue?

Thanks for the quick response!
With a little help, i have managed to install it in the project. :slight_smile:
Is there an example for better understanding what i need to setup in the Blueprint to stream out of the engine?

The plug-in only supports NDI input at this time.

Apologies for the self-promotion, but if you guys are comfortable using a 3rd party product you could have a look at Lightact media server.

Have a look at this tutorial:

It’s for a webcam, but the same procedure works with live video capture cards. We’ve worked with Datapath cards mostly.

i think Datapath FX4 will be ideal to use