IOS Camera as MediaTexture

I’m trying to capture the back facing camera on ios to texture, basically following the code from AVFMediaTracks using a MediaTexture as shown below (full code in this gist: header, source). The code runs without errors but the texture is black. Any ideas?

void FIOSCamera::HandleSampleBuffer(CMSampleBufferRef sampleBuffer)
{
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    size_t Width = CVPixelBufferGetWidth(pixelBuffer);
    size_t Height = CVPixelBufferGetHeight(pixelBuffer);
    if (TextureCache == nil)
    {
        id<MTLDevice> MetalDevice = (id<MTLDevice>)GDynamicRHI->RHIGetNativeDevice();
        CVMetalTextureCacheCreate(NULL, NULL, MetalDevice, NULL, &TextureCache);
    }
    CVMetalTextureRef texture = NULL;
    CVReturn status = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, TextureCache, pixelBuffer, NULL, MTLPixelFormatBGRA8Unorm_sRGB, Width, Height, 0, &texture);
    if(status == kCVReturnSuccess)
    {
        FRHIResourceCreateInfo CreateInfo;
        auto Wrapper = new FAvfTexture2DResourceWrapper(texture);
        CreateInfo.BulkData = Wrapper;
        CFRelease(texture);
        CreateInfo.ResourceArray = nullptr;
        
        uint32 TexCreateFlags = TexCreate_SRGB;
        TexCreateFlags |= TexCreate_Dynamic | TexCreate_NoTiling;
        
        TRefCountPtr<FRHITexture2D> ShaderResource;
        
        ShaderResource = RHICreateTexture2D(Width, Height, PF_B8G8R8A8, 1, 1, TexCreateFlags | TexCreate_ShaderResource, CreateInfo);
        if (ShaderResource)
        {
            VideoSink->UpdateTextureSinkResource(ShaderResource, ShaderResource);
            CMTime timestamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            FTimespan DisplayTime = FTimespan::FromSeconds(CMTimeGetSeconds(timestamp));
            //UE_LOG(LogTemp, Log, TEXT("Display TextureSink %d, %d"), Width, Height);
            VideoSink->DisplayTextureSinkBuffer(DisplayTime);
        }
        else
        {
            UE_LOG(LogTemp, Error, TEXT("IOSCamera: Couldn't create shader resource"));
            delete Wrapper;
        }
    }
    else
    {
        UE_LOG(LogTemp, Error, TEXT("IOSCamera:: Couldn't get texture from cache"));
    }
}

I think the problem might be that the ShaderResource you’re creating is getting destroyed when the containing scope exits. TRefCountPtr is a reference counted pointer that will auto-destroy the pointed to object if the count goes to zero. The resource is passed as a raw pointer to the video texture, and since FIOSCamera::HandleSampleBuffer is not on the render thread, the render command will also just have a copy of that raw pointer, which doesn’t increment the reference count.

As a workaround, you could either hold on to the resource in FIOSCamera (perhaps reuse it for each frame), or you could execute the video sink related code on the render thread.

PS: This kind of stuff is going to be much easier with Media Framework 3.0 in the upcoming 4.18 release. In the new API, video samples are actual shared objects that can be passed around. You can already take a look at the new code in GitHub Master.

I forgot to mention that Media Framework 3.0 also has better support for video capture devices. However, we didn’t have time to implement iOS/macOS yet. We only got experimental support on Windows and Android, but other platforms will follow.

Thanks! I have it working now. I’ve updated the gists. And looking forward to 4.18 although it seems i’l have a lot to do to migrate.

Yes, there are quite a few changes. I rearchitected much of the API. I’ll be here to help though, so just post new questions in case you get stuck, thanks!