How to read correct pixel values from 32-bit EXR image?

I’m trying to read in pixel values from a 32-bit EXR using the following code:

FLinearColor UBlueprintFunctionLibrary::TextureTest(UTexture2D * TextureInput, int32 PixelIndex)
{
	FLinearColor PixelColor;
	PixelColor = FLinearColor(1270, 1270, 1270, 1270);

	if (TextureInput)
	{
		FTexture2DMipMap * CurrentMipMap = &TextureInput->PlatformData->Mips[0];
		FByteBulkData * RawImageData = &CurrentMipMap->BulkData;
		FLinearColor* FormatedImageData = static_cast<FLinearColor*>(RawImageData->Lock(LOCK_READ_ONLY));

		PixelColor = FormatedImageData[PixelIndex];

		TextureInput->PlatformData->Mips[0].BulkData.Unlock();
	}
	return PixelColor;
}

Unfortunately, this is giving me completely incorrect values from the pixels.

Does anyone know how to get correct (32-bit / HDR) values from an EXR through C++? Have been bashing my head against this for ages now. I feel like the correct values must be in there, and in theory accessible, but I can’t work out how to get them out.

Any help massively appreciated!

Hello.
Did you solve this issue?

Id like to know this too. I want to render scene depth to a 32bit render target, and read pixels from it.