Low quality graphics only when playing sequences
My game has some cutscenes where the camera always ends exactly at the same position where the gameplay camera will be. So, the end of the cutscene smoothly merges with the rest of the game.
It was working ok until 4.12 (I think), but not anymore: now, when the sequence ends, the screen "blinks". The graphics quality obviously changes from the last frame of the cutscene to the first frame of the gameplay. It didn't happen before, but happens now (I'm using 4.14).
Here are two screenshots:
The ending of the sequence:
The beginning of the gameplay:
Look at the water, the horizon, the aliasing... (click the links bellow to open the images on full resolution if needed)
I could handle with the bad graphics, but I can't handle with the bad and clunky transition from the sequence to the gameplay...
No mather how I configure the engine scalability (epic, or even cinematic), the results are the same.
Anyone knows what's happening?
FIXED. I had a blueprint that manages the view target. It was calling SetViewTargetWithBlend at each frame (tick function) because I had some problems in the past that don't matter now.
So, I don't know why, but this was causing the problem. I removed that from the tick function and now the sequence is playing fine.
answered Jan 15 '17 at 03:52 PM
Follow this question
Once you sign in you will be able to subscribe for any updates here