Engine constantly recompiles

Hi,

We have recently moved our project over from 4.13.2 to 4.16.2. The project is partly developed in C++. When working on the project within 4.13, it was possible to make changes to our game code, hit Build Game in VS(2015) and only the game code would be compiled, occasionally it would be required to force a recompile of FMOD plugin too, but that is pretty small in memory, so it has not been a problem.

Since moving to 4.16.2, I have made a small change to one of our source files, no change at all to the engine source, compiled the game source successfully with no errors, then when I have attempted to start the Editor, it claims that some DLLs are out of date, requiring a recompile. This then triggers a FULL engine rebuild.

I waited the roughly 1 hour, spending the time looking around for possible solutions to this scenario, once it completed (I had to do the build from within VS as it failed within the engine), I modified the GenerateProjectFiles removing the -engine options, rebuilt the project files, so that the sln would not have the engine code in it ready for the next change, I did not start VS, or change anything else. Started the engine, and was told that a few DLLs were out of date, I clicked OK (not much option really), and off we go AGAIN on another full engine build.

Please tell me how do I stop the UnrealBuildTool deciding that the engine needs rebuilding all the time, I have lost hours today watching already built code being pointlessly recompiled. There were the odd strange recompiles within 4.13 but they were rare, and never back to back. I think this must be the 3rd full engine recompile without changes.

Edit: I am having a new theory. I have just experienced another round of strange behavior, one symptom was VisualStudio claiming it couldn’t find the pdb for our project or most of the engine despite them clearly being there and freshly constructed.

Due to limitations of diskspace, all my project and engine source is on an external HDD which was mounted into an NTFS folder… When looking at the debug modules in VS I noticed that for DLL’s stored on the local disk the path was given as c:… however those stored on the external disk were something along the lines of /Device/Disk-2/Volume/blah/blah and then it was unable to find the pdb for some if not all. I tried the usual of shutting down VS, UE4, and even rebooting with no joy.

I have just remounted that disk as a drive letter and without recompiling or making any changes attempted to once again debug our project. This time all the symbols were located without issue.

Given this strange behavior from VS itself has lead me to wonder if perhaps UBT wasn’t always understanding or getting sensible file locations back from Windows and hence assuming the files hadn’t been compiled.

Hi Graeme,

I checked with one of our engineers who works on UBT, and he said that he believes it should only work with drives mounted as individual drives with unique drive letters. It sounds as though that may be working for you for now. If the issue returns, please let us know and we will keep digging.

Tim

The whole source is now on it’s own drive mounted as drive D:, the problem still occurs although less frequent. It also happens on a different machine (same source) which only has a C: drive. I have just followed the steps that cause a full recompile for me.

  1. I built the Project in Visual Studio, Target of Development-Editor for Win64 2) I then ran our build/cook/package script which has the target Development Win64.

This produced a lot of log files with verbose enabled. I made no changes to the project between the compile finishing in VS and starting the build script.

I am attaching one of the compressed UnrealBuildTool logs from that single run. It looks like it has produced several (I guess one for each of the various components it has rebuilt). I can zip up them all if you like.

I have had a scan through myself and it seems to be reporting that “.obj files are produced by an outdated action” Which I am guessing means it thinks they are out of date (despite VS having just compiled) link text

This is the thing, I made NO changes to anything. I literally compiled in VS. Then ran the build script to build a packaged build of the project.

If on starting the editor it deems the .dll to be out of data and I allow the editor to trigger a compile that can trigger a full recompile, the same is true of the build scripts. If I stick to only ever compiling with Visual Studio, on the whole I am not forced to constantly recompile everything.

I have also had it where if I add a new C++ class in the editor, the next time I hit compile in VS, it re-compiles the full engine again.

It feels like whenever I use a different application to compile (Visual Studio, or Unreal), the system thinks everything is out of date. If I stick to the same application, this doesn’t happen (as much)

The log file lists the actions which are outdated and needs to be triggered. In particular, I see these lines:

SharedPCH.UnrealEd.cpp: Prerequisite SharedPCH.UnrealEd.h.pch.response is newer than the last execution of the action: 18/08/2017 18:22:17 vs 02/08/2017 17:00:13 SharedPCH.Engine.cpp: Prerequisite SharedPCH.Engine.h.pch.response is newer than the last execution of the action: 18/08/2017 18:22:17 vs 02/08/2017 17:00:14 SharedPCH.CoreUObject.cpp: Prerequisite SharedPCH.CoreUObject.h.pch.response is newer than the last execution of the action: 18/08/2017 18:22:17 vs 02/08/2017 16:59:51 PCH.Core.cpp: Prerequisite PCH.Core.h.pch.response is newer than the last execution of the action: 18/08/2017 18:22:20 vs 02/08/2017 16:59:50

Those PCHs are used by everything, so that looks like what’s causing the full rebuild. To diagnose the problem, you could diff those .response files between builds from different programs.

It’s possible that some target that you’re building is forcing a define on which is invalidating the PCHs used by another target.