Large BP Graphs Performance/Compiling time improvements

Use Functions, separate your logic and reuse parts of it, no reason to have everything on one graph. my biggest blueprint which have like 250 functions + events in it takes 2 sec to compile and have no lag on main and only one graph. Also, using function would allow you to debug and update stuff a lot faster.

Some BPs tend to become rather large after some time, crowded with many nodes. Generally I’m fine with having many nodes in one Graph, I can work faster that way than having things split into hundreds of functions - however after some time the editor slows down to much.

I think there are various ways to improve on it, but I’m unsure which is the best solution in terms of BP speed, Compiling- and runtime performance.

-New Graphs
-Collapsed Graphs/Nodes
-Functions
-Macros
-anything else

I definitely make use of functions and macros for pieces of code that can be reused. But there’s lots of logic that don’t gain any benefit from it but being cluttered and slow to access.

I’m really only interested in a comparison of the above methods in regards of BP speed. If they are all about the same I will probably use a combination of New Graphs and collapsed Graphs.