x

Search in
Sort by:

Question Status:

Search help

  • Simple searches use one or more words. Separate the words with spaces (cat dog) to search cat,dog or both. Separate the words with plus signs (cat +dog) to search for items that may contain cat but must contain dog.
  • You can further refine your search on the search results page, where you can search by keywords, author, topic. These can be combined with each other. Examples
    • cat dog --matches anything with cat,dog or both
    • cat +dog --searches for cat +dog where dog is a mandatory term
    • cat -dog -- searches for cat excluding any result containing dog
    • [cats] —will restrict your search to results with topic named "cats"
    • [cats] [dogs] —will restrict your search to results with both topics, "cats", and "dogs"

Office Holiday

Epic Games' offices will be on holiday from June 22nd to July 7th. During this period support will be limited. Our offices will reopen on Monday, July 8th. 

FMemoryWriter does not support data larger than 2GB - not real

Hi everyone, sometimes when I try to cook my projects, unreal gets me the error of 2gb: I never had levels which weight so much, infact I split every part of my project in sub-levels to not have the 2gb error; everytime I solved checking my project and finding some link not perfect, some blueprint not correctly connected or similar little problems; now I made some really simple changes to a project I've packaged 6 days ago with no problem: it's since yesterday I'm trying to cook it and it gets stuck with this problem; I've checked every new part many times and I haven't found anything bad; here is the copy of the output log it gives me everytime

Cook output log

if anyone find something please tell me: I'm open to any solution, but I'd rather not redo the project in another version or try things that imply to rebuild my project, since I'd have to rebuild the lights twice and it would take about 10h each time

Thanks for the attention

Product Version: UE 4.11
Tags:
cook output log.txt (508.9 kB)
more ▼

asked Sep 15 '16 at 08:24 AM in Packaging & Deployment

avatar image

Hainzgrimmer
40 8 16 18

avatar image Hainzgrimmer Sep 15 '16 at 09:08 AM

In the middle of my trials I've changed 2 parameter and it worked:

  • In the PostProcessVolume -> Misc -> Screen Percentage, I had supersampled it at the value of 200

  • I've packed for Shipping instead of Development

I'm thinking that the fist one could be the real solution, but it's not clear since my heavier level was weighting around 300mb, so even doubling it would just reach 600mb so event in that case I should be under the limit of 2gb; overall I'm working on a parallel project, really close to this, with same setting, but way bigger, it hasn't had any problems!

I'd like to know from anybody of Epic where is the real deal!

(comments are locked)
10|2000 characters needed characters left
Viewable by all users

1 answer: sort voted first

Hello Hainzgrimmer,

That would make a big difference. While it may seem that going from 100% to 200% would be simply be double the memory, it is much more than that. You could think of it as exponential rather than multiplying by that percentage. It wouldn't be surprised if it's taking you way over that 2GB limit. I would suggest playing with the value some and seeing how high you can get before it hits that 2GB limit so that you have an idea of how much of a difference it's making.

more ▼

answered Sep 15 '16 at 06:17 PM

avatar image L34D3R Dec 23 '17 at 04:59 AM

I'm facing the same problem UE 4.18.2

avatar image Matthew J Dec 28 '17 at 05:16 PM

Hello L34D3R,

As I told Hainzgrimmer, have you tried different values to see if you actually hitting the 2GB limit? going from 100% to 200% isn't simply "doubling" the amount, it's much more than that.

(comments are locked)
10|2000 characters needed characters left
Viewable by all users
Your answer
toggle preview:

Up to 5 attachments (including images) can be used with a maximum of 5.2 MB each and 5.2 MB total.

Follow this question

Once you sign in you will be able to subscribe for any updates here

Answers to this question