Standalone database for Unreal?

Is there any known local standalone (so no sql server or anything) database that works with Unreal4 that is not abandoned/deprecated?

This is the best I found so far but no updates in over 9 months:

Another one but not updated for the last ~2 years:

And this solution:

But… also contains outdated guides and whatnot… I couldn’t’ get it to work. I wish there was a really fool-proof & up2date guide for it somewhere.

I just need something to dump and retrieve local level-data (for a voxel game) because I would run out of RAM otherwise (because the total stored data itself can be enormous). Databases like SQL Server and whatnot are complete overkill. I prefer something locally that just resides within the game directory instead of having to install all huge platforms onto the end-user system and worry about licenses.

Unreal Datatables are probably not designed to handle such large amounts of data + would probably just instantly cause a crash due to RAM problems as I believe that it will load the entire table in one go. And it would be too big to ever open in Excel :P.

The only other solution I’d have would be XML-files but I worry about performance when the server has to read/write to/from the XML file that contains (hundreds of) millions of entries already plus it’s kinda of a dirty solution. Unless anyone has a better solution?

Hi. Why not to use a FileWriters already included in UE? Also there is compression functions included here that allows to compress your data stream on-the-fly.

Compression: Engine\Source\Runtime\Core\Public\Misc\Compression.h
File writer and reader could be created using IFileManager::Get() and gives an opportunity to operate with files as with common UE archive.

You may be right, there is perhaps no reason to go with a database (was considering regular binary files anyway).

I’m not gonna be using compression but I might combine a lot of voxel-chunks together and write them to a file and then put the index into the filename so that the server knows what file to use instead of a database.

Note to self: A new, community-hosted Unreal Engine Wiki - Announcements - Unreal Engine Forums

I suppose that it’s performance is the same as any other disk I/O operations, which should be okay. Perhaps also add folder structures because if too many files are created within the same directory, it will slow down performance and creating 1 huge file can not possibly be efficient either.

Consider using the Save Game system, you can generate GUIDs or use any other various unique ID to get/set records. You can spread data out amongst various save files

https://docs.unrealengine.com/latest/INT/Gameplay/SaveGame/Blueprints/

Also interesting but the values I’m saving are mostly just integers (integers perform better than bytes on modern CPU’s). So instead of saving the voxel-object itself I only save it if a voxel was modified and then I only save the new value (just an integer). And because there can be multiple modifications it becomes an array like:

VoxelChunk[x + width * (y + depth * z)] = 0; // 0 = removed/no block
VoxelChunk[x + width * (y + depth * z)] = 1; // 1 = grass block.

There can be tens of thousands of arrays like that and each array having between 1 and 125000 entries (50x50x50). I only save the modifications, the rest is generated client-side from seed. I’m not sure but I think that the built-in savegame system is slower than the FileManager (it creates a class and whatnot, which I don’t need). I just need to save/load a 1D-array basically. Well okay maybe an array of arrays because 1 savegame file could probably have multiple arrays to reduce number of files.

So in this case is the savegame system really better than the FileManager (aside from it being easier to code in UE4)?

Crap… Now that I think about it, If I only modify the last block (index 49, 49, 49) then the array probably still has 125000 entries but all other entries are empty entries… I may have to rethink that part… No point in saving, loading & networking 124999 empty entries. Voxel games are more complex than they look -.-

If you need archives you can use
FArchive* OutputFile = IFileManager::Get().CreateFileWriter(OutPath, FILEWRITE_EvenIfReadOnly);
to write and
FArchive
FileAr = IFileManager::Get().CreateFileReader(*InFilename);
to read your data.

An archive allows to save and load objects like you usual do in savegames.
To serialize ints, arrays or other primitives you just have to use operator<<
Also you can write whole USTRUCT or UOBJECT to archive

	ObjectClass->SerializeBinEx(Archive, (uint8*)ObjectData, (uint8*)ObjectClass->GetDefaultObject(), ObjectClass);

First argument is an archive to save object to (or to load from if archive is reader). Second is your object’s memory dump - pointer to an instance may be used. Third is a CDO - a class default state. Save\load process ignores entries that never changed from its CDO change and it may be helpful with voxels. Last argument is a class of an object given as CDO - usual it’s the same Object class, but you can use parent classes to limit excluding non-changed properties if you need.

Also I wrote about compression earlier - it’s fast enough and can effectively remove your empty blocks from the resulting file.

Also there is nice tutorial about that A new, community-hosted Unreal Engine Wiki - Announcements - Unreal Engine Forums,Read%26_Write_Any_Data_to_Compressed_Binary_Files

The only thing is that a struct I believe is created on the heap instead of the stack (not 100% sure). And a CDO would likely perform even worse when created in insane numbers. BUT for determining what voxels to save to disk (only the modified ones), a CDO may be a good idea. But I probably just prefer an additional integer-value acting as a ‘flag’ that determines if the value was modified or not.
Regardless, the methods you two posted should still work.

I suppose that I can add multiple data to the same binary array w/o using structs by just adding more as long as I load them back in the same order:

FBufferArchive ToBinary;
ToBinary << Array1;
ToBinary << Array2;
ToBinary << Array3;
// etc.

Well looks like the database-idea is gonna be dropped. Which I find personally a bit weird because databases are meant for things like these. But the filesystems seem much easier and probably perform just as good, if not better. And has more possibilities.