Data Quota Errors When Working with GitHub Fork of UE

This just started earlier today, but since it’s first occurrence, there’s been nothing I’ve been able to do with my repo that seems to be able to get past it:

Checking out files: 100% (84201/84201), done.
Downloading Engine/Binaries/DotNET/GitDependencies.exe (42 KB)
Error downloading object: Engine/Binaries/DotNET/GitDependencies.exe (d4d8f00): Smudge error: Error downloading Engine/Binaries/DotNET/GitDependencies.exe (d4d8f005cdf942439f2f105012fa76dcbe5952551811463e56e09bb111943cd8): batch response: This repository is over its data quota. Purchase more data packs to restore access.

Every single I time I basically fetch to pull/push/fetch/checkout/etc. the op ends up failing with that same error. Needless to say: I spent a lot of time trying to fix it. The thing I can’t figure out is why the data quota would be a thing, as both my personal GitHub account and my organization account (where my fork of UE is) are both well under any data limits (bandwidth or storage). It also isn’t affecting any of my other repositories (even a private UE repo I have that isn’t a fork).

(I also can’t even clone my repo without it failing).

Any help would be appreciated because this has been an annoying and, due to when it occurred and the nature of operations it’s interrupting, pretty fully-blocked day.

More info:
My LFS migration info check finished (I only use LFS for a handful of directories unrelated to the base UE repo and this was the only thing I found online that was remotely related to data quotas) just after I posted this and… Uh. The results were unexpected:

λ  git lfs migrate info --top=100
migrate: Fetching remote refs: ..., done
migrate: Sorting commits: ..., done
migrate: Examining commits: 100% (61668/61668), done
*.xml           85 GB     14213/14213 files(s)  100%
*.cpp           6.5 GB  167949/167952 files(s)  100%
*.h             1.8 GB  163860/163873 files(s)  100%
*.cs            613 MB    23856/23856 files(s)  100%
*.udn           338 MB    40440/40440 files(s)  100%
*.so            218 MB          62/62 files(s)  100%
*.pdb           144 MB        322/322 files(s)  100%
*.dylib         122 MB          18/18 files(s)  100%
*.ini           121 MB      1996/1999 files(s)  100%
*.gnf           118 MB          20/20 files(s)  100%

For what it’s worth: on my forked repo, I keep all documentation, extras, templates, feature packs, and samples ignored (and generally deleted) and my ignore file only makes very explicit exceptions for a handful of items.

[EDIT] I eventually got my repo back to a “working” state where it was exactly in the state of my last commit yesterday (I think; it’s doing a full rebuild) — I basically removed all remotes (including my own origin), pruned everything, cloned a new repo into another folder (which failed to fully complete due to the data quota thing) and took its git folder, git a pretty hard nuke on all files, did a full setup again, re-attached my repo as the origin, and… Well. That’s in-line with my last commit. That’s alllll I’m touching for now.

Also: this all started happening when I just made a change to a couple files and tried to push it today; as of last night everything was hunky-dory.

Just to add to this: GitHub got back to me, and this is apparently not a result of my repo, but as a result of being a UE4 repo fork.

I have no idea how this is a thing, but I’m even more confused why no one else seems to be having issues.

Basically, don’t try to use lfs in an unreal fork. It’s not going to work.

I’m not sure how GitDependencies got converted to an LFS file — the only thing I use LFS for are for binaries in my fork that are unrelated to the UE4 repo (custom engine-side libraries and assets); this fork has been working fine for about a year or so. So “it’s not going to work” seems somewhat inaccurate. Other than GitDependencies (which, again, no idea how it got over to LFS since it wasn’t manual and once in git things don’t go into LFS automatically afaik) the only LFS-tracked files are not a part of UE4’s repo.