I've tried adding 100 GB of LFS credits to my github account but still get:

"This repository is over its data quota. Purchase more data packs to restore access."

For this repo:

https://github.com/aarch64-laptops/prebuilt

And I've tried forking the repo to a new repo under my account but same issue. Any other way to get these large files?

1 Answers

0
VonC On

Forking wouldn't solve the problem, as stated in nabla-c0d3/nassl issue 17

To be honest, I'm very disappointed with how GitHub supports git-lfs for public projects. Charging for storage and bandwidth makes no sense, especially since switching to git-lfs reduces bandwidth usage / cost for GitHub, compared to just storing big files in git. The fact the quotas are for every fork, which I did not know, is even crazier.

I agree with you that git-lfs leaves a lot to be desired, and I think you raise an excellent point about charging us for something that saves them money.

If you don't care about the history of those large file, you could, in the context of your additional credit:

  • create a new local repository
  • add one of those large files in it
  • add, commit, and push
  • repeat for each other large file (see if the process blocks at any point)