Closed stephensamonte closed 4 years ago
*.tif filter=lfs diff=lfs merge=lfs -text
*.cubemap filter=lfs diff=lfs merge=lfs -text
*.tga filter=lfs diff=lfs merge=lfs -text
*.png filter=lfs diff=lfs merge=lfs -text
*.raw filter=lfs diff=lfs merge=lfs -text
*.wav filter=lfs diff=lfs merge=lfs -text
*.psd filter=lfs diff=lfs merge=lfs -text
*.mov filter=lfs diff=lfs merge=lfs -text
*.mb filter=lfs diff=lfs merge=lfs -text
*.duf filter=lfs diff=lfs merge=lfs -text
*.jpg filter=lfs diff=lfs merge=lfs -text
*.hdr filter=lfs diff=lfs merge=lfs -text
*.bin.fbx filter=lfs diff=lfs merge=lfs -text
*.uasset filter=lfs diff=lfs merge=lfs -text
*.umap filter=lfs diff=lfs merge=lfs -text
*.upk filter=lfs diff=lfs merge=lfs -text
*.udk filter=lfs diff=lfs merge=lfs -text
Enabled GitHub LFS
I think there's an issue with the git versioning. It's taking forever to update a copy. Its most likely that all versions of large binary files are saved in the history. https://stackoverflow.com/questions/3055506/git-is-very-very-slow-when-tracking-large-binary-files
💯 there's something wrong with got because it takes way too long. But, I don't know what's wrong with it. It seems like everything is fine. I think it's because it already has a very long history of changes. But Thai shouldn't be an issue.
This is interesting. The repository is currently 9.46 GB:
But 5gb of the repository is git:
Diagnosing why Git is so slow Managing Huge Repositories with Git
https://stackoverflow.com/questions/5613345/how-to-shrink-the-git-folder
you should not delete all changes older than 30 days (i think it's somehow possible exploiting git, but really not recommended).
you can call git gc --aggressive --prune, which will perform garbage collection in your repository and prune old objects. do you have a lot of binary files (archives, images, executables) which change often? those usually lead to huge .git folders (remember, git stores snapshots for each revision and binary files compress badly)
Removing large files from Git without losing history
GitHub Help: Working with large files
GitHub Desktop Issue to support Shallow Copies: https://github.com/desktop/desktop/issues/3480
Shallow Copy: $ git fetch --depth 3
how to get a shallow clone (git clone --depth n, then git fetch --depth n to get more history)
@AlisaCK We may have to switch to Git Terminal to be able to just get shallow clones of the project. This will tell get to get a certain number of commits rather than all the commits.
I re-cloned the project again and now it seems like .git is fine. It's no longer 5.4 gb. Its back to 2 gb
The project is currently 2.3 GB during the time of this writing and this leads to very slow GitHub loads for fetching, pushing, pulling, switching branches, etc.
How to handle big repositories with Git