Closed pontaoski closed 2 years ago
I always use the console and docker logs during import. There is information about the progress of file processing. Unfortunately, I do not always have access to this console remotely.
The component creation has log visible in the application. The repository scanning merely consists of git clone
...
To figure out what is really the expensive operation, you can try it without Weblate:
git ls-remote --symref repo:url HEAD
git clone --depth 1 --branch repo:branch repo:url repo:destination
translation-finder repo:destination
But with ~150k files, my guess would be as well that the translation-finder is the bottleneck here and https://github.com/WeblateOrg/weblate/issues/7251 could address this.
I've looked at the translation-finder and there is a lot of space to improve the performance there. https://github.com/WeblateOrg/translation-finder/commit/510ef7a2664d400b3f650a089cfbb3d6a051fdc2 should remove ~300k syscalls in your case.
This issue has been automatically marked as stale because there wasn’t any recent activity.
It will be closed soon if no further action occurs.
Thank you for your contributions!
Describe the issue
When importing large repositories into Weblate, the scanning period takes a huge time without any indication of what exactly it's doing or how long it'll take.
I already tried
Steps to reproduce the behavior
Expected behavior
Screenshots
No response
Exception traceback
No response
How do you run Weblate?
PyPI module
Weblate versions
Weblate deploy checks
No response
Additional context
No response