Closed nathanael-ruf closed 3 weeks ago
Same here, still loading
I deleted the files at the location under "Data Files" listed on this page: https://docs.gitbutler.com/development/debugging
And it fixed the loading issue and there being 400+ changes in git status
.
I only just downloaded though, so not sure if that might cause any data loss.
To me this sounds like an error occurred, but the UI didn't notice it keeps waiting for a successful response that never arrives.
Is there errors in the logs you could share?
I have never been able to see the “Workspace” part of the app, only the loading spinner.
Here is the tail of the log
2024-05-06T17:46:02.512024Z INFO handle: crates/gitbutler-watcher/src/handler/mod.rs:70: new event=GitFileChange(a93a347c-39a1-48f9-9fbd-741efca4b4ca, FETCH_HEAD)
2024-05-06T17:46:02.512080Z INFO handle:calculate_virtual_branches: crates/gitbutler-watcher/src/handler/mod.rs:162: new event=GitFileChange(a93a347c-39a1-48f9-9fbd-741efca4b4ca, FETCH_HEAD)
2024-05-06T17:46:02.512548Z INFO list_remote_branches: crates/gitbutler-tauri/src/virtual_branches.rs:439: new project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.551819Z WARN list_remote_branches: crates/gitbutler-core/src/virtual_branches/remote.rs:117: ignoring branch Some("refs/remotes/origin/HEAD") as peeling failed err=NotFound(Error { code: -3, klass: 3, message: "the reference 'refs/remotes/origin/HEAD' cannot be peeled - Cannot resolve reference" }) project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.551937Z INFO list_remote_branches: crates/gitbutler-tauri/src/virtual_branches.rs:439: close time.busy=39.4ms time.idle=10.0µs project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.613757Z INFO get_base_branch_data: crates/gitbutler-tauri/src/virtual_branches.rs:104: new project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.653396Z INFO get_base_branch_data: crates/gitbutler-tauri/src/virtual_branches.rs:104: close time.busy=39.6ms time.idle=22.2µs project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.655417Z INFO list_remote_branches: crates/gitbutler-tauri/src/virtual_branches.rs:439: new project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.694873Z WARN list_remote_branches: crates/gitbutler-core/src/virtual_branches/remote.rs:117: ignoring branch Some("refs/remotes/origin/HEAD") as peeling failed err=NotFound(Error { code: -3, klass: 3, message: "the reference 'refs/remotes/origin/HEAD' cannot be peeled - Cannot resolve reference" }) project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
2024-05-06T17:46:02.695008Z INFO list_remote_branches: crates/gitbutler-tauri/src/virtual_branches.rs:439: close time.busy=39.6ms time.idle=13.5µs project_id=a93a347c-39a1-48f9-9fbd-741efca4b4ca
More probably, I have realised that some folders like a Python virtual environment make
git status -sc | wc -l
total more than 28000 files. This seems to be too much for Gitbulter.
Same problem here, no errors in log. I have 378 changes, a lot, but not crazy.
Thanks for sharing! The logs feel like it's doing the same thing over and over again, and it's getting into some self-inflicted loop - a filesystem change triggers an action, that triggers a filesystem change again, …..
378 changes aren't a problem, I believe, it's something else.
Oh, now that I remember this, when opening the Linux kernel it took 2 hours because it went through all the history to eagerly fetch profile pictures. Maybe the same is happening here?
This would mean that opening the repository on a clone created with git clone --depth 1
would work. Maybe that's something you could try?
Also creating a new branch didn't work until I stacked all changes. Than it created al lot of branches because I clicked a lot on the "new branch" button when it was unresponsive.
I think it has something to do with a large modified yarn.lock file. Other changes are fine, but my lock file is 40.000 lines, with a lot of diffs.
Thank you! Indeed I have also heard that large diffs (or small diffs of large files?) can take a long time to compute.
And since this issue isn't about 'stuck loading after adding project', I think there is no need to see if git clone --depth 1
changes anything.
Since I reproduced it on Windows, I thought I'd look into it, but couldn't see what the actual problem is except for an issue with a project watcher not being available, something that might be more related to a race than anything else.
The endless-spinner I saw was unrelated to big files or big repositories, but something very else and I couldn't make sense of it. I could, however, catch on video what happened here.
https://github.com/gitbutlerapp/gitbutler/assets/63622/ce921d45-6dcb-4847-aa22-2587b440da58
The setup is the latest master
with modifications from here, and exactly the same state compiled as release binary. The release binary has issues with watching files, apparently, and shows the endless spinner, whereas the debug-dev version has no such issues. Further, once the dev version change the vbranches once, the nightly build would at least be able to open the project again. However, trying to change anything with it would do nothing, and then result in an endless spinner.
It's notable that the release version has a particular error in its logs, which might well be the reason for the lack of update:
ERROR fetch_from_target: crates\gitbutler-tauri\src\virtual_branches.rs:545: error=watcher not found project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c action=Some("auto")
The emit_vbranches()
function is what fails like that.
How come that it's only happening in release mode? Is something too fast?
Also worth noting: This issue only reproduces for me on Windows right now.
Further research shows the following (improved) error:
2024-05-25T16:50:44.067645Z INFO git_head: crates\gitbutler-tauri\src\commands.rs:68: new project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c
2024-05-25T16:50:44.067706Z INFO set_project_active:watch: crates\gitbutler-tauri\src\watcher.rs:100: new id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c
2024-05-25T16:50:44.074662Z INFO fetch_from_target: crates\gitbutler-tauri\src\virtual_branches.rs:487: new project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c action=Some("auto")
2024-05-25T16:50:44.119971Z INFO git_head: crates\gitbutler-tauri\src\commands.rs:68: close time.busy=51.9ms time.idle=380µs project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c
2024-05-25T16:50:44.186607Z ERROR fetch_from_target: crates\gitbutler-tauri\src\virtual_branches.rs:545: error=matching watcher to post event not found, wanted 31ca6173-bfb9-43d0-a18f-3e84250b7f6c, got None project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c action=Some("auto")
Despite seeing a set_project_active:watch
call for 7f6c
, the watch is unavailable for query in fetch_from_target
, or should I say "not yet available"?
Later in the log, we see this:
2024-05-25T16:50:44.335125Z INFO set_project_active:watch:file monitor: crates\gitbutler-watcher\src\file_monitor.rs:115: new id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c
2024-05-25T16:50:44.357996Z INFO list_virtual_branches:workdir: crates\gitbutler-core\src\git\diff.rs:131: close time.busy=41.6ms time.idle=199µs project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c commit_oid=Oid { oid: 3021ce35397e174ed6904b2c1c0e6c69c5046ab9 }
2024-05-25T16:50:44.359506Z INFO list_virtual_branches: crates\gitbutler-tauri\src\virtual_branches.rs:39: close time.busy=108ms time.idle=444µs project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c
2024-05-25T16:50:45.091689Z INFO set_project_active:watch:file monitor:handle debounced events: crates\gitbutler-watcher\src\file_monitor.rs:118: new id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c
2024-05-25T16:50:45.093098Z INFO set_project_active:watch:file monitor:handle debounced events: crates\gitbutler-watcher\src\file_monitor.rs:118: close time.busy=1.20ms time.idle=229µs id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c project_id=31ca6173-bfb9-43d0-a18f-3e84250b7f6c fs_events=10 fs_events=10 ignored=0 ignored=0 git_noop=10 git_noop=10 git=0 git=0 project=0 project=0
A whopping 240ish milliseconds later the file_monitor, a crucial part of the watcher, is finally starting. Later, there is filesystem events coming in as well. But the system never recovers from that first missed emit_vbranches
call.
The action posted there comes with this code:
It's the last piece of business logic still executed in the watcher, and used by the watcher internally still.
Ideally, it would just be re-implemented by hand where needed, but in order to keep it DRY we'd have to try to assure the watcher is available before everything else happens.
And indeed, with this patch…
diff --git a/app/src/routes/[projectId]/+layout.ts b/app/src/routes/[projectId]/+layout.ts
index 2f10753fa..80037da9e 100644
--- a/app/src/routes/[projectId]/+layout.ts
+++ b/app/src/routes/[projectId]/+layout.ts
@@ -29,7 +29,10 @@ export async function load({ params, parent }) {
let project: Project | undefined = undefined;
try {
project = await projectService.getProject(projectId);
- invoke('set_project_active', { id: projectId }).then((_r) => {});
+ // We need to await this as some initialization done here is needed
+ // for follow-up comamnds we will emit. If it's not yet done or ready,
+ // these commands fail and put the UI in an invalid state.
+ await invoke('set_project_active', { id: projectId });
} catch (err: any) {
throw error(400, {
message: err.message
the watcher related error goes away. Unfortunately, the issue persists and there is no error left in the log. The next question could be how a dev-version of the UI can 'fix' the project.
Comparing the .git/gitbutler
content yielded some interesting insights.
For example, a broken variant of this folder has a virtual_branches.toml
with the following content:
ownership = """
hi:1-4-806ecdc5607fadd8c9604bd53db5659b
nothing:0-0-d41d8cd98f00b204e9800998ecf8427e-1716650736195
"""
The timestamp is notably missing on the 'hi' file.
The folders are attached here: dot_git-gitbutler.zip
But there was no need to go into detail here as another reproduction is to rm -Rf .git/gitbutler
and then to activate the dua-cli
repository again. Now one gets a .git/gitbutler
directory with just this file:
❯ cat .git/gitbutler/virtual_branches.toml
[default_target]
branchName = "main"
remoteName = "origin"
remoteUrl = "https://github.com/Byron/dua-cli"
sha = "66e01667ddf6cc8da8a2387ef9d457aef63f42a5"
pushRemoteName = "origin"
[branch_targets]
[branches]
And the UI popped up a timeout while loading virtual branches. Nothing was emitted into the log file after setting the project.
Finally, this could be boiled down to this dataset, which also fails on MacOS in release and dev versions:
Timeouts due to performance issues aren't covered by this, but it's a step in the right direction.
If one of you can produce a sample repo that shows the performance issue, I'd love to get my hands on it as it should then be fixable one way or another.
Great you looking into this! I'm on MacOS btw.
If you are on nightly, this could already work for you. But if it doesn't and there is high-CPU usage, it's probably a performance issue and it would be great if the file (like yarn.lock
) could be shared so I can test with it.
Oh yes, nightly does not crash with the same lock file, works a lot faster!
That's great, and also interesting, at least it wasn't me who fixed it, probably 😁.
Edit: I will just keep waiting for reproducible performance issues, and if there is nothing for a while, close this issue.
Im glad to hear this is fixed! Feel free to create another issue if you come across any other problems. Thanks
When I first tried GitButler it worked as expected. I then closed it, worked with git branches as usual and later opened GitButler again. It then prompted me to switch back to its branch which I accepted. Now it's just showing a loading spinner where the workspace branches normally are. I can still see the trunk and the remote branches on the left. A
git status
shows pretty much every file in the repo asnew file
, so not sure what it's doing. Restart doesn't help. It's a real bummer because I "stashed" a virtual branch that is now unretrievable.Let me know if I can provide more info somehow. Is GitButler store some state for a repo somewhere that I can manually delete?