Closed webextensions closed 1 month ago
Hey webextensions! Thanks for the detailed bug report!
Do you know if running git gc
mitigates the issue?
I still agree that there is value in a setting to not blame files that are too large but its very difficult to know how character count, line count, number of commits the file is part of, or number of commits in general influence the resource requirements.
@Sertion Thank you for the response.
git gc
might have made it work intermittently now (for one particular file), but that doesn't solve the core problem.
It seems that large file + longer history both are playing certain parts.
For a particular package-lock.json
file, it works and fails randomly as tried with multiple attempts. In case the following helps to get the picture of the file-size/command-time:
$ time git blame -C --incremental -- /path/to/project/package-lock.json | wc -l
23056
real 0m7.143s
user 0m6.946s
sys 0m0.169s
For another use case where the file contains more than 100,000 lines and part of over 1000 commits, the CPU usage never comes down.
Probably a way to configure excluding some specific files by name/glob would also be a decent workaround.
PS: GitLens seems to work fine even for the file with 100,000 lines as well (might take a few seconds to effectively load/render, but not too much considering the scale).
I've spent some time trying to figure this out and have made a pre-release version with a few changes. It would mean a lot if you could try it out and report back if it solved the issue!
It would also be helpful to know what state (the S
column in htop
) the process is in when it locks at 100% if the pre-release version does not resolve the issue.
@Sertion
Thank you for the update.
The git blame
operation runs with R
and the code
instance (/usr/share/code/code --type=utility --utility-sub-type=node.mojom.NodeService --lang=en-US ...
) which is eating up the CPU also runs with R
.
I tried out the pre-release version, but it didn't seem to help much. It is still working intermittently only for the file with 20000+ lines (out of 10 attempts, around 7 times it did not load and CPU usage stayed 100% for a code
instance). When it works, the CPU usage gets back to normal after the load/rendering is complete within 5 to 20 seconds (rough time range, didn't monitor accurately). Otherwise, the CPU usage of code
instance stays at 100% till when that particular instance of code
is not killed or extensions aren't disabled.
Thank you for your help but I have to admit that I'm stumped. I've spent a few hours trying to reproduce the issue with this quite large file (25764 lines, 1678 commits) and have not been able to do so. I've made another pre-release version with some very minor changes from the last but I am unsure if it will solve anything.
Thank you for trying further.
While I am yet to try the next version of pre-release, I attempted to establish reproducibility further.
I was also unable to reproduce it with next.js
project in the few attempts I tried :+1:
But, https://github.com/facebook/create-react-app/blob/main/package-lock.json seems to be failing to work for my machine 3 to 6 times out of 10 attempts. It has more than 50k lines of contents.
@Sertion gitblame-10.11.1-pre-release-2.zip doesn't seem to help. The problematic behavior is still the same.
The pre-release below adds gitblame.maxLineCount
: a setting that prevents files with more lines than the setting declares from being blamed at all.
@Sertion This solution / workaround is helping the situation. Thank you :smile:
Marking closed :+1:
Environment:
Steps to reproduce:
Git Blame
extensionpackage-lock.json
). In my case, the file has been part of more 300 commits ($ git rev-list --count HEAD -- package-lock.json
), not sure if that is also relevant for this issue.htop
htop
, it can be observed initially agit blame
operation runs for a few seconds (eg: between 2 to 7 seconds) and after that,code
consumes 100% CPU and that number never comes down.Suggested solution: In case it is tough to solve the issue due to the file size/history being too large, kindly consider providing manual and/or automatic turn off feature for this extension based on the file name and/or content length.
Similar issues: