Open TFTomSun opened 2 years ago
Are there any network calls done or is the git height determined based on the local cloned repository?
It's all done based on local git history.
We generate version.json files per project with longer lists of path filters
NB.GV does a lot of caching to prevent the git history walk from being repeated during a build, but when you have a separate version.json file per-project, each project has to do its own walk, compounding the time required. I don't know how much path filters impact the build perf.
What version of Nerdbank.GitVersioning are you using? The 3.4+ versions default to a much faster all-managed implementation of git.
We use the latest 3.5.* version but it's the same behavior with 3.4. So I assume that it gets worse because of the increasing number of projects and/or path filters. Are 1.2 seconds for GetBuildVersionCore run a usual duration? I'd like to know whether the amount of GetBuildVersionCore executions are the problem or the duration of the executions.
We generate a per project version.json to increase the version for a project, when changes in the project folder or in one of its dependency have been made. We wanted to avoid that all projects in the repo get a new version even if they and their dependency were not touched. Is that approach not recommended?
Are 1.2 seconds for GetBuildVersionCore run a usual duration?
That's definitely on the long side of the curve, but it's not completely crazy. I just tried on a large repo and it took 395ms for one call. It gets slower with the 'version height', so rev'ing the version in version.json will reset it to its 'minimum' for your particular project, path filters, etc.
We generate a per project version.json to increase the version for a project, when changes in the project folder or in one of its dependency have been made. We wanted to avoid that all projects in the repo get a new version even if they and their dependency were not touched. Is that approach not recommended?
Personally, I have never bothered with that. It's very difficult to maintain (it sounds like you have a tool that helps with that, which is great) but mostly it's a hassle that I haven't seen evidence to say that it pays off for customers. If I have a repo that builds a bunch of packages, I tend to find it more useful to push all the packages together and everyone can readily see that this set of packages go together because all their version #s match exactly. There are certainly arguments on both sides of this.
By having just one version.json file in the repo (and no path filters), you become compatible with the most effective build perf optimizations that drop GetBuildVersionCore impact on your build to almost zero.
thanks, I read the whole history and it seems that there was already a good solution to the problem. Just wonder what happened to this MR https://github.com/dotnet/Nerdbank.GitVersioning/pull/499#issuecomment-698676161
Good point. That PR may have slipped through the cracks. I've created another.
thanks... just for tracking purposes, here's a link to the new PR https://github.com/dotnet/Nerdbank.GitVersioning/pull/801
We see recently very bad performance in our builds caused by GitBuildVersionCore in a solution with about 25 projects:
Each call of takes about 1.2 to 1.5 seconds:
What could be the root cause of that problem? Are there any network calls done or is the git height determined based on the local cloned repository?
We generate version.json files per project with longer lists of path filters. Could that cause problems?