Open sluongng opened 8 months ago
I'm not sure whether we can really pull off a "Go SDK to action definition" automation.
But maybe we could implement some kind of a diff test in rules_go that compares -x
output with whatever Gazelle and rules_go generate? That could help us stay closer to vanilla Go and also to detect regressions and missing features on SDK updates.
In a typical open source Go project, the build is often happen by invoking
go build <package>
orgo install <package>
(which, in turn, invokes the former).The idea is that with
go build -a -p=1 <package>
, we force everything to rebuild sequentially. By adding-x
into the mix, we get an “execution log” of all actions thatgo build
is doing. Using this execution log, we should be able to map each actions to a rules_go’s action and corresponding rules to construct the appropriate BUILD files.Use cases:
GoStdlib
into smaller packages.Right now we are installing the standard library in Go as 1 big bulk using
go install ./...
. We should be able to generate fine grain, granular BUILD files for the stdlib this way. Meaning that the cache would be more granular. A typical project would not need to compile and install”net”
stdlib package unless it depends on it.We should be a lot more flexible to changes on upstream
go build
tool. For example: the recent coverage change is taking us quite a long time to update rules_go accordingly. We also having a harder time adopting the new BuildID and ended up filtering out BuildID entirely.Foot notes
Leveraging “-x” is what rules_go maintainers having to do anyway, each time “go build” is updated upstream. We should find a way to automate this discovery process entirely.
Perhaps Gazelle is not a good place for this? Instead of generating BUILD files, we would want to first generate a custom set of starlark rules to define a set of actions used. Then BUILD files generation in Gazelle can kicks in. 🤔
cc: @fmeum