Open quic-sbjorkle opened 1 week ago
What does FOO
look like here? In particular, does it contain any ~
characters?
bazel-out/ubuntu22-fastbuild/bin/external/FOO/external/FOO/lib/modified_runpath/libBar.so.5
also looks very weird as it has two external segments. Is that what you see in the cache entry or is it a result of redacting information?
@fmeum, no.. Just [A-Z0-9_]
bazel-out/ubuntu22-fastbuild/bin/_solib_k8/_U@@CLibs_UGt_Ulinux_U64_S_S_CCLibs_UGt_Ucc_Ulibrary___Uexternal_SCLibs_UGt_Ulinux_U64_Sexternal_SCLibs_UGt_Ulinux_U64_Slib_Smodified_Urunpath/libBar.so.5
You are correct with the observation of 2 external/ directories: 333424, bazel-out/ubuntu22-fastbuild/bin/external/CLibs_Gt_linux_64/external/CLibs_Gt_linux_64/lib/modified_runpath/libBar.so.5
@justinhorvitz Not sure who to ask about this, but do you happen to have an idea how such corrupted paths could end up in the action cache?
@bazel-io flag
@bazel-io fork 7.3.0
The action cache indexes the path strings of outputs' exec paths. I'm not sure why you've concluded that there is action cache corruption? Is the correct output path bin/external/CLibs_Gt_linux_64/lib/modified_runpath/libBar.so.5
and not bin/external/CLibs_Gt_linux_64/external/CLibs_Gt_linux_64/lib/modified_runpath/libBar.so.5
?
The action cache indexes the path strings of outputs' exec paths. I'm not sure why you've concluded that there is action cache corruption? Is the correct output path
bin/external/CLibs_Gt_linux_64/lib/modified_runpath/libBar.so.5
and notbin/external/CLibs_Gt_linux_64/external/CLibs_Gt_linux_64/lib/modified_runpath/libBar.so.5
?
Yes, I don't know of any source of the latter type of path that wouldn't be a bug. I don't know whether this comes from action cache logic though.
I'm afraid I can't be of too much help here. Is it at all possible that multiple concurrent builds are writing to the same action cache (wild guess)?
If it could be reproduced, I would try to see if I could hit a breakpoint where the questionable path string is being written to the action cache. It would be in PersistentStringIndexer#getOrCreate
, or perhaps it's being loaded from disk incorrectly, which would be when instantiating the PersistentIndexMap
.
We don’t have concurrent builds on these machines so multiple simultaneous writes to the action cache is not possible. It also happens fairly frequently on multiple machines in the cluster. Maybe 1 in 50 builds will end up in this corrupt state when —enable_bzlmod is toggled. But will be tricky to catch with a debugger.
@Wyverald since this seems related to bzlmod. I do not have any further ideas beyond adding more verbose logging to try and gather data.
Description of the bug:
Switching from --noenable_bzlmod to the default --enable_bzlmod occasionally causes builds to flake, resulting in "dangling symbolic link" errors. Although this issue was reported in #20886 and supposedly fixed in commit 52adf0b, the problem persists.
Error message from build:
This error occurs quite often when running in CI, though it happens less so on local development machines. It appears to be linked to the local action cache becoming corrupted somehow when toggling
bzlmod_enabled
. Once the error manifests on a node, it persists on that node until the action cache is cleared or a./bazel clean
is executed.Below are extracted parts from a decoded action cache:
In index:
bazel-out/ubuntu22-fastbuild/bin/_solib_k8/_U@@FOOcc_Ulibrary___Uexternal_SFOO_Sexternal_SFOO_Slib_Smodified_Urunpath/libBar.so.5 <==> 333425 <-- Missing entry
A known workaround is to use the flag
--nouse_action_cache
.Which category does this issue belong to?
Local Execution
What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
It's not clear how to reproduce this issue with ease. Attempts to manually induce this state on a local machine in a controlled manner have been unsuccessful.
Which operating system are you running Bazel on?
Linux
What is the output of
bazel info release
?release 7.2.0
If
bazel info release
returnsdevelopment version
or(@non-git)
, tell us how you built Bazel.No response
What's the output of
git remote get-url origin; git rev-parse HEAD
?No response
If this is a regression, please try to identify the Bazel commit where the bug was introduced with bazelisk --bisect.
No response
Have you found anything relevant by searching the web?
No response
Any other information, logs, or outputs that you want to share?
No response