bazelbuild / bazel

a fast, scalable, multi-language and extensible build system
https://bazel.build
Apache License 2.0
23.21k stars 4.06k forks source link

failed: I/O exception during sandboxed execution: No such file or directory #22951

Open mattyclarkson opened 4 months ago

mattyclarkson commented 4 months ago

Description of the bug:

rules_git, when checking out the working directory into a declare_directory will occasionally report:

ERROR: github-mozilla-deepspeech/BUILD.bazel:3:20: Testing //github-mozilla-deepspeech:checkout failed: I/O exception during sandboxed execution: /var/lib/buildkite-agent/.cache/bazel/_bazel_buildkite-agent/a1208da49aaa9451b147b4d0696a68a7/execroot/_main/bazel-out/k8-fastbuild/bin/external/_main~_repo_rules~github-mozilla-deepspeech-0.9.3/checkout/tensorflow/native_client/ctcdecode/third_party/openfst-1.6.7/src/include/fst/extensions/pdt (No such file or directory)

GitLab CI Log, Buildkite Log.

Which category does this issue belong to?

Local Execution, Remote Execution, Rules API

What's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.

git clone https://git.gitlab.arm.com/bazel/rules_git.git
cd rules_git/e2e
while true; do
    bazelisk clean
    bazelisk test github-mozilla-deepspeech:checkout
done

The bug is flaky, so repeat the test as one sees fit.

Which operating system are you running Bazel on?

Linux

What is the output of bazel info release?

release 7.2.1rc2

If bazel info release returns development version or (@non-git), tell us how you built Bazel.

No response

What's the output of git remote get-url origin; git rev-parse HEAD ?

git@git.gitlab.arm.com:bazel/rules_git.git
0faf7ebb7a3289e337261c50ff916cd1f5acae95

If this is a regression, please try to identify the Bazel commit where the bug was introduced with bazelisk --bisect.

No response

Have you found anything relevant by searching the web?

22151 is a bug in a similar area

Any other information, logs, or outputs that you want to share?

No response

coeuvre commented 4 months ago

Can you rerun with --verbose_failures so we have a stacktrace?

amit-mittal commented 3 months ago

I think the error that I am seeing here is also related.

mattyclarkson commented 3 months ago

Can you rerun with --verbose_failures so we have a stacktrace?

Attempting to get a stack trace but the conditioning for it triggering are challenging, especially locally. We should catch one soon on the CI and I'll update.