Open zqj2333 opened 1 year ago
Hi, do you solve the error? I build chipyard-1.9.0 at Google colab, and meet the same error. utils/fireperf/FlameGraph is failed to checked out, and br-base.json is failed to run in Step 9 of build-setup.sh.
Hi, do you solve the error? I build chipyard-1.9.0 at Google colab, and meet the same error. utils/fireperf/FlameGraph is failed to checked out, and br-base.json is failed to run in Step 9 of build-setup.sh.
Hi, unfortunately not.
Hi zqj2333, the timeout is OK, it checks whether you are on EC2 or not. If you are not on EC2, the timeout will happens. By the way, your problem may be caused by small ulimit -Hn, a warning is presented in your log.
Hi zqj2333, the timeout is OK, it checks whether you are on EC2 or not. If you are not on EC2, the timeout will happens. By the way, your problem may be caused by small ulimit -Hn, a warning is presented in your log.
Hi, thanks for your reply. I was wondering if your log also has this warning, or you have built it successfully.
In the google colab, it has no this warning in the log. But the building of step 9 is still failed, and I do not know why.
In the google colab, it has no this warning in the log. But the building of step 9 is still failed, and I do not know why.
Okay, I will continue to try it, I will give out new information if there is some update.
I have the same issue on a Ubuntu 20.04. My logfile ends with
2023-04-25 10:43:32,472 [run ] [DEBUG] OBJCOPY platform/generic/firmware/fw_payload.bin
2023-04-25 10:43:32,504 [print_deps ] [DEBUG] Running task /home/jan/chipyard/software/firemarshal/images/firechip/br-base/br-base.img because one of its targets does not exist anymore: /home/jan/chipyard/software/firemarshal/images/firechip/br-base/br-base.img
2023-04-25 10:43:32,791 [makeImage ] [DEBUG] Applying overlay: /home/jan/chipyard/software/firemarshal/boards/firechip/base-workloads/br-base/overlay
2023-04-25 10:43:32,791 [run ] [DEBUG] Running: "guestmount --pid-file guestmount.pid -a /home/jan/chipyard/software/firemarshal/images/firechip/br-base/br-base.img -m /dev/sda /home/jan/chipyard/software/firemarshal/disk-mount" in /home/jan/chipyard/software/firemarshal
2023-04-25 10:43:33,508 [run ] [DEBUG] libguestfs: error: /usr/bin/supermin exited with error status 1.
2023-04-25 10:43:33,508 [run ] [DEBUG] To see full error messages you may need to enable debugging.
2023-04-25 10:43:33,508 [run ] [DEBUG] Do:
2023-04-25 10:43:33,508 [run ] [DEBUG] export LIBGUESTFS_DEBUG=1 LIBGUESTFS_TRACE=1
2023-04-25 10:43:33,508 [run ] [DEBUG] and run the command again. For further information, read:
2023-04-25 10:43:33,508 [run ] [DEBUG] http://libguestfs.org/guestfs-faq.1.html#debugging-libguestfs
2023-04-25 10:43:33,508 [run ] [DEBUG] You can also run 'libguestfs-test-tool' and post the *complete* output
2023-04-25 10:43:33,508 [run ] [DEBUG] into a bug report or message to the libguestfs mailing list.
2023-04-25 10:43:33,543 [main ] [ERROR] Failed to build workload br-base.json
2023-04-25 10:43:33,543 [main ] [INFO ] Log available at: /home/jan/chipyard/software/firemarshal/logs/br-base-build-2023-04-25--10-31-00-NH7MUAV3HG8VQBQD.log
2023-04-25 10:43:33,544 [main ] [ERROR] FAILURE: 1 builds failed
Did you solve this issue?
[Edit: Downgrading to 1.8.1 does help.]
I have the same issue on a Ubuntu 20.04. My logfile ends with
2023-04-25 10:43:32,472 [run ] [DEBUG] OBJCOPY platform/generic/firmware/fw_payload.bin 2023-04-25 10:43:32,504 [print_deps ] [DEBUG] Running task /home/jan/chipyard/software/firemarshal/images/firechip/br-base/br-base.img because one of its targets does not exist anymore: /home/jan/chipyard/software/firemarshal/images/firechip/br-base/br-base.img 2023-04-25 10:43:32,791 [makeImage ] [DEBUG] Applying overlay: /home/jan/chipyard/software/firemarshal/boards/firechip/base-workloads/br-base/overlay 2023-04-25 10:43:32,791 [run ] [DEBUG] Running: "guestmount --pid-file guestmount.pid -a /home/jan/chipyard/software/firemarshal/images/firechip/br-base/br-base.img -m /dev/sda /home/jan/chipyard/software/firemarshal/disk-mount" in /home/jan/chipyard/software/firemarshal 2023-04-25 10:43:33,508 [run ] [DEBUG] libguestfs: error: /usr/bin/supermin exited with error status 1. 2023-04-25 10:43:33,508 [run ] [DEBUG] To see full error messages you may need to enable debugging. 2023-04-25 10:43:33,508 [run ] [DEBUG] Do: 2023-04-25 10:43:33,508 [run ] [DEBUG] export LIBGUESTFS_DEBUG=1 LIBGUESTFS_TRACE=1 2023-04-25 10:43:33,508 [run ] [DEBUG] and run the command again. For further information, read: 2023-04-25 10:43:33,508 [run ] [DEBUG] http://libguestfs.org/guestfs-faq.1.html#debugging-libguestfs 2023-04-25 10:43:33,508 [run ] [DEBUG] You can also run 'libguestfs-test-tool' and post the *complete* output 2023-04-25 10:43:33,508 [run ] [DEBUG] into a bug report or message to the libguestfs mailing list. 2023-04-25 10:43:33,543 [main ] [ERROR] Failed to build workload br-base.json 2023-04-25 10:43:33,543 [main ] [INFO ] Log available at: /home/jan/chipyard/software/firemarshal/logs/br-base-build-2023-04-25--10-31-00-NH7MUAV3HG8VQBQD.log 2023-04-25 10:43:33,544 [main ] [ERROR] FAILURE: 1 builds failed
Did you solve this issue?
[Edit: Downgrading to 1.8.1 does help.]
Hello, unfortunately I still haven't solved it.
I have a same issue.
Is there a way to skip the step that builds firemarshal? I keep getting this issue which is completely stopping me from using Chipyard.
I found the same issue in release 1.9.1, but managed to solve it and build everything else by running the following (which seems to effectively disable Firesim and Firemarshall from the setup process), as specified in the usage message of that script:
./build-setup.sh riscv-tools -s 6 -s 7 -s 8 -s 9
Hello, I have an issue in "./build-setup.sh riscv-tools": /root/mambaforge/lib/python3.10/site-packages/pydantic/_internal/_config.py:257: UserWarning: Valid config keys have changed in V2:
I found the same issue in release 1.9.1, but managed to solve it and build everything else by running the following (which seems to effectively disable Firesim and Firemarshall from the setup process), as specified in the usage message of that script:
./build-setup.sh riscv-tools -s 6 -s 7 -s 8 -s 9
thanks, i also had the same issue but this is a good workaround!
Hi @jerryz123, responding to https://github.com/ucb-bar/chipyard/issues/1609?notification_referrer_id=NT_kwDOAK9KfrM3ODY2MTAyODA0OjExNDg3ODcw#issuecomment-1746347714
To reproduce my issue, I cloned Chipyard's main
branch into ~/chipyard_main
and then ran the setup script.
The first time I ran it, I got a different error than the one I was used to:
Cloning into '/home/drak/chipyard_main/software/firemarshal/wlutil/busybox'...
remote: Enumerating objects: 18140, done.
remote: Counting objects: 100% (15/15), done.
remote: Compressing objects: 100% (14/14), done.
remote: Total 18140 (delta 1), reused 1 (delta 1), pack-reused 18125
Receiving objects: 100% (18140/18140), 4.29 MiB | 4.35 MiB/s, done.
Resolving deltas: 100% (1340/1340), done.
Submodule path 'boards/default/distros/br/buildroot': checked out 'd48a8beb39275a479185ab9b3232cd15dcfb87ab'
Submodule path 'boards/default/firmware/opensbi': checked out '5ccebf0a7ec79d0bbef36d6dcdc2717f25d40767'
error: RPC failed; curl 56 OpenSSL SSL_read: OpenSSL/3.1.2: error:0A000119:SSL routines::decryption failed or bad record mac, errno 0
error: 8186 bytes of body are still expected
fetch-pack: unexpected disconnect while reading sideband packet
fatal: early EOF
fatal: fetch-pack: invalid index-pack output
fatal: could not fetch b62836419ea3e2d19de323071132fa26135c2f10 from promisor remote
fatal: Unable to checkout '71bece669db27e8c5bf1b6e25780bab194d23103' in submodule path 'boards/default/linux'
After this, I just ran the setup script a second time.
+ git submodule update --progress --filter=tree:0 --init boards/default/linux boards/default/firmware/opensbi wlutil/busybox boards/default/distros/br/buildroot boards/firechip/drivers/iceblk-driver boards/firechip/drivers/icenet-driver
Submodule path 'boards/default/linux': checked out '71bece669db27e8c5bf1b6e25780bab194d23103'
Submodule path 'boards/firechip/drivers/iceblk-driver': checked out '4e6f183337b27aa5be99dc4873ea507572aceb9f'
Submodule path 'wlutil/busybox': checked out '70f77e4617e06077231b8b63c3fb3406d7f8865d'
To check on progress, either call marshal with '-v' or see the live output at:
/home/drak/chipyard_main/software/firemarshal/logs/br-base-build-2023-10-04--18-32-32-34U5DIC95HPJQN82.log
. /home/drak/chipyard_main/software/firemarshal/boards/firechip/base-workloads/br-base/host-init.sh
. /home/drak/chipyard_main/software/firemarshal/images/firechip/br.8fff/br.8fff.img
Attempting to download cached image: https://raw.githubusercontent.com/firesim/firemarshal-public-br-images/main/images/firechip/br.8fff/br.8fff.img.zip
Unzipping cached image: /home/drak/chipyard_main/software/firemarshal/boards/firechip/distros/br/br.8fff.img.zip
Skipping full buildroot build. Using cached image /home/drak/chipyard_main/software/firemarshal/images/firechip/br.8fff/br.8fff.img from /home/drak/chipyard_main/software/firemarshal/boards/firechip/distros/br/br.8fff.img.zip
. build_busybox
. /home/drak/chipyard_main/software/firemarshal/images/firechip/br-base/br-base-bin
TaskError - taskid:/home/drak/chipyard_main/software/firemarshal/images/firechip/br-base/br-base-bin
PythonAction Error
Traceback (most recent call last):
File "/home/drak/chipyard_main/.conda-env/lib/python3.10/site-packages/doit/action.py", line 461, in execute
returned_value = self.py_callable(*self.args, **kwargs)
File "/home/drak/chipyard_main/software/firemarshal/wlutil/build.py", line 544, in makeBin
makeModules(config)
File "/home/drak/chipyard_main/software/firemarshal/wlutil/build.py", line 462, in makeModules
wlutil.run(makeCmd + " clean", cwd=driverDir, shell=True)
File "/home/drak/chipyard_main/software/firemarshal/wlutil/wlutil.py", line 527, in run
raise sp.CalledProcessError(p.returncode, prettyCmd)
subprocess.CalledProcessError: Command 'make LINUXSRC=/home/drak/chipyard_main/software/firemarshal/boards/firechip/base-workloads/br-base/../../linux clean' returned non-zero exit status 2.
ERROR: Failed to build workload br-base.json
Log available at: /home/drak/chipyard_main/software/firemarshal/logs/br-base-build-2023-10-04--18-32-32-34U5DIC95HPJQN82.log
ERROR: FAILURE: 1 builds failed
I see that both times, it failed at approximately the same step, but with a different specific error.
Here's the full log that's mentioned in the second error: https://pastebin.com/w8eVnEY2
I suspect the first error error: RPC failed; curl 56 OpenSSL SSL_read
is caused by network instability. This put the repo in a bad state with things partially cloned, that caused the next error when you rerun the setup command.
I suspect the first error
error: RPC failed; curl 56 OpenSSL SSL_read
is caused by network instability. This put the repo in a bad state with things partially cloned, that caused the next error when you rerun the setup command.
Hmmm. Network instability is definitely a possibility for the computer I tested it on. I'll try wiping, re-cloning and re-running it again to see if the error occurs in the exact same place (indicating a different error) or seems semi-random (indicating a network error).
I also encountered the same problem!
2023-10-06 17:11:16,889 [run ] [DEBUG] Running: "make LINUXSRC=/chipyard/software/firemarshal/boards/firechip/base-workloads/br-base/../../linux clean" in /chipyard/software/firemarshal/boards/firechip/base-workloads/br-base/../../drivers/icenet-driver 2023-10-06 17:11:16,893 [run ] [DEBUG] make: *** 没有规则可制作目标“clean”。 停止。 2023-10-06 17:11:16,899 [main ] [ERROR] Failed to build workload br-base.json 2023-10-06 17:11:16,899 [main ] [INFO ] Log available at: /chipyard/software/firemarshal/logs/br-base-build-2023-10-06--08-34-33-6N5DJJAYT18BY7DW.log 2023-10-06 17:11:16,899 [main ] [ERROR] FAILURE: 1 builds failed
Now that #1614 is in a working state (yes, that whole PR was just an ADHD tangent of mine to help out with this issue), the error this time around is related to a guestmount command.
For people to reproduce:
git clone git@github.com:JL102/chipyard.git chipyard_test
cd chipyard_test
git checkout script-trycatch
./build-setup.sh
Last bit of the log:
========== BEGINNING STEP 9: Pre-compiling FireMarshal buildroot sources ==========
To check on progress, either call marshal with '-v' or see the live output at:
/home/drak/chipyard_test/software/firemarshal/logs/br-base-build-2023-10-07--00-11-12-SJSYZ8EGR4PWBZF5.log
. /home/drak/chipyard_test/software/firemarshal/boards/firechip/base-workloads/br-base/host-init.sh
. /home/drak/chipyard_test/software/firemarshal/images/firechip/br.8fff/br.8fff.img
Attempting to download cached image: https://raw.githubusercontent.com/firesim/firemarshal-public-br-images/main/images/firechip/br.8fff/br.8fff.img.zip
Unzipping cached image: /home/drak/chipyard_test/software/firemarshal/boards/firechip/distros/br/br.8fff.img.zip
Skipping full buildroot build. Using cached image /home/drak/chipyard_test/software/firemarshal/images/firechip/br.8fff/br.8fff.img from /home/drak/chipyard_test/software/firemarshal/boards/firechip/distros/br/br.8fff.img.zip
. build_busybox
. /home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base-bin
. calc_br-base_dep
. /home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img
TaskError - taskid:/home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img
PythonAction Error
Traceback (most recent call last):
File "/home/drak/chipyard_test/.conda-env/lib/python3.10/site-packages/doit/action.py", line 461, in execute
returned_value = self.py_callable(*self.args, **kwargs)
File "/home/drak/chipyard_test/software/firemarshal/wlutil/build.py", line 602, in makeImage
wlutil.applyOverlay(config['img'], config['overlay'])
File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 671, in applyOverlay
copyImgFiles(img, flist, 'in')
File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 652, in copyImgFiles
with mountImg(img, getOpt('mnt-dir')):
File "/home/drak/chipyard_test/.conda-env/lib/python3.10/contextlib.py", line 135, in __enter__
return next(self.gen)
File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 589, in mountImg
run(['guestmount', '--pid-file', 'guestmount.pid', '-a', imgPath, '-m', '/dev/sda', mntPath])
File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 527, in run
raise sp.CalledProcessError(p.returncode, prettyCmd)
subprocess.CalledProcessError: Command 'guestmount --pid-file guestmount.pid -a /home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img -m /dev/sda /home/drak/chipyard_test/software/firemarshal/disk-mount' returned non-zero exit status 1.
ERROR: Failed to build workload br-base.json
Log available at: /home/drak/chipyard_test/software/firemarshal/logs/br-base-build-2023-10-07--00-11-12-SJSYZ8EGR4PWBZF5.log
ERROR: FAILURE: 1 builds failed
build-setup.sh: Build script failed with exit code 1 at step 9: Pre-compiling FireMarshal buildroot sources
here's the trace:
libguestfs: trace: set_verbose true
libguestfs: trace: set_verbose = 0
libguestfs: create: flags = 0, handle = 0x55f4735c3480, program = guestmount
libguestfs: trace: set_recovery_proc false
libguestfs: trace: set_recovery_proc = 0
libguestfs: trace: add_drive "/home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img"
libguestfs: trace: add_drive = 0
libguestfs: trace: launch
libguestfs: trace: max_disks
libguestfs: trace: max_disks = 255
libguestfs: trace: get_tmpdir
libguestfs: trace: get_tmpdir = "/tmp"
libguestfs: trace: version
libguestfs: trace: version = <struct guestfs_version = major: 1, minor: 40, release: 2, extra: , >
libguestfs: trace: get_backend
libguestfs: trace: get_backend = "direct"
libguestfs: launch: program=guestmount
libguestfs: launch: version=1.40.2
libguestfs: launch: backend registered: unix
libguestfs: launch: backend registered: uml
libguestfs: launch: backend registered: libvirt
libguestfs: launch: backend registered: direct
libguestfs: launch: backend=direct
libguestfs: launch: tmpdir=/tmp/libguestfsRUcLIp
libguestfs: launch: umask=0022
libguestfs: launch: euid=1000
libguestfs: trace: get_cachedir
libguestfs: trace: get_cachedir = "/var/tmp"
libguestfs: begin building supermin appliance
libguestfs: run supermin
libguestfs: command: run: /usr/bin/supermin
libguestfs: command: run: \ --build
libguestfs: command: run: \ --verbose
libguestfs: command: run: \ --if-newer
libguestfs: command: run: \ --lock /var/tmp/.guestfs-1000/lock
libguestfs: command: run: \ --copy-kernel
libguestfs: command: run: \ -f ext2
libguestfs: command: run: \ --host-cpu x86_64
libguestfs: command: run: \ /usr/lib/x86_64-linux-gnu/guestfs/supermin.d
libguestfs: command: run: \ -o /var/tmp/.guestfs-1000/appliance.d
supermin: version: 5.1.20
supermin: package handler: debian/dpkg
supermin: acquiring lock on /var/tmp/.guestfs-1000/lock
supermin: build: /usr/lib/x86_64-linux-gnu/guestfs/supermin.d
supermin: reading the supermin appliance
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/base.tar.gz type gzip base image (tar)
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/daemon.tar.gz type gzip base image (tar)
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/excludefiles type uncompressed excludefiles
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/hostfiles type uncompressed hostfiles
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/init.tar.gz type gzip base image (tar)
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages type uncompressed packages
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages-hfsplus type uncompressed packages
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages-reiserfs type uncompressed packages
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages-xfs type uncompressed packages
supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/udev-rules.tar.gz type gzip base image (tar)
supermin: mapping package names to installed packages
supermin: resolving full list of package dependencies
supermin: build: 238 packages, including dependencies
supermin: build: 12322 files
supermin: build: 8885 files, after matching excludefiles
supermin: build: 8888 files, after adding hostfiles
supermin: build: 8885 files, after removing unreadable files
supermin: build: 8893 files, after munging
supermin: kernel: looking for kernel using environment variables ...
supermin: kernel: looking for kernels in /lib/modules/*/vmlinuz ...
supermin: kernel: looking for kernels in /boot ...
supermin: failed to find a suitable kernel (host_cpu=x86_64).
I looked for kernels in /boot and modules in /lib/modules.
If this is a Xen guest, and you only have Xen domU kernels
installed, try installing a fullvirt kernel (only for
supermin use, you shouldn't boot the Xen guest with it).
libguestfs: error: /usr/bin/supermin exited with error status 1, see debug messages above
libguestfs: trace: launch = -1 (error)
libguestfs: trace: close
libguestfs: closing guestfs handle 0x55f4735c3480 (state 0)
libguestfs: command: run: rm
libguestfs: command: run: \ -rf /tmp/libguestfsRUcLIp
i see the "failed to find a suitable kernel", could it be related to some RISC-V dependencies not being loaded?
hmm. I'm now encountering the error as described in the previous comment, but in an already-setup instance of firemarshal. After looking closer at the logs, I see it's attempting to load a file from the system /boot
folder. Why's it doing that?
Now that #1614 is in a working state (yes, that whole PR was just an ADHD tangent of mine to help out with this issue), the error this time around is related to a guestmount command.
For people to reproduce:
git clone git@github.com:JL102/chipyard.git chipyard_test cd chipyard_test git checkout script-trycatch ./build-setup.sh
Last bit of the log:
========== BEGINNING STEP 9: Pre-compiling FireMarshal buildroot sources ========== To check on progress, either call marshal with '-v' or see the live output at: /home/drak/chipyard_test/software/firemarshal/logs/br-base-build-2023-10-07--00-11-12-SJSYZ8EGR4PWBZF5.log . /home/drak/chipyard_test/software/firemarshal/boards/firechip/base-workloads/br-base/host-init.sh . /home/drak/chipyard_test/software/firemarshal/images/firechip/br.8fff/br.8fff.img Attempting to download cached image: https://raw.githubusercontent.com/firesim/firemarshal-public-br-images/main/images/firechip/br.8fff/br.8fff.img.zip Unzipping cached image: /home/drak/chipyard_test/software/firemarshal/boards/firechip/distros/br/br.8fff.img.zip Skipping full buildroot build. Using cached image /home/drak/chipyard_test/software/firemarshal/images/firechip/br.8fff/br.8fff.img from /home/drak/chipyard_test/software/firemarshal/boards/firechip/distros/br/br.8fff.img.zip . build_busybox . /home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base-bin . calc_br-base_dep . /home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img TaskError - taskid:/home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img PythonAction Error Traceback (most recent call last): File "/home/drak/chipyard_test/.conda-env/lib/python3.10/site-packages/doit/action.py", line 461, in execute returned_value = self.py_callable(*self.args, **kwargs) File "/home/drak/chipyard_test/software/firemarshal/wlutil/build.py", line 602, in makeImage wlutil.applyOverlay(config['img'], config['overlay']) File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 671, in applyOverlay copyImgFiles(img, flist, 'in') File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 652, in copyImgFiles with mountImg(img, getOpt('mnt-dir')): File "/home/drak/chipyard_test/.conda-env/lib/python3.10/contextlib.py", line 135, in __enter__ return next(self.gen) File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 589, in mountImg run(['guestmount', '--pid-file', 'guestmount.pid', '-a', imgPath, '-m', '/dev/sda', mntPath]) File "/home/drak/chipyard_test/software/firemarshal/wlutil/wlutil.py", line 527, in run raise sp.CalledProcessError(p.returncode, prettyCmd) subprocess.CalledProcessError: Command 'guestmount --pid-file guestmount.pid -a /home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img -m /dev/sda /home/drak/chipyard_test/software/firemarshal/disk-mount' returned non-zero exit status 1. ERROR: Failed to build workload br-base.json Log available at: /home/drak/chipyard_test/software/firemarshal/logs/br-base-build-2023-10-07--00-11-12-SJSYZ8EGR4PWBZF5.log ERROR: FAILURE: 1 builds failed build-setup.sh: Build script failed with exit code 1 at step 9: Pre-compiling FireMarshal buildroot sources
here's the trace:
libguestfs: trace: set_verbose true libguestfs: trace: set_verbose = 0 libguestfs: create: flags = 0, handle = 0x55f4735c3480, program = guestmount libguestfs: trace: set_recovery_proc false libguestfs: trace: set_recovery_proc = 0 libguestfs: trace: add_drive "/home/drak/chipyard_test/software/firemarshal/images/firechip/br-base/br-base.img" libguestfs: trace: add_drive = 0 libguestfs: trace: launch libguestfs: trace: max_disks libguestfs: trace: max_disks = 255 libguestfs: trace: get_tmpdir libguestfs: trace: get_tmpdir = "/tmp" libguestfs: trace: version libguestfs: trace: version = <struct guestfs_version = major: 1, minor: 40, release: 2, extra: , > libguestfs: trace: get_backend libguestfs: trace: get_backend = "direct" libguestfs: launch: program=guestmount libguestfs: launch: version=1.40.2 libguestfs: launch: backend registered: unix libguestfs: launch: backend registered: uml libguestfs: launch: backend registered: libvirt libguestfs: launch: backend registered: direct libguestfs: launch: backend=direct libguestfs: launch: tmpdir=/tmp/libguestfsRUcLIp libguestfs: launch: umask=0022 libguestfs: launch: euid=1000 libguestfs: trace: get_cachedir libguestfs: trace: get_cachedir = "/var/tmp" libguestfs: begin building supermin appliance libguestfs: run supermin libguestfs: command: run: /usr/bin/supermin libguestfs: command: run: \ --build libguestfs: command: run: \ --verbose libguestfs: command: run: \ --if-newer libguestfs: command: run: \ --lock /var/tmp/.guestfs-1000/lock libguestfs: command: run: \ --copy-kernel libguestfs: command: run: \ -f ext2 libguestfs: command: run: \ --host-cpu x86_64 libguestfs: command: run: \ /usr/lib/x86_64-linux-gnu/guestfs/supermin.d libguestfs: command: run: \ -o /var/tmp/.guestfs-1000/appliance.d supermin: version: 5.1.20 supermin: package handler: debian/dpkg supermin: acquiring lock on /var/tmp/.guestfs-1000/lock supermin: build: /usr/lib/x86_64-linux-gnu/guestfs/supermin.d supermin: reading the supermin appliance supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/base.tar.gz type gzip base image (tar) supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/daemon.tar.gz type gzip base image (tar) supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/excludefiles type uncompressed excludefiles supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/hostfiles type uncompressed hostfiles supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/init.tar.gz type gzip base image (tar) supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages type uncompressed packages supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages-hfsplus type uncompressed packages supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages-reiserfs type uncompressed packages supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/packages-xfs type uncompressed packages supermin: build: visiting /usr/lib/x86_64-linux-gnu/guestfs/supermin.d/udev-rules.tar.gz type gzip base image (tar) supermin: mapping package names to installed packages supermin: resolving full list of package dependencies supermin: build: 238 packages, including dependencies supermin: build: 12322 files supermin: build: 8885 files, after matching excludefiles supermin: build: 8888 files, after adding hostfiles supermin: build: 8885 files, after removing unreadable files supermin: build: 8893 files, after munging supermin: kernel: looking for kernel using environment variables ... supermin: kernel: looking for kernels in /lib/modules/*/vmlinuz ... supermin: kernel: looking for kernels in /boot ... supermin: failed to find a suitable kernel (host_cpu=x86_64). I looked for kernels in /boot and modules in /lib/modules. If this is a Xen guest, and you only have Xen domU kernels installed, try installing a fullvirt kernel (only for supermin use, you shouldn't boot the Xen guest with it). libguestfs: error: /usr/bin/supermin exited with error status 1, see debug messages above libguestfs: trace: launch = -1 (error) libguestfs: trace: close libguestfs: closing guestfs handle 0x55f4735c3480 (state 0) libguestfs: command: run: rm libguestfs: command: run: \ -rf /tmp/libguestfsRUcLIp
i see the "failed to find a suitable kernel", could it be related to some RISC-V dependencies not being loaded?
hello JL102, i have the same error now, have you sloved this error?
You can just skip the firemarshal step in the build-setup script (Run ./build-setup.sh -h
to see the flags)
When I run the ./build-setup.sh -s 8 I get the below error can you please guide.
You can just skip the firemarshal step in the build-setup script (Run
./build-setup.sh -h
to see the flags)
@waseem-10xe Yours is different from other people because your setup stops at the first step. -s 8 skips initializing FireMarshalm, which is not yet your problem. You might want to configure conda from the beginning again.
@waseem-10xe Yours is different from other people because your setup stops at the first step. -s 8 skips initializing FireMarshalm, which is not yet your problem. You might want to configure conda from the beginning again.
Yes I configure the conda from start and then my build setup was completed and working now.
Background Work
Chipyard Version and Hash
Release: 1.9.0 the stable version
OS Setup
centos conda 23.3.1
Other Setup
No response
Current Behavior
There are some error as the image said.
Expected Behavior
build successfully
Other Information
I found that there is a time out.