Closed dplyukhin closed 4 months ago
Hi Dan, thank you for trying the Docker image under MacOS! I have no experience with this error yet, so we will have to figure out what causes it.
Can you tell me a bit about our machine:
Also, I noticed a build warning in the log that you posted. Can you please try to rebuild the image using the following command instead of executing build.sh
:
docker build --network=host --platform linux/amd64 --build-arg USER_ID=$(id -u ${SUDO_USER:-$(whoami)}) --build-arg GROUP_ID=$(id -g ${SUDO_USER:-$(whoami)}) -t boosting .
and then try to run the validation again?
No problem :) I tried that command and the warning (and the segfault) still showed up. I changed it to --platform linux/arm64/v8
, and that got rid of the warning but the segfault persisted.
I have 36GB physical memory and I see that Docker is using about 6.5 GB of memory. I do not use sudo to run Docker.
The segfault does look slightly different, so here it is FWIW:
Starting validation
Running validation
2024-06-07 08:01:01 [main] org.variantsync.boosting.eval.util.VEVOSUtilities.cloneRepositories()
INFO: Starting experiment initialization.
2024-06-07 08:01:01 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping busybox
2024-06-07 08:01:01 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping openvpn
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Loading variability dataset.
2024-06-07 08:01:01 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping vim
2024-06-07 08:01:01 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping Marlin
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Started loading of dataset under data/ground-truth/argouml-spl
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Read commit ids.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Initialized SPL commits.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped success commits.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped error commits.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped empty commits.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped partial success commits.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapping commits to parents...
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Done.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Found a total of 1 commits.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Dataset loaded.
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Retrieving commits
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.RQRunner.run()
INFO: 3 size of variants out of 3 samples
2024-06-07 08:01:01 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.ExperimentRunner.prepareVariants()
INFO: Preparing new set of 3 variants for argouml-spl
[thread 32 also had an error]
[thread 31 also had an error]
[thread 30 also had an error]
[thread 10 also had an error]
[thread 29 also had an error]
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007ffffe8f3795, pid=8, tid=33
#
# JRE version: OpenJDK Runtime Environment (23.0+25) (build 23-ea+25-2094)
# Java VM: OpenJDK 64-Bit Server VM (23-ea+25-2094, mixed mode, sharing, tiered, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# V [libjvm.so+0x7f2795] G1ParScanThreadState::trim_queue_to_threshold(unsigned int)+0x1295
#
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/user/hs_err_pid8.log
#
# If you would like to submit a bug report, please visit:
# https://bugreport.java.com/bugreport/crash.jsp
#
./run-experiments.sh: line 7: 8 Aborted java -Xmx128g -Dtinylog.configuration=/home/user/tinylog.properties -jar traceboosting-eval-jar-with-dependencies.jar validation.properties
Done.
I notice you have the max java heap set to 128G! That could be the issue.
Thanks for the update! I made a minor change in the scripts used by docker. Can you please update the project and then try to rebuild the image using the command from my previous comment with your change, and then try to execute the container again?
If this does not work, perhaps we can first try whether a local setup works to check whether the problem is truly related to Docker.
To do so, you require Java and Maven. I do not want to force you to install stuff which you do not need, so please tell me if what I am asking here is too much.
Run ./setup.sh
to download and extract our dataset and ground truth. You might have to install curl
and unzip
to run the script. It is also possible to do the download and extraction manually by following the steps in the script.
./setup.sh
Build the artifact by running mvn clean package
in the root of the repository.
mvn clean package
Start the validation by calling the built jar file with
java -jar target/traceboosting-eval-jar-with-dependencies.jar validation.properties
I notice you have the max java heap set to 128G! That could be the issue.
Indeed, I removed this with the last update.
Fix works so far! But I had to add --platform linux/arm64/v8
to build.sh. You should make a separate build script for Macs with the apple chipset, with the platform set accordingly.
Great! Thank you for your help! I will add a separate script with a hint in the README about setting the platform. I assume, that it might have to be adjusted depending on the used hardware.
Let me know whether any other issues pop up.
Hmm, the evaluation was going fine but I aborted it by killing the job. Now when I run the evaluation the old issue is cropping up again:
INFO: Parsing variant 2...
2024-06-07 08:41:39 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 0 of 69491 variable nodes (0%)
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007ffffe8f5d08, pid=8, tid=36
#
# JRE version: OpenJDK Runtime Environment (23.0+25) (build 23-ea+25-2094)
# Java VM: OpenJDK 64-Bit Server VM (23-ea+25-2094, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# V [libjvm.so+0x7f4d08] G1ParScanThreadState::trim_queue_to_threshold(unsigned int)+0x3808
#
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/user/hs_err_pid8.log
#
# If you would like to submit a bug report, please visit:
# https://bugreport.java.com/bugreport/crash.jsp
#
./run-experiments.sh: line 7: 8 Aborted java -Dtinylog.configuration=/home/user/tinylog.properties -jar traceboosting-eval-jar-with-dependencies.jar validation.properties
Done.
I tried to build locally, but failed at the maven build step. I don't have JDK 21 right now and I don't want to mess up my Java environment ;-;
Not good, but also interesting that this happens. There are two things that come to mind.
First, there might be spikes in the memory usage and the validation might sometimes require more than 32 GB of RAM in order to run. I changed the dataset that is used for the validation to one that might be smaller. Can you please update the repo and try to build and run the validation again? I also added a build script for MacOS.
Second, an undesired state might build up when executing the artifact several times (although, I do not see how this might happen).
If the updated validation does not work, can you try cleaning all files in the results
directory (rm -r results/*
) and perhaps also the docker image and container?
can you try cleaning all files in the results directory (rm -r results/*) and perhaps also the docker image and container?
How do I do that?
If you do not use docker for anything else, the easiest way is docker system prune -a
which removes all data related to previously built images and executed containers.
Otherwise, you can try to execute:
docker image rm boosting
docker container rm "$(docker ps -a -q --filter "ancestor=boosting")"
Ran docker system prune -a
, rebuilt the container, re-ran the validation, and issue persisted. This is with the old dataset. Now I've pulled the latest version.
Running clean.sh:
Cleaning all related Docker data. This may take a moment...
Trying to stop running containers...
Error response from daemon: page not found
Removing boosting image...
Untagged: boosting:latest
Deleted: sha256:26c2e0fb9b7a1881931564efe27d13f3f74cefc98d955e06990fc0b41c6f25c1
Removing boosting containers...
Container name cannot be empty
...done.
Looks like there was an issue, so I ran docker prune
again.
Running the latest dataset, I got further than before, but eventually got to the same error:
Starting validation
Running validation
2024-06-07 09:17:18 [main] org.variantsync.boosting.eval.util.VEVOSUtilities.cloneRepositories()
INFO: Starting experiment initialization.
2024-06-07 09:17:18 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping busybox
2024-06-07 09:17:18 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping argouml-spl
2024-06-07 09:17:18 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping vim
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Loading variability dataset.
2024-06-07 09:17:18 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping Marlin
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Started loading of dataset under data/ground-truth/openvpn
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Read commit ids.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Initialized SPL commits.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped success commits.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped error commits.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped empty commits.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped partial success commits.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapping commits to parents...
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Done.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Found a total of 1 commits.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Dataset loaded.
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Retrieving commits
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.RQRunner.run()
INFO: 3 size of variants out of 3 samples
2024-06-07 09:17:18 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.ExperimentRunner.prepareVariants()
INFO: Preparing new set of 3 variants for openvpn
2024-06-07 09:17:19 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.ExperimentRunner.prepareVariants()
INFO: Generated variants and received ground truths for 3 variants
2024-06-07 09:17:19 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.RQRunner.run()
INFO: loop mappingPercentage:0 runID=1
build TraceBoosting object
2024-06-07 09:17:19 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.initialize()
INFO: Collecting variant dirs...
2024-06-07 09:17:19 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.startProductCreation()
INFO: Creating products
2024-06-07 09:17:19 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.initialize()
INFO: ...done.
initial done
2024-06-07 09:17:19 [pool-2-thread-1] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 0...
2024-06-07 09:17:20 [pool-2-thread-2] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 1...
2024-06-07 09:17:20 [pool-2-thread-3] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 2...
2024-06-07 09:17:22 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 0 of 20877 variable nodes (0%)
2024-06-07 09:17:22 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 0 of 19942 variable nodes (0%)
2024-06-07 09:17:22 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 0 of 20907 variable nodes (0%)
Boosting...
2024-06-07 09:17:22 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Merging product #0
2024-06-07 09:17:22 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Considering product 0...
2024-06-07 09:17:23 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: ...done.
2024-06-07 09:17:23 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Merging product #1
2024-06-07 09:17:23 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Considering product 1...
2024-06-07 09:17:23 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: ...done.
2024-06-07 09:17:23 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Merging product #2
2024-06-07 09:17:24 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Considering product 2...
2024-06-07 09:17:24 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: ...done.
2024-06-07 09:17:24 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.assignProactiveTraces()
INFO: Updated 0 of 5 associations by propagating mappings.
2024-06-07 09:17:24 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.computeMappings()
INFO: Translating mappings from associations back to products...
2024-06-07 09:17:24 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.computeMappings()
INFO: Mapping translation complete.
comparing...
number of nodes: 90562
2024-06-07 09:17:26 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.Evaluator.compare()
INFO: Evaluated the mapping of 90562 nodes.
scoring....
2024-06-07 09:17:26 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.RQRunner.run()
INFO: loop mappingPercentage:15 runID=1
build TraceBoosting object
2024-06-07 09:17:26 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.initialize()
INFO: Collecting variant dirs...
2024-06-07 09:17:26 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.startProductCreation()
INFO: Creating products
2024-06-07 09:17:26 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.initialize()
INFO: ...done.
initial done
2024-06-07 09:17:26 [pool-4-thread-1] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 0...
2024-06-07 09:17:26 [pool-4-thread-2] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 1...
2024-06-07 09:17:26 [pool-4-thread-3] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 2...
2024-06-07 09:17:27 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 3131 of 20877 variable nodes (15%)
2024-06-07 09:17:28 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 2991 of 19942 variable nodes (15%)
2024-06-07 09:17:28 [pool-1-thread-1] org.variantsync.boosting.eval.util.RandomMapping.applyDistribution()
INFO: Assigning mappings to 3136 of 20907 variable nodes (15%)
Boosting...
2024-06-07 09:17:28 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Merging product #0
2024-06-07 09:17:28 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Considering product 0...
2024-06-07 09:17:28 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: ...done.
2024-06-07 09:17:28 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Merging product #1
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Considering product 1...
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: ...done.
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Merging product #2
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: Considering product 2...
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.traceExtractionAlgorithm()
INFO: ...done.
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.assignProactiveTraces()
INFO: Updated 0 of 5 associations by propagating mappings.
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.computeMappings()
INFO: Translating mappings from associations back to products...
2024-06-07 09:17:29 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.computeMappings()
INFO: Mapping translation complete.
comparing...
number of nodes: 90562
[thread 10 also had an error]
[thread 28 also had an error]
[thread 32 also had an error]
[thread 38 also had an error]
[thread 37 also had an error]
[thread 29 also had an error]
[thread 27 also had an error]
[thread 31 also had an error]
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007ffffe8f88ed, pid=8, tid=36
#
# JRE version: OpenJDK Runtime Environment (23.0+25) (build 23-ea+25-2094)
# Java VM: OpenJDK 64-Bit Server VM (23-ea+25-2094, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# V [libjvm.so+0x7f78ed] G1ParScanThreadState::steal_and_trim_queue(GenericTaskQueueSet<OverflowTaskQueue<ScannerTask, (MEMFLAGS)5, 131072u>, (MEMFLAGS)5>*)+0x4cd
#
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/user/hs_err_pid8.log
#
# If you would like to submit a bug report, please visit:
# https://bugreport.java.com/bugreport/crash.jsp
#
./run-experiments.sh: line 7: 8 Aborted java -Dtinylog.configuration=/home/user/tinylog.properties -jar traceboosting-eval-jar-with-dependencies.jar validation.properties
Thanks for trying everything! I am now pretty sure that the issue is related to the required memory. I will make the number of threads that are used configurable and make some other config changes to reduce the memory required by the validation.
I pushed the changes. Can you please try to rebuild and rerun the validation?
No dice:
➜ trace-boosting-eval git:(main) ✗ bash build-on-mac.sh
[+] Building 85.1s (33/33) FINISHED docker:desktop-linux
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 1.80kB 0.0s
=> [internal] load metadata for docker.io/library/openjdk:23-slim 1.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load build context 0.7s
=> => transferring context: 1.86MB 0.7s
=> [stage-0 1/9] FROM docker.io/library/openjdk:23-slim@sha256:3b841dd4400a37578c9dad030aa455d3c1872c7dd376842376b53b3c7c822829 0.0s
=> CACHED [stage-0 2/9] RUN apt-get update && apt-get upgrade -y 0.0s
=> [stage-1 3/21] RUN apt-get install bash git python3 unzip curl -y 19.1s
=> [stage-0 3/9] RUN apt-get install -y maven 26.1s
=> [stage-1 4/21] RUN apt-get install -y python3 python3-pip python3-virtualenv 33.7s
=> [stage-0 4/9] WORKDIR /home/user 0.0s
=> [stage-0 5/9] COPY src ./src 0.0s
=> [stage-0 6/9] COPY python ./python 0.0s
=> [stage-0 7/9] COPY local-maven-repo ./local-maven-repo 0.1s
=> [stage-0 8/9] COPY pom.xml ./pom.xml 0.0s
=> [stage-0 9/9] RUN mvn clean package || exit 48.7s
=> [stage-1 5/21] RUN if ! getent group 20; then groupadd -g 20 user; fi 0.1s
=> [stage-1 6/21] RUN useradd --no-create-home --uid 502 --gid 20 --home-dir /home/user --shell /bin/bash user 0.1s
=> [stage-1 7/21] RUN mkdir -p /home/user 0.1s
=> [stage-1 8/21] RUN chown 502:20 /home/user 0.1s
=> [stage-1 9/21] WORKDIR /home/user 0.0s
=> [stage-1 10/21] COPY docker/* ./ 0.0s
=> [stage-1 11/21] COPY python ./python 0.0s
=> [stage-1 12/21] RUN mkdir data 0.1s
=> [stage-1 13/21] COPY data ./data 0.7s
=> [stage-1 14/21] COPY setup.sh ./setup.sh 0.0s
=> [stage-1 15/21] COPY --from=0 /home/user/target* ./ 0.1s
=> [stage-1 16/21] WORKDIR /home/user/python 0.0s
=> [stage-1 17/21] WORKDIR /home/user 0.0s
=> [stage-1 18/21] RUN chown 502:20 /home/user -R 7.8s
=> [stage-1 19/21] RUN chmod +x run-experiments.sh 0.1s
=> [stage-1 20/21] RUN chmod +x entrypoint.sh 0.1s
=> [stage-1 21/21] RUN chmod +x setup.sh 0.1s
=> exporting to image 0.9s
=> => exporting layers 0.9s
=> => writing image sha256:5a887434b90eaef6d028aeb8b73ccb2ed4681298837e03484b8c3cf4e0b442b8 0.0s
=> => naming to docker.io/library/boosting 0.0s
View build details: docker-desktop://dashboard/build/desktop-linux/desktop-linux/9kdadf87nbvcse5x41ojj010g
What's Next?
1. Sign in to your Docker account → docker login
2. View a summary of image vulnerabilities and recommendations → docker scout quickview
➜ trace-boosting-eval git:(main) ✗ ./execute.sh validation
Starting validation
Running validation
2024-06-07 09:50:33 [main] org.variantsync.boosting.eval.util.VEVOSUtilities.cloneRepositories()
INFO: Starting experiment initialization.
2024-06-07 09:50:33 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping busybox
2024-06-07 09:50:33 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping argouml-spl
2024-06-07 09:50:33 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping vim
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Loading variability dataset.
2024-06-07 09:50:33 [main] org.variantsync.boosting.eval.Main.main()
INFO: Skipping Marlin
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Started loading of dataset under data/ground-truth/openvpn
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Read commit ids.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Initialized SPL commits.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped success commits.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped error commits.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped empty commits.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapped partial success commits.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Mapping commits to parents...
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Done.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.vevos.simulation.io.data.VariabilityDatasetLoader.load()
INFO: Found a total of 1 commits.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Dataset loaded.
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.boosting.eval.util.VEVOSUtilities.loadGroundTruth()
INFO: Retrieving commits
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.RQRunner.run()
INFO: 3 size of variants out of 3 samples
2024-06-07 09:50:33 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.ExperimentRunner.prepareVariants()
INFO: Preparing new set of 3 variants for openvpn
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.ExperimentRunner.prepareVariants()
INFO: Generated variants and received ground truths for 3 variants
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.eval.experiments.RQRunner.run()
INFO: loop mappingPercentage:0 runID=1
build TraceBoosting object
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.initialize()
INFO: Collecting variant dirs...
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.startProductCreation()
INFO: Creating products
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.initialize()
INFO: ...done.
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.setNumThreads()
INFO: Updating thread pool...
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.setNumThreads()
INFO: Shutting down old pool
2024-06-07 09:50:34 [pool-1-thread-1] org.variantsync.boosting.TraceBoosting.setNumThreads()
INFO: Created new pool with 4 threads.
initial done
2024-06-07 09:50:34 [pool-2-thread-1] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 0...
2024-06-07 09:50:35 [pool-2-thread-2] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 1...
2024-06-07 09:50:35 [pool-2-thread-3] org.variantsync.boosting.product.ProductInitializationTask.call()
INFO: Parsing variant 2...
#
# A fatal error has been detected by the Java Runtime Environment:
#
# SIGSEGV (0xb) at pc=0x00007ffffe8f596a, pid=8, tid=39
#
# JRE version: OpenJDK Runtime Environment (23.0+25) (build 23-ea+25-2094)
# Java VM: OpenJDK 64-Bit Server VM (23-ea+25-2094, mixed mode, sharing, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64)
# Problematic frame:
# V [libjvm.so+0x7f496a] G1ParScanThreadState::trim_queue_to_threshold(unsigned int)+0x346a
#
# No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/user/hs_err_pid8.log
assertion failed [node != nullptr]: There is a hole in the VmTracker tree at address 0xfffffffffffffffc
(VMAllocationTracker.cpp:137 vm_allocation_info_for_address)
./run-experiments.sh: line 7: 8 Trace/breakpoint trap java -Dtinylog.configuration=/home/user/tinylog.properties -jar traceboosting-eval-jar-with-dependencies.jar validation.properties
Done.
➜ trace-boosting-eval git:(main) ✗ git pull
Already up to date.
I did run docker prune
before that command btw
Ahh... that is unfortunate. Can you please try one last thing? Please try to increase the memory available to docker (e.g., 24 GB). (See https://docs.docker.com/desktop/settings/mac/ for more information)
If you are using Docker Desktop / Dashboard, this should be possible under Settings
> Advanced
> Memory Limit
. Otherwise, there should also be a settings.json file at ~/Library/Group Containers/group.com.docker/settings.json
.
Simply restarting the validation without a rebuild should be sufficient, after changing the setting.
My memory limit was 7.9GB. It was under Settings > Resources > Advanced
. I updated it to 26GB, but it didn't fix the issue. I tried setting -m 26GB
in the execute.sh
script, but that also didn't fix the problem.
Okay, thank you for trying! At the moment, I do not know anything else we could try. Nevertheless, you already helped us a lot! Thank you very much!
Happy to help! If you think of anything, leave a reply here.
I tried running the validation with reduced RAM (8GB) on my machine and it worked, so it might not be a memory problem after all. Perhaps its instead related to issues with the JVM used in the container or incompatibility between the host and container file system. I switched the JDK used by the container to an LTS version (JDK21), but I am not sure whether this makes a difference.
Just tried that and it didn't work. Then I figured, if G1 is throwing the error why not try another GC? This patch got it working for me:
validation() {
echo "Running validation"
- java -Dtinylog.configuration=/home/user/tinylog.properties -jar traceboosting-eval-jar-with-dependencies.jar validation.properties
+ java -XX:+UnlockExperimentalVMOptions -XX:+UseZGC -Dtinylog.configuration=/home/user/tinylog.properties -jar traceboosting-eval-jar-with-dependencies.jar validation.properties
}
Thank you for this fix! I just integrated it in c2afa3c5 and checked whether it still works with my setup as well. Things are looking great.
If it is working on your setup now, we can close this issue, right?
I think so! As long as "validate" was enough and you don't need me to test anything else.
hey :) thanks again for checking this with MacOS! I gonna close this issue now, as the problem seems to be fixed :)
Thought I'd speed things up by creating an issue here. With the latest fixes, the docker image now builds on my machine. But running the evaluation produces a segfault in the JRE: