Closed Ngiong closed 3 years ago
Hello,
After modifying the search budget to account 50% of the overall time (leaving the remaining 50% for the postprocessing steps), I found Evosuite generated flaky test cases (e.g. broken test) for most subjects of SBST 2020 benchmark (in single run). evosuite_singlerun.zip
Am I still in the correct direction? I'm afraid that I missed configuring something in Evosuite, which causes the discrepancy.
Looking forward to hearing any response from you :smile: Thank you ^^
Hello,
As SBST Unit-Test Competition utilizes Docker for the benchmarking I wonder if there is any plan in the future to publish the replication package of Evosuite in SBST competitions too?
Actually, I want to reproduce the result of Evosuite in SBST 2020 competition, but I don't know why I keep getting different results (i.e. almost all targets get 0% branch cov.) from Evosuite's Result Paper.
Context
I want to reproduce the result of Evosuite in SBST 2020 as published in the Result Paper
Steps to Reproduce
contest_generate_tests.sh evosuite 1 1 60
contest_compute_metrics.sh results_evosuite_60 > stat_log.txt 2> error_log.txt
EvoSuite Arguments
Current Result
Although there lies some test cases successfully generated, (almost) all branch coverage is 0%.
Here is a sample of GUAVA-128_1 GUAVA-128_1.zip p.s. the
separateClassLoader = false
andpublic Throwables_ESTest() { super(); }
inThrowables_ESTest.java
were manually added, but it didn't work.I also found this error message in the
metrics/log_detailed.txt
GUAVA-128_1-metrics.zipExpected result
Similar result as the Result Paper
Additional info
It's quite a dilemma for me, whether to post an issue here or in the SBST docker infrastructure's Github repo
But reflecting on my initial question, is there any plan to publish the replication package for Evosuite?
Thank you. :)