Open hactar opened 3 years ago
Hi @hactar; I won't be able to guess why this bad junit is being made. Have you reviewed it? Does it look valid? I see by the output that you did provide that you used the --verbose
flag. I add a lot of output around junits and HTML reporting, do you see any clues there?
Are there any crashes in the test runner that you can see? Usually, full logs have more information before the problem that help me figure out what is going on, or at least let me add more logging around areas that I suspect are the culprit.
So, two things:
Hey @lyndsey-ferguson, thanks, a few updates: I posted the junit already (BikemapTests-batch-3. The others, BikemapTests-batch-1, BikemapTests-batch-2 and BikemapTests-batch-4 look normal)
<?xml version='1.0' encoding='UTF-8'?>
<testsuites tests='0' failures='0'/>
Thats it, it does not contain anything other than this. I've gone deeper into the logs, and yes it appears that a test runner crashed during the run:
I took a look at Test-Transient Testing-2021.05.15_11-13-48-+0200.xcresult. but it does not appear to contain anything useful, see screenshot. The build log file that it is referencing does not exist.
Here's the full log, encrypted: build-multi-scan-enc.zip
Would a good workaround be adding something to collate_junit_reports.rb above line 13 which filters out xml files where the root object did not contain a .name value?
@hactar that would be a workaround, but I wonder why the file doesn't exist?
I have to move this to the backlog again, personal life has gotten too busy to try and pick this up.
So just to be clear, the file does exist, its just malformed: its not a full report, but only contains two lines because the test runner crashed:
<?xml version='1.0' encoding='UTF-8'?>
<testsuites tests='0' failures='0'/>
I tried the work around, but its not complete, because the tests that were supposed to run in that testrunner are never run or rerun, so multi_scan completes with the workaround, but because only 38 of the 47 tests or so where run, it marks the result as a failure.
We have found a different workaround, we have removed parallel_testrun_count
from our multi_scan settings. By turning off the parallel test run we no longer have crashing test runners, and therefore the tests get run and rerun reliably - at the cost of parallelism.
Sure, it is malformed, but we don't know why. That's what concerns me.
multi_scan
uses the results of those junit files to determine which tests passed so it doesn't have to run them again. Without a valid junit, the program doesn't know.
Thanks for the note about the parallelism. There were problems when running tests in parallel and extra params were sent via a Scanfile that broke things: I believe I fixed them, but make sure that you're not using a Scanfile, or it is relatively empty to test if that is the problem.
@hactar I'm trying to decrypt the log files that you sent me. When I run the command to decrypt the key that you encrypted with my public key, I am using this command:
openssl rsautl -decrypt -ssl -inkey ~/.ssh/id_rsa -in secret.txt.key.enc -out secret.txt.key
When I try to read the secret.txt.key
, my computer tells me that it is a binary key (which it shouldn't).
Did you encrypt that key with my public key like this?
# download https://github.com/lyndsey-ferguson/fastlane-plugin-test_center/files/5577804/lyndsey-ferguson-id_rsa.pub.pkcs8.zip -> lyndsey-ferguson-id_rsa.pub
openssl rsautl -encrypt -pubin -inkey <my public key> -in secret.txt.key. -out secret.txt.key.enc
And if so, did you create the secret.txt.key
and encrypt the log file with these commands?
The exact command I ran was this:
openssl rsautl -encrypt -pubin -inkey lyndsey-ferguson-id_rsa.pub.pkcs8 -in secret.txt.key -out secret.txt.key.enc
I created the key using openssl aes-256-cbc -in build-multi-scan.txt -out build-multi-scan.txt.enc -pass file:secret.txt.key
Thanks for the tip with the Scanfile, we do still have a scanfile around, will see if removing it changes anything (the disabling of parallel testing is holding up though, the issue has not occured since)
Isn't it just the one generated by running build-without-testing
? I'm having the same issue and wasn't able to resolve it by setting parallel test count to 1.
@hactar, I have been working on making multi_scan use xcresult files instead of junit files, if you're interested in testing (alpha quality), follow the issue for the feature request
@lyndsey-ferguson Cool thanks! we're busy with our iOS 15 release but will test it once we find a moment.
same issue here, trying the workaround
multi_scan(workspace: ENV["WORKSPACE"],
devices: [DEFAULT_SIMULATOR],
scheme: ENV["UI_TEST_SCHEME"],
skip_slack: true,
result_bundle: true,
include_simulator_logs: false, # does not work in lab (causes Error 74)
output_types: "junit",
try_count: 5,
# parallel_testrun_count: 3, /* DISABLING AS A WORK AROUND TO THIS ISSUE https://github.com/lyndsey-ferguson/fastlane-plugin-test_center/issues/339 */
collate_reports: true)
I want to apologize that I have not responded to this issue. A family health issue is requiring me to narrow my focus on the essentials and I don't have the time I need to focus on the plugin more than a minute. If you can get some other contributors to this project to help with this, I would consider merging a resulting PR.
New Issue Checklist
fastlane-plugin-test_center
to the latest versionIssue Description
multi_scan crashes when it encounters a junit file that does not include a "name" in its root xml object. I don't know why such a junit file was generated in the first place, I have attached it to this document. So multi_scan crashes when collating the tests. My guess is there are two things to do: 1) figure out why a "wrong" junit is generated in the first place and 2) make collate_junit_reports.rb more robust to ignore such junits in case they do occur.
Complete output when running fastlane, including the stack trace and command used
I'm not gonna include the full output but the parts I think are relevant, but can provide more if required:
Tail of the output:
multi_scan settings:
How multi_scan is called:
The report.junit the plugin is choking on (note the other 3 junit files look ok, and this does not happen on every run, sometimes all junit files are ok):