Closed Agorguy closed 1 year ago
Yes there are some flaky tests due to the tricky nature of testing concurrent code. Additionally some issues with Instant-truncation and flaky testcontainers. I try to fix the painful ones. Typically I run the tests on a multicore machine, I have never tried a single core one. Not sure why I would 🤔
I would rather we fixed the test than add that type of disclaimer readme though.
Pull-requests making tests more stable would be awesome. However, not by just adding Thread.sleeps :)
@kagkarlsson I wanted to provide additional context about my efforts to fix the tests. I have invested considerable time and effort in trying to resolve the issues. However, due to the project's complexity, finding a simple solution has been challenging. Despite exploring different approaches, I have been unsuccessful in achieving the desired stability.
This is the error message:
expected: <true> but was: <false>
in the line 39. (link).
You mentioned running the tests on a multicore machine, but not on a single-core machine. While this may have worked for you, it's important to consider that users with different hardware configurations may encounter problems. By specifying the minimum system requirements in the project documentation, we can ensure that all users are aware of the baseline configuration necessary for stable test execution. This approach can prevent unexpected test failures and save developers the trouble of debugging unrelated issues.
I see your point. But how many would read that doc and how many of those are building on single-core machines? I would guess not many. But I agree that it might be useful to add a section in the Readme on how to build the source, and there possibly add this hint
Hello @kagkarlsson can you take a look at this PR?
I made it based on what we discussed above and your last comment. Let me know if you think something can be improved..
🎉 This issue has been resolved in v12.4.0
(Release Notes)
Hello,
We tried running your project and discovered that it contains some flaky tests (i.e., tests that nondeterministically pass and fail). We found these tests to fail more frequently when running them on certain machines of ours.
To prevent others from running this project and its tests in machines that may result in flaky tests, we suggest adding information to the README.md file indicating the minimum resource configuration for running the tests of this project as to prevent observation of test flakiness.
If we run this project in a machine with 1cpu and 500 ram, we observe flaky tests. We found that the tests in this project did not have any flaky tests when we ran it on machines with 2cpu and 2gb ram.
Here is the test we have identified and their likelihood of failure on a system with less than the recommended 2 CPUs and 2 GB RAM.
Please let me know if you would like us to create a pull request on this matter (possibly to the readme of this project).
Thank you for your attention to this matter. We hope that our recommendations will be helpful in improving the quality and performance of your project, especially for others to use.
Reproducing
build the image:
Running: