connext / indra

[LEGACY] Monorepo containing everything related to the core Connext protocols and network.
MIT License
84 stars 38 forks source link

Tests: Integrate xud simulation tests #1140

Open kilrau opened 4 years ago

kilrau commented 4 years ago

Is your feature request related to a problem? Please describe. To avoid regression for us, @LePremierHomme started working on integrating connext into our simulation test suite, which tests all typical flows from setup, sync and swap with many "unhappy" edge cases. The simulation test suite is dockerized and running on our travis.

Describe the solution you'd like I think it would make sense for connext our simulation test suite once the connext integration is done and run it additionally with the connext CI.

Describe alternatives you've considered Not sure, if integration into the CI proves to be difficult, maybe manually run these locally as a final test before tagging a release for starters.

ArjunBhuptani commented 4 years ago

Is there any way these tests can be run in the XUD CI? The other folks who are setting up Connext nodes for various usecases are running custom tests in their own CI/CD flow with new versions of Connext.

Our CI process is already pretty long/complex/extensive. Adding this would be another thing that we have to maintain which is tough. From our end, we're working on making our tests more robust and comprehensive. That should handle protection against regressions for the most part.

kilrau commented 4 years ago

Is there any way these tests can be run in the XUD CI?

They are already (in travis with every new commit). Connext-specific test scenarios are currently being integrated there.

Our CI process is already pretty long/complex/extensive.

Understood. But isn't it enough to run these kind of tests from clients only before tagging a release to make sure nothing broke? Honestly, since our tests are dockerized, one can just run them locally for now before tagging a release. My only point was to catch things before they get into a release.

Adding this would be another thing that we have to maintain which is tough.

We would maintain these.

he other folks who are setting up Connext nodes for various usecases are running custom tests in their own CI/CD flow with new versions of Connext.

Per above, we will do this anyways, but like this we can only detect things that broke after a release was tagged. If something broke, we have will have to skip the release and dive into what broke and why and then connect with you guys how to fix it.

TL,DR: for the foreseeable "moving-fast" future us running the tests against releases is fine. But mid-term we would need stability for our use case and that ideally means new releases reliably pass our tests. Maybe we find an elegant way to reach that.