nextcloud / cdav-library

:date: 📇 CalDAV and CardDAV client library for JavaScript
https://www.npmjs.com/package/@nextcloud/cdav-library
GNU Affero General Public License v3.0
64 stars 15 forks source link

Integration Tests #2

Open georgehrke opened 6 years ago

georgehrke commented 6 years ago

I'd like to have integration tests that test this library against the real Nextcloud CalDAV and CardDAV server.

Any suggestions for an integration test framework?

rullzer commented 6 years ago

Summoning @danxuliu for input

danxuliu commented 6 years ago

I am afraid that I can not be of much help here, as I have never used integration test frameworks on JavaScript. Anyway, here is some information just in case you find it useful (although I am sure you already know everything that I will write below ;-) ).

The integration test framework used in the server is Behat, which is the PHP implementation of Cucumber. Cucumber is a Behaviour-Driven Development framework with implementations in several languages, including JavaScript, so you may want to use it to keep a consistent style in the test definitions between the server and this library.

However, Cucumber basically provides a translation between the human-readable Gherkin language (Given, When and Then sentences) and the programming language in which the tests really run. As far as I know it does not provide anything regarding integration test themselves, in the sense of starting the server before a test and cleaning it up after the test has run. In my experience that (test isolation) is precisely the most complex part regarding integration (and acceptance) tests, and I am not aware of any tool or framework that takes care of that automatically :-(

In the integration tests of the server the clean up is done explicitly; there are several methods annotated as @AfterScenario and @AfterSuite that acts as an undo of their associated tests. The acceptance tests use a different approach; after each test is run the server is stopped and then started again in its initial state.

The advantage of fully resetting the server is that each test is fully isolated. If an integration test fails and leaves the server in an unexpected state not covered by the clean up method further tests could fail in cascade. More important is that it requires more planning and maintenance to write and then keep the tests and the clean up in sync. However, the drawback of resetting the server is that the tests are slower to run and, more important, that currently that approach works when using SQLite, but not when the data is stored in real databases.

Initially, the acceptance tests run in docker containers, which made trivial and quick to reset them to a known state. The problem with that approach is that they could not be run in Drone, as Drone tasks run in Docker containers, and it would require nested containers, which due to the current architecture of Drone and Docker would mean that anyone making a pull request to our repository could get root access to the machine they run on without much effort (great, isn't it? :-P ). Thus now the acceptance tests run in the built-in PHP server using SQLite and a dirty trick is used to reset them: adding all the files to a local Git repository and checking it again to its initial state after each test is run. Unfortunately that trick limits them to the SQLite database :-(

End of the verbose yet barely useful explanation :-P