Closed emijrp closed 10 years ago
The issue would be configuring all sorts of images (MediaWiki, webserver) option pairs and trigger them to test each build. Even on gerrit.wikimedia.org, they only test against one version/configuration at a time AFAIK.
If we're lucky @hashar or @krinkle have some advice for us.
I mean to test against real sites, mainly those that have failed in the past.
For example some of #134, this one #127, etc.
But we can never know if they changed their configuration, or release or if they have database/webserver/content/configuration issues... Could be better than nothing, sure.
As a start, at least in lucky periods like this when there's two or more of us around, we could get into the habit of sending PR and let someone else merge to master. I should have re-read before checking out the code etc., but I didn't; had I had the pressure of an open pull request, I would probably have read rather than skimmed, and caught that typo.
Hello there! Wikimedia has a cluster that ran tip of the master branch of all our repositories. It is updated continuously (configuration after merge, code every 10 minutes or so and db updated every hour or so).
We only have a subset of the real projects https://github.com/wikimedia/operations-mediawiki-config/blob/master/all-labs.dblist and they barely have any content. But their configuration should match the production ones.
Example URL:
http://en.wikipedia.beta.wmflabs.org/
We have mobile version as well: http://en.m.wikipedia.beta.wmflabs.org/
The Simple wiki version has a full import of last revision of pages from a couple years ago. So there is a bunch of articles there: http://en.simple.wikipedia.beta.wmflabs.org/
Let us know if you need assistance. There are some folks on Freenode in #wikimedia-qa that might assist. You can also use the QA mailing list at https://lists.wikimedia.org/mailman/listinfo/qa
Or there, but you reach a smaller audience :-]
Just did a first unit test d04c0e5bceaaf151041e863d5eb1daf184381c9c
It seems easy. Just create a test_function(), initialize a config for the wiki to test, test the function you want from dumpgenerator and explore the result.
The testing directory contains a script that is neat and works fine. Perhaps in the future we need something more advanced but for now is very nice. Closing this.
I think we need to build a script to run some tests in an automate way, every time we produce a new version of dumpgenerator.py.
The testing script may include checks for small wikis with a variety of MediaWiki versions, with/without API, and so. We need a list of correct results to compare the results of the test.
Anyone have experience on testing? I have never do this before.