Tools for downloading and preserving wikis. We archive wikis, from Wikipedia to tiniest wikis. As of 2024, WikiTeam has preserved more than 600,000 wikis.
A number of non-API downloads terminate like this:
urllib2.HTTPError: HTTP Error 302: The HTTP server returned a redirect error that would lead to an infinite loop.
The last 30x error message was:
Found
tail: cannot open `mikroitetdk_fictionpedia-20140215-wikidump/mikroitetdk_fictionpedia-20140215-history.xml' for reading: No such file or directory
I've not made an audit of all such wikis because launcher.py just goes on to the next wiki and doesn't log the error (maybe worth another issue?), but as first thing better check the logic of Special:AllPages screenscraping.
From nemow...@gmail.com on February 16, 2014 09:44:22
A number of non-API downloads terminate like this:
urllib2.HTTPError: HTTP Error 302: The HTTP server returned a redirect error that would lead to an infinite loop. The last 30x error message was: Found tail: cannot open `mikroitetdk_fictionpedia-20140215-wikidump/mikroitetdk_fictionpedia-20140215-history.xml' for reading: No such file or directory
With variations in the 30x error code and in the error message. One example is http://www.ifnipedia.es/index.php?title=Especial:Todas (1.16), another http://mikroitet.dk/wiki/Special:AllPages (also 1.16). I can't reproduce manually because I don't even see "Next" links, maybe that's the actual problem?
I've not made an audit of all such wikis because launcher.py just goes on to the next wiki and doesn't log the error (maybe worth another issue?), but as first thing better check the logic of Special:AllPages screenscraping.
Original issue: http://code.google.com/p/wikiteam/issues/detail?id=102