WikiTeam / wikiteam

Tools for downloading and preserving wikis. We archive wikis, from Wikipedia to tiniest wikis. As of 2024, WikiTeam has preserved more than 600,000 wikis.
https://github.com/WikiTeam
GNU General Public License v3.0
714 stars 148 forks source link

PmWiki export #105

Open emijrp opened 10 years ago

emijrp commented 10 years ago

From nemow...@gmail.com on April 04, 2014 21:02:24

Apparently some way to fetch data from a PmWiki exists, investigate: https://www.ohloh.net/p/pm2media

Original issue: http://code.google.com/p/wikiteam/issues/detail?id=105

emijrp commented 10 years ago

From nemow...@gmail.com on April 21, 2014 02:47:00

gremilkar is interested in this for http://tvtropes.org/ export.

Cc: gremil...@gmail.com

nemobis commented 10 years ago

@gremilkar, are you still interested in exporting tvtropes? @walkingice, @pmwiki, @MichaelPaulukonis, @gambhiro, @fernao, does any of you have ideas how one could proceed to archive/dump a pmwiki's data from the web?

MichaelPaulukonis commented 10 years ago

I can't find a canonical reference, but previously I found that TvTropes no longer uses PmWiki as their engine; it's just left in the URL like a vermiform appendix.

fernao commented 10 years ago

I'm making experiences dumping pmwiki to markdown, as my goal is to migrate it to http://ikiwiki.info/. Ikiwiki parses html directly from mdwn files.

There's a pmwiki module (that uses < php 5.4) that can be used to export it in that format. I've wrote a very simple script to extract all data of a wiki that exports to markdown, getting then with wget: https://github.com/fernao/pmwiki2mdwn.

nemobis commented 10 years ago

gremilkar is interested in this for http://tvtropes.org/ export.

...though for that specific wiki we now enjoy regular server-side dumps of the fork. https://static.orain.org/common/dumps/allthetropes/