google-code-export / wiki2latex

Automatically exported from code.google.com/p/wiki2latex
0 stars 1 forks source link

Process multiple Wiki articles (+ Hierarchy extension?) #18

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
It would be great if Wiki2LaTeX could compile a large LaTeX document from
multiple Wiki articles.

Currently, I use the Hierarchy extension for MediaWiki to organise articles
into a tree structure. For this extension to work, an "index" article is
created, containing e.g.:
  <index>
  = Article1 =
  == Article2 ==
  = Article3 =
  </index>

This would organise the articles as follows:
1. Article1
1.1 Article2
2. Article3

Ideally, Wiki2LaTeX would be able to parse such an <index> ... </index>
specification and generate suitable LaTeX code. For example:

  \chapter{Article1}
  [LaTeX code for contents of Article1]

  \section{Article2}
  [LaTeX code for contents of Article2]

  \chapter{Article3}
  [LaTeX code for contents of Article3]

This would really help me a lot.

Another possibility would be to allow the user to specify a list of
articles which would then be compiled to a single LaTeX document, where
every article starts a new chapter.

However, for me, the connection with the Hierarchy extension would be very
useful, because the "index" page already contains the list of articles,
including a document structure.

Original issue reported on code.google.com by s.c.w.pl...@gmail.com on 9 Jul 2008 at 9:19

GoogleCodeExporter commented 9 years ago
Supporting Extension is possible of course. And it looks to me, that it 
wouldn't be
too hard to code a small function, which would provide the same feature to
Wiki2LaTeX. I will look into it.

The second way is already possible: You can create a page, which transcludes 
other
pages into it (much the way like templates work). This page can then be 
processed by
w2l the usual way and should give the expected result.

Original comment by hansgeorg.kluge@gmail.com on 9 Jul 2008 at 12:27

GoogleCodeExporter commented 9 years ago
Sounds great, thanks for the tips!
BTW, the type of this issue should be "Enhancement" but I don't see a way of 
changing
that.

Original comment by s.c.w.pl...@gmail.com on 9 Jul 2008 at 1:44

GoogleCodeExporter commented 9 years ago
Yesterday, I followed your advice of transcluding wiki pages in the following 
way. I
created a hook for a new action "hierarchy_one_page". This hook would render all
articles referenced in a Hierarchy index, on a single page. Effectively, this 
hook
replaces the Wiki-text:
  <index>
  = Article1 =
  == Article2 ==
  = Article3 =
  </index>
by the following:
  = Article 1 =
  {{:Article1}}
  == Article2 ==
  {{:Article2}}
  = Article3 =
  {{:Article3}}
and then uses $wgOut to render this wiki text.

However, the page would not render for indexes containing a large hierarchy 
(about 40
articles): I would simply get a blank page. I think this is a memory issue: the
amount of wiki-text that MediaWiki has to parse and render due to the 
transclusions,
is simply too large. For smaller hierarchies it worked okay.

Therefore, I think the following solution would be better for extending 
Wiki2LaTeX:
- for every entry of the form "[=]+ X [=]+" in the <index> ... </index> tag:
  * convert the following wiki code to LaTeX:
      [=]+ X [=]+
      Wiki-contents of article X
  * store this LaTeX code in a separate .tex-file in a tmp directory.
- create a master .tex-file that includes all created tex-files using \input{}.
- run pdflatex on that master file.

This processes the articles one-by-one instead of first collecting all 
wiki-code and
then passing that huge bulk of wiki-code to the w2l parser. This should prevent
memory problems.

Original comment by s.c.w.pl...@gmail.com on 10 Jul 2008 at 9:57

GoogleCodeExporter commented 9 years ago

Original comment by hansgeorg.kluge@gmail.com on 17 Feb 2011 at 10:52

GoogleCodeExporter commented 9 years ago

Original comment by hansgeorg.kluge@gmail.com on 22 Feb 2011 at 9:59