Open GoogleCodeExporter opened 9 years ago
Supporting Extension is possible of course. And it looks to me, that it
wouldn't be
too hard to code a small function, which would provide the same feature to
Wiki2LaTeX. I will look into it.
The second way is already possible: You can create a page, which transcludes
other
pages into it (much the way like templates work). This page can then be
processed by
w2l the usual way and should give the expected result.
Original comment by hansgeorg.kluge@gmail.com
on 9 Jul 2008 at 12:27
Sounds great, thanks for the tips!
BTW, the type of this issue should be "Enhancement" but I don't see a way of
changing
that.
Original comment by s.c.w.pl...@gmail.com
on 9 Jul 2008 at 1:44
Yesterday, I followed your advice of transcluding wiki pages in the following
way. I
created a hook for a new action "hierarchy_one_page". This hook would render all
articles referenced in a Hierarchy index, on a single page. Effectively, this
hook
replaces the Wiki-text:
<index>
= Article1 =
== Article2 ==
= Article3 =
</index>
by the following:
= Article 1 =
{{:Article1}}
== Article2 ==
{{:Article2}}
= Article3 =
{{:Article3}}
and then uses $wgOut to render this wiki text.
However, the page would not render for indexes containing a large hierarchy
(about 40
articles): I would simply get a blank page. I think this is a memory issue: the
amount of wiki-text that MediaWiki has to parse and render due to the
transclusions,
is simply too large. For smaller hierarchies it worked okay.
Therefore, I think the following solution would be better for extending
Wiki2LaTeX:
- for every entry of the form "[=]+ X [=]+" in the <index> ... </index> tag:
* convert the following wiki code to LaTeX:
[=]+ X [=]+
Wiki-contents of article X
* store this LaTeX code in a separate .tex-file in a tmp directory.
- create a master .tex-file that includes all created tex-files using \input{}.
- run pdflatex on that master file.
This processes the articles one-by-one instead of first collecting all
wiki-code and
then passing that huge bulk of wiki-code to the w2l parser. This should prevent
memory problems.
Original comment by s.c.w.pl...@gmail.com
on 10 Jul 2008 at 9:57
Original comment by hansgeorg.kluge@gmail.com
on 17 Feb 2011 at 10:52
Original comment by hansgeorg.kluge@gmail.com
on 22 Feb 2011 at 9:59
Original issue reported on code.google.com by
s.c.w.pl...@gmail.com
on 9 Jul 2008 at 9:19