The initial clone or fetch will fail if "not well-formed" utf-8 characters are on a page or revision.
Since dokuwiki will store and display the characters (for example a page containing the UTF-8 decoder capability and stress test) and git raises the fatal error it could be either the dokuwiki xml-rpc interface or the RPC::XML::Client.pm module that complains.
I haven't looked too closely, but could git-remote-dokuwiki handle the transport error by ignoring it and storing the deformed data, or show a warning and skip any page/revisions that cause problems?
e.g.
Retrieving 9659 of 24657...Transport error:
not well-formed (invalid token) at line 34, column 312, byte 3502:
[...]
at /Library/Perl/5.18/RPC/XML/Client.pm line 402. at /usr/local/Cellar/git/2.25.0_1/libexec/git-core/git-remote-dokuwiki line 69.
warning: Not updating refs/dokuwiki/origin/master (new tip 1abb00391d24fe4cd134f7a8c02e0a3c98103a68 does not contain 6fa75660f5fca63356cadc54c40289636dd65d24)
fatal: error while running fast-import
The initial clone or fetch will fail if "not well-formed" utf-8 characters are on a page or revision.
Since dokuwiki will store and display the characters (for example a page containing the UTF-8 decoder capability and stress test) and
git
raises the fatal error it could be either the dokuwiki xml-rpc interface or theRPC::XML::Client.pm
module that complains.I haven't looked too closely, but could
git-remote-dokuwiki
handle the transport error by ignoring it and storing the deformed data, or show a warning and skip any page/revisions that cause problems?e.g.
may or may not be related to #1