Open ryandesign opened 4 months ago
I am working on fixing this (and #4) by using trurl to parse the url (and, if the path is empty after stripping up to the last slash, using the name index.html) and then have wcurl specify the filename with --output
instead of letting curl do it with --remote-name
, but I'm having to rewrite the way that you collect the curl arguments in exec_curl
because the current way of putting everything into a single string does not accommodate quoting of special characters. I'd normally use a bash array but if you're trying to maintain POSIX sh compatibility I'll need to be more inventive. POSIX sh only has one array—$@
—which is in use in exec_curl
to hold the urls so I'll either try to dual-purpose $@
or find a different way to store the urls.
@ryandesign maybe we should reconsider doing it in bash... I'll speak to sergiodj about it
I don't quite see the reason why $URLS
is being moved into $@
and am attempting to just use $URLS
directly in the loop and leave $@
free for the curl arguments. I'm not opposed to the POSIX sh challenge.
I think I have it working and will submit it in a few hours.
This works for me but please test:
https://salsa.debian.org/debian/wcurl/-/merge_requests/4
Note there is a new dependency on trurl.
wget 1.24.5 percent-decodes filenames:
wcurl 2024-07-02 doesn't: