Closed cybermcm closed 4 years ago
Having the same issue: Downloading a directory d results in an empty file d.zip. I am running IFM with docker compose. Log:
ifm | [Sun Aug 2 12:14:38 2020] 192.168.128.2:58130 Accepted
ifm | [Sun Aug 2 12:14:38 2020] 192.168.128.2:58130 [200]: GET /?api=zipnload&dir=docker&filename=wiki
ifm | [Sun Aug 2 12:14:38 2020] 192.168.128.2:58130 Closing
Hi. I'm working on it, should be ready tomorrow.
This should be fixed in the latest commit (4c0c80b). Sorry for the really late respond. Let me know if you still have trouble.
Thank you for getting into it. It seems to work for smaller downloads but if the folder contains bigger files a timeout occurs:
Fatal error: Maximum execution time of 30+2 seconds exceeded (terminated) in /usr/local/share/webapps/ifm/index.php on line 5395
Is it possible to zip the files "on the fly" somehow? With other file managers the zip downloads starts immediately, with ifm it seems that the packing process is taking place during the waiting time. Sorry to compare your solution with other ones but I only can give you my user perspective...
Same for me. IFM even crashes due to fatal error:
ifm | Fatal error: Maximum execution time of 30+2 seconds exceeded (terminated) in /usr/local/share/webapps/ifm/index.php on line 5395
ifm exited with code 124
Hm, I try to implement the 'on-the-fly' download of archives. The max_execution_time
should be increased anyway, though.
So. Implementing a "on-the-fly" creation and streaming of a zip file is possible but quite a bit of work, because I either have to hand-craft this file or use an entire library like this: maennchen/ZipStream-PHP, which in turn has some downsides, e.g. the linux utility unzip
does not handle unicode characters properly.
However, I added a php.ini
which disables the max_execution_time
and removes upload size restrictions, so you should at least able to download your directories.
I might add that on-the-fly creation of zip files, but for now I guess the php.ini
is a reasonable solution.
@misterunknown I am having a somewhat similar issue with IFM. I am trying to download large directories containing multiple GBs of data and I am seeing a 502 as a result of the connection being reset by the peer. I have tried increasing max_execution_time, request_terminate_timeout, fastcgi_connect_timeout, fastcgi_send_timeout, fastcgi_read_timeout, and even fastcgi_buffers and fastcgi_buffer_size and none of these things are helping. Wondering if you have some other suggestions. Thanks.
using latest docker image from pinknet/ifm trying to download a folder (zip) resulting in a 0 byte .zip file. Can't find any log. ifm directory is read and writable log:
any ideas?