rhushikeshc / clients-oriented-ftp

Automatically exported from code.google.com/p/clients-oriented-ftp
0 stars 0 forks source link

Unable to download Large files #228

Open GoogleCodeExporter opened 8 years ago

GoogleCodeExporter commented 8 years ago
Hi, Here is another bug I found today. I was trying to download a 1GB file but 
it's not getting download. Neither through Download Manager or any browser. 
Small files are okay no issues with downloading. 

Is this a known issue or any settings I have to do? Please let me know.

Waiting for your reply.

Thanks,
Manoj

Original issue reported on code.google.com by mk.sa...@gmail.com on 8 Mar 2013 at 8:06

GoogleCodeExporter commented 8 years ago
Thanks Manoj. Are you using r375?

Original comment by i...@subwaydesign.com.ar on 8 Mar 2013 at 8:17

GoogleCodeExporter commented 8 years ago
Yes, I am using r375 but I also tried with r335. None of them are working. 

If you want I can create a user for you so that you can check that.

Thanks

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:19

GoogleCodeExporter commented 8 years ago
Do files download corrupted or the don't start to download at all? It might 
take some time to start. A couple of seconds per gb.

Original comment by i...@subwaydesign.com.ar on 8 Mar 2013 at 8:21

GoogleCodeExporter commented 8 years ago
No.. Files are not corrupted. They are all Rar files and each of them are 1GB 
size. I uploaded those files on to the server through SSH. And placed them 
inside /projectsend/upload/files directory. Then went to Orphan files and 
assigned them to the clients and groups. 

Please let me know if am doing anything wrong.

Thanks

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:25

GoogleCodeExporter commented 8 years ago
What you did is correct.
Files do not download directly for security reasons. The download is processed 
by php. If you try to access the file URL directly, apache will return a 
forbidden state.

If you click the download button and wait, what happens next? Not immediatly, 
but any action at any time (servers gives you an error? timeout? nothing after 
1 or 2 minutes?).

Original comment by i...@subwaydesign.com.ar on 8 Mar 2013 at 8:27

GoogleCodeExporter commented 8 years ago
When I click on the Download button it takes long time to response for around 1 
min or so. Then Download manager grabs the download link but could not able to 
connect. Just showing connecting then send get again connecting then send get 
and this process is keep on going. 

Nothing happens there after. Even I waited for more than 10 mins. Still same. 

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:33

GoogleCodeExporter commented 8 years ago
And just now IDM threw the "Time Out" error. 

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:34

GoogleCodeExporter commented 8 years ago
I see. Seems like php isn't able to read the file in full. Do you know what you 
memory_limit setting is? That's your assigned server's RAM.

Original comment by i...@subwaydesign.com.ar on 8 Mar 2013 at 8:36

GoogleCodeExporter commented 8 years ago
I just checked the PHP.INI and this is what it shows.

memory_limit = 32M      ; Maximum amount of memory a script may consume (16MB)

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:39

GoogleCodeExporter commented 8 years ago
Shall I increase the Memory_limit to 128M?

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:39

GoogleCodeExporter commented 8 years ago
Please try to do that. I might take a few minutes for the server to recognize 
the new value. You should run a phpinfo(); script to check it out.
Are you familiar with this?

Original comment by i...@subwaydesign.com.ar on 8 Mar 2013 at 8:41

GoogleCodeExporter commented 8 years ago
Yes, I will do that. 

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:42

GoogleCodeExporter commented 8 years ago
http://ingtechnologies.com/info.php

Please have a look

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:46

GoogleCodeExporter commented 8 years ago
How about increasing max_execution_time to 120 for example?

Original comment by i...@subwaydesign.com.ar on 8 Mar 2013 at 8:50

GoogleCodeExporter commented 8 years ago
Did that.. But still no luck :(

I am sorry, my Notebook battery out of charge now. Please send me all possible 
steps/fixed I have to do after doing an investigation. I will follow that. I 
have to logout now. 

Thanks a lot for your help... 

Will check all your messages later.

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 8:52

GoogleCodeExporter commented 8 years ago
Hi, I am sorry.. I had to logout that time as my notebook was out of charge. 

Did you find any solution/fix for this issue?

Original comment by mk.sa...@gmail.com on 8 Mar 2013 at 11:47

GoogleCodeExporter commented 8 years ago
I had a similar issue. I increased my memory limit in a php.ini file, then adde 
a line to my .htaccess pointing towards my php file.  Seems to be downloading 
any file size now. 

Original comment by twerk...@gmail.com on 9 Mar 2013 at 12:56

GoogleCodeExporter commented 8 years ago
Hey, thanks for posting the tweak. Could you please tell me how much memory 
limit have you increased and what is the exact line have you added to 
.htaccess? 

Thanks

Original comment by mk.sa...@gmail.com on 9 Mar 2013 at 2:18

GoogleCodeExporter commented 8 years ago
Hi, Any update yet? 

I am still waiting to fix this issue. My clients are keep on asking..

Original comment by mk.sa...@gmail.com on 9 Mar 2013 at 7:35

GoogleCodeExporter commented 8 years ago
Sorry, I couldn't answer before.
Did you get any response from your hosting about the php error log?

Also, it would be great to know what that .htaccess line is twerk! :)

Thanks!

Original comment by i...@subwaydesign.com.ar on 10 Mar 2013 at 9:07

GoogleCodeExporter commented 8 years ago
Hi, I have hosted this on my own VPS. 

So please let me know how to get the php error log. I will get that for you.

But I want to add one thing, I able to download files by using other php 
scripts. Only problem with projectsend.

Thanks

Original comment by mk.sa...@gmail.com on 10 Mar 2013 at 9:43

GoogleCodeExporter commented 8 years ago
No Solution yet? Well it's all right. I finally decided to go with another 
script. But I will still love to use Projectsend. Hope next version will have 
all these bug fixes. 

Thanks

Original comment by mk.sa...@gmail.com on 12 Mar 2013 at 12:58

GoogleCodeExporter commented 8 years ago
I have the same problem. A file with 1,5 GB is downloadable, 2,4 GB crashed.

memory_limit is 128M

Analyze the source, extract the affected code from process.php in function 
download() { ... } (send header lines and readfile() ) and test this 
standalone. Now, i see a "memory limit exceeded".

Replace readline() with fpassthru() has no effect.
Replace readline() with the following code resolv the problem.

$chunk = 10 * 1024 * 1024; // bytes per chunk (10 MB)
$fh = fopen($this->real_file, 'rb');
while (!feof($fh)) {
    echo fread($fh, $chunk);
}

The server run with debian squeeze, apache2-mpm-itk, php 5.3.3 and 
ProjectSend-r514.
(r561 used also readline)

Original comment by r...@netbreaker.de on 3 May 2014 at 12:04

GoogleCodeExporter commented 8 years ago
additionally:

r561> diff process.php.orig process.php
136c136,140
<                           readfile($this->real_file);
---
>                           $chunk = 10 * 1024 * 1024; // bytes per chunk (10 MB)
>                           $fh = fopen($this->real_file, 'rb');
>                           while (!feof($fh)) {
>                                   echo fread($fh, $chunk);
>                           }

Original comment by r...@netbreaker.de on 3 May 2014 at 1:15