git-ftp / git-ftp

Uses Git to upload only changed files to FTP servers.
https://git-ftp.github.io/
GNU General Public License v3.0
5.49k stars 690 forks source link

curl: Argument list too long #102

Closed mok0 closed 10 years ago

mok0 commented 10 years ago

Trying to do the initial push of a Drupal project directory, I ran into the following:

.
.
.
[3954 of 3954] Buffered for upload 'xmlrpc.php'.
Uploading ...
/Users/mok/bin/git-ftp: line 408: /usr/bin/curl: Argument list too long
fatal: Could not upload files., exiting...
mozmorris commented 10 years ago

Take a look at #88, looks like the same problem.

mok0 commented 10 years ago

I don't think it's the same problem, @MozMorris . The argument list is simply too long for curl to handle. I tried outputting $CURL_ARGS and it's huge.

mozmorris commented 10 years ago

Take a look at @benjamin-rota's comment on April 22nd. He had the same error as yours. He also managed to solve it.

mok0 commented 10 years ago

@MozMorris, I did:

Lør Maj 31 13:37:34 CEST 2014: Uploading ... /Users/mok/bin/git-ftp: line 412: /usr/local/Cellar/curl/7.37.0/bin/curl: Argument list too long

mkllnk commented 10 years ago

I tested the upload of 3955 files on my system and it worked. The error you got does not depend on curl. It depends on your operating system.

It would be helpful if some people post the output of these two commands:

uname -osvr
getconf ARG_MAX

In my case:

Linux 3.2.0-4-amd64 #1 SMP Debian 3.2.57-3+deb7u1 GNU/Linux
2097152

Regarding pull request #103 I would probably just decrease the variable ARG_MAX. Did you try that?

Maybe we can use getconf to set ARG_MAX.

mkllnk commented 10 years ago

http://www.in-ulm.de/~mascheck/various/argmax/ is a nice article about this problem. But reading the table of maxima out there it seems that git-ftp chose already a save value of 4096. Maybe the check doesn't work on some systems.

mkllnk commented 10 years ago

I looked into the check whether to upload the buffer or not and it is broken. Unless you try to upload a file with a name of 4100 characters, it will buffer the whole list and try to upload in the end.

I added a fix to my pull request #104 and that works in my tests. But I've only fixed the upload. If you try to delete all these files, you have the same problem again, I guess.

Also interesting is, that my machine needs five minutes to iterate over the whole file list and add it to the buffer. An upload all in one of just empty files on my local machine takes only six seconds. So there is some performance issue in the loop.

mok0 commented 10 years ago

Here's the data you requested above:

Darwin 13.2.0 Darwin Kernel Version 13.2.0: Thu Apr 17 23:03:13 PDT 2014; root:xnu-2422.100.13~1/RELEASE_X86_64 Darwin $ getconf ARG_MAX 262144

It also took my machine several minutes to build up the buffered file list, and in addition, I had the sense that it became slower and slower as it went along. When it finally got going uploading files, it was even slower, and due to a strange bug in my pull request #103 it actually tried to upload the same files more than once. I only discovered that after the upload had gone on for about 4 hours without significant progress. Testing was painfully slow and half the times I forgot the --disable-epsv switch in which case the server at the other end would hang. I should have done more intelligent debugging but I was focused on getting on with the deployment of the web site.

My pull request #103 is probably not the right solution, since it obviously is much slower uploading files one at a time. I was not aware of ARG_MAX, and I notice that my value is ~ 1/10 of yours (I assume the unit is bytes).

mok0 commented 10 years ago

Re: my comment above, a useful addition to git-ftp would be to store the --disable-epsv status of the server in the config file (it is a feature of the server, same as username and password.)

mkllnk commented 10 years ago

@mok0 Thanks for your report.

For storing the epsv setting feature it would be good if you open an own feature request.

I could reproduce the scenario you described above. It is a totally broken upload buffer mechanism. Also setting ARG_MAX to a lower value doesn't help. But have you uploaded everything now? Otherwise you could try the version of my pull request. It should solve it.

By the way, testing can also be done with the unit testing script in the test directory. You just have to tell the program where it can store some test directories to work on.

resmo commented 10 years ago

which version of git-ftp is this issue related?

mkllnk commented 10 years ago

I was working with the current git develop version. My pull request has that as base as well.

mkllnk commented 10 years ago

Now my pull-request fixes the upload buffer size check as well. It's ready to merge, I guess.