4udak / pyftpdlib

Automatically exported from code.google.com/p/pyftpdlib
Other
1 stars 1 forks source link

Large file uploads hang when using TLS. #194

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. python demo/tls_ftpd.py
2. connect to the server on 8021
3. Upload a large file.

What is the expected output? What do you see instead?

The transfer hangs after 212K uploaded. The data channel eventually times out, 
then the transfer retries (FileZilla).

Original issue reported on code.google.com by btimby@gmail.com on 2 Dec 2011 at 11:06

GoogleCodeExporter commented 9 years ago
I can't reproduce this with r929.
I tried with filezilla 3.5.0 and managed to upload a 25 MB file.

Original comment by g.rodola on 3 Dec 2011 at 11:08

GoogleCodeExporter commented 9 years ago
I still have the same problem. I tested with the server on CentOS 5.7 and 
Fedora 16. I am using FileZilla 3.5.1.

Original comment by btimby@gmail.com on 3 Dec 2011 at 5:14

GoogleCodeExporter commented 9 years ago
Here is an example log from FileZilla. You can see that it connects, transfers 
some bytes, times out, restarts.

When I have some time later, I will start testing old versions. This only 
started happening recently. By bisecting, I can probably locate the revision 
that introduced this quickly.

Original comment by btimby@gmail.com on 3 Dec 2011 at 5:18

Attachments:

GoogleCodeExporter commented 9 years ago
Please try to add this patch to test/test_ftpd.py, then run 
test/test_contrib.py.
I'd like to know if this test fails.

--- test/test_ftpd.py   (revisione 926)
+++ test/test_ftpd.py   (copia locale)
@@ -1511,6 +1511,32 @@
         self.assertEqual(f.read(), "")
         f.close()

+    def test_stor_big_file(self):
+        SIZE = 1048576 * 50  # 50 MB
+        chunk = 'abcde12345' * 100000
+        iterations = 0
+        while self.dummy_sendfile.tell() < SIZE:
+            iterations += 1
+            self.dummy_sendfile.write(chunk)
+        self.dummy_sendfile.seek(0)
+        try:
+            self.client.storbinary('stor ' + TESTFN, self.dummy_sendfile)
+            self.client.retrbinary('retr ' + TESTFN, self.dummy_recvfile.write)
+            self.dummy_recvfile.seek(0)
+            for x in range(iterations):
+                recv_chunk = self.dummy_recvfile.read(len(chunk))
+                self.assertEqual(recv_chunk, chunk)
+        finally:
+            # We do not use os.remove() because file could still be
+            # locked by ftpd thread.  If DELE through FTP fails try
+            # os.remove() as last resort.
+            if os.path.exists(TESTFN):
+                try:
+                    self.client.delete(TESTFN)
+                except (ftplib.Error, EOFError, socket.error):
+                    safe_remove(TESTFN)
+
+

Original comment by g.rodola on 3 Dec 2011 at 5:32

GoogleCodeExporter commented 9 years ago
All of the tests pass.

Original comment by btimby@gmail.com on 3 Dec 2011 at 7:26

GoogleCodeExporter commented 9 years ago
Looking further into this. It *might* be the FTP client. I am running Fedora 16.

I tested FileZilla 3.5.1 with GnuTLS 2.12.7. and it fails as described.

I also tried an older FileZilla (3.3.3) with the same GnuTLS version and it 
fails.

However, on Ubuntu, version 3.3.5.1 with GnuTLS 2.8.6 works.

I am going to try to find another TLS enabled FTP server to connect to and 
test, but this does not seem like a pyftpdlib problem. Sorry.

Original comment by btimby@gmail.com on 5 Dec 2011 at 4:02

GoogleCodeExporter commented 9 years ago
I closed this issue. I opened a bug with FileZilla and will continue 
troubleshooting this there.

http://trac.filezilla-project.org/ticket/7837

Original comment by btimby@gmail.com on 5 Dec 2011 at 5:33

GoogleCodeExporter commented 9 years ago
Thanks, I'll keep an eye on that filezilla tracker's bug.

Original comment by g.rodola on 5 Dec 2011 at 5:36