JackYeh / s3fs

Automatically exported from code.google.com/p/s3fs
GNU General Public License v2.0
0 stars 0 forks source link

ls in a currently large file uploading dir is blocked by upload #96

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
What steps will reproduce the problem?
1. start s3fs with -d (to see the debugging, if not, s3fs will result in 
crash/failure to upload or be unresponsive)
2. create directory, upload a huge file to that directory (eg.. 500mb+)
3. start a second terminal, do a ls -la (or similary in the SAME directory as 
where you upload

What is the expected output? What do you see instead?
I would expect to get the directory listing. Currently, it hangs and waits for 
the uplaod to complete, or even crashes/fails/disconnects

What version of the product are you using? On what operating system?
r177 on 8.0-RELEASE-p2 FreeBSD 8.0-RELEASE-p2 #0: Tue Jan  5 21:11:58 UTC 2010  
   root@amd64-builder.daemonology.net:/usr/obj/usr/src/sys/GENERIC  amd64

Please provide any additional information below.
It seems to be a good idea to use -d, it prevents (apparently) that s3fs 
crashes. 
An idea to solve it could be to use an secret upload directory, eg an 
.s3fsupload and move the file a directory level when uploaded (if it's 
technical possible) 

Original issue reported on code.google.com by gratis...@gmail.com on 22 Aug 2010 at 9:27

GoogleCodeExporter commented 8 years ago
I'm not seeing these issues.  Please try using a more recent version.  If you 
can reproduce this with the newer version of s3fs, I'll spin up a FreeBSD 
virtual machine and try to reproduce/debug. Thanks.

Original comment by dmoore4...@gmail.com on 30 Dec 2010 at 7:22

GoogleCodeExporter commented 8 years ago
No response in over a month. Closing this issue.

Original comment by dmoore4...@gmail.com on 5 Feb 2011 at 1:53