yangljun / s3fs

Automatically exported from code.google.com/p/s3fs
GNU General Public License v2.0
0 stars 0 forks source link

No space left on device - file size 4.2GB - v1.74 #416

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
Detailed description of observed behavior:

Copying a file larger than 3.8GB () ends prematurely with the following errors:
cp: writing `/s3/file_path_and_name': No space left on device
cp: failed to extend `/s3/file_path_and_name': No space left on device

The file size on the S3 storage after the fault is : "4127002624"
The source file size is "4438227459"

The machine is an Amazon EC2 instance.

What steps will reproduce the problem - please be very specific and
detailed. (if the developers cannot reproduce the issue, then it is
unlikely a fix will be found)?

mount the S3 drive: s3fs bucket_name /s3
cp source_file_name /s3/destination_path

===================================================================
The following information is very important in order to help us to help
you.  Omission of the following details may delay your support request or
receive no attention at all.
===================================================================
Version of s3fs being used (s3fs --version): installed 1.74 - syslog reports 
"init $Rev: 497 $"

Version of fuse being used (pkg-config --modversion fuse): 2.8.6

System information (uname -a): Linux ip-XXXX-31-virtual #50-Ubuntu SMP Fri Sep 
7 16:36:36 UTC 2012 x86_64 x86_64 x86_64 GNU/Linux

Distro (cat /etc/issue): Ubuntu 12.04.3 LTS \n \l

s3fs command line used (if applicable):
s3fs bucket_name /s3
cp source_file_name /s3/destination_path

/etc/fstab entry (if applicable):

s3fs syslog messages (grep s3fs /var/log/syslog):

Apr  1 13:40:07 ip-XX-XX-XX-XX s3fs: init $Rev: 497 $
Apr  1 13:40:07 ip-XX-XX-XX-XX s3fs: connecting to URL 
http://bucket_name.s3.amazonaws.com/
Apr  1 13:40:07 ip-XX-XX-XX-XX s3fs: HTTP response code 200
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: connecting to URL 
http://bucket_name.s3.amazonaws.com/daily
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: HTTP response code 404
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: HTTP response code 404 was returned, 
returning ENOENT
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: Body Text: 
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: connecting to URL 
http://bucket_name.s3.amazonaws.com/daily/
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: HTTP response code 200
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: add stat cache entry[path=/daily/]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/][time=1396352446][hit count=0]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/][time=1396352446][hit count=1]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: connecting to URL 
http://bucket_name.s3.amazonaws.com/daily/file_name.tar.gz
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: HTTP response code 200
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: add stat cache 
entry[path=/daily/file_name.tar.gz]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/file_name.tar.gz][time=1396352446][hit count=0]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: delete stat cache 
entry[path=/daily/file_name.tar.gz]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/][time=1396352446][hit count=2]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: connecting to URL 
http://bucket_name.s3.amazonaws.com/daily/file_name.tar.gz
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: HTTP response code 200
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: add stat cache 
entry[path=/daily/file_name.tar.gz]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/file_name.tar.gz][time=1396352446][hit count=0]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/][time=1396352446][hit count=3]
Apr  1 13:40:46 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/file_name.tar.gz][time=1396352446][hit count=1]
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: pwrite failed. errno(28)
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: failed to write 
file(/daily/file_name.tar.gz). result=-28
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: pwrite failed. errno(28)
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: failed to write 
file(/daily/file_name.tar.gz). result=-28
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/][time=1396352446][hit count=4]
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/file_name.tar.gz][time=1396352446][hit count=2]
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: stat cache hit 
[path=/daily/file_name.tar.gz][time=1396352503][hit count=3]
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: connecting to URL 
http://bucket_name.s3.amazonaws.com/daily/file_name.tar.gz?uploads
Apr  1 13:41:43 ip-XX-XX-XX-XX s3fs: HTTP response code 200

Original issue reported on code.google.com by gudul...@gmail.com on 1 Apr 2014 at 11:57

GoogleCodeExporter commented 9 years ago
s3fs copies the source file to a local file for temporally when s3fs uploads 
the file.
So that, I think this error is occurred by your local disk space.
If you can please check your local disk left space.

Thanks in advance for your help.

* Please use latest version s3fs which is moved to 
Github(https://github.com/s3fs-fuse/s3fs-fuse).

Original comment by ggta...@gmail.com on 6 Apr 2014 at 1:58

GoogleCodeExporter commented 9 years ago
Thanks for the info!
If I believe what I read, s3fs uses the standard POSIX tmpfile() and the 
temporary file is most likely created in /tmp. In which case it confirms that 
the problem is related to the source HDD capacity and not the transfer itself.
I could not validate this yet as I will fist have to resize my root partition 
or mount a new partition for /tmp.
If possible, it would be worth to add some more specific debug info for such 
issues.
Cheers.

Original comment by gudul...@gmail.com on 6 Apr 2014 at 7:56

GoogleCodeExporter commented 9 years ago
Hi,

s3fs uses tmp file() function as default, but if you specify use_cache option, 
s3fs makes temporally file in this option path.
Please try to run s3fs with "use_cache" option.
(ex, use_cache=/home/myaccount)

Thanks,

Original comment by ggta...@gmail.com on 7 Apr 2014 at 12:58