Open GoogleCodeExporter opened 9 years ago
After further investigation on the problem cause, I found what cause it:
I'm using proftp in sftp mode, it is used by duplicity to transfer backups on
my s3 mounted point.
As soon as a second duplicity is connecting to proftp (while the 1st one is
still running), s3fs crashes.
I don't know how to provide more useful information but I can if you tell me :-)
Original comment by nicolas....@gmail.com
on 9 Jan 2014 at 10:15
Further investigation:
Using duplicity with paramiko fixes the problem (was using pexpect before)
Original comment by nicolas....@gmail.com
on 9 Jan 2014 at 3:24
Hi,
Could you try to use test/sample_delcache.sh script?
(you can see it, and show help with -h option.)
This script can clear cache file made by s3fs, please try to use it.
Thanks in advance for your help.
Original comment by ggta...@gmail.com
on 21 Jan 2014 at 3:19
Hi,
Should I run this script with 0 as limit size ?
I'm not using 'use-cache' option so I can set any path for the folder ?
Thanks for help
Original comment by nicolas....@gmail.com
on 21 Jan 2014 at 7:57
HI, Nicolas
I'm sorry that I misunderstood about this issue, and I have sent contents
irrelevant to this problem.
(please do not care about this script.)
And I continue to check codes about this problem, please wait...
Thanks in advance for your help.
Original comment by ggta...@gmail.com
on 22 Jan 2014 at 4:00
Hi,
(I'm sorry replying too slow.)
we moved s3fs from googlecodes to
github(https://github.com/s3fs-fuse/s3fs-fuse).
s3fs on github is some changes and fixed a bug.
If you can, please try to use latest version or latest master branch on github.
And if you can, please try to run s3fs with multireq_max option as small number.
Thanks in advance for your help.
Original comment by ggta...@gmail.com
on 1 Jun 2014 at 3:47
Original issue reported on code.google.com by
nicolas....@gmail.com
on 6 Jan 2014 at 8:41