yadayada / acd_cli

An unmaintained command line interface and FUSE filesystem for Amazon (Cloud) Drive
Other
1.35k stars 167 forks source link

Limited to 100TB storage #370

Open zenjabba opened 8 years ago

zenjabba commented 8 years ago

ACD_CLI reports to FUSE 100TB storage, so once you go above that amount, it will report "out of space".

Can we report 1PB of free space?

charlymr commented 8 years ago

Amazon will ask you question before you reach that I would say... It is unlimited with (fair usage policy)... Are you close to that amount of space use already? Just curious as if Amazon did tell you anything?

zenjabba commented 8 years ago

yes I am close to that amount, and amazon has not questioned my amount stored in my two accounts.

On 3 Aug 2016, at 7:12 AM, MARTIN Denis notifications@github.com wrote:

Amazon will ask you question before you reach that I would say... It is unlimited with (fair usage policy)... Are you close to that amount of space use already? Just curious as if Amazon did tell you anything?

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/yadayada/acd_cli/issues/370#issuecomment-237206037, or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuCqQUaZVsYklMR1qR09n6vwCLX_6ks5qcHeVgaJpZM4JbEMX.

charlymr commented 8 years ago

Fairplay 👍

jetbalsa commented 8 years ago

I think thats the endpoint data from amazon is the quota side, if you look in .cache/endpoint_cache it says the 100TB max size

I think this might be Unlimited* on ACD's part

asabla commented 8 years ago

I've noticed this as well. Could you please comment here again @zenjabba if you get past 100TB with the result?

zenjabba commented 8 years ago

So I've looked at other systems, and they also report 100TB as the storage space available, but for example NetDrive reports

100TB of 100TB Free

screen shot 2016-08-05 at 8 02 14 pm

so I don't feel it's a limitation within amazon.

bryan commented 8 years ago

It appears to be a 100TB soft cap, where you can call Amazon to allow for more space (probably have to give them a good reason given that 100TB is quite a bit of storage). Not certain though since I can't seem to find anyone hitting the 100TB cap. Keep us in the loop!

https://github.com/dularion/streama/issues/63

roaima commented 7 years ago

If it's any help, S3QL (another cloud filesystem) reports double the usage with a minimum of 1TB, so you never exceed 50℅ of the reported space.

zenjabba commented 7 years ago

The problem is S3QL doesn’t support ACD, so apples and oranges (last time I looked at S3QL it didn’t)

yadayada commented 7 years ago

Does the reported disk size really have any practical implications?

zenjabba commented 7 years ago

yes, when you try to upload a file and the disk reports 100% usage, you get a error

On 24 Sep 2016, at 12:40 PM, yadayada notifications@github.com wrote:

Does the reported disk size really have any practical implications?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/yadayada/acd_cli/issues/370#issuecomment-249374287, or mute the thread https://github.com/notifications/unsubscribe-auth/AApfuAS7vVFXk7zYuIfje3HIVIox4Iitks5qtVJ6gaJpZM4JbEMX.

tristaoeast commented 7 years ago

@zenjabba did you try uploading using rclone? I'm just curious to see if it presents the same limitation. I'm currently using acdcli to mount my ACD and rclone to upload to it

(sorry @yadayada for PRing the competition, it's just that I'm more familiared with the rsync syntax -- although you have some interesting options in your upload command that I still ought to try)

zenjabba commented 7 years ago

So finally got a chance to get back to this

ACDFuse 107374182400 -9444732965617890843136 -14025401856 100% /mnt/amazon

is what is reported when it's over 100GB. Can we please just get it to report 1PB of storage so it will never run out.

Thanks