mk-fg / onedrive-fuse-fs

Script to mount Microsoft OneDrive (formerly known as SkyDrive) folder as a FUSE filesystem
Do What The F*ck You Want To Public License
73 stars 10 forks source link

Question - Large File Transfers #1

Closed KarmaPoliceT2 closed 9 years ago

KarmaPoliceT2 commented 9 years ago

I have a lot of large (2-10GB) media files on my linux server... I would like to be able to "archive" them into my onedrive account since it is going "unlimited". Not intending to playback directly from there (i would copy them back to local storage first), but instead thinking of it like a hierarchical storage management solution where files i haven't used in a while go up to the cloud to free up local storage space...

I explored doing this on windows a bit but found a couple issues: 1) the webdav upload won't handle files of this size 2) SmartFiles/Placeholders are going away so I would need equivalent drive space on my local machine (which kind of defeats the idea) to use the current sync client(s)

So in looking into the linux route, it would appear that if i can mount as an FS then I can just copy files to it and they will exist only in the cloud (not synced to local storage too)... Have you been able to test this with media type files?

Thanks for any advice you can offer. I'll try playing around with it personally soon.

mk-fg commented 9 years ago

Eh, you probably don't want to use this particular project - it's a thin wrapper around python-onedrive with a very crude, probably incomplete, racy and sure-to-be-suboptimal handling of the usual posix-fs stuff, and all in a single thread and memory too, iirc. I'm sure it will blow up horribly for such large files.

From recent developments, I'd say trying BITS in python-onedrive might work, and is probably the best chance to upload such files in one piece that I know about. But that last statement might be deceptive, as I don't follow onedrive news at all, last looked at its docs (apart from that BITS gist) about two years ago, never used commercial version, and don't actually use it in any way for a year or so. Thus, probably a bad person to ask that question, unfortunately - you might want to try one of the Stack Exchange sites, or maybe some dedicated onedrive forum.

If I wanted to do something like that right now, I wouldn't bother even looking for a way to upload such huge files as-is - it looks like a guaranteed pain however you look at it, BITS or not (and things like #39 sound like a tip of that iceberg) - I'd definitely look at backup-oriented (which sounds like what you want to do anyway) tools first, like duplicity or tahoe-lafs. They all do chunking, they work (and made to work) with clouds and sluggish links, they are designed with such huge uploads (over huge timespans) in mind, plus they do client-side encryption, which I think is a must-have for anything you might care about, if only to not find "value adding" ads injected all over your "at rest" content one day.

If that is not an option (for whatever reason), I'd pick any working tool - e.g. python-onedrive (biased) - and write a trivial (bash or python) script, e.g. "myvideos", that does only two simple things - myvideos -u /path/to/huge/somevideo.avi does the split && gpg --sign && upload and myvideos somevideo does the opposite - finds (hardcoded paths), downloads, checks and assembles the file. Writing/testing such script should be easy and fun, hour or a few at most, and it seem to solve the outlined problem. Major advantage here is that it should describe this operation exactly, without any need to remember how to configure, manage and operate complex backup tools like the ones mentioned above (though I guess e.g. python-onedrive will still need to be configured).

mk-fg commented 9 years ago

I recall that there's also onedrive-d tool, which people often confuse with python-onedrive and file bugs against the latter, which can help with uploading (it syncs paths) if you don't want to write even simple scripts (though I do think they are best-suited to describe algorithms, even as simple as split/sign-encrypt/upload, to avoid head-scratching later).

mk-fg commented 9 years ago

Um... and in previous msg I don't mean that it will upload e.g. 10 GiB files - probably not, but it should be easy to just run split(1) on a files, put chunks into dir(s) and run the sync.

mk-fg commented 9 years ago

Hope you resolved the thing one way or the other, sorry if too much text just scared you away, I was never good at brevity.