pcloudcom / console-client

428 stars 130 forks source link

Sync folders like on the desktop app #23

Open puchrojo opened 7 years ago

puchrojo commented 7 years ago

Feature request: Could you please add Sync Folders like on the desktop app. That ist a major feature from pCloud. On raspberry pi only work this client and the fuse mount-point is nor sufficient if you like to share the content with samba or minidlna on you network. You need the date offline and on the normale file system.

Regards, Isaac

jav-montoya commented 7 years ago

A feature very much needed. I use the CLI version in my laptop because my distro is not based on Debian and it is very sad to loose access to all your files due to a bad Internet connection.

d5c0d3 commented 6 years ago

Could you not just use a command line sync client like for example rsync?

ddssff commented 6 years ago

That does not sound like a solution I would be proud of if I worked for pcloud.

counterpoint commented 6 years ago

Unfortunately there seem to be more problems with pCloud on Raspberry Pi than just the question of linking up to minidlna. It is possible to overcome the technical problems of the link up. I created systemd services for both minidlna and pcloudcc so that both operated as services under the same user. It was then possible to link minidlna to files within pCloud. However, it didn't work very well.

I also found that other things didn't work well on Raspberry Pi. Although the pCloud files appeared present, I was unable to transfer a selection of them using rsync to a different directory on the Pi. The rsync ran for a while, but then stalled. To verify that this isn't a general problem, I built pcloudcc on a headless server running Debian on an Intel Xeon based server, and found that rsync worked absolutely fine. So the problems seem to be specific to Raspberry Pi.

It's good that pcloudcc is available to build, so that in principle it can be used on non-Intel platforms such as Raspberry Pi. But it does appear to need more development to be robust on the Pi. Whether that will get any priority from the pCloud developers I don't know! It would certainly open up interesting possibilities if a robust implementation of pCloud was available on that platform.

bblboy54 commented 5 years ago

This feature is something that I really would like to have as well. All of my videos and photos are automatically synced from my phone to pCloud and many of those videos I want to copy to my media server. My solution to this when I used Dropbox was to have a single folder sync to a directory and from within the dropbox app I could easily just copy the video from my auto upload folder to the "HomeMovieSync" folder.... On my media server I had a cron set up to issue a mv command a couple times per day which would put any videos in that folder into the correct media server directory and then delete them from the HomeMovieSync as a result. In theory the pCloud Drive option would work but there are 2 concerns:

1) I really would rather this media server not have (easy) access to all of my pCloud Drive 2) I am concerned about the speed of the copy mainly because of the possibility of a partial file being copied

Besides this particular instance there are likely many other reasons why people would want to have this feature on a CLI and with the amount of Linux users leaving Dropbox right now and Nextcloud/Owncloud not having a native CLI sync client I'm sure there are lots of people looking to pCloud as a replacement to Dropbox and might need that feature as well.

counterpoint commented 5 years ago

@bblboy54 I wonder if you could achieve what you want with rsync as an intermediary? Since pCloud doesn't actually copy files on to a client unless they are needed (unlike Dropbox) there is no penalty to having the whole of your pCloud drive on your media server. The pCloud drive cannot be accessed by other users than the pCloud user, so you can achieve separation from other media server activities, and use rsync to keep an ordinary directory sync'd with a pCloud directory. Run your cron job from there. Make sure rsync is configured to sync deletions.

blackjackshellac commented 5 years ago

@counterpoint so the issue of rsync stalling has been resolved for you? Is this project still being maintained?

counterpoint commented 5 years ago

@blackjackshellac Hmm, maybe not. I'd forgotten about that issue. Maybe there is some way to adapt what I am doing. I have pCloud console installed on a headless server that is running on an Intel VPS using Debian. The main purpose of the server isn't connected to pCloud, it just happens to be available. The Raspberry Pi that runs minidlna connects to the VPS using sshfs, the SSH file system. That seems to work. But it does depend on having access to an Intel (or AMD) server. I don't know whether there is any maintenance, certainly there isn't much response to requests.

phocean commented 5 years ago

It seems that synchronization does work in the sense that the console client is somehow able to read a previous configuration from the desktop client. There is no documentation on how it works and if there is some configuration file that could be edited.

phocean commented 5 years ago

Ok, I found it. This is doable but no trivial.

In the $HOME/.pcloud folder, there is a data.db file, which is simply a SQLite database.

You can open it with a GUI like DB Browser for SQLite, or from the sqlite3 CLI.

Then, there is a synfolder table, with entries like :

"1" "XXX"   "/home/user/MySync" "3" "1" "123"   "456"

There is the local path to sync, and XXX is an ID that points to the remote location.

To find a suitable remote location ID, you need to go the the folder table.

You can find it with a request like:

select * from folder where id == XXX;

Which gives back the ID and the remote location folder name:

"XXX"   "0" "123"   "15"    "MySync"    "123"   "123"   "0" "1"

So, basically, if you need to add a new sync folder, you have to find a suitable remote location ID, and then create a suitable entry in the synfolder table.

Not difficult, but hacky. Good luck!

bateau84 commented 4 years ago

didnt quite turn out to be that simple. you need inode and devices id and so forth you wouldnt have a simpler fix or oneliner to create a sync, would you?

phocean commented 4 years ago

@bateau84 why did it work for me then?

coxackie commented 4 years ago

@phocean there is an inode column though in folder - it is the next-to-last one, that you have dubbed "123". how did you get it for the new folder to sync?

AckslD commented 4 years ago

@coxackie I don't see inode in folder. I get the following columns:

sqlite> select * from folder;
id          parentfolderid  userid      permissions  name        ctime       mtime       flags       subdircnt
----------  --------------  ----------  -----------  ----------  ----------  ----------  ----------  ----------
...
alworx commented 3 years ago

@coxackie I don't see inode in folder. I get the following columns:

sqlite> select * from folder;
id          parentfolderid  userid      permissions  name        ctime       mtime       flags       subdircnt
----------  --------------  ----------  -----------  ----------  ----------  ----------  ----------  ----------
...

Run ls -id /path/to/folder to get the folders inode.

I'm also giving this a shot ;) Thanks to @phocean for the database insights, looks like an insert to the syncfolder should do the job.

AckslD commented 3 years ago

@alworx I see, thanks! Did you make it work btw?

alworx commented 3 years ago

@AckslD On it, but not fulltime... As i'm primarily targeting a simple way to run two-sync within a docker container, I'll most probably come up with a script writing a syncfolder connection to the database, but not a lot more... Basically came here because I needed to change the api-url (EU-hosted space).

Btw, stat -c '%i %d' /path/to/folder shows inode and deviceid!

alworx commented 3 years ago

@alworx I see, thanks! Did you make it work btw?

No, I'm sort of stuck now. pcloudcc is running inside a docker container (ubuntu:18.04), mounting the pCloudDrive with fuse and syncinc in both directions inside the drive just fine. That's the good part.

When I insert a syncfolder in the database and put a file in the local synced folder, it is logged as uploaded but never ends up in the cloud:

pCloud console client v.2.0.1
Down: Everything Downloaded| Up: Everything Uploaded, status is SCANNING
Down: Everything Downloaded| Up: Everything Uploaded, status is READY
Down: Everything Downloaded| Up: Remaining: 1 files, 11.2KB, status is UPLOADING
Down: Everything Downloaded| Up: Everything Uploaded, status is READY

When I add a file via the webapp, an event gets triggered but that file does not show up in my local synced folder.

Could be an API incompatibility or permission/filesystem problem or maybe I missed additional changes in the database necessary for a properly working syncfolder.

Any thoughts anyone?

kchrispens commented 3 years ago

@alworx I see, thanks! Did you make it work btw?

No, I'm sort of stuck now. pcloudcc is running inside a docker container (ubuntu:18.04), mounting the pCloudDrive with fuse and syncinc in both directions inside the drive just fine. That's the good part.

When I insert a syncfolder in the database and put a file in the local synced folder, it is logged as uploaded but never ends up in the cloud:

pCloud console client v.2.0.1
Down: Everything Downloaded| Up: Everything Uploaded, status is SCANNING
Down: Everything Downloaded| Up: Everything Uploaded, status is READY
Down: Everything Downloaded| Up: Remaining: 1 files, 11.2KB, status is UPLOADING
Down: Everything Downloaded| Up: Everything Uploaded, status is READY

When I add a file via the webapp, an event gets triggered but that file does not show up in my local synced folder.

Could be an API incompatibility or permission/filesystem problem or maybe I missed additional changes in the database necessary for a properly working syncfolder.

Any thoughts anyone?

Did you set the value of the "flags" column to 0?

I struggled a long time to get it working by manually inserting a new entry into the "syncfolder" table. Then I took a look into the code.

I think "flags" has to be set to 0 for new entries.

alworx commented 3 years ago

@alworx I see, thanks! Did you make it work btw?

No, I'm sort of stuck now. pcloudcc is running inside a docker container (ubuntu:18.04), mounting the pCloudDrive with fuse and syncinc in both directions inside the drive just fine. That's the good part. When I insert a syncfolder in the database and put a file in the local synced folder, it is logged as uploaded but never ends up in the cloud:

pCloud console client v.2.0.1
Down: Everything Downloaded| Up: Everything Uploaded, status is SCANNING
Down: Everything Downloaded| Up: Everything Uploaded, status is READY
Down: Everything Downloaded| Up: Remaining: 1 files, 11.2KB, status is UPLOADING
Down: Everything Downloaded| Up: Everything Uploaded, status is READY

When I add a file via the webapp, an event gets triggered but that file does not show up in my local synced folder. Could be an API incompatibility or permission/filesystem problem or maybe I missed additional changes in the database necessary for a properly working syncfolder. Any thoughts anyone?

Did you set the value of the "flags" column to 0?

I struggled a long time to get it working by manually inserting a new entry into the "syncfolder" table. Then I took a look into the code.

I think "flags" has to be set to 0 for new entries.

I guess not, since I've overlooked the flags when I had a look at the code.

You're right, I'm confident that this will start the file transfers just like it should. I'll give it a try, thanks for the valuable input!

alworx commented 3 years ago

Thanks @kchrispens, that really helped!

I wrapped up a little conveniance script for managing the syncfolders directly in the database, maybe it's of use to anyone. The script is also packaged along the client binaries in the docker image that i now use with bind-mounts for database and local folders.

Feel free to use, but use at your own risk :) https://github.com/alworx/console-client/blob/use-eu-server/syncfolder https://github.com/alworx/console-client/blob/use-eu-server/README-Docker.md

Having CLI options in the client itself for managing the syncfolders would be the far better option imho, but I'm not sure if I would get this right in a reasonable amount of time. Especially since the console-client could be dropped by pcloud anytime, I don't feel like investing too much time for it. I appreciate the client existing at all, but I wish it would draw a bit more attention by the pcloud dev team.

Anyway, foldersync works fine so far for me. Changes on remote are transferred pretty much immediately, changes on local folder take some seconds to be detected and synced. First run can take some time, I guess maybe the files are being re-hashed on a new client.

Ithanil commented 2 years ago

@alworx Great work, your script works as expected. Thank you!! But this should really already be a part of the official CLI.

glenndm1000 commented 1 year ago

@ all: thanks for this info. The Flag 0 seems to start the local download.

My setup was (is) a debian on VM sharing out the pcloud data over my lan via samba. thus avoiding having to install pcloud locally on all devices.

This worked fine as long as internet works. when internet fails, access to the data is lost.

The local sync solves that issue: on debian every is synced to a local folder and that folder is shared over the lan. if internet fails, access to the local copy persists.

a minor issue: it does not seem possible to sync the parent folder (folderid 0?) workaround: I put everything in a subfolder and synced that.

aemonge commented 8 months ago

https://github.com/rclone/rclone

glenndm1000 commented 3 months ago

Hello all, A further update (and a wave to future-me who will, no doubt, in years to come, need this info because present-me did not document it properly, like in 2022 :) )

I tried to sync a shared (with me) folder locally using the syncfolder script kindly provided by @alworx Whereas this works/ed very well with folders in my own account, I could not get the script to accept the Shared folder. It returned the folder did not exist. refused command: "./syncfolder.sh add /Shared /home/myuser/pcloud_Local/Shared sync" probably my mistake.

Using the info in this thread, I manually added the record in the sqlite syncfolder table. This did work and the sync of the shared folder is up and running.

steps: 0) run and stop pcloud sync in order to create the sqlite3 data.db file

1) retrieve local inode, deviceid for the local target folder $ stat -c '%i %d' /home/myuser/pcloud_Local/Shared 524920 2049

2) retrieve the folderid of the /Shared folder
Shared = the default pcloud top level folder for files/folders shared with the user - can by customized sqlite> select * from folder where name='Shared'; 6639371585|0|15913130|15|Shared|1594892813|1710944581|0|3

3) add new recorded to syncfolder table sqlite>insert into syncfolder (id,folderid,localpath,synctype,flags,inode,deviceid) values( 1,6639371585,'/home/myuser/pcloud_Local/Shared',3,0,524920,2049);

this was the first record so id =1 flag 0 to prime the initial download

4) start the pcloud sync
pcloudcc -u -p -d

HTH