gilesknap / gphotos-sync

Google Photos and Albums backup with Google Photos Library API
Apache License 2.0
1.97k stars 161 forks source link

Unable to sync photos #411

Closed chakrademorue closed 1 year ago

chakrademorue commented 1 year ago

Hello, Thank you very much for your plugin which is exactly what I was looking for. Sorry, I am a novice, but I can't get the sync to work.

I'm using a Synlogy NAS and I did manage to install the container on Docker which is working. The application is well created on the Google side and is published. Unfortunately, I can't synchronize.

I run the commands in SSH from my Mac. 1st error when I try to synchronize using the command line: a message said "Database is locked".

I then deleted the gphotos.lock file in /storage and ran the command line again.

Here is the second error: 02-20 19:33:30 WARNING gphotos-sync 3.0.3 2023-02-20 19:33:30.699715 02-20 19:33:30 ERROR Process failed. Traceback (most recent call last): File "/usr/lib/python3.9/site-packages/gphotos_sync/Main.py", line 496, in main self.setup(args, db_path) File "/usr/lib/python3.9/site-packages/gphotos_sync/Main.py", line 345, in setup self.auth.authorize() File "/usr/lib/python3.9/site-packages/gphotos_sync/authorize.py", line 96, in authorize flow.run_local_server(open_browser=False, host=self.host, port=self.port) File "/usr/lib/python3.9/site-packages/google_auth_oauthlib/flow.py", line 425, in run_local_server local_server = wsgiref.simple_server.make_server( File "/usr/lib/python3.9/wsgiref/simple_server.py", line 154, in make_server server = server_class((host, port), handler_class) File "/usr/lib/python3.9/socketserver.py", line 452, in __init__ self.server_bind() File "/usr/lib/python3.9/wsgiref/simple_server.py", line 50, in server_bind HTTPServer.server_bind(self) File "/usr/lib/python3.9/http/server.py", line 137, in server_bind socketserver.TCPServer.server_bind(self) File "/usr/lib/python3.9/socketserver.py", line 466, in server_bind self.socket.bind(self.server_address) OSError: [Errno 98] Address in use 02-20 19:33:30 WARNING Done.

Could you tell me what I can do to fix this?

Many thanks for your work, F.

gilesknap commented 1 year ago

I think you are trying to run the initial authentication on the NAS. The Google Auth flow sets up a http server to recieve a re-direct from your browser, it looks like your NAS is already using the port that the flow uses.

You can change the port on the command line BUT:

You must run this first time locally on a machine with a browser in order to complete the auth flow. Hopefully these instructions will help you do that https://gilesknap.github.io/gphotos-sync/main/tutorials/installation.html#headless-gphotos-sync-servers

chakrademorue commented 1 year ago

Thank you very much for your help, but sorry, I am a beginner and I don't know how to do it... My container is configured to use the same network as Docker Host. Should I reinstall my container and configure it in bridge mode and set a port that is not used by my NAS?

Capture d’écran 2023-02-22 à 19 00 40

Once this is done, can I run the command again? docker exec -it GooglePhotosSync gphotos-sync /storage

Finally, what URL should I go through to allow authentication? The first time I ran the above command, I had a URL come up in the terminal, which I copied into my browser to authorize the application.

gilesknap commented 1 year ago

Maybe the easiest thing to try is this.

I believe some people have had success with this approach

gilesknap commented 1 year ago

failing that maybe you would consider the paid service https://photovaultone.com/ which keeps a backup for you and manages all the secrets on your behalf.

chakrademorue commented 1 year ago

Hello,

Thank you very much for your help. So, I've tried a lot of things, but it still doesn't work, even if it's not the same problem. I can't authenticate the app on the 1st connection. My browser displays an error either ERR_CONNECTION_REFUSED (and sometimes briefly ERR_CONNECTION_CLOSED). I made a little video to show the steps: https://drive.google.com/file/d/1Mod-16oDxXCnmdG1JwMPIPJvMBki04X6/view?usp=share_link

I tried with port 8080 (which is not used by any other instance), then custom ports, nothing works. The firewalls of Synology NAS, my router and my Mac are disabled. It doesn't work with http, https, local URL or remote URL. Everything works fine on Google, but as soon as I have to finalize the installation by connecting to my container, it doesn't work.

Do you have any idea what is causing the problem?

Also, thanks for the PhotoVaultOne suggestion, but it's not quite what I'm looking for. I would like to have a folder mounted on my computer, with the photos directly available almost "live". If I can get Gphotos Sync to work, I'll run a sync once a day to get the latest photos directly to my computer.

gilesknap commented 1 year ago

OK sorry that did not work out for you. It was a hack that has worked for others, but is dependent on how open your NAS is.

The official approach is more fiddly and is what I linked above in https://gilesknap.github.io/gphotos-sync/main/tutorials/installation.html#headless-gphotos-sync-servers.

I've just had a go at using the instructions and I realize they are quite difficult to do. I'll have a go at creating a more step by step approach this weekend and post it here for you to try.

chakrademorue commented 1 year ago

Hello,

Thank you again, this is really very generous of you.

For my last tests, I used this command, I use this command for installation (found in this tutorial https://bullyrooks.com/index.php/2021/02/02/backing-up-google-photos-to-your-synology-nas):

sudo docker run -p 8080:8080 \
    -ti \
    --name gphotos-sync
    -v /volume1/mydirectory/gphotos:/storage
    -v /volume1/docker/gphotosync:/config
    gilesknap/gphotos-sync:latest
   /storage

Because I would like to have a permanent container, and then be able to run a scheduled task every day: docker container start gphotos-sync

However, I also tried with the command line indicated in your documentation: sudo docker run --rm -v /volume1/docker/gphotosync:/config -v /volume1/docker/gphotosync -p 8080:8080 -it ghcr.io/gilesknap/gphotos-sync /storage

But the result is the same: I'm asked to go to a URL on Google to authenticate myself and validate the app accesses, then when I'm redirected to the local URL, the page doesn't show up (error: ERR_CONNECTION_REFUSED, see the end of my video), even if I type the local IP instead of localhost.

gilesknap commented 1 year ago

Hi @Floooowk,

I have just walked through this and will document the steps here. Let me know if this works for you. If all is well I'll add this into the docs.

First you need a Linux workstation with a browser and docker or podman installed. If you don't have this then let me know what kind of workstation you do have and I'll see if I can write instructions for your use case. The reason you need this is because the initial OAuth flow from Google expects to run on a single machine and use a browser to do the interactive part of the authentication. Once you have done this you will get a token file that you can copy to your NAS and it should be renewed indefinitely, so this is a once only step.

First you need to go through the instructions here to get a secrets file which will end up in $HOME/.config/gphotos-sync. (I believe you have already completed this step)

Next run up gphotos-sync in a local container on your workstation (replace podman with docker if that is what you are using):

mkdir /tmp/storage
podman run --rm -v $HOME/.config/gphotos-sync:/config -v /tmp/storage:/storage -p 8080:8080 -it ghcr.io/gilesknap/gphotos-sync /storage --port 8080 --skip-files --skip-albums

NOTE: if port 8080 is in use on your workstation then change 8080 to an unused port (replace all 3 places 8080 appears in the command line).

You should see an output similar to this

[giles@ws1 ~]$ podman run --rm -v $HOME/.config/gphotos-sync:/config -v /tmp/storage:/storage -p 8080:8080 -it ghcr.io/gilesknap/gphotos-sync /storage --port 8080
02-24 18:53:17 WARNING  gphotos-sync 0.1.dev1+g208a216 2023-02-24 18:53:17.671256 
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=XXXXXXXXXXXXXXXXXX.apps.googleusercontent.com&redirect_uri=http%3A%2F%2Flocalhost%3A8080%2F&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fphotoslibrary.readonly+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fphotoslibrary.sharing&state=XXXXXXXX&access_type=offline

Copy the URL and paste into your local browser running on the same workstation as the container. This should take you through the standard Google authentication UI in your browser and when you complete gphotos-sync will continue to run. This works because the URL you pasted includes a redirect URL parameter that connects back to a http server running on port 8080 inside of the container.

Because you specified --skip-files and --skip-albums gphotos-sync will immediately exit without downloading anything. However it will have written your authentication token to /tmp/storage/.gphotos.token.

Now you need to copy the file .gphotos.token to the storage folder you are using on your NAS. When you next run the container on the NAS it should use that token and not ask for any credentials. When the token expires it should be automatically renewed. From now on your NAS should be able to run gphotos-sync indefinitely.

Good luck.

chakrademorue commented 1 year ago

Hello @gilesknap ,

Thank you very much for this super detailed tutorial which helped me a lot. I am pleased to announce that it works now!

I'm on Mac, so I installed Docker for Desktop for Mac, then I used the Terminal to install the container on my computer, then I followed the steps of your tutorial until I moved the token on my NAS. Everything is now operational. I had tried before on an Ubuntu VM, but without success.

I also scheduled a daily task that runs with the command: docker container start gphotos-sync

I have some last little questions:

1/ When my daily task will run, will it re-download all the photos? Or just add the new ones and delete the ones I deleted in Google Photos?

2/ Is it possible to have only folders per year (without subfolders per month)?

Once again, many thanks for your precious help and for your work!

gilesknap commented 1 year ago

@Floooowk I'm glad that worked out!

Regarding your other questions:

  1. The default behaviour is to download new items only and not remove items that have been deleted in the library. To delete items add the --delete option
  2. There are some folder formatting options that I'm not very familiar with as they were contributed by another developer. Take a look at --path-format FMT (use --help to see all options) I think it will do what you want.
chakrademorue commented 1 year ago

Thanks a lot for your answer. Last question: How do I add the -delete option? I tried in the command, but it doesn't work. docker container start --delete gphotos-sync

Sorry if my question is stupid, I tried to look in the documentation and do a search, but I can't find.

sebrem commented 1 year ago

Thanks a lot for your answer. Last question: How do I add the -delete option? I tried in the command, but it doesn't work. docker container start --delete gphotos-sync

So with Docker the order of parameters really matter, this --delete is being passed to docker start (I think it would only work for docker run in order to remove stopped containers) but you want to pass it to the so-called entrypoint of the container, so the parameters which go there have to be specified after the name of the container.

I'd always make sure what the entrypoint looks like (e.g. using docker inspect and/or looking at the source code repository of the docker image) before passing parameters.

Hope that helps!

chakrademorue commented 1 year ago

Thanks @sebrem ! i've also tried to put --delete at the end, but it doesn't work too

Capture d’écran 2023-02-26 à 18 07 17
sebrem commented 1 year ago

You can't modify the entrypoint of an existing container. It's like a new instance then. You need to have a new docker run run (which will create a new container so you need to use another name or need to docker rm gphotos-sync to pass the argument.

This is now moving into the scope of docker usage and should probably be continued at another location in order to keep the notifications down for anyone involved.

chakrademorue commented 1 year ago

Thank you for your time and sorry for the inconvenience. Just a last post to say that I found the solution: the correct name for the option was --do-delete (not --delete).

What makes this command: docker run -ti --name gphotos-sync -v /volume1/docker/PhotoSync:/config -v /volume1/homes/myfolder/GooglePhotos:/storage -p 8080:8080 -it ghcr.io/gilesknap/gphotos-sync /storage --port 8080 --do-delete

Unfortunately, this does not work and when I delete photos in Google Photos, they remain in my local folder.

sebrem commented 1 year ago

If you use --do-delete you also need to rebuild the local index as far as I know.

I don't have the parameter at hand right now but --help will tell you. Something with "flush index" or similar, mentioning --do-delete (or was it --do-delete mentioning the appropriate index option in its help text?)

chakrademorue commented 1 year ago

Thanks again for your help. Adding --flush-index before--do-deletedoes indeed allow to delete items locally after they have been deleted from Google Photos.

But now I have a new problem that seems unrelated. I will open another thread.

gilesknap commented 1 year ago

@sebrem thanks for your answers - yes (I forgot) do delete requires --flush-index to work. This does mean it takes a bit of extra time to re-build the index, but it still sees the already downloaded files and only does incremental download of content.

JPnux commented 10 months ago

hello, what is the difference between --flush-index and --rescan ?

gilesknap commented 10 months ago

Good question!

The differences are: