Open LoganJFisher opened 3 weeks ago
Hi @LoganJFisher, thanks for opening this issue. I'll have to think about if it's worth implementing this, but either way it'll likely take a while before it would be.
When you are updating your server, you should backup your entire watcharr data folder to avoid any data loss. If you don't have an automated backup in place, copying your entire data folder somewhere else locally would do, then you can just copy it back if you have any issues after updating (IMPORTANT: Always ensure watcharr is shutdown when copying the data folder, or copying into it).
At the very least, the only thing you really need to backup, is your watcharr.db. Starting any new server with a backed up database file, will just bring back all your data for all users (i would recommend doing this in the meantime).
Edit: I just wanted to add, the export functionality in the profile menu is currently only really meant for if you want to do an export to another service (if any provide a watcharr import in the future for example) or you are a user that doesn't have access to the database directly (not the best for a proper backup use case if you do have access to the database).
I've been looking around, but it seems like Watcharr's folders are all hidden on my Synology NAS. Other containers I have, like Gamevault and Pinchflat have their folders (including their .db) located in the "docker" directory that I can access, but there's nothing anywhere for Watcharr.
I've been looking around, but it seems like Watcharr's folders are all hidden on my Synology NAS. Other containers I have, like Gamevault and Pinchflat have their folders (including their .db) located in the "docker" directory that I can access, but there's nothing anywhere for Watcharr.
It might be a configuration issue - it sounds like your watcharr data folder isn't mounted as a volume (which would make sense of it deleting your data after a restart)?
Unfortunately I cannot help with that specific software though, I haven't used it before. I would suggest checking how the other containers have their volumes setup, and try to replicate that for the watcharr one (watcharr data should we located at /data
inside the container).
I think I might have sorted it. I probably should have thought to do precisely that before - that was dumb of me not to.
I'll let you know what happens next time there's an update. Hopefully that also fixed #589
I'd also suggest creating a cronjob that handles the backup automatically. Something like:
#!/bin/bash
BACKUP_DIR="backups";
PROJ_DIR="/volume2/docker/Watcharr/" # update this to the right location on your disk
DIR_TO_BACKUP="/volume2/docker/Watcharr/container_data" # update this to the right location on your disk
cd $PROJ_DIR;
currDate=$(date +'%Y%m%d%H%M');
echo "Running update / backup script at $currDate";
docker compose stop > /dev/null 2>&1; # stop the container before doing a backup
mkdir -p $BACKUP_DIR; # creating backup dir if not already present
sudo tar -zcpf ./$BACKUP_DIR/data$currDate.tar.gz; # compress data dir; sudo needed if some files have permissions that don't allow the current user to access them
if [ $? -ne 0 ]; then
echo "Failed to create backup. Restarting container and exiting.";
docker compose up -d > /dev/null 2>&1;
exit 1;
fi
echo "Finished creating backup at $BACKUP_DIR/$DATA_DIR.$currDate.tar.gz";
docker compose up -d > /dev/null 2>&1;
echo "Finished backing up";
Then add to sudo crontab -e
(sudo again needed for file permissions)
0 6 * * * /volume2/docker/Watcharr/update >> /volume2/docker/Watcharr/backups/logs.log
Followup on #589. After updating Watcharr and experiencing data loss, it's simple to export the watchlist of a single user. However, when setting Watcharr back up, I have to create a user profile and then import the file, and there's no simple way to export multiple users from a single admin profile.
This leads me to suggest that: