Open georgzoeller opened 7 months ago
Do you want to do this per user or as a system backup? If it's a system backup, just archiving the 'DATA_DIR' is good enough. Redis data is ephemeral, and the search index can be reconstructed from the admin panel.
For me, a system-wide backup would be enough. I think for less technical users, it would be nice to have some documentation about how to backup the DATA_DIR
.
For me, a system-wide backup would be enough. I think for less technical users, it would be nice to have some documentation about how to backup the
DATA_DIR
.
This feels like it will be specific to each users setup and their preferred backup methods and locations. Or are you just thinking of some mention in documentation that backing up the data is enough for a restore or transfer?
Do you want to do this per user or as a system backup? If it's a system backup, just archiving the 'DATA_DIR' is good enough. Redis data is ephemeral, and the search index can be reconstructed from the admin panel.
Hello, do you mean DATA_DIR is hoarder_data /var/lib/docker/volumes/hoarder_data/_data?
It would be nice to have a better solution. I didn't check the contents of that directory, but copying data files on the fly isn't necessarily a great idea, see e.g. https://theorangeone.net/posts/database-backups/ or https://www.sqlite.org/howtocorrupt.html#_backup_or_restore_while_a_transaction_is_active.
You simply have to turn off your docker containers, do a backup and then restart them after the backup. I am not sure how many users your hoarder instance has, but I am guessing it should be easy enough to find a timeframe, where people are sleeping. And in the worst case, you have a 5 minute downtime.
Per user backup would also be nice. This can be like sqlite db file containing query results of current user :thinking:
Considering I'm single user, DATA_DIR
is fine for me.
Do you want to do this per user or as a system backup? If it's a system backup, just archiving the 'DATA_DIR' is good enough. Redis data is ephemeral, and the search index can be reconstructed from the admin panel.
This may seem simple for people who can and are comfortable working on the backend. But for those who are not engineers or coders - like people who use my instance of hoarder to take their backups. It would definitely improve the UX for non-technical users.
Agree, backup and restore would be a good feature. At least as HTML file (without cached content and screenshots).
I have added support for a per user export of links and notes which is going to be available in the next release (https://github.com/hoarder-app/hoarder/commit/c8a3c1ee02e917b2e553d403b7bf94cbc736f51d). It's not a backup though, it's for the sake of migrating to another platform if you want to.
That's good. Even export is fine. It just means indexing and inferencing might need to be done again, but as long as one can get a copy of all the links, it's good. :)
Keep up the good work buddy. 👍🏽
Another use-case. I'm planning to run this on Google Cloud Run or Amazon Lightsail (container version, not the vps version) but I wouldn't feel safe without a backup mechanism in the event of container crashes. So I'd like to be able to periodically save the DATA_DIR (or just the export data) to the respective cloud's object storage. Is this something you'd be open to accepting a PR for?
It would be nice to be able to backup export the saved content and restore it (potentially on another machine).