Open torwag opened 4 years ago
in the meantime, rmapi can upload/manage files (the address can be set via env variables)
So we could have a dockerfile which contains rmfakecloud and rmapi, define an input and backup folder and start rmapi whenever a file comes into the import folder (by whatever method) to put it into rmfakecloud and hence on the rM. Furthermore, periodically export all data from rmfakecloud into the backup-folder where
I have something very similar (without the need for a fakecloud as the data is just normal data) running using rclone and nextcloud for my ebook reader. In case of the rM, a folder on nextcloud serves as the entry folder like "rM_incoming", all files in their get synced and deleted afterwards. Another folder like "rm_backup" contains the entire content of the rM, in case it gets lost or you want to have access to it outside the rM universe.
i got confused :> those are 2 differnt things i think
My thought was that a) observe a folder and upload via rmapi new content to rmfake, delete the original file in that folder And b) rmapi could export all documents of rmfake into (another) folder periodically, ideally only by updating new content
What happens to those folders and how data gets in resp. how the output folder gets used would be out of scope
Example scenario: A) setup a ftp server and point to incoming folder... Setup scanner to scan to ftp as pdf. Every scan ends in the incoming folder->rmapi->rmfake->rM B) the output folder will be observed by rclone, which points to a nextcloud server. Different folders there are shared with different coworkers. Thus, on the rM you have a folder John and one Administration and whenever you copy a file into one of the folders on your rM or change them (e. g. Sign them) it ends in their nextcloud share accessible by your colleagues.
ERROR: 2023/03/12 06:45:17 auth.go:93: failed to create a new device token
actually while trying to register actual version of rmapi
The server might always contain just the raw data. However, with rclone and some other tools, it would be nice to have a backend, that whenever a data change is reported by the server, and post-processing is triggered e.g. to create a set of readable files and sync them with another cloud service, e.g. nextcloud, borg-backup, git, rsync, etc. This would allow users to use dedicated tools for other tasks (sync, backup, notification, etc.) without bloating the server itself. For this a set of events like on_newfile, on_changedfile, on_removedfile, to trigger certain scripts would be helpful.