Open peterbarker opened 7 years ago
@peterbarker My current work adding this feature can be found here. I’m attempting to follow your proposal but have hit a bit of an issue with the cloudsync_dir. Basically the log is rsync’d from the dflogger directory and I can't find a good way to rsync it to a non-existent target directory without first making the directory on the server. For the time being I’m dumping the logs into the cloudsync_user’s home directory on the server.
I’m using pre shared keys for authentication with the server. It’s quite simple for someone to replicate the server setup if they wanted to manage their own logs, but i'm assuming that the end goal is to have one server used by many users & vehicles.
One idea is as follows (edit - added some details):
This can all be automated from a webform being served by the companion computer
ssh-keygen -t ed25519
~/.ssh/known_hosts
The companion computer wants to rsync a log
When a user wants to retrieve logs they have sent to the datalog server
Making this somewhat secure would be a bit of a challenge, but we could chroot jail a user to the assigned folder for that particular one time upload? https://linuxconfig.org/how-to-automatically-chroot-jail-selected-ssh-user-logins https://serverfault.com/questions/287578/trying-to-setup-chrootd-rsync
Just ideas, but let me know your thoughts.
@davidbuzz mentioned some really good ideas with regards to making the process easier for users. I'll try and capture those ideas here, but in general:
Significantly reduces the user setup effort
www.apsync.cloud
, their email and an optional unique vehicle IDThe companion computer wants to rsync a log
N/A
When a user wants to retrieve logs that have been rsync'd with the datalog server
We need some preliminary, rudimentary log upload support. This will be superseded by something more intelligent later.
Currently the DataFlash logs are written to
~apsync/dflogger/dataflash
.We need to get those logs off the companion computer and onto a server somewhere (heretoforth known as "the cloud")
Proposal:
identity_file
to be used when trying to upload logsdataflash-${VEHICLE_UNIQUE_ID}-$(date '+%Y%m%d%H%M%S
)` (CLOUDSYNC_DIR)rsync -aHz $FILEPATH ${CLOUDSYNC_USER}@${CLOUDSYNC_ADDRESS}:$CLOUDSYNC_DIR
$HOME/dflogger/dataflash-archive/$CLOUDSYNC_DIR
directoryDoing the files individually has the advantage that we will avoid pushing the same file up multiple times into differently-timestamped directories should the rsync transfer be interrupted by e.g. the CC machine being rebooted. The lack of --partial on the receiver end should mean that we don't end up with partial logs on the server, at the expense of transferring the same data multiple times.
Note that simply running rsync directly on the $HOME/dflogger/dataflash directory will seldom succeed as a file is constantly being written in that directory.