Closed nathanthorpe closed 1 year ago
Very cool!
It will take me a bit of time to find a moment to test this but I will do so as soon as I can.
I'm getting an error when trying to run the init-dev
script to test locally. Not sure if it is a common problem but I will likely look into it later.
Creating motuz_database_init_run ... done
/run/secrets
cat: read error: Is a directory
ERROR: 1
I tried to implement the pagination feature in https://github.com/nathanthorpe/motuz/tree/transfer_job_listing but cannot test it locally yet.
Try changing SECRETS_DIRECTORY (in deployment/docker/load-secrets.sh) to a directory that exists, or create one.
There seems to be an issue with the back end code. I generated an sts token and verified that I can upload/download from S3 from the command line using the token + temp credentials on the same machine where the test instance of Motuz is running. I created a cloud connection that used a session token and verified that it got stored in the database.
I'm able to list buckets with that token but trying to upload/download is failing with
ERROR : : error reading source directory: InvalidAccessKeyId: The AWS Access Key Id you provided does not exist in our records.
ERROR : Attempt 3/3 failed with 1 errors and: InvalidAccessKeyId: The AWS Access Key Id you provided does not exist in our records.
Looking in the celery log I see the command that was used and it does not appear to include the session token:
motuz_celery | [2022-06-03 18:49:30,354: INFO/ForkPoolWorker-2] RCLONE_CONFIG_DST_TYPE='s3' RCLONE_CONFIG_DST_REGION='us-west-2' RCLONE_CONFIG_DST_ACCESS_KEY_ID='***LWXI' RCLONE_CONFIG_DST_SECRET_ACCESS_KEY='***' sudo -E -u ubuntu /usr/local/bin/rclone --config=/dev/null copyto /home/ubuntu/rclone-v1.55.1-linux-amd64.zip dst:/dtenenba-test/rclone-v1.55.1-linux-amd64.zip --progress --stats 2s
I took that command line and added RCLONE_S3_SESSION_TOKEN=<token>
and it worked from the command line so it looks like we need to add that to the command line in Motuz if the cloud connection has a session token.
Thanks
Oh ok hmm, I will try to see what is wrong now that I have the dev environment working.
Added subtype to the backend so edit works on S3, Azure Blob, and SFTP
Adds support for an S3 credential if you have a session token (from STS). This is useful if you get temporary credentials from a provider.
Don't know if it would be better to have the connection type dropdown for this (like in Azure) or just have the session token be an optional field. It is currently implemented as a dropdown option.
Save subtype into the database so users can edit connections that have multiple methods (S3, Azure Blob, and SFTP).