schickling / dockerfiles

Collection of lightweight and ready-to-use docker images
https://hub.docker.com/u/schickling/
MIT License
848 stars 367 forks source link

mysql-backup-s3: custom S3 endpoints not working #133

Closed moracabanas closed 3 years ago

moracabanas commented 3 years ago

S3_ENDPOINT env variable is not working for some reason.

When I set S3_ENDPOINT value to Upcloud's endpoint like this https://example.fi-hel2.upcloudobjects.com there is no reason I got an error like this:

upload failed: - to s3://example-bucket/backup/2021-04-26T154437Z.dump.sql.gz 
Could not connect to the endpoint URL: "https://s3.fi-hel2.amazonaws.com/example-bucket/backup/2021-04-26T154437Z.dump.sql.gz?uploads"

where s3 client is taking a default amazon endpoint.

I checked the backup.sh code fragment and I didn't found any issues but for some reason it is not taking it in place:

copy_s3 () {
  SRC_FILE=$1
  DEST_FILE=$2

  if [ "${S3_ENDPOINT}" == "**None**" ]; then
    AWS_ARGS=""
  else
    AWS_ARGS="--endpoint-url ${S3_ENDPOINT}"
  fi

  echo "Uploading ${DEST_FILE} on S3..."

  cat $SRC_FILE | aws $AWS_ARGS s3 cp - s3://$S3_BUCKET/$S3_PREFIX/$DEST_FILE

  if [ $? != 0 ]; then
    >&2 echo "Error uploading ${DEST_FILE} on S3"
  fi

  rm $SRC_FILE
}

AWS_ARGS should be --endpoint-url https://example.fi-hel2.upcloudobjects.com but its not taking it.

moracabanas commented 3 years ago

I made some progress debugging:

from minio docs you are supposedly be able to do this:

aws --endpoint-url https://play.min.io:9000 s3 ls s3://mybucket

where in my case would be

aws --endpoint-url https://example.fi-hel2.upcloudobjects.com s3 ls s3://example-bucket

This is returning the following error:

Could not connect to the endpoint URL: "https://s3.fi-hel2.amazonaws.com/example-bucket?delimiter=%2F&prefix=&encoding-type=url"

but if I omit s3://example-bucket it remains like this:

aws --endpoint-url https://example.fi-hel2.upcloudobjects.com s3 ls

And it works displaying my buckets

2021-03-09 16:07:36 example-bucket
2021-03-02 12:51:34 other-example-bucket

So, what am I missing here?

moracabanas commented 3 years ago

SOLVED

First I tried to upgrade and customize the whole image. Going to latest alpine python 3 and p3 awscli with not behavior changes at all. (I should then upgrade the image for mainly for security reasons and pull request as no braking changes for me 😅)

I was missing to use S3_REGION=fi-hel2 env. Now I tested it and it is working:

# sh backup.sh
Creating dump for --all-databases from mariadb...
-- bla bla things --
Uploading 2021-04-27T025826Z.dump.sql.gz on S3...
SQL backup finished

It was very hard to find out by trial and error due to error log output.

I'm glad I found the solution.