Open ChristianGiupponi opened 6 years ago
I had the same problem and solved it by changing the configuration as follows:
.env
AWS_DEFAULT_REGION=sgp1
AWS_BUCKET=YOUR_BUCKET
AWS_URL=digitaloceanspaces.com
filesystem.php
'driver' => 's3',
'version' => '2006-03-01',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'endpoint' => "https://".env('AWS_DEFAULT_REGION').".".env('AWS_URL')
Don't forget to run "php artisan config:cache" if you have cached your configuration.
@patrickriemer
Is this the setup you put in your backup_manager.php
config file? Or does the backup manager package pull from filesystem.php
and override those credentials?
I got the concept but a little confused on the configs since there is a separate config for this package.
@ patrickriemer I don't get it, the Backup Manager has its own config file backup-manager.php
like :
's3' => [
'type' => 'AwsS3',
'key' => env('AWS_ACCESS_KEY_ID'),
'secret' => env('AWS_SECRET_ACCESS_KEY'),
'region' => env('AWS_DEFAULT_REGION'),
'bucket' => env('AWS_BUCKET'),
'root' => '',
],
I have tried a custom one :
'do_space' => [
'type' => 'AwsS3',
'key' => env('MEGALOBIZ_DO_KEY'),
'secret' => env('MEGALOBIZ_DO_SECRET'),
'region' => env('MEGALOBIZ_DO_REGION'),
'bucket' => env('MEGALOBIZ_DO_BUCKET'),
'endpoint' => env('MEGALOBIZ_DO_ENDPOINT'),
'root' => 'backups/'.env('MEGALOBIZ_DO_BUCKET_ENV'),
],
But I received error : Could not resolve host: megalobiz.s3.ny3.amazonaws.com
, it won't use the endpoint
value which is not standard of course, I added it manually in the list.
However in filesystems.php
config, I upload files just fine with the save .env
variables.
Any plan to support DigitalOcean Spaces? I have see that is s3 compatible but the command for s3 will append amazon's url which causes problem:
artisan db:backup --database=mysql --destination=s3 --destinationPath=my-space/backups/ --timestamp="Ymd-Hi" --compression=gzip
This will produce an error: