Closed axelander closed 7 years ago
The WPOS3_SETTINGS value should always take precedence. However, this will only impact new uploads. Existing files will remain in the bucket they were uploaded to.
Alright! So I still need to do a search/replace on the db for the uploaded files? I am thinking about using aws sync
command to transfer files between the different buckets and was hoping to avoid a db search/replace.
At the moment yes. We're adding a CLI command in version 1.3, which will allow you to migrate your library between different buckets.
We've received a few requests around multiple environments now, but we're not sure to best solve the issue. The bucket migration tool may be useful, but it might not be the best solution. Can you describe your workflow? Any thoughts on how you would envision WP Offload S3 working with multiple environments?
At the moment I am just trying out different approaches to get a nice workflow developing with wordpress. I don't want to keep the uploads directory in git. So I'm thinking about using WP Offload S3 in all environments, eg. dev, stage, prod. This would make it easy for another developer to work on the same project. The databases for all environments are also hosted on a remote server. When pushing a project to stage/prod I would like to use wordmove or similar to sync databases between environments. The uploads would be synced with the aws-cli sync
command. The missing piece in this equation is that wordmove can't search/replace arbitrary strings like the bucketname. I made a fork of wordmove to handle this.
So basically I want what you're introducing in 1.3, then I won't have to replace the bucketname in the db with this fork :)
Gotcha. Thanks for the insight.
We added our S3 config info to wp-config and added conditionals to determine which bucket is used.
//This is in wp-config.php, setting these options here makes the GUI and find/replace on DB syncs unnecessary,
// Determine test or prod based on a custom server variable. For those without custom server variables, youcan use $_SERVER['HTTP_HOST'] to determine the environment you are in
$object_prefix = OUR_APP_NAME . '/' . ( isset( $_SERVER['SGK_ENVIRONMENT'] ) && $_SERVER['SGK_ENVIRONMENT'] == 'prod' ? 'prod' : 'test' ) . '/';
define( 'WPOS3_SETTINGS', serialize( array(
// S3 bucket to upload files
'bucket' => 'our-bucket',
// Automatically copy files to S3 on upload
'copy-to-s3' => TRUE,
// Rewrite file URLs to S3
'serve-from-s3' => FALSE,
// S3 URL format to use ('path', 'cloud front'
'domain' => 'path',
// Custom domain if 'domain' set to 'cloud front'
//'cloudfront' => 'cdn.exmple.com',
// Enable object prefix, useful if you use your bucket for other files
'enable-object-prefix' => TRUE,
// Object prefix to use if 'enable-object-prefix' is 'true'
'object-prefix' => $object_prefix,
// Organize S3 files into YYYY/MM directories
'use-yearmonth-folders' => TRUE,
// Serve files over HTTPS ('request', 'https', 'http' )
'ssl' => 'request',
// Remove the local file version once offloaded to S3
'remove-local-file' => FALSE,
// Append a timestamped folder to path of files offloaded to S3
'object-versioning' => FALSE,
) ) );
I also added a bit of custom code to automatically sync S3 buckets after a DB push/pull
add_action('wpmdb_migration_complete', 'db_s3_transfer',10, 2);
function db_s3_transfer($type, $location ){
global $amazon_web_services;
$client = $amazon_web_services->get_client();
$S3 = $client->get('S3');
$S3->registerStreamWrapper();
$S3->uploadDirectory('s3://prod-bucket/prod-key', 'test-bucket','test-key');
}
@ithakaben are you using the same bucket and just change the $object_prefix
depending on environment? I don't see how this solution handles uploads in stage to be migrated to prod. The bucketname is saved as metadata in the db when uploading a image. As @A5hleyRich said, only new uploads are affected when changing the bucket (and I guess the same goes for object-prefix
). I might be missing something in your example but it feels like files you uploaded in stage will be placed in bucket/stage
and when in production they will still be served from that path, not bucket/prod
.
@axelander we are using the same bucket, and just changing the object prefix. The MigrateDBpro settings include a find and replace for to change instances of the object prefix in the db.
The function db_s3_transfer takes care of copying prod uploads to staging. This meets the needs for our particularly project, but the last line can easily be refactored to accommodate your needs as well.
The purpose of the uploadDirectory() function is to sync content from a local directory to an S3 one, but you can manipulate it into giving you the ability to move content from one S3 location to another.
If you run registerStreamWrapper() first, you can choose an S3 path as your "local" directory (this will be the first arg). The second and third args are the bucket and object prefix of the destination.
This file sync via pho, combined with the conditionals in wp-config allow us to gitignore the uploads directory and use different S3 resources depending on the environment. (and if I misread everything, hoping the uploadDirectory() example is of some use anyway!
Thanks for the feedback everyone. For anyone looking for a consolidated resource with pros and cons of various approaches to this problem, please see this article on our blog: https://deliciousbrains.com/strategies-handling-large-wordpress-media-libraries-dev-staging-environments/.
If anyone is still experiencing a particular difficulty regarding working in multiple environments with Offload S3, feel free to open a new issue.
Hello,
I would like to have separate buckets for dev, staging and production. I tried to define
WPOS3_SETTINGS
and setting thebucket
parameter from a env variable, but it seems like the first value defined is persisted to the database. Do I have to make a search/replace for the bucketname in the database or does the defined WPOS3_SETTINGS bucket parameter take precedence so I can just ignore the db value?