Closed STGSC closed 1 year ago
use Aws\S3\MultipartUploader;
function S3put($s3, $bucket, $vars = array()) {
if (is_string($vars)) {
if (file_exists($vars)) {
$vars = array('SourceFile' => $vars);
} else {
return 'ERROR: S3put($cms, $bucket, $vars)';
}
}
if (empty($vars['Bucket'])) {
$vars['Bucket'] = $bucket;
}
if (empty($vars['Key']) && !empty($vars['SourceFile'])) {
$vars['Key'] = $vars['SourceFile'];
}
if (empty($vars['Bucket'])) {
return 'ERROR: no Bucket';
}
if (empty($vars['Key'])) {
return 'ERROR: no Key';
}
if (empty($vars['ACL'])) {
$vars['ACL'] = 'private';
}
try {
$uploader = new MultipartUploader(
$s3,
$vars['SourceFile'],
[
'bucket' => $bucket,
'key' => $vars['Key'],
'acl' => 'private'
]
);
$result = $uploader->upload();
if (!empty($result['ObjectURL'])) {
return 'OK: ' . 'ObjectURL: ' . $result['ObjectURL'];
} else {
return 'ERROR: ' . $vars['Key'] . ' was not uploaded';
}
} catch (MultipartUploadException | S3Exception | Exception $e) {
return 'ERROR: ' . $e->getMessage();
}
}
added support for 'MultipartUploader', could you test?
(un-comment line 17 & 18 and keep/set the rest of your config naturally)
PS: let me know how it went (in general)
added support for 'MultipartUploader', could you test? > > (un-comment line 17 & 18 and keep/set the rest of your config naturally) > > > PS: let me know how it went (in general)
Thanks for the update, I'm testing it on a 2Tb sized Nextcloud project I'm not sure how to quickly verify that it works. I have to run Test=0 to see if it works after database changes, the wait is long .....
Yes it'll be a serious wait!
When the script runs and you 'at most' see a warning here and there and (eventually :P) it completes, then you should be able to simply set $TEST to 'zero'. then the database settings will be converted to S3 usage.
It shouldn't, but if it does fail, simply restore the SQL-backup (do first go to maintenance mode ON!) and you are reset to "local usage".. so your data should be safe :)
And, well, yea.. you will only know if it all went well, when 'test=1' has completed.. (and only then re run with test=0) It might take a few days ;) And if it hangs for some reason (that's why i built a simple progress indication) simply cancel it and restart.. it'll check what has been done and continue where it left off..)
Let me know if you run in any (more) trouble
Why does running this script on a suspended project prompt some files on S3, but is older then local, upload... and that these files are fixed.They are the ones every time.
Hmm, if the upload succeed then there must be a discrepancy in your database? The script uploads the file again if the timestamp of the file is more recent then the timestamp of that file in your database..
have you set $SHOWINFO = 1 and $TEST anything but 0?
I suspect that's it, since it's the message "XX/YY.zz on S3, but is older then local, upload..."
You could do 'occ files:scan --all' (or set $DO_FILES_SCAN=1), also setting $DO_FILES_CLEAN =1 (occ files:cleanup) might be a good idea.. those are "check & clean" operations, that can help in cleaning up discrepancies..
AD: Running those two "set to 1" once should be enough.. it'll fix discrepancies.. once fixed..
I didn't pay too much attention to this error, because he would just make me waste some traffic. All data is now synced and finished. Great, I found it working properly! If I want to migrate my nextclou web service, in theory I just need to import SQL and overwrite this config.php in the new environment, right?
Ehm, I don't understand that question completely..
If everything went well with $TEST=1, you can set $TEST=0.. then it'll do a final check & sync (which should go quick) and perform all the database changes to point your Nextcloud from local to S3 and the migration is completed!
AD: do keep in mind that the script will put your instance in maintenance mode! Users can not use the instance while in maintenance mode!
Don't do this "by hand", the order "of things" is important! The script does that for you! There are a few ways you can use the script, so do carefully read what the script tells you.. sometimes you need to do something by hand (again, the order of things is important!!)
AD: this is "the big one".. if by any chance it fails, simply restore the backup and you will have reverted to a working local operation (and tell me what went wrong ;)
I have verified that the MultipartUpload is valid. Thank you again for providing this project.
Migration completed? You are now "live with S3"?
Yes, the migration is now complete, and the data is in S3.
----------Reply to Message---------- On Thu, Mar 2, 2023 17:20 PM @.***> wrote:
Migration completed? You are now "live with S3"?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Thank you very much for providing this project. I have encountered an issue where it prompts me that the file I am trying to upload exceeds the size limit. After reviewing the AWS documentation, I found that using the SDK for uploading supports a single file size of 5G, but it also provides a method for segmented uploading. I am looking forward to the implementation of this feature in the project. Cheers!
https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/php/example_code/s3/MultipartUpload.php