Closed rohitkhatri closed 3 years ago
Hi, issues are not a support forum. If you can file a complete bug report with logs and a clear reproduction I can look.
Happy to reopen with adequate info to investigate but a one sentence description is not actionable.
Hey, sorry for not being descriptive.
What I'm trying to do is copy all the files from one of the blobs
of azure storage account, and it has around 7000 files so when I run the script, It only transfers some files and stops, then I have to run it again.
Is this how It's supposed to work or It should run until all the items are copied from azure to s3?
This is what my code looks like:
const toS3 = require('./lib');
toS3({
azure: {
connection: process.env.AZURE_CONNECTION_STRING,
container: process.env.AZURE_STORAGE_CONTAINER_NAME
},
aws: {
region: process.env.AWS_REGION,
bucket: process.env.AWS_S3_BUCKET_NAME,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
}
});
Have you tried to reduce to a smaller test case?
It's uploading random number of file one execute, It doesn't copy all of the items from azure
Can anyone help me here?