kpfaulkner / azurecopy

copy blobs between azure, s3 and local storage
Apache License 2.0
36 stars 13 forks source link

Another unknown error #23

Open ibagit opened 7 years ago

ibagit commented 7 years ago

Ken, I performed a list of both my Azure Storage and S3 containers prior to attempting the copy from Azure to S3. Both listed contents as expected. Permissions are correct on both.

From Powershell: azurecopy.exe -i https://weststorage.blob.core.windows.net/icobackupwest/filename.bacpac -o https://backup.s3-website-us-west-2.amazonaws.com/ -blobcopy

Getting this when trying to copy:

Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underly ing stacktrace by adding -db flag. Microsoft.WindowsAzure.Storage.StorageException: The remote name could not be resolve d: 's3-website-us-west-2.blob.core.windows.net' ---> System.Net.WebException: The remote name could not be resolved: 's3 -website-us-west-2.blob.core.windows.net' at System.Net.HttpWebRequest.GetResponse() at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand1 cmd, IRetryPolicy policy, Opera tionContext operationContext) --- End of inner exception stack trace --- at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand1 cmd, IRetryPolicy policy, Opera tionContext operationContext) at Microsoft.WindowsAzure.Storage.Blob.CloudBlobContainer.<>cDisplayClassf.be(IContinuationToken token ) at Microsoft.WindowsAzure.Storage.Core.Util.CommonUtility.d__01.MoveNext() at System.Linq.Enumerable.WhereSelectEnumerableIterator2.MoveNext() at System.Collections.Generic.List1..ctor(IEnumerable1 collection) at System.Linq.Enumerable.ToList[TSource](IEnumerable1 source) at azurecopy.AzureBlobCopyHandler.MonitorBlobCopy(String destinationUrl) at azurecopy.AzureBlobCopyHandler.StartCopyList(IEnumerable1 origBlobList, String destinationUrl, DestinationBlobTyp e destBlobType, Boolean debugMode, Boolean skipIfExists) at azurecopycommand.Program.DoNormalCopy(Boolean debugMode) at azurecopycommand.Program.Main(String[] args)

kpfaulkner commented 7 years ago

Hi, the S3 URL looks a little different to what I usually try. Will try and reproduce later today. Meanwhile, have you tried the command:

azurecopy.exe -i https://weststorage.blob.core.windows.net/icobackupwest/filename.bacpac -o https://s3-website-us-west-2.amazonaws.com/backup/ -blobcopy

Or, if you've setup the config file correctly, am wondering if the command:

azurecopy.exe -i https://weststorage.blob.core.windows.net/icobackupwest/filename.bacpac -o https://s3.amazonaws.com/backup/ -blobcopy

Would work?

I'll try those out myself later tonight, but if they work for you, please let me know.

kpfaulkner commented 7 years ago

Hi

Ok, just a couple of other notes. The -blobcopy flag is only good when copying TO Azure. If you just want to perform "regular" copying (ie it copies from source to your machine then automatically uploads to the destination) then you don't need to supply "-blobcopy". The -blobcopy flag is an optimisation only useful when copying to Azure, which isn't what you're doing above.

Secondly, please do try the S3 URL as backup.s3.amazonaws.com or s3.amazonaws.com/backup/ and let me know how that does.

Just to confirm something, is the bucket name you're really using just called "backup" ? Or do you have a more complex bucket name of "backup.s3-website-us-west-2" ?

Thanks

Ken

ibagit commented 7 years ago

Ken,

Moving the container name to the end of the string looks like it resolved S3 issue, but it looks like it’s now having an issue with the Azure Blob Storage. I added the -db flag and received this:

GetHandler start GetHandler retrieved azurecopy.AzureHandler GetHandler start GetHandler retrieved azurecopy.S3Handler Copying blob to https://s3-website-us-west-2.amazonaws.com/backup ReadBlob .bacpac Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underlying stacktrace by adding -db flag. azurecopy.Exceptions.CloudReadException: AzureHandler:ReadBlob unable to read blob ---> azurecopy.Exceptions.CloudReadE xception: AzureHandler:ReadBlockBlob unable to read blob ---> Microsoft.WindowsAzure.Storage.StorageException: Stream was too long. ---> System.IO.IOException: Stream was too long. at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) at Microsoft.WindowsAzure.Storage.Core.Util.StreamExtensions.WriteToSync[T](Stream stream, Stream toStream, Nullable1 copyLength, Nul lable1 maxLength, Boolean calculateMd5, Boolean syncRead, ExecutionState1 executionState, StreamDescriptor streamCopyState) at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand1 cmd, IRetryPolicy policy, OperationContext opera tionContext) --- End of inner exception stack trace --- at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext opera tionContext) at Microsoft.WindowsAzure.Storage.Blob.CloudBlob.DownloadToStream(Stream target, AccessCondition accessCondition, BlobRequestOptions o ptions, OperationContext operationContext) at azurecopy.AzureHandler.ReadBlockBlob(ICloudBlob blobRef, String fileName) --- End of inner exception stack trace --- at azurecopy.AzureHandler.ReadBlockBlob(ICloudBlob blobRef, String fileName) at azurecopy.AzureHandler.ReadBlob(String containerName, String blobName, String cacheFilePath) --- End of inner exception stack trace --- at azurecopy.AzureHandler.ReadBlob(String containerName, String blobName, String cacheFilePath) at azurecopycommand.Program.DoNormalCopy(Boolean debugMode) at azurecopycommand.Program.Main(String[] args) at azurecopy.AzureHandler.ReadBlob(String containerName, String blobName, String cacheFilePath) at azurecopycommand.Program.DoNormalCopy(Boolean debugMode) at azurecopycommand.Program.Main(String[] args)

Steve

From: Ken Faulkner [mailto:notifications@github.com] Sent: Sunday, April 23, 2017 3:53 AM To: kpfaulkner/azurecopy azurecopy@noreply.github.com Cc: Donk, Steven donks@ibacorp.us; Author author@noreply.github.com Subject: Re: [kpfaulkner/azurecopy] Another unknown error (#23)

Hi

Ok, just a couple of other notes. The -blobcopy flag is only good when copying TO Azure. If you just want to perform "regular" copying (ie it copies from source to your machine then automatically uploads to the destination) then you don't need to supply "-blobcopy". The -blobcopy flag is an optimisation only useful when copying to Azure, which isn't what you're doing above.

Secondly, please do try the S3 URL as backup.s3.amazonaws.com or s3.amazonaws.com/backup/ and let me know how that does.

Just to confirm something, is the bucket name you're really using just called "backup" ? Or do you have a more complex bucket name of "backup.s3-website-us-west-2" ?

Thanks

Ken

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/kpfaulkner/azurecopy/issues/23#issuecomment-296426164, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AL-0btFtA6AdIfKgs0MHoBeD3s2IqvaVks5rywNpgaJpZM4NEU7S.

kpfaulkner commented 7 years ago

Hi

What's the size of the blob?

Thanks

Ken

ibagit commented 7 years ago

3.65 GiB , and this is one of the smaller ones. Others may be closer to 10 GiB. Database exports from Azure SQL.

From: Ken Faulkner [mailto:notifications@github.com] Sent: Monday, April 24, 2017 8:43 AM To: kpfaulkner/azurecopy azurecopy@noreply.github.com Cc: Donk, Steven donks@ibacorp.us; Author author@noreply.github.com Subject: Re: [kpfaulkner/azurecopy] Another unknown error (#23)

Hi

What's the size of the blob?

Thanks

Ken

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/kpfaulkner/azurecopy/issues/23#issuecomment-296653644, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AL-0bq8RH7IDxyCDO-2f4O4yzf-H25KTks5rzJjwgaJpZM4NEU7S.

kpfaulkner commented 7 years ago

Hi

Ok, that's probably the reason. I see that the MS Azure lib itself is the part that blew up. Let me investigate and see what I can sort out. I've gone to about 2G previously, am wondering if there is a limit over that. Will see what I can fix.

One thing you might want to try out while I investigate is my other project (basically a cross platform version of AzureCopy). You might have better luck with larger blobs there (I dont have any huge blobs handy at the moment to test this with). Check out https://github.com/kpfaulkner/azurecopy-go/releases/tag/0.2.3 and grab the Windows AMD64 version. It's far more limited in functionality (still work in progress) but the Azure/S3 side of things is pretty stable.

Meanwhile I'll keep looking into this issue.

Thanks

Ken

ibagit commented 7 years ago

Will do. Thanks Ken.

Steve

From: Ken Faulkner [mailto:notifications@github.com] Sent: Monday, April 24, 2017 8:56 AM To: kpfaulkner/azurecopy azurecopy@noreply.github.com Cc: Donk, Steven donks@ibacorp.us; Author author@noreply.github.com Subject: Re: [kpfaulkner/azurecopy] Another unknown error (#23)

Hi

Ok, that's probably the reason. I see that the MS Azure lib itself is the part that blew up. Let me investigate and see what I can sort out. I've gone to about 2G previously, am wondering if there is a limit over that. Will see what I can fix.

One thing you might want to try out while I investigate is my other project (basically a cross platform version of AzureCopy). You might have better luck with larger blobs there (I dont have any huge blobs handy at the moment to test this with). Check out https://github.com/kpfaulkner/azurecopy-go/releases/tag/0.2.3 and grab the Windows AMD64 version. It's far more limited in functionality (still work in progress) but the Azure/S3 side of things is pretty stable.

Meanwhile I'll keep looking into this issue.

Thanks

Ken

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/kpfaulkner/azurecopy/issues/23#issuecomment-296657644, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AL-0bsZ_SpWgHm3qIXFANsllcblTQPkDks5rzJvCgaJpZM4NEU7S.

kpfaulkner commented 7 years ago

Hi

As for the AzureCopy that you're already using, can you add the following into the config file.

<add key="AmDownloading" value="true" />

I believe that should fix your problem. By default it caches the file into memory as opposed to storing it onto disk. I reckon 3.5G of memory for that might be a tad excessive ;)

Please try that out and let me know how you go.

Thanks

Ken

ibagit commented 7 years ago

Does that imply that it’s downloading it to my workstation first, before transferring to S3? I definitely don’t want to do that.

From: Ken Faulkner [mailto:notifications@github.com] Sent: Monday, April 24, 2017 9:07 AM To: kpfaulkner/azurecopy azurecopy@noreply.github.com Cc: Donk, Steven donks@ibacorp.us; Author author@noreply.github.com Subject: Re: [kpfaulkner/azurecopy] Another unknown error (#23)

Hi

As for the AzureCopy that you're already using, can you add the following into the config file.

I believe that should fix your problem. By default it caches the file into memory as opposed to storing it onto disk. I reckon 3.5G of memory for that might be a tad excessive ;)

Please try that out and let me know how you go.

Thanks

Ken

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/kpfaulkner/azurecopy/issues/23#issuecomment-296661694, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AL-0blYz-H54_DAz9tQjcB94WPVx-iajks5rzJ6PgaJpZM4NEU7S.

kpfaulkner commented 7 years ago

Hi

Yes, I'm afraid that's the only option. If you're going TO Azure, then Azure itself provides a nice facility to handle data centre to data centre transfer without using ANY of your local bandwidth. No other (to my knowledge) cloud storage provider supplies that.

Of course if you have accounts with AWS or Azure (which you must) then you could just create a VM on one of those systems and run the command from there. That way you can do it without hitting your own local ISP/download limits.

Ken

ibagit commented 7 years ago

OK, thanks Ken.

From: Ken Faulkner [mailto:notifications@github.com] Sent: Monday, April 24, 2017 9:13 AM To: kpfaulkner/azurecopy azurecopy@noreply.github.com Cc: Donk, Steven donks@ibacorp.us; Author author@noreply.github.com Subject: Re: [kpfaulkner/azurecopy] Another unknown error (#23)

Hi

Yes, I'm afraid that's the only option. If you're going TO Azure, then Azure itself provides a nice facility to handle data centre to data centre transfer without using ANY of your local bandwidth. No other (to my knowledge) cloud storage provider supplies that.

Of course if you have accounts with AWS or Azure (which you must) then you could just create a VM on one of those systems and run the command from there. That way you can do it without hitting your own local ISP/download limits.

Ken

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/kpfaulkner/azurecopy/issues/23#issuecomment-296663184, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AL-0bsMHd6PmnSgXivhcwANQIEsLyLp0ks5rzJ_zgaJpZM4NEU7S.